Research brief: Why Social Enterprises Resist or Collectively Improve Impact Assessment

Research paper by Jarrod Ormiston, published August 2022
Summary by Kate Ruff, May 2025

At Common Approach, we discuss the importance of flexible, shareable impact measurement and the need to move away from a one-size-fits-all approach. Ormiston’s paper addresses a key relationship that suffers under inflexible impact measurement solutions—the investor-investee dynamic and how it differs based on the investee’s measurement experience. We now have a clearer understanding of why some investees are grateful for their investors’ top-down solutions, while others are not. The author’s conclusions and insights on “lock in” can help us—and investors —better understand and tailor data collection to the needs of investees.

Summary of the research paper

Source: Ormiston, J. (2022). Why Social Enterprises Resist or Collectively Improve Impact Assessment: The Role of Prior Organizational Experience and “Impact Lock-In”. Business & Society, 62(5), 989-1030.

In this article, Ormiston explored why some social enterprises are receptive and grateful when their investors tell them which impact metrics to track, while others resist investor influence over impact measurement.

Impact measurement “lock in”

Ormiston found that inexperienced social enterprises (SPOs) welcome investor direction and suggestions but that those who are locked in on their own impact measurement practice view investor influence as an unwelcome burden, be it outright investor directives or more moderate investor co-creation of impact metrics.

Ormiston refers to lock-in as “a state where a social enterprise [SPO] is committed to particular approaches to understanding, measuring, and reporting impact” (p. 992). Organizations lock in when they have sufficient impact measurement experience to know what works for them. It is important to note that being “locked in” does not necessarily imply high-quality, expert, or sophisticated impact measurement. The organization may have settled on a “good enough” measurement that is feasible within their resource constraints and information needs.

“[We] decided to ditch some of our own standard reporting because we can’t afford the time to do both [our reporting and the reporting government wants]… although it’s better and it’s more information-rich. It doesn’t get us the money” (Ormiston 1021).

Competitive vs collaborative relations with investors

The author observes that organizations that are experienced with impact measurement are more likely to adopt a combative stance to investors’ social impact assessments because they are protecting what they have already evolved themselves, or challenging what they deem either insufficient or overly cumbersome measurement obligations. These more experienced organizations were more likely to describe investor-driven metrics as burdensome, irrelevant or “lost in translation” (p 1001). The experienced organizations often complied with the investor’s requests because they hold less power in the investor-investee relationship.

Organizations that are less experienced with impact measurement are more likely to adopt a collaborative stance. These organizations see potential benefits that they will derive from impact measurement. They and their investors see opportunities to strengthen their collective work by bringing more measurement to it.

“I think the stakeholder analysis was important as well. . . I don’t think we had put a lot of thought into who the stakeholders were in our business and what impact we had on them” (Ormiston 1023).

Takeaways

Ormiston suggests that investors who wish to be supportive and non-burdensome to their investees need to customize their own impact measurement frameworks and collection methods to their investees’ impact experiences.

Implications for Common Approach to Impact Measurement

We couldn’t agree more that there is a need for customizing impact measurement to investees, and the Common Approach standards can do just that.

Building on Ormiston’s observations and the theory of lock-in, we have segmented social purpose organizations (SPOs) into three categories:

  • Nascent, when the organization has no existing impact measurement system.
  • Emergent, when the organization is beginning to take ownership of metrics that work for them. They are thinking strategically about their purpose and those they serve and what data they need to make decisions; however, they remain funder-driven and are working to find their own footing. 
  • Established (our term for locked in), when the social purpose organization (SPO) is confident in adapting its metrics to stay aligned with evolving strategy and insight, and is not at all funder-driven. If the SPO undertakes additional measurement to comply with funders, it is out of obligation.

These categories consider not just impact metrics, but also digital infrastructure for measuring impact (the digital tools, such as software, that collect, store and exchange data), and data on the demographics of their management and board.

In our work, SPOs who are established demonstrate it in three related areas, all associated with how they measure and report their impact.

  • Social purpose organizations are established in specific impact measures and measurement approaches when they have sufficient experience to know what metrics they need to manage; when they know the metrics that those they serve find meaningful, and when they have built data collection systems around certain metrics. 
  • Social purpose organizations are established in data collection, data structure and data storage after they have invested significant time into their systems. They may have undergone a significant process of choosing and onboarding software, such as a Salesforce customization or impact measurement software. Or, they may have developed complex and intricate spreadsheets that seamlessly populate impact management dashboards.  
  • Social purpose organizations are established in specific demographic data collection once they have undertaken a thoughtful, likely time-consuming, exercise on how to collect this data and what categories to use.

We believe that it is a good thing when SPOs are established. It takes time for an SPO to get their impact measurement systems to a point where they can use the data to make decisions and improve their work. Once they have figured out their systems and fully own their impact measurement, investors telling them what to do or requesting they do additional things becomes burdensome and interferes with the measurement.

This burden doesn’t impact locked-in organizations alonewe observe that when a grantmaker or investor imposes metrics and forms on emergent SPOs, it prevents them from becoming established. While emergent organizations are trying to think strategically and take ownership, imposed measurement keeps them in a funder-reporting trap that supplants the work of aligning measurement with management and makes finding their footing that much harder. 

Each of these three categories have unique needs, and imposing the best practices for one category onto another category will never have the desired results. Investor behaviour that is ideal (helpful, non-burdensome) for an SPO with nascent systems will be burdensome and unhelpful to an SPO that is established, and vice versa. 

Ormiston’s findings help us understand the disconnect between organizations that thrive under top-down metrics and those that struggle with them. The tension isn’t ungrateful SPOs or unhelpful investors, but the use of one-size-fits-all approaches for impact measurement. Corresponding to the author’s conclusions and Common Approach’s ethos, the solution to this burden is flexibility. Nascent, emergent and established create a useful framework for investors to employ flexibility in their approach by understanding the impact measurement needs of SPOs and tailoring their impact data collection accordingly. Common Approach is here to help investors support those locked-in enterprises, and help others on their journey towards that moment when impact measurement and management clicks.

Join the Common Approach community to stay up to date on our efforts to make impact measurement better, and help shape impact measurement standards!

📣 Follow us on LinkedInYouTube and Instagram

📌 Subscribe to our mailing list

Published May 20, 2025

More like this

Steps for reviewing your ETL pipeline

Steps for reviewing your ETL pipeline

Discussing the ETL pipeline, an often unaddressed step in the impact measurement process, and how reviewing and refining this pipeline can address many common hurdles & lead to efficient and reliable data.

read more

Get email updates.

The Common Approach is community-driven! Get updates on standards and see upcoming events and webinars. 

Merci!

Inscrivez-vous à notre infolettre

L'Approche commune est propulsée par la communauté! Recevez des mises à jour et des invitations aux webinaires. 

Merci!