The Self-Assessment: Getting to 100%
The Common Foundations set a minimum standard for the “how” of impact measurement. Made up of five essential practices, the Foundations outlines the fundamental processes and practices an organization should have in place to confidently say, “We are doing impact measurement.”
When Common Foundations Version 2.0 was being developed, the goal was to convey the essentials of impact measurement based on the practices common to the many tools and approaches available globally. We were thorough—we reviewed over 500 tools and resources!
We also set a goal that the Common Foundations should meet social purpose organizations (SPOs) where they’re at. The indicator we chose for this measurement: 60% of SPOs should be able to score 100% on the Common Foundations self-assessment.
So far, we have not hit this goal. Far less than 60% of the organizations who take the self-assessment are answering “Yes” to every question. However, on a question-by-question basis, no question has a lower “Yes” response rate than 82%. So, there is no one sub-practice that is clearly tripping up most organizations.
The Common Foundations are a minimum standard—it’s not meant to be difficult or overly rigorous. In reviewing the submissions so far, we can see organizations we are familiar with and know have good impact measurement practice have said “no” to questions we believe they could answer “yes” to.
So we are left wondering: are organizations taking the assessment being too hard on themselves?
Here’s what we are seeing:
- the seven lowest scoring questions in the self-assessment were spread across four of the five sub-practices (all but the first one, “Describe your intended change.”
- These questions only garnered “no” responses from between 15% to 18% of the organizations that have taken the self-assessment.
- Most organizations that do not score 100% are very close! Less than 20% of organizations have recorded scores below 50%.
Let’s take a closer look at the 7 questions with the highest “no” responses, and explore what it takes to be able to answer “yes”.
If you have taken the time to consider what indicators you are using and you believe these are the best indicators for your organization to use right now, you can answer “Yes” to this question.
Under “Collect Useful Information”, this sub-practice asks organizations to have a clearly articulated plan or process for how data will be gathered. This does not need to be elaborate or complicated; it simply ensures that data collection happens consistently and thoughtfully. Examples:
- A program coordinator will conduct short intake interviews with all new program participants. The answers are transcribed into a spreadsheet during the interview, or written down and input afterwards.
- The communications team sends an annual survey to our database each July. The questions remain largely the same each year, so we can track changes. The data is stored in our online survey tool, and exported into a spreadsheet that we upload into our impact measurement software.
- Our onsite staff conducts a count of participants during each session and enters it into the daily logs. The administrative coordinator inputs this data into our impact measurement software each week.
If you have these sorts of data collection processes in place (even if they are just starting points, and you know you will be building on them!), then you can answer “Yes” to this question.
This might mean ensuring your impact measurement software has reporting set up for at-a-glance check-ins on your key measurements. Or, it might mean a team member periodically uses Excel functionality to generate charts or graphs to represent the data collected in a more digestible format.
If you are regularly summarizing your data for senior leadership to refer to or for board or funder reports, then you can answer “Yes” to this question.
If you compare your data to a desired outcome, an ideal target, or to past performance, you can answer yes to this question.
If you produce a detailed report for specific audiences (such as your board), but then share key findings on social media, or include learnings in other public communications, you can say yes to this question.
Basically, if your organization shares results and learnings in different ways to reach different people, you can answer “yes” to this question.
If your organization has the means to regularly communicate with those you are serving and your supporters, you can likely answer “yes” to this question.
If so, we encourage you to take the self-assessment again! You can also review the questions for all five sub-practices offline with your team using this PDF worksheet.
If you can now answer “yes” to every question, your organization has successfully adopted the Common Foundations!
If there are still questions you cannot answer “yes” to, we hope this helps make the steps needed to get to a “yes” more manageable.
Do you have questions, suggestions or general feedback about the self-assessment? We would love to hear from you! Please be in touch.
More like this
A summary of the findings of “Social impact reporting in the public interest: the case of accounting standardisation”, which argues that standard-setters should consider a “common good” approach to social impact standard setting.
Help us improve our data by taking the Common Foundations self-assessment! We are eager to hear not only from social purpose organizations in the beginning stages of implementing impact measurement but also from those with an established practice.
On April 21, Common Approach to Impact Measurement held its first Annual General Meeting, where we had the pleasure of introducing our newest board members! We are very excited to have these three new voices joining the Common Approach Board.