Common Foundations

The Self-Assessment: Getting to 100%

The Common Foundations set a minimum standard for the “how” of impact measurement. Made up of five essential practices, the Foundations outlines the fundamental processes and practices an organization should have in place to confidently say, “We are doing impact measurement.”

When Common Foundations Version 2.0 was being developed, the goal was to convey the essentials of impact measurement based on the practices common to the many tools and approaches available globally. We were thorough—we reviewed over 500 tools and resources!

We also set a goal that the Common Foundations should meet social purpose organizations (SPOs) where they’re at. The indicator we chose for this measurement: 60% of SPOs should be able to score 100% on the Common Foundations self-assessment.

So far, we have not hit this goal. Far less than 60% of the organizations who take the self-assessment are answering “Yes” to every question. However, on a question-by-question basis, no question has a lower “Yes” response rate than 82%. So, there is no one sub-practice that is clearly tripping up most organizations.

The Common Foundations are a minimum standard—it’s not meant to be difficult or overly rigorous. In reviewing the submissions so far, we can see organizations we are familiar with and know have good impact measurement practice have said “no” to questions we believe they could answer “yes” to.

So we are left wondering: are organizations taking the assessment being too hard on themselves?

Here’s what we are seeing:

  • the seven lowest scoring questions in the self-assessment were spread across four of the five sub-practices (all but the first one, “Describe your intended change.”
  • These questions only garnered “no” responses from between 15% to 18% of the organizations that have taken the self-assessment.
  • Most organizations that do not score 100% are very close! Less than 20% of organizations have recorded scores below 50%.

Let’s take a closer look at the 7 questions with the highest “no” responses, and explore what it takes to be able to answer “yes”.

2.3 Refine indicators so they are time-specific and can be observed or measured. Our qualitative and/or quantitative indicators have clear timelines and are focused on changes that can be observed or measured. (17% of SPOs answered “No”.)
A sub-practice of “Using Indicators”, Question 2.3 asks SPOs if they have taken the time to assess and adjust their indicators to ensure they are useful. A good indicator should clearly state what change it is measuring, and over what time period. Demonstrating Value explains this here on page 18; Better Evaluation also provides some useful guidance here.

If you have taken the time to consider what indicators you are using and you believe these are the best indicators for your organization to use right now, you can answer “Yes” to this question.

3.3 Have a clear plan for data collection. We plan our data collection including who collects the data, how we collect it, from whom and when. (15% of SPOs answered “No”.)

Under “Collect Useful Information”, this sub-practice asks organizations to have a clearly articulated plan or process for how data will be gathered. This does not need to be elaborate or complicated; it simply ensures that data collection happens consistently and thoughtfully. Examples:

  • A program coordinator will conduct short intake interviews with all new program participants. The answers are transcribed into a spreadsheet during the interview, or written down and input afterwards.
  • The communications team sends an annual survey to our database each July. The questions remain largely the same each year, so we can track changes. The data is stored in our online survey tool, and exported into a spreadsheet that we upload into our impact measurement software.
  • Our onsite staff conducts a count of participants during each session and enters it into the daily logs. The administrative coordinator inputs this data into our impact measurement software each week.

If you have these sorts of data collection processes in place (even if they are just starting points, and you know you will be building on them!), then you can answer “Yes” to this question.

The next three questions all fall under the practice “Guage performance and impact”.
4.2 Assemble, organize and review your information. Our data is stored in ways that are useful and accessible to those who are learning about and improving the organization. They review it periodically (monthly, yearly, and as needed.) (16% of SPOs answered “No”.)
This question asks whether or not you are set up to ensure you are looking at the data that has been collected and stored. Do you have a process in place for periodically checking in to ensure your data is accessible and useful to those who need it?

This might mean ensuring your impact measurement software has reporting set up for at-a-glance check-ins on your key measurements. Or, it might mean a team member periodically uses Excel functionality to generate charts or graphs to represent the data collected in a more digestible format.

If you are regularly summarizing your data for senior leadership to refer to or for board or funder reports, then you can answer “Yes” to this question.

4.3 Analyze data. We analyze our data to understand if and how changes are occurring. (15% of SPOs answered “No”.)
This question asks if you are going beyond merely looking at your data. Are you noticing patterns? Are you actively reflecting on what it means and how you might use the information to improve products or services?

If you compare your data to a desired outcome, an ideal target, or to past performance, you can answer yes to this question.

4.5 Base conclusions about impact on reasonable assumptions. We have considered what changes might have happened even without our work and we moderate our claims as necessary. (15% of SPOs answered “No”.)
Here, the self-assessment asks if you’ve identified where your data might be flawed. Are you actively avoiding overstating your organization’s impact? Are your findings presented in a way that provides an honest and nuanced picture of your data quality for your readers? If you include a sentence or two about the limitations of the data you’ve collected when presenting it to any audience (such as pointing out where other factors may have contributed to the change or challenges encountered during data collection), you can say yes to this question.
The final two questions we’re looking at fall under “Communicate and use results”.
5.2 Choose reporting methods and communication styles targeted to the needs of different groups of people affected by your work. We create a few different reports (different lengths, detail, reading level and language) so that they are accessible to the groups affected by our work. (18% of SPOs answered “No”.)
This section asks you to look at how your results are being communicated and used. In this question, the self-assessment asks, “are you being thoughtful in how you reach your target audience(s)?” For example, we have observed that most organizations share impact information on a user-friendly website, in a more detailed report, and in conversation with clients.

If you produce a detailed report for specific audiences (such as your board), but then share key findings on social media, or include learnings in other public communications, you can say yes to this question.

Basically, if your organization shares results and learnings in different ways to reach different people, you can answer “yes” to this question.

5.3 Report on performance and impact regularly. We routinely make public updates on the main things our organization has achieved (or not) and changed (or not). (15% of SPOs answered “No”.)
Does your organization have processes for regularly communicating with your community? If you have a newsletter for donors, a social media platform you maintain, or a website that is periodically updated, you are likely making regular updates about what your organization has accomplished.

If your organization has the means to regularly communicate with those you are serving and your supporters, you can likely answer “yes” to this question.

Given this additional context, would your organization answer any of these questions differently?

If so, we encourage you to take the self-assessment again! You can also review the questions for all five sub-practices offline with your team using this PDF worksheet.

If you can now answer “yes” to every question, your organization has successfully adopted the Common Foundations!

If there are still questions you cannot answer “yes” to, we hope this helps make the steps needed to get to a “yes” more manageable.

Do you have questions, suggestions or general feedback about the self-assessment? We would love to hear from you! Please be in touch.

Be sure you don’t miss Common Approach to Impact Measurement news, updates and resources—sign up for our newsletter below and follow us on Twitter and LinkedIn.

More like this

Public review of the Common Impact Data Standard

Public review of the Common Impact Data Standard

We often quip that standards are communities, not documents. To live up to this community-driven mandate, we ask the users of our standards and the broader public for feedback on existing and proposed draft updates to the standards.

read more
Meet the Technical Committee

Meet the Technical Committee

The Technical Committee works to ensure that the Common Impact Data Standard and Common Form are driven by community and meet the needs of social purpose organizations, social financial institutions, and software organizations.

read more

Get email updates.

The Common Approach is community-driven! Get updates on standards and see upcoming events and webinars. 

Merci!

Inscrivez-vous à notre infolettre

L'Approche commune est propulsée par la communauté! Recevez des mises à jour et des invitations aux webinaires. 

Merci!