top of page

When Impact Measurement Misfires: How Charities Can Cut Through the Noise

  • Writer: Helen Vaterlaws
    Helen Vaterlaws
  • Mar 4, 2025
  • 6 min read

Updated: Jan 22

Hands hold a pen and point at a colorful bar chart on paper, placed on a laptop. Warm bokeh lights in the background.

We’ve all been there. You’re staring at a spreadsheet full of green KPIs and "successful" outcomes, yet on the ground, your team is exhausted, and your beneficiaries are still struggling. There’s often a quiet gap between what the Board sees and how the work actually feels.


I’ve seen this from both sides. As a Chief Scientific Officer, I designed measurement frameworks with an emphasis on methodological rigor. Later, in operational leadership roles, I applied those frameworks while running services at scale. That mix sharpened my view of what ‘good’ measurement looks like. It’s not just statistical rigour. It’s whether the measure informs decisions and improves real outcomes in the real world.


With AI making it easier than ever to collect, summarise, and report data, there’s a real risk of things getting even more cluttered. Most charities don't need more data; they need the right data. I’ve put together a five-step plan to help you move away from data-overload and toward a system that actually supports your mission. This is for you if:


  • You’re collecting plenty of data but aren't sure it’s actually making your service better.

  • You’re a leader trying to give funders what they need without burning out your staff.

  • You want your measurement to be about people, not just numbers.


For a staff-friendly tool stack and my one-page cheat sheet, see: Charity Impact Measurement Tool Cheat Sheet (updated annually).


Where impact measurement often goes wrong (and the traps to watch for)


Before we look at how to fix things, we need to be honest about why so many systems break down. In my experience, these patterns are common and fixable without a full overhaul.t


1. The 'homework' trap: When data collection feels like a chore


Woman in green shirt smiles while holding a clipboard. She's in a donation center with boxes labeled "Donations" filled with toys and cans.

If your staff and the people you support don’t see the point of the data, they’ll treat it as a box-ticking exercise. When data collection feels imposed, driven by board reporting or funder requirements rather than frontline usefulness, the quality often drops and resentment can grow.


What to do instead: Work with your team to decide what actually matters. People are much more likely to collect good data if they know it helps them do their job better.


2. Focusing on statistics but losing the human story


Four people help each other climb a wooden wall outdoors. One woman cheers, wearing light blue tops. Greenery and trees in the background.

It’s easy to get obsessed with spreadsheets, but stats alone rarely tell the whole story. You might have great numbers, but if you aren't capturing the lived experience of the people you’re helping, you’re missing the point of why you’re there.


What to do instead: Don’t let numbers stand alone. Use beneficiary stories (with informed consent and appropriate anonymisation) to explain the "why" behind the "what."


3. Using measurement tools that don't fit your daily operations


Hand holding a smartphone with digital icons, including graphs and envelopes, overlaying the image. Text reads "Business". Tech theme.

I’ve seen many charities try to use "off-the-shelf" metrics that just don’t fit their culture or their daily workflow. If a reporting tool is clunky or uses language that your team wouldn't use in real life, it’s likely going to fail.


What to do instead: Keep it simple and keep it relevant. Choose measures that feel like a natural part of the conversation, not an interruption to it.


4. Ignoring the external context behind your results


A view of the busy streets of london.

Charities don’t operate in a vacuum. Outcomes are shaped by economic conditions, policy shifts, demographic changes, and service demand that sit outside the organisation’s direct control.

Results can look weaker on paper while still reflecting strong operational performance. Context isn’t an excuse, it’s part of interpreting outcomes responsibly.


For example, maintaining service levels during a cost-of-living crisis or adapting to a change in local policy may represent effective leadership, not underperformance.


What to do instead: Build contextual intelligence into your measurement and reporting. Track the external conditions that influence demand, delivery, and outcomes, and be explicit about how they interact with your results. This allows Boards to understand not just what changed, but why and to assess performance with greater accuracy and confidence.


5. Tracking metrics that no longer align with your mission


Couple packing a van at a campsite. Man holds a green bag, smiling. Woman in green sweater and blue hat holds a small dog. Green foliage background.

Too often, charities keep measuring things they stopped doing years ago, simply because "that’s how we’ve always done it." If your goals have changed but your metrics haven't, you're wasting time on data that no longer has a use.


What to do instead: Review your metrics regularly. If a piece of data isn't helping you make a decision, evidence your impact or a funding requirement is it still worth collecting?


Five steps to improve your impact measurement (without an overhaul)


You don’t need a massive budget or a total system redesign to start making your data more meaningful. Here is a simple, five-step plan to help you move toward a system that supports your mission.


Five potted plants with increasing growth stages labeled Why, How, Choose, Engage, Evolve, each with symbols on the pots.

1. Start by clarifying the ‘Why’ behind your mission


Before you look at spreadsheets, get your leadership and frontline staff in a room (or on a call). You need to agree on what you’re actually trying to achieve. If the goal isn't clear, the data will always feel like a burden.


  • The conversation to have: Ask your team: "What is the one outcome we exist to achieve?" and "What would failure look like for the people we support?"

  • Try this: Find the one goal your team would rally around, even if it feels ambitious. That’s your starting point.


2. Define the ‘How’ through shared experience


Metrics shouldn’t be handed down from the board; they should be grounded in the reality of the work. The best way to do this is to talk to the people who are actually involved.


  • The conversation to have: Host a small workshop with a mix of staff and beneficiaries (voluntary, inclusive and accessible). Ask them: "What changes would show us that we’re actually succeeding?"

  • Try this: When was the last time you asked the people you support what success looks like to them? Their answers might surprise you and lead to much better metrics.


3. Choose metrics that matter (not just ones that flatter)


It’s tempting to pick the numbers that always look good, but surface-level metrics rarely help you improve in the long-term. You need metrics that tell you the truth, even when it’s uncomfortable.


  • The conversation to have: For every metric you track, ask: "Is this tied to our mission?" and "Can we actually take action if this number changes?"

  • Try this: Identify the one metric that, if it dropped, would signal an immediate crisis. That is a metric that matters.

What to avoid

(Surface Metric)

Why it’s misleading

What to track instead (Meaningful)

Total number of interventions

Shows how busy you are, not the result.

Simple, consistent, co-designed progress measure

Event attendance numbers

Counts people in the room, not impact.

How skills were used after the event.

Number of website hits

Shows reach, but not engagement.

Conversion rate to service inquiry.

4. Change how you talk about data with your team


If data training is just a dull slide deck, people will switch off. To get buy-in, you need to make it relevant to their daily work and show them how it makes their lives easier.


  • The conversation to have: Sit down together and look at real data. Ask: "What is this telling us about our service today?"

  • Try this: Bring in the colleagues who are most hesitant and involve them in designing the training. If you can convince them, the rest of the team will follow.


5. Be brave enough to stop measuring what isn't working (within the bounds of your reporting obligations)


Before dropping a metric, distinguish between internal utility (what helps you run the service) and external compliance (what the funder requires). If a measure serves neither, consider retiring it and record the rationale so you can explain the change to auditors, funders, and your board.


  • The conversation to have: Every six months, look at your internal utility reporting metrics and ask: "Has this sparked any action lately?" and "Would we actually miss this if it was gone?"

  • Try this: Ask your team: "Which metric would you drop tomorrow if you had the choice?" If they all point to the same one, it might be time to let it go.


Ready to move forward with your charity impact measures?


Impact measurement isn’t just another task on your to-do list. It’s how you protect what actually matters for the people you support and stop wasting energy on the things that don't. By following these five steps, and only introducing new tools when your system is ready, you can turn your data from a heavy administrative burden into a tool for making better decisions.


  • A quick reflection: Which metric on your report feels like noise today? What would happen if you just stopped tracking it after checking obligations?


  • Stay Compliant: Always check your funder requirements and your organisation’s GDPR/retention policies. In the UK, the Information Commissioner's Office guidance is a good starting point for data protection and ethical use of personal data.


  • Next step: If you want to see where your gaps are, you can take my two-minute impact self-assessment here.



Change doesn’t start with a workshop; it starts with one honest conversation.

 



Note: These insights are general guidance based on practitioner experience and are not legal or regulatory advice. Always review your specific funder contracts and data protection policies (e.g. GDPR) before making significant changes to data collection or retention schedules. Examples are for illustrative purposes only; no official affiliation with the organisations or tools mentioned is claimed.

© 2026 Insights2Outputs Ltd. | All rights reserved | Privacy Policy

Disclaimer: This content is provided for informational and illustrative purposes only. It does not constitute professional advice and reading it does not create a client relationship. Always obtain professional advice before making significant business decisions.

bottom of page