How can addiction recovery services include impact evaluation in their strategy?
- Dragos Dragomir
- Dec 4, 2025
- 7 min read
A few years back I sat with the leadership team of an addiction recovery charity. They were tired, proud, and slightly on edge. Referrals were higher than ever. Groups were packed. Staff were doing great work.
Funders were asking for “impact data” and they were not sure ho exactly to get that. So, I asked one simple question: “How do you know that what you're doing, it’s working?”
Someone pointed to their spreadsheet of key performance indicators. Someone else talked about a powerful story from a service user.
Another pulled up a funder report that looked impressive but did not actually answer the question.
This is where so many great organisations live. Doing important work. Delivering hard-won services. Caring, stretching, firefighting.
But without a clear, shared, honest way of knowing whether all that effort is actually changing lives in the way you hope.
I understand that resistance. After twenty years in this field, I've felt it myself. When you're holding space for someone's recovery, when you're designing services with limited resources, when you're fighting for every pound of funding, evaluation can feel like one more demand from people who don't understand the work. Another box to tick. Another report to write for commissioners who've never sat with someone through their first week of sobriety.
That is what this article is about.
Not evaluation as yet another chore. Not evaluation as “the thing we do at the end for the funder”. Evaluation as the beating heart of your recovery work.
Evaluation has to sit at the centre, not on the edge
If you work in the addiction recovery space, you already care deeply. You already know the stakes. This is not about convincing you that outcomes matter.
It is about naming why evaluation must move from “nice to have” to “non negotiable” in your strategy.
Here are four reasons why:
Because people deserve the best you can offer, not just the best you can describe
A powerful story can open a door. It cannot carry a system. Evaluation helps you see who is benefitting, who is not, and who is being left behind. That is a matter of ethics, not admin.
Because money follows clarity
Commissioners and funders are under pressure too. They are being asked to back interventions that can show change, not just activity. If you want to be in the room when those decisions are made, you need more than warm words and attendance stats.
Because staff need evidence that their work matters
Recovery work is emotionally heavy. People burn out when they feel they are pouring themselves into a black hole. Thoughtful evaluation feeds back real progress, real change, real learning. That is one of the most powerful antidotes to burnout.
Because systems change starts with honest feedback loops
You cannot build better models of care if you are not seeing clearly what your current model is doing. Evaluation is the mirror that lets you redesign the room.
Most organisations in our field have at least some kind of business strategy.
We're good at what we do in the addiction recovery field. We're excellent at developing business strategies, articulating vision, building services that respond to real need. We understand trauma-informed care, we grasp the neurobiology of addiction, we know that recovery is a process, not an event.
Yet somewhere between the vision and the delivery, between the mission statement and the monthly team meeting, evaluation becomes an afterthought.
Something we'll get to. Something we'll add when we have time, when we have funding, when things calm down - which, in our world, they never do.
If that is where your organisation is right now, you are not alone. But you also have a huge opportunity.
Thinking like a recovery organisation that evaluates on purpose
So how do you begin to shift from “we do collect some data and gather some feedback from our service users” to “evaluation is part of who we are”?
If I were sitting with your board or leadership team, I would start with three questions.
What are we really trying to change?
Not “what do we deliver” but “what do we want to see in people’s lives, families, and communities as a result of our work”.
How would we recognise that change if it walked through the door?
What would people be doing, feeling, saying differently at 3 months, 6 months, 12 months after they have engaged with you?
What is the minimum information we need, collected in the least painful way, to know whether that is happening?
Evaluation should feel lean and purposeful, not like an extra job piled on top of already stretched staff.
Once you have honest answers to those questions, everything else flows more easily.
You are not building “an evaluation system”. You are building a way to pay attention to what matters most.
How to weave evaluation into your strategy, not just your reporting
Here is a simple way to think about embedding evaluation into your organisational strategy.
1. Make evaluation a strategic aim, not a side project
If you have 3 to 5 strategic priorities, one of them should explicitly relate to understanding and improving your impact.
Spell it out. For example:
“We will build and sustain a practical impact evaluation approach that tells us how our services support long-term recovery and where we need to improve.”
This should be discussed at board level, reported on, and resourced, in the same way as any other strategic aim. Evaluation is not something your “data person” does in the corner. It is a leadership responsibility.
2. Choose a small set of core outcomes
Resist the urge to measure everything. Instead, identify 5 to 7 outcomes that reflect what recovery means in your context.
For example:
Stability in housing and finances
Reduced harmful use or severity of the gambling behaviour
Improved mental health and wellbeing
Stronger supportive connections
Increased sense of hope, purpose, or identity
Use existing recovery frameworks and tools where you can rather than inventing everything from scratch. Resist the urge to create your own evaluation tool, as it would be impossible to compare your results with the wider sector then. The key is consistency, not perfection.
3. Design a simple learning cycle
A strategy that never lands in the diary is just a wish. Agree on a rhythm. For example:
Frontline staff collect outcome data and feedback at key points (start, end, and 3 or 6 month follow up).
A designated evaluation lead or small internal evaluation group turns that into a simple dashboard.
Every quarter, your leadership team holds a learning meeting:
What are we seeing?
What is improving?
Where are we stuck?
What do we want to test or change in the next quarter?
Then, crucially, you feed those insights back to staff and, where possible, to the people who use your services. Evaluation without feedback is just extraction.
4. Invest in people and tools
You cannot evaluate well on fumes. If you are serious about this, you will need to:
Set aside budget for evaluation work.
Give someone clear responsibility (and time) to lead it.
Provide basic training so staff understand why you are asking these questions and how to ask them safely and respectfully.
Use simple systems that staff can actually work with. If your database makes people want to throw their laptop out of the window, it is not going to support a culture of learning.
You may also choose to build a small internal evaluation team over time, or work with external partners on specific pieces of analysis or research.
I know what you might be thinking: "Resources? We're already stretched thin." But consider this: how much time do we spend explaining our work to funders, defending our budgets, justifying why our services should continue? What if that time was supported by robust, embedded evaluation that told our story with evidence and clarity?
What should you actually do in practice?
This will look different depending on your size and resources, but most recovery organisations can aim for a core evaluation toolkit that includes:
Routine outcome measurement
A brief set of validated or well-constructed questions at start, end, and follow up, linked to your core outcomes.
Lived experience voice built in, not bolted on
Short feedback questions after groups or sessions. Regular listening spaces. Occasional deeper interviews or focus groups to explore what the numbers cannot tell you.
A simple visual dashboard
Nothing fancy. Just enough to see trends, patterns, and outliers. Something your team can look at together and make sense of.
Periodic deep dives
Once or twice a year, pick a big question. For example:
“How well are we supporting people with co-occurring mental health conditions?”
“What happens after people leave our residential programme?”
Use mixed methods to explore it properly, then make clear decisions on the back of what you find.
Closing the loop
Every evaluation activity should end with:
What did we learn?
What are we going to do differently?
How will we know if that change made a difference?
If your evaluation work is not changing practice, it is just admin. The goal is always better decisions in service of better recovery.
If this feels heavy, you are not failing
If you are reading this and thinking “this sounds right, but I have no idea where to start” you are not alone.
Most organisations were never given a map for this. They were given a contract, a set of targets, and a reporting template. The rest they had to figure out while dealing with crisis after crisis.
Over the last few years, I have been working with recovery organisations to turn all of this into a clear, usable framework that helps them put evaluation at the heart of their strategy without overwhelming their teams. It breaks the work into manageable steps and keeps the focus on learning, not blame.
I will share more of that in future articles, but for now, here is the invitation.
Pick one small move.Name one outcome that really matters.Ask yourself how you would know if it was changing.Start paying attention to that, on purpose.
Evaluation is not about proving you are perfect. It is about staying in honest relationship with the people you exist to serve.
And that, in the end, is the heart of recovery work.




Comments