top of page

What does “evidence-based” really mean in addiction care?

In addiction treatment circles, "evidence-based" has become something of a sacred phrase. Invoked by clinicians, demanded by commissioners and sought by people desperately searching for help.


Yet for all its ubiquity, the term remains curiously opaque.


What does it actually mean when we say a treatment is evidence-based?


How do we make sure an addiction recovery programme can embrace an evidence-based practice approach?


And why should it matter to someone standing at the threshold of recovery?


In this article I unpack those questions and offer a way through the maze that is the evidence-to-practice gap - what it means, how to do it, and how to know it’s working.


Understanding standards of evidence


At its core, evidence-based treatment represents a commitment to practices that have been rigorously tested through scientific inquiry, subjected to clinical judgment and determined to be appropriate for the treatment of a given individual, population, or problem area.


It is important to acknowledge that diverse types of evidence must be considered when evaluating addiction treatments and interpreting available data is not always a clear-cut process. Clinical trials, reviews, and meta-analyses are all useful sources but should be evaluated in terms of internal validity, external validity, ability to be replicated, clarity of instructions for implementation that providers can reference, and applicability to the client populations and the expertise of clinicians.


And… not all evidence carries equal weight. The research field operates with a hierarchy of evidence, with the randomized controlled trial (RCT) standing as the gold standard, the most reliable and robust form of scientific proof we possess.


Think of an RCT as a carefully orchestrated experiment: researchers randomly assign participants to different treatment groups, meticulously control for variables, and measure outcomes with precision. This is a rigorous standard designed to produce the best scientific evidence to prove a treatment’s effectiveness and protect people from ineffective or potentially harmful interventions. In USA, The Food and Drug Administration requires at least two independent RCTs demonstrating significantly better results than placebo or "treatment as usual" before approving any new medication or medical device. In the UK, NICE standards use RCTs evidence to guide commissioning and clinical practice.


In addiction treatment, this standard becomes particularly crucial. We're not simply evaluating whether something feels helpful or seems promising; we're asking whether it demonstrably changes lives, reduces substance use or engagement with a particular addictive behaviour, and supports sustained recovery. The stakes - public health, public safety, individual wellbeing - demand nothing less than this level of scientific rigor.


However, is not only academic research that can create an ‘evidence-base’.

Practice generates evidence too. The patterns clinicians see across caseloads, the insights from lived experience, service dashboards and audits, client-reported outcomes, case reviews etc., all count. Case studies, ethnographies, and lived experience knowledge help us understand why something works, for whom and in what conditions.


Evidence-based practice (EBP) is often referred to as the conscientious, explicit, and judicious use of current best evidence in making decisions about the care of individuals. True EBP requires the addiction professional to consider characteristics of the practice context in which they work. EBP is thus a process that integrates the client’s needs, values and preferences with relevant research evidence and clinical expertise into health care decision-making. That is, addiction professionals who follow this practice will consider scientific findings, their own clinical experience and the client’s desires, strengths and needs when prescribing or recommending or delivering treatment.


Evidence-based treatments, like cognitive behavioural therapy, can be an important part of the EBP process, but not all evidence-based practices use or prescribe evidence-based treatments.


When we combine rigorous studies with practice-based evidence, routine outcome monitoring, and co-produced learning, we get a truer picture, one that is more humane, more useful and far closer to the realities of addiction care.


Key components of evidence-based practice


EBP rests on three interlocking components: the best available research, the clinical expertise required to apply it safely and effectively, and the values and context of the person receiving care. In tackling addiction related harms, all three matter and create a system that only works when the parts talk to each other.


1) Best available research (what works, for whom, under what conditions)


What it is. High-quality studies, RCTs, implementation studies, meta-analyses, and qualitative research, establish whether an intervention has a reliable signal of benefit and in which contexts. In addiction, this includes psychosocial approaches (e.g., MI, CBT, CM, CRA, family work), medications (where indicated), and system-level strategies (harm reduction, community recovery).


How it looks in practice.


Start from trusted meta-level summaries (guidelines, systematic reviews) and pay close attention to details such as inclusion criteria, comparators, outcomes and follow-up processes.


Critically appraise the evidence and prioritise studies with good external validity (real-world settings, diverse populations) when planning services.


Combine “does it work?” evidence with implementation evidence (can it be delivered here or do we need pay attention to structural implementation barriers – housing, transport, digital exclusion etc. that could torpedo effectiveness? Fidelity matters but so does fit) and mechanism evidence (why does it work, what are the mechanism of change that we need to pay closer attention to?).


2) Clinical expertise (safe, skilled, adaptive delivery)


What it is. The judgement, tacit knowledge, and technical skills clinicians and other addiction professionals use to assess risk, design and sequence interventions, maintain therapeutic alliance and adapt care while preserving the active ingredients.


How it looks in practice.


Comprehensive assessment that covers addiction history, co-occurring mental health, risk (including suicidality), influence of trauma, physical health issues, safeguarding concerns, and social determinants.


Formulation-driven support planning: link problems to desired goals and outcomes, identify realistic change mechanisms and choose interventions that target them.


Fidelity with flexibility: use evidence-based treatment models, but adapt language, pacing, and delivery mode to individual strengths and needs (readiness, culture, literacy, complexity etc.)


Team-based practice: routine MDT case discussion, clear escalation protocols, and warm handovers across medical, psychological, and peer support staff. Avoid heroic individual practice with no supervision, audit, or outcome monitoring.


3) Patient values & context (goals, identity, constraints, and strengths)


What it is. The person’s aims (abstinence, reduction, stability), preferences, cultural identity, lived experience, and practical constraints. In addiction, readiness, safety, and environment are decisive.


How it looks in practice.


Co-created goals that are specific, meaningful, and revisited often. Remember that people need options, plus guidance. Person-centred does not mean that the person chooses every step of the intervention.


Trauma-informed stance: safety, choice, collaboration, trust, empowerment.


Accessibility by design: flexible appointment times, remote options, address social constraints and difficulties, group (gender, culture, ethnicity etc.) specific interventions, language choice, and cost/transport solutions.


Peer support involvement: use lived experience roles for engagement, credibility, and hope.


The glue: measurement-based care and continuous learning


These three components only come alive when services commit to a disciplined loop of measuring, reviewing, and adapting. Start with simple, repeatable outcomes: abstinence or reduced use, cravings management, psychological wellbeing, physical wellbeing, social functioning and quality of life, therapeutic alliance, and service utilisation, and plot them over time so they can be used throughout the support and aftercare time to guide decisions.


Build in regular structured reviews: if the needle isn’t moving, change something, intensity, modality, sequencing, or the social conditions that keep tripping people up. Treat practice as a source of evidence: record the right data, maintain a clear effectiveness overview; run case reviews after incidents as well as after successes; capture implementation notes about what helped or hindered uptake, then feed those lessons back into delivery.


Wrap this in strong governance, SOPs that make safety routine, regular supervision and fidelity checks to protect core ingredients, safeguarding pathways that activate quickly, and equity and impact audits that reveal who isn’t getting in, who drops out, and who isn’t benefiting and why.


Operationally, this means matching the right care to the right need at the right time, stepping intensity up or down based on data rather than hunch. It means blending modalities, medications when indicated, targeted psychosocial interventions, and harm-reduction and community recovery support, so people receive coherent, reinforcing help rather than disconnected offerings. It means holding the line on fidelity while adapting for culture, literacy, and logistics. It means treating safety as proactive, anticipating deterioration and responding warmly and fast. And it means designing with an equity lens from the outset, measuring access and outcomes by key demographics, and closing the gaps we find. This is how evidence becomes practice and practice becomes better evidence.


Why evidence-based practice is important


Because people don’t arrive at our doors with unlimited time, energy or trust. Evidence-based practice is how we honour that reality. When we use approaches that are shown to work, and deliver them with skill, humility, and a person-centred approach, we increase the chances that someone doesn’t just enter treatment but actually gets their life back. That’s the core promise: a higher likelihood of meaningful, sustained recovery.


It also sharpens decision-making. Evidence clarifies what to do first and why, so clinicians aren’t left guessing, and people in care aren’t left churning. It helps us sequence support (what now, what next), match intensity to need, and explain choices in plain language. That transparency builds trust, between practitioners and patients, across teams, and with funders and the public.


Evidence brings discipline to evaluation. Instead of relying on anecdotes or marketing claims, we look at patterns across many studies and our own data: who benefits, under what conditions, and at what cost. This lets services focus precious resources on the most effective interventions, reduce waste and scale what works.


Crucially, an evidence base that includes lived experience, qualitative research, and implementation learning helps us see the whole picture: barriers and facilitators to access, cultural and logistical fit, the importance of safety and trauma-informed practice, and the social conditions that enable change. That broader lens is how we close equity gaps and design pathways that reach those who are least likely to be offered, accept, or benefit from care.


For leaders, evidence is the backbone of accountability. It supports smarter commissioning, clearer standards and honest conversations about outcomes. For frontline teams, it protects the craft and the fidelity to what works, while still allowing flexibility for the person in front of you. For communities, it signals that addiction care is not guesswork but a learning system worthy of trust and investment.


Addiction care fails when any leg of the stool is missing: research without craft becomes brittle; expertise without evidence drifts; choice without structure overwhelms. When we integrate all three and keep learning through routine evaluation and co-production, we move from “delivering interventions” to building recovery-oriented systems that actually work: safer, fairer, more effective and worthy of the people we serve.

 
 
 

Comments


Get Great Recovery Insights  Straight to Your Mailbox.

+44 744 823 7212

London

United Kingdom

© 2025 by Dragos Dragomir.

bottom of page