THE IDEA
Measuring what actually shifts
Traditional evaluation counts outputs: how many people attended the training, how many reports were published, how many meetings were held. These are easy to measure and nearly useless for understanding whether change is happening. You can train a thousand people and change nothing if the training doesn’t change how anyone acts.
Outcome mapping takes a different approach. Instead of tracking what the programme produces, it tracks what changes in the behaviour, relationships, and actions of the people the programme is trying to influence. These people are called “boundary partners” - the individuals, groups, or organisations at the boundary of the programme’s influence, whose changed behaviour is the pathway to wider impact.
The insight is that in complex change, you can’t control the final outcome. You can’t make poverty decrease or ecosystems recover or organisations transform. What you can do is influence the people and groups who are in a position to make those things happen. Outcome mapping focuses your attention and measurement on those changes in behaviour - the things you can realistically influence and meaningfully track.
IN PRACTICE
Watching for what’s different
A programme supporting smallholder farmers in adopting sustainable practices. Traditional evaluation: how many farmers attended training sessions? Outcome mapping: are farmers experimenting with new techniques on their own plots? Are they sharing what they learn with neighbours? Are local agricultural suppliers starting to stock the new inputs? These behavioural changes are the mechanism through which the programme produces impact. Tracking them tells you whether the programme is working, long before the impact data arrives.
An initiative to improve collaboration between two departments. Traditional evaluation: how many joint meetings were held? Outcome mapping: are people from each department reaching out to the other without being prompted? Are joint decisions being made faster? Is information flowing in both directions? The meetings are outputs. The changed behaviour is the outcome - and it’s a much better predictor of whether the collaboration is real or performative.
A personal development effort - say, someone trying to be a better listener. Traditional evaluation: how many books about listening did they read? How many workshops did they attend? Outcome mapping: are the people around them experiencing them differently? Are conversations lasting longer? Are family members sharing more? Is the person asking more questions and offering fewer solutions? The behavioural changes in the person and in their relationships are the real measure of whether the effort is working.
WORKING WITH THIS
Designing for behavioural signals
When planning any change initiative, identify who needs to behave differently for the change to succeed. These are your boundary partners. Then describe what changed behaviour would look like - specifically, observably, in terms of what people do, not what they think or feel.
Create “progress markers” at three levels. Expect to see: early signs of engagement and openness to change. Like to see: active adoption of new behaviours. Love to see: deep integration and independent action. This gives you a gradient to track progress, rather than a binary success/fail.
The discipline is focusing on what you can influence (behaviour) rather than what you can’t control (outcomes). This isn’t lowering your ambitions. It’s being honest about where your agency ends and the system’s complexity begins. If you change the right behaviours in the right people, the outcomes follow - but through pathways you can’t fully predict or control. Outcome mapping keeps your attention on the part of the chain you can see and affect.
THE INSIGHT
The line to remember
You can’t control outcomes in a complex system. You can influence the behaviour of the people who shape those outcomes. Measure that, and you’ll know whether change is real before the results are in.
RECOGNITION
When this is in play
You need outcome mapping when your programme is delivering outputs but you’re unsure whether anything is changing. When the impact data won’t arrive for years and you need to know now whether you’re on track. When different stakeholders define success differently and you need a shared way to track progress. When the change you’re seeking is behavioural, not technical - and counting workshops doesn’t tell you whether anyone’s behaving differently.