Home / Leverage and intervention / Intervention Side Effects

Leverage and intervention

Intervention Side Effects

Every action in a complex system produces effects beyond the one you intended

Also known as: Iatrogenic effects, Collateral effects

THE IDEA

Nothing lands in isolation

When you intervene in a complex system, you’re not doing one thing. You’re doing many things, only one of which you intended. The rest are side effects - consequences that ripple outward through the system’s connections, sometimes reinforcing your goal, sometimes undermining it, sometimes affecting parts of the system you never thought about.

This isn’t a failure of planning. It’s a property of complexity. In a system where everything is connected to everything else, any change propagates. The question is never “will there be side effects?” but “which side effects will matter, and when will they show up?”

The medical world has a word for harm caused by treatment itself: iatrogenic. It’s a useful concept to borrow. Every intervention in any system carries iatrogenic risk - the possibility that the act of helping makes something worse. The skill isn’t avoiding side effects entirely (that’s impossible) but designing interventions that are aware of where the ripples are likely to go.

IN PRACTICE

The ripples you didn’t draw on the map

A team introduces daily standups to improve communication. Communication improves. But the standups also create a pressure to always have something to report, which subtly discourages the kind of slow, exploratory work that doesn’t produce daily updates. The intervention solved one problem and quietly created another - not because standups are bad, but because changing the rhythm of communication also changes what kind of work feels valued.

Antibiotics cure the infection. They also disrupt the gut microbiome, potentially creating conditions for a different kind of illness down the line. The doctor who prescribes them isn’t making a mistake - they’re making a trade-off. The good doctor is the one who knows it’s a trade-off and watches for the secondary effects. The same principle applies to every organisational restructure, policy change, and product launch.

A city builds a bypass to reduce congestion in the centre. Traffic through town drops immediately. Over two years, the bypass attracts new housing developments, retail parks, and commuter traffic. Congestion returns - just in a different place, on roads that weren’t designed for it. The side effect wasn’t a surprise to anyone who studies transport systems. It’s called induced demand, and it’s so reliable it has a name. But it wasn’t on the original planning document.

WORKING WITH THIS

Designing for ripples

Before any significant intervention, ask two questions. First: what else is connected to the thing we’re changing? Trace the connections outward. If you’re changing a process, who else uses that process? If you’re changing an incentive, what other behaviours does that incentive currently support? Second: what will this look like in six months, not just next week? Side effects often operate on different timescales to the intended effect.

You won’t catch everything. That’s fine. The point isn’t to predict every ripple - it’s to go in with eyes open rather than closed, and to build in checkpoints where you look for effects you didn’t plan for. The worst outcomes come not from side effects themselves, but from the refusal to look for them once the intervention is underway.

A useful habit: after any significant change, actively look for what got worse. Not as pessimism, but as maintenance. Something always shifts in an unexpected direction. Finding it early is the difference between a manageable adjustment and a compounding problem.

THE INSIGHT

The line to remember

In a connected system, you never do just one thing. The intervention is the stone; the side effects are the ripples. You can’t throw the stone without making the ripples.

RECOGNITION

When this is in play

You’re looking at intervention side effects when a change that’s working on its primary measure is quietly making something else worse. When people in a different part of the system start complaining after your improvement went live. When a fix produces a new problem that nobody had before the fix existed. When six months after a successful initiative, there’s a new issue and nobody connects it back to the thing that “worked.”

intervention unintended-consequences complexity planning