I own a great car. A Ford S-Max that has given me years of smooth journeys, until recently.
Lately, on several long trips, it started losing power. Imagine climbing mountain passes with your family at 90 km/h, foot to the floor, hoping the engine would respond. Not exactly confidence-inspiring.
The diagnosis at the garage was always the same: “the particulate filter is clogged”. They cleaned it, replaced sensors and changed thermostats. Over and over again. And yet, the problem kept coming back.
What they were fixing were symptoms, not causes.
So I did what engineers are trained to do: I stepped back and looked at the system. I revisited how diesel particulate filters actually regenerate: how fuel is injected into the exhaust, how temperature rises and how cleaning really starts.
While doing my research, it reminded me of a guy from my past: a former colleague at 3M, a tall Dutch director everyone called “Hermann Why”. Because whatever the problem was, Hermann would ask “why?” at least five times. No shortcuts. No surface answers. What Hermann practiced is what we now call first-principles thinking: don’t stop at what you observe, trace the causal chain back to where the problem actually originates.
The Ishikawa diagrams
In the early 1960s, Japanese factories were becoming incredibly good at execution, but recurring defects still appeared. Kaoru Ishikawa, one of the fathers of modern quality management, realized that teams were too quick to jump to solutions. They treated what they could see, not what actually caused the issue. His answer was disarmingly simple: slow down and map causality.
The Ishikawa diagram was born from a very human frustration: fixing problems that keep coming back. The diagram he introduced forces you to place the problem at the end of a line and then work backwards, asking what could possibly be contributing to it. Not just one cause, but many. Technical, human, procedural, environmental. Layer by layer.
Visually, it looks like a fish skeleton. The problem sits at the head. The main categories of causes branch off like large bones. Deeper causes extend as smaller bones. That is why it became known as the fishbone diagram.
Its power is not in the drawing itself, but in the behavior it enforces: resisting the temptation to fix symptoms and instead building a shared understanding of why something is happening. In other words, it’s a reminder that most persistent problems don’t need faster action, they need better diagnosis.
Back to my Ford Smax challenge…
Eventually, the insight became obvious. If the catalyst temperature never rises, the catalyst isn’t the issue. The real problem is the fuel vaporizer, the mechanism that injects fuel into the exhaust to trigger regeneration. Without it, no amount of sensor replacement will solve anything.
Once the why is clear, the how becomes trivial. And this is exactly what happens in Change Management. We obsess over what is visible:
- low adoption
- resistance
- lack of engagement
- poor usage metrics
So we add more dashboards, more surveys, more KPIs, more data. But data without diagnosis is just noise.
Change fails not because we lack information, but because we fail to make sense of it. Because we act on consequences instead of understanding origins. Because we try to change behaviors without understanding what generates them.
When you truly understand the why, the interventions almost design themselves.
So next time someone asks for more data, ask back: for what purpose? would you take a different decision if you had more information?. Very often, the honest answer is no. The challenge is rarely to collect more data. The real challenge is to turn the data we already have into insight. And that, in both engineering and change, is where progress actually starts.
#ChangeManagement #Transformation #ChangePills #Diagnosis #Ishikawa #Causes #Consequences
Leave a comment