Correlation Isn't the Problem. You Are.
Everyone knows correlation isn't causation. They teach it in every stats class. People say it at dinner parties to sound smart. Then they walk into a Monday morning meeting and do it anyway. The problem was never confusion. It was always motivation. When the correlation supports what you already believe, questioning it feels like self-sabotage.
A client called me a few years ago, excited. They'd launched a premium loyalty program in January. By March, spending among members was up 25%. The VP of marketing had already built the deck. Could I help them put numbers around it so they could expand it?
I asked one question. What was spending doing among non-members?
Long pause. They didn't know. Nobody had looked.
Turns out spending was up across the entire customer base. The economy was strong, their biggest competitor had just gone through a messy public restructuring, and people were spending more everywhere. Two things happened at the same time and someone decided one caused the other. The VP was not happy with me. The deck was already done.
Everybody understands that correlation isn't causation. Ice cream sales and drowning deaths. Full marks. But the ice cream example is easy to see through because you have no stake in it. No budget attached to that conclusion. No slide deck due Friday. The moment you have skin in the game, the part of your brain that aced the quiz goes quiet and the part that needs the project to work starts talking.
Nobody has ever been promoted for saying "those two things might be unrelated."
Here are three reasons smart people confuse correlation with causation, none of which have anything to do with intelligence.
1. The correlation confirms what you already wanted to be true.
I sat in a quarterly review where the marketing team showed they'd increased ad spend by 40% and revenue had gone up the same quarter. Nice arrow on the slide. Cause, effect, done. I asked what quarter it was. Q4. Revenue goes up every Q4 because people buy gifts. It had gone up the previous Q4 too, when they hadn't increased ad spend at all. The room got quiet. Then they moved to the next slide.
When the data agrees with what you already believe, you stop interrogating it. I do this too. I catch myself maybe half the time. The other half, someone else catches me. That's the honest ratio for anyone who tells you they're data-driven.
2. A good story makes it invisible.
I watched a consulting firm present a beautiful analysis showing that companies with more women in senior leadership had higher stock returns. The correlation was real. The story was appealing. Everybody in the room wanted it to be true, including me.
But the companies that appointed women to senior roles were already better-managed, more forward-thinking, more profitable before the appointments happened. Good management led to both things. One didn't cause the other. Something underneath caused both.
A plausible narrative is the most dangerous thing you can attach to a correlation. Once the story exists, you're no longer testing whether A causes B. You're explaining how, which is a completely different activity that feels like the same one. I've caught myself doing this mid-sentence in a presentation. The story sounds good, the client is nodding, and somewhere in the back of your mind a small voice says "you don't actually know that." You keep talking.
3. Admitting it's not causal means someone wasted real money.
I worked with a company that had rolled out a wellness program. Six months later, absenteeism was down. The HR team reported the wellness program had reduced sick days. Applause all around.
Here's what else happened. Absenteeism drops every year during the same period because of seasonal patterns. And two months into the program, the company laid off 15% of its workforce. The people left were not calling in sick. They were afraid. Three things changed at once. The wellness program got credit because it had a budget and a champion, and the other two explanations were inconvenient. I tried to bring this up carefully. I was told the report had already been sent to the executive team.
The cost of "this might not be causal" is never zero. Someone spent money. Someone's review is tied to the outcome. You might be right. You will not be thanked.
What to do instead.
Three questions I've learned to ask every time someone implies that one thing caused another.
What else changed? My client's loyalty program launched in January. So did a competitor's implosion. So did an improving economy. If you can't name what else was going on, you can't claim A caused B. You can only claim they were in the same room.
What would have happened if you'd done nothing? That loyalty program's members spent 25% more. But so did non-members. If nobody checks the group that didn't get the treatment, the analysis isn't analysis. It's a mood board.
Who benefits from the causal story? If the person presenting the finding is the same person whose budget depends on it being true, apply extra skepticism. Not because they're lying. Because they're human.
The gap between correlation and causation is not a statistics lesson. It's a test of character. It asks whether you're willing to stay uncertain long enough to find out what's actually happening, even when a perfectly good story is sitting right there, asking to be believed.
Most people fail that test. Not because they're dumb. Because they're invested.
The Great Zandini Sees:
Everyone knows correlation isn't causation. Almost nobody behaves like it, especially when the correlation is paying their salary.