DiscoveryUpdated·Falk Gottlob··updated ·5 min read

You're Only Listening to the Survivors

Survivorship bias is quietly wrecking your product decisions. You're building for the users who stayed and ignoring the ones who left. Here's how to fix that.

discoverycognitive biasuser researchretention
Helpful?

The short version

Survivorship bias is one of the most common mistakes in product management. Abraham Wald's WWII insight: don't armor the planes that came back; armor the parts of the planes that never returned. In PM work: you're only hearing from users who stuck around (your most engaged ones), looking at features that survived the gauntlet of your current UX, copying competitors who won (not the ones who failed identically), and celebrating A/B test wins without analyzing the losing variants. The countermeasures: talk to churned users (exit surveys, churn interviews), look at what's not happening (drop-off, non-events), run feature post-mortems on what flopped, include churn drivers in prioritization, frame your roadmap around outcomes. The bombers that came back were interesting. The ones that didn't had the answers.

There's a famous story from World War II. British bombers were coming back full of bullet holes, mostly in the wings and fuselage. The military wanted to add armor to those areas.

A statistician named Abraham Wald said no. The bombers they were looking at were the ones that made it back. The planes hit in the engines and cockpit never returned. You should armor the areas without holes, because that's where the fatal hits were.

This is survivorship bias: drawing conclusions from the winners while ignoring the losers. And it's one of the most common mistakes I see in product management.

Your most engaged users are the planes that came back. The ones with the fatal hits never made it to your dashboard.

, The Wald lesson, applied to product

Where this shows up in PM work

Customer feedback. You talk to your most engaged users. They love the product. They have strong opinions about what to build next. You feel great about your direction.

But you're only hearing from people who stuck around. The ones who churned, who signed up and left after a week, who evaluated you and picked a competitor, they have a completely different story. And their story is probably more important for your growth than your power users' wish list.

I ran into this hard at one company. Our NPS was great. Customers who used the product regularly loved it. We felt confident. Then we dug into churn data and realized we were losing 30% of new signups in the first two weeks. The survivors were happy. The ones who didn't survive never told us why, because we never asked them.

Feature prioritization. You look at feature usage data. Feature A gets tons of engagement. Feature B barely gets touched. Easy call: double down on A, deprioritize B.

But maybe Feature B is poorly discoverable. Maybe it's confusing. Maybe it solves a real problem but the onboarding doesn't surface it. You're looking at what survived the gauntlet of your current UX and assuming that's what matters.

Competitor analysis. You study the companies winning in your space. You copy their playbook. But you're only looking at the ones that made it. The companies that tried the same approach and failed have equally valuable lessons, and nobody writes case studies about them.

A/B testing. Your last three tests produced wins. You keep iterating in that direction. But you never went back to analyze why the losing variants lost. Was the hypothesis wrong? Was the execution bad? Was the timing off? Each failed test has information you're ignoring.

How I try to counter this

Talk to churned users. This is the single highest-ROI research activity most teams aren't doing. Set up exit surveys. Do churn interviews. Ask people who left: what were you hoping for? When did it break down? What did you switch to? The answers will make you uncomfortable, which is exactly why they're valuable.

Look at what's not happening. Usage data shows you what people do. But the most important signal is often what they don't do. Which features do people ignore? Where do they drop off in a flow? What pages do they visit once and never return to? I track "non-events" as seriously as events.

Analyze your failures. We do post-mortems on outages but not on features that flopped. I started running feature post-mortems: what did we expect? What actually happened? Why did it miss? These conversations surface more product insight than most roadmap planning meetings.

Include churn drivers in your prioritization. When I'm prioritizing work, I include "reasons people leave" right alongside "things engaged users want." If 20% of churned users cite the same problem, that should rank higher than feature request number 47 from your power user council.

Test multiple things at once. Don't run one A/B test, celebrate the win, and move on. Run parallel experiments. Look at the full range of results. Understand why things fail, not just why things work.

Measure what's missing

The hardest part of survivorship bias is that the data you need isn't in your dashboards. Churned users don't generate events. Failed features don't show up in your "top features" report. Competitors who went under don't appear in your competitive analysis.

You have to go looking for what's missing. That means building processes that capture the invisible data: exit surveys, churn interviews, failure post-mortems, drop-off analysis, non-adoption studies.

Building a roadmap that accounts for the dead

Your roadmap should not just be a list of things your best customers want. It should also account for:

Why people leave. What's the number one reason for churn? Is anything on the roadmap addressing it?

Why people never start. What happens in the first 48 hours? Where do they get stuck? What's the activation bottleneck?

What failed and why. Which shipped features didn't move the needle? What did you learn? Are you applying those lessons?

Frame your roadmap around outcomes, not features. "Reduce first-week churn by 15%" is a better north star than "Build feature X that 10 power users requested."

The bombers that came back were interesting. The ones that didn't come back had the answers.

Share this post

Also on Medium

Full archive →

Keep Reading

Posts you might find interesting based on what you just read.