
The short version
We went from quarterly customer research to every PM running weekly discovery calls in 30 days. Not by mandate from leadership. By system that proved its value in real time. Week 1: I started alone, posting small insights in Slack from six 20-minute calls. Week 2: two PMs asked to join, I created a simple Google Doc template with five fields. Week 3: a roadmap was redirected mid-design because four PMs heard the same customer-driven insight. Week 4: all seven PMs were doing weekly discovery as habit. Culture change doesn't come from mandates. It comes from visible results. Eighteen months later, weekly discovery is so embedded that onboarding includes "here's how you schedule yours."
Our discovery process, if you could call it that, was seasonal. We'd run big research pushes quarterly - book a week of customer calls, compile insights, hand off a 40-slide deck to the roadmap committee, and then not talk to customers again until the next quarter.
It was slow. It was batch. It was designed for a company one-tenth our size.
The problem: by the time insights landed on a roadmap, they were three months old. By the time those roadmap items shipped, they were six months old. By the time we understood whether they worked, they were nine months old.
In nine months, customer needs change. Markets shift. Competitor moves matter.
So I decided to break the quarterly cycle and move to weekly discovery. Not as a mandate from leadership. As a system that proved its value in real time.
This is the story of how that worked.
Week 1: I Started (Alone)
Monday, January 2. I scheduled my first discovery call - 20 minutes, a small customer I'd worked with before, one simple question: "Walk me through the last task you did on our platform. What was easy? What made you slow down?"
The call took 22 minutes. I typed notes. Nothing revolutionary came out of it, but I did learn something specific: they were switching between two tabs constantly because a context they needed wasn't visible in the current view.
I posted the insight in Slack: "Context switching in [Feature X] - three customers mentioned this month. Worth investigating."
Tuesday, I had two more calls scheduled. One revealed a contract renewal was stalled because of a compliance question nobody on their side could answer. One was just a relationship check-in. Small stuff. No product insights.
Wednesday, I did it again. Then Thursday. Then Friday.
By end of week one, I'd done six customer calls - 20 minutes each, scheduled in batches around my calendar - and posted maybe four small insights in Slack.
The response was interesting: nobody commented. But people were reading them. I could see the emoji reactions. And one engineer said, "Hey, that context-switching thing - I wondered if that was a real problem."
Week 2: The Curiosity Spreads
Monday of week two, another PM asked: "Can I join one of your discovery calls?"
By Wednesday, two other PMs were scheduling their own. By Friday, we had six customer discovery calls happening across the team. Still just 20 minutes each, mostly unstructured. But the pattern was starting.
I created a simple template in a Google Doc:
Customer: [Name]
Date: [Date]
Key question: [One thing I wanted to know]
What I learned: [2-3 bullet points]
What surprised me: [Anything that contradicted our assumptions]
Next step: [Do I need to follow up? Talk to someone on the team?]
That template was the first system piece. Before, I'd been running discovery calls in my head. Now there was a written thing I could share.
By end of week two, we had 14 customer discovery calls across four PMs. The insights started showing a pattern: customers wanted better control over [Feature Set X], but not in the way we'd been planning it.
Week 3: The Contradiction
This is where it got serious.
On Monday of week 3, I was in a roadmap planning session. The team was reviewing Q1 priorities. [Feature Set X] improvements were solid, fully scoped, about to start development.
Then I mentioned what four of us had heard in customer calls the previous week: customers wanted a different configuration of [Feature Set X] than we'd designed.
The engineering lead pushed back: "We've already designed this. We talked to sales about it. It's aligned with the business case."
And I said something that mattered: "Yeah, we talked to sales. We didn't talk to the people actually using it daily."
That's a different conversation. Not accusatory. Just different.
So we did something unusual: we brought the discovery insights into the planning meeting. I pulled up the template notes from the previous week. Three different customers, three different companies, same core insight: they wanted to do thing A in context B, but our design forced them to do thing A in context C first.
The design was logical from our perspective. It made the feature simpler, more consistent.
It made the customer experience worse.
We had a choice: ship it as designed and iterate, or spend two more days digging into what the customer-driven configuration actually looked like.
We chose two more days.
By Wednesday, we'd reframed [Feature Set X] based on the discovery patterns. Not a complete redesign - we kept 80% of what was already done. But the workflow flipped.
This saved us six weeks of post-launch fixes and customer complaints. It's also what convinced the rest of the team that weekly discovery wasn't PM theater. It was operational.
By Friday of week 3, all seven PMs were doing weekly discovery calls.
Week 4: It Became Habit
By week four, it wasn't unusual. It was normal. Every Monday and Wednesday in the PM Slack, someone would post: "Got 10 minutes for a discovery call?"
The insights stopped being individual data points and started showing patterns. One PM noticed that three customers had mentioned the same integration pain. Another noticed that the word "templates" was confusing - customers would say "template" and mean four different things depending on context.
We weren't just collecting data. We were building a feedback loop that was actually responsive.
The quarterly research process was still there, but it became the deep dive on top of the continuous signal. Weekly discovery was the system that fed quarterly research.
And because we were talking to customers every week, when we ran a quarterly research project, we knew which questions were worth asking.
What Changed
Here are the concrete outputs from 30 days:
- Roadmap decisions changed direction twice - based on discovery patterns, not sales requests
- One feature redesign happened mid-development - saved 6 weeks of fixes
- Three high-value customer insights that contradicted our assumptions about how customers were using certain features
- Integration pain was identified early - we'd been planning a different fix than what would actually solve the problem
- Customer language standardization - we realized our UI was using different terminology than how customers thought about the work
But the real output was cultural: discovery became part of how we worked, not something we did quarterly.
Why It Worked
The playbook says culture change doesn't come from mandates. It comes from visible results.
I didn't tell anyone to do weekly discovery. I did it, shared what I learned, and other PMs copied the pattern because they saw the ROI. One PM talking about a finding that changed a roadmap decision is more persuasive than a memo from leadership saying "do more customer research."
The system was minimal: 20-minute calls, a template, shared in Slack. No tooling required. No new process. Just consistency.
Week one, I was alone. By week four, it was the default. Not because it was required. Because it worked.
And now, eighteen months later, it's so embedded in how we operate that onboarding new PMs includes "here's how you schedule your weekly discovery calls." Nobody has to be told. It's just what we do.
That's what a playbook should be: not a mandate, but a system that proves its value so clearly that people choose to adopt it.
Also on Medium
Full archive →How to Avoid Survivorship Bias in Product Management
Lessons from the British bomber study, applied to PM customer interviews and analytics.
Why Users Don't Know What They Want Until You Show Them
Visionary product development, when prototypes beat interviews, and how to know the difference.