
In today’s fast‑changing markets, traditional, certainty‑based investment approvals waste capital on untested assumptions, as shown by historic failures like Euro Disney. By adopting DDP, CFOs and finance teams can allocate resources more efficiently, reduce risk, and accelerate learning, making the approach essential for companies seeking sustainable growth and competitive advantage in an era of rapid innovation.
My 4th article on #entrepreneurialfinance…
Dear finance executive,
Growth investments look more and more like ventures. They venture into unknown territory: New markets, new business models, new operating models.
Yet, most firms still evaluate venture investments as if uncertainty was the exception rather than norm. Many investment evaluations reward precision over truth. Teams present immaculate spreadsheets, confident market narratives, and linear rollouts. Leaders then approve investments that look ready — even though most of the underlying assumptions have never been validated.
That mindset may have worked when markets were slow. Today it destroys capital.
Rita McGrath and Ian MacMillan warned about this decades ago. Euro Disney remains the archetype: Assumptions about length of stay, merchandising, and dining patterns were never tested and did not materialize. This led to over $1B in early losses despite hitting visitor numbers.1
The deeper issue: Ventures are funded based on assumptions pretending to be knowledge. In a world of accelerating competition, shifting technologies, and unpredictable customers, those assumptions multiply. What used to be a planning exercise is now a risk-management exercise.
Discovery-Driven Planning (DDP) gives Finance a rigorous, structured way to evaluate ventures under uncertainty — by forcing ideas to earn their funding through evidence, not optimism.
[

](https://substackcdn.com/image/fetch/$s_!HO-n!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb846e1de-b695-4c82-a09f-275bb5d05857_1024x1024.png)
DDP is fundamentally a method for evaluating ventures by treating every plan as a collection of hypotheses. It replaces prediction with disciplined learning.
DDP’s modern form — now widely used in innovation, R&D, digital transformation, and venture incubation — incorporates three essential ideas:
Only validated assumptions deserve capital.
Funding is released in tranches tied to evidence.
When new information contradicts the plan, the plan changes
This “planning to learn” mindset is what keeps companies from burning millions on attractive but unrealistic ventures.
Flatiron Health: Pivot only after validating assumptions
Flatiron didn’t pivot from ad-tech into oncology data because the story was compelling. They ran small experiments, tested feasibility, and validated clinician demand. Only once core assumptions held up did they scale — ultimately leading to a $1.9B acquisition by Roche. A classic DDP evaluation: define success, validate cheaply, pivot decisively.
Best Buy: Evaluating digital initiatives through experiments, not forecasts
Best Buy didn’t “approve” transformation projects; it tested them. Price-matching, service expansions, store-in-store concepts, and fulfillment pilots were all evaluated based on measurable learning. Ventures progressed only when evidence supported the next step. Leadership embedded the DDP logic: capital follows proof.
Kloeckner: Digitalization in industrial B2B through phased validation
Instead of launching a massive digital overhaul, Kloeckner evaluated each venture — online ordering, AI pricing, digital services — via small-scale tests. Each pilot had defined assumptions to validate (e.g., “customers will buy steel online if ordering is simplified”). Funding expanded only when assumptions held. This DDP discipline modernized a century-old industrial firm without catastrophic bets.
AGC: R&D funding governed by DDP logic
AGC now evaluates R&D ventures using assumption checkpoints inside its Stage-Gate process. Funding is tied to validation events, not deliverables. Assumptions about technical feasibility and market adoption must be proven before progressing. This effectively treats every R&D initiative as a real-options portfolio — only invest more when the venture earns the right to scale.
Across industries, the shift is clear:
**Ventures aren’t approved, they’re tested.
Capital isn’t allocated, it’s earned.**
Finance’s role is not to predict venture outcomes — it’s to evaluate venture viability. DDP provides a structured sequence for doing exactly that.
Before evaluating potential, define exactly what the venture must deliver financially.
Work backward from required profit and return thresholds.
This immediately filters out ventures that cannot meet strategic or economic relevance.
Every venture model hides dozens of assumptions:
Demand
Pricing
Adoption curves
Conversion
Cost-to-serve
Channel economics
Competitor responses
Operational feasibility
DDP calls for making these assumptions explicit. Recent DDP discussions emphasize the importance of competitor responses, because competitive reaction can erode margins quickly.
This is where most venture evaluations fail: assumptions remain invisible.
This is where overconfidence collapses. Market sizing, pricing power, competitive dynamics, cost curves, and adoption rates should be benchmarked against best-in-class data.
A significant number of corporate ventures die in this step — which is exactly the point. Better to invalidate assumptions now than during execution.
This step converts “theoretical viability” into “real-world feasibility.”
If a model needs 40,000 enterprise sales conversations, that operational truth matters more than the initial top-line ambition.
If a digital venture assumes premium pricing, what evidence of differentiation must exist?
McGrath’s recent research reinforces this: DDP requires testing not just demand assumptions but operating model assumptions; something startups often ignore.
In DDP, milestones are not activities. They are assumption tests.
What must be true for the venture to deserve the next tranche of funding?
What evidence will validate or invalidate the key assumptions?
What is the stopping rule if assumptions fail?
Modern DDP practice has strengthened this discipline:
Budget follows evidence.
Practical takeaway: Finance’s strategic advantage in venture evaluation is not superior forecasting. It’s structured skepticism. Our role is to force ideas to earn their next round of funding, not with slideware, but with validated assumptions. What must be true for this to work and how will we test it?
Your competitive edge is not predicting the future. It’s converting assumptions into knowledge faster, and acting decisively on what you learn.
All the best,
Sebastian
PS: Related articles…
Businesses need more entrepreneurship: 4 shifts CFOs must lead
Why Investment Approvals Fail - and What CFOs Should Do Instead
What CFOs should Learn from Venture Capital about Capital Allocation
Thanks for reading CFO Impulse! This post is public so feel free to share it.
McGrath, R. G. & MacMillan, I. C. (1995) Discovery-Driven Planning. Harvard Business Review, July–August.
McGrath, R. G. & MacMillan, I. C. (1999) Discovery-Driven Planning: Turning Conventional Planning on its Head. DeepCanyon / HP e-publishing.
Gallo, A. (2017) A Refresher on Discovery-Driven Planning. Harvard Business Review, February.
Comments
Want to join the conversation?
Loading comments...