How to Measure Marketing Effectiveness: A Scoreboards vs Film Room Framework
Quick Answer: how to measure marketing effectiveness
Effective marketing measurement requires separating two activities: evaluating campaigns (film room) and evaluating teams (scoreboard). Use Marketing Mix Modeling for budget allocation, attribution for tactical optimization, and incrementality testing for validation. According to Patrick Gilbert in *Never Always, Never Never*, these tools break down when treated like scoreboards rather than learning instruments. Focus on business outcomes like revenue growth and market share, not just short-term metrics. Build measurement systems that prioritize effectiveness over accountability theater.
Why Most Marketing Measurement Fails
Marketing measurement today suffers from a fundamental confusion between two different activities. We've taken tools designed for learning and improvement—Marketing Mix Modeling, attribution, and incrementality testing—and turned them into courtrooms where teams fight over credit and blame. As Patrick Gilbert argues in Never Always, Never Never, this transforms powerful analytical instruments into sources of dysfunction.
The core problem isn't the tools themselves, but how we use them. When an agency's survival depends on producing short-term, attributable ROAS, their incentives drift toward whatever is easiest to measure and claim credit for, even when that's not what's best for the business long-term. This creates a measurement system that looks precise but optimizes for the wrong outcomes.
Modern marketing organizations are obsessed with measurement, but they've misunderstood what measurement is for—and built incentive systems around that misunderstanding.
The Scoreboards vs Film Room Framework
The solution requires separating two distinct activities. Film room work is about reviewing performance to learn and optimize. You're analyzing what worked, what didn't, and what to change next. The mindset is collaborative and analytical, focused on improvement rather than judgment.
Scoreboard work evaluates teams and partners based on business outcomes over time: revenue growth, profit, market share, retention, and brand health. This includes qualitative factors that don't fit in dashboards but determine success: strategic thinking, reliability, decision-making quality, and trust.
When you treat film room tools like scoreboards, the game gets distorted. Teams start optimizing for measurement rather than results. The focus shifts from building something durable to padding statistics that look good in reports.
The Three Pillars of Marketing Measurement
Effective measurement relies on three complementary tools, each designed for different questions and time horizons. Marketing Mix Modeling (MMM) provides the strategic, top-down view for budget allocation across major channels. It's your planning tool for quarters and years, not daily optimization.
Attribution works at the tactical level, helping optimize within channels you've already chosen. Which creative performs best? Which audiences convert most efficiently? Which keywords drive quality traffic? Attribution excels at these execution questions but struggles with strategic ones.
Incrementality testing answers the scientific question attribution can't: 'What would have happened without this campaign?' Through controlled experiments, it separates correlation from causation and validates whether marketing truly drives incremental behavior.
Each of these three tools gives you a different view of reality, answers a different kind of question, and becomes dangerous when you force it to answer questions it was never meant to answer.
Patrick Gilbert, Never Always, Never Never
Why All Models Are Wrong (But Some Are Useful)
The statistician George Box famously observed that 'all models are wrong, but some are useful.' No model can fully capture the complexity of human behavior, market dynamics, competition, and timing that influence marketing outcomes. Every measurement approach is a simplification—a useful fiction that approximates reality.
This matters because modern marketing measurement tools are probabilistic, not deterministic. They estimate likelihoods and work with confidence intervals rather than providing perfect answers. MMM, attribution, and incrementality testing all fall into this category. They're decision aids, not crystal balls.
The right question is never 'Is this model true?' but 'Is this model useful for the decision I'm trying to make?' This perspective prevents organizations from demanding false precision and allows them to make better decisions with imperfect information.
Building Trust Through Measurement
True accountability requires trust, not just data. The most effective marketing organizations aren't those that can explain every outcome in a spreadsheet—they're the ones that can operate with some uncertainty and still move forward together. This means creating measurement systems that encourage collaboration rather than competition.
Trust is built through transparent communication about strategy, clear definition of shared goals, and honest acknowledgment of what can and can't be measured perfectly. When teams and vendors share objectives rather than fighting over attribution, marketing becomes more effective and measurement becomes more meaningful.
Steps
Separate Scoreboards from Film Rooms
Define two distinct measurement purposes: evaluating campaigns (film room) and evaluating teams (scoreboard). Film room activities focus on learning and optimization using MMM, attribution, and incrementality. Scoreboard activities evaluate business health through revenue growth, profit, market share, and team performance over time.
Set Up Your Marketing Mix Model for Strategic Allocation
Use MMM as your 30,000-foot view for budget allocation across major channels. Treat it as a planning tool that sets direction over quarters or years, not a real-time optimization dashboard. Remember that MMM reflects historical performance and may miss future potential in underinvested channels.
Deploy Attribution for Tactical Optimization
Use attribution as your on-the-ground execution tool for optimizing within established channels. Focus on creative performance, audience selection, keyword optimization, and placement decisions. Accept its blind spots around offline influence, cross-device behavior, and brand effects rather than forcing it to answer strategic questions.
Run Incrementality Tests for Validation
Design holdout tests to answer 'What would have happened without this campaign?' Run multiple tests over time rather than treating single results as definitive truth. Use incrementality to validate big bets and settle strategic arguments, not for fine-grained tactical decisions.
Build a Measurement Triangle System
Integrate all three tools as complements, not competitors. Let MMM set allocation guardrails, incrementality pressure-test assumptions, and attribution provide fast tactical signals. Each tool makes the others better when used for its intended purpose within the proper sequence.
Focus Business Evaluation on Long-Term Outcomes
Evaluate marketing teams and agencies based on sustained business health: revenue growth, profit margins, market share gains, customer retention, and brand demand. Include qualitative factors like strategic thinking, proactivity, reliability, and decision-making quality that determine long-term success.
Accept Probabilistic vs Deterministic Measurement
Embrace that all models are approximations, not perfect explanations. Work with confidence intervals and likelihood estimates rather than demanding false precision. Use models as decision aids while maintaining judgment about what matters but can't be perfectly measured.
Align Incentives Around Shared Goals
Create measurement frameworks that encourage collaboration rather than competition between teams and vendors. Define shared business objectives that prevent attribution gaming and credit-claiming. Build trust through transparent communication about strategy, milestones, and adaptation plans rather than dashboard management.
Frequently Asked Questions
What's the difference between scoreboards and film room in marketing measurement?
Film room activities focus on learning and optimization—analyzing campaign performance to understand what worked and what to change. Scoreboards evaluate team and partner performance based on long-term business outcomes like revenue growth, profit, and market share. Mixing these purposes creates dysfunction.
When should I use Marketing Mix Modeling vs attribution?
Use MMM for strategic budget allocation across major channels over quarters or years. Use attribution for tactical optimization within channels—creative testing, audience selection, keyword performance. MMM is your planning tool; attribution is your execution tool.
How do I run effective incrementality tests?
Design controlled experiments where one group sees your marketing and another doesn't, then measure the difference. Run multiple tests over time rather than treating single results as definitive. Use incrementality to validate big strategic bets, not fine-grained tactical decisions.
Why do my attribution numbers conflict with my MMM results?
Attribution and MMM measure different things using different methodologies. Attribution tracks touchpoint-level interactions but misses offline influence and brand effects. MMM captures top-down impact but moves slowly. They're complementary tools, not competing sources of truth.
What business metrics should I use to evaluate marketing effectiveness?
Focus on sustained outcomes: revenue growth, profit margins, market share gains, customer retention, and brand health metrics. Include qualitative factors like strategic thinking, reliability, and decision-making quality that determine long-term success but don't fit in dashboards.
How do I handle measurement uncertainty in marketing decisions?
Embrace probabilistic thinking—work with confidence intervals and likelihood estimates rather than demanding false precision. Use models as decision aids while maintaining judgment about what matters but can't be perfectly measured. All models are approximations, not perfect explanations.
What's wrong with optimizing for ROAS targets?
ROAS optimization encourages short-term efficiency over long-term effectiveness. According to research by Les Binet and Peter Field, campaigns focused on ROI targets tend to boost apparent returns by reducing investment, often at the cost of sustainable growth and market share.
How do I prevent measurement from creating team conflicts?
Align incentives around shared business goals rather than individual channel performance. Create measurement frameworks that encourage collaboration rather than credit-claiming. Build trust through transparent strategy communication and honest acknowledgment of each tool's limitations.
From the Book
Chapter 21 introduces the scoreboards vs film room framework, showing how to use MMM, attribution, and incrementality testing as learning tools rather than weapons in internal battles over credit and blame.
Read more in Chapter 21 of Never Always, Never Never.
Want to go deeper on this topic?
Chat with the AI companion to explore these concepts with the full context of the book.
Chat about this topicRelated Reading
Why the Old Playbook Is Broken
The platform tricks and optimization shortcuts that built most digital marketing careers have been commoditized. Here's what actually changed and what to do about it.
BlogMental Availability: The Marketing Concept Most Digital Marketers Ignore
Mental availability is the single most important driver of brand growth. Most digital marketers have never heard of it. Here's why that matters.