Scoreboards vs. Film Room: A Framework for Marketing Measurement
Quick Answer: marketing measurement framework
The scoreboards vs. film room framework separates two distinct activities in marketing measurement. The film room is for evaluating campaigns and tactics using MMM, attribution, and incrementality testing. The purpose is learning and optimization, not judgment. The scoreboard is for evaluating teams and partners through business health over time: revenue growth, profit, market share, and qualitative factors like strategic thinking and trust. As Patrick Gilbert argues in Never Always, Never Never, when measurement tools designed for the film room are treated as scoreboards, incentives distort. Teams optimize for what is easiest to measure and claim credit for, rather than what is best for the business over the long run.
When the Film Room Becomes the Scoreboard
Marketing organizations are obsessed with measurement. Patrick Gilbert argues in Never Always, Never Never that the problem is not measurement itself but what we think measurement is for. Marketing Mix Modeling, attribution, and incrementality testing are powerful tools for learning. They help you review the tape. They help you understand what happened, why it happened, and what you should change next. Used correctly, they make your marketing program smarter over time. But these tools break down when you treat them like a final score: a clean, definitive verdict on whether your team or agency is good. When the film room becomes the scoreboard, the game gets distorted. The incentives change from winning games to padding your own statistics. This is where so much dysfunction in the industry comes from. When an agency's survival depends on producing short-term, attributable ROAS, their incentives drift. The work starts to tilt toward whatever is easiest to measure and easiest to claim credit for, even if that is not what is best for the business over the long run. This is not a failure of intelligence. It is a failure of framing. We took tools designed for analysis and improvement, and we turned them into a courtroom. The scoreboards vs. film room framework puts those tools back in the right place.

Enjoying this? Never Always, Never Never goes much deeper into the mental models and decision frameworks that shape how we think.
The Marketing Measurement Triangle
Each of the three measurement tools gives you a different view of reality, answers a different kind of question, and becomes dangerous when forced to answer questions it was never meant to answer. Marketing Mix Modeling is the 30,000-foot view. It looks at your business from high altitude and asks: when spend goes up in certain places, does the business tend to grow? MMM helps you decide how to allocate a finite pool of money across major channels and levers over a quarter or a year. But it moves slowly. It cannot tell you which specific creative is driving results, and it cannot act as a real-time optimization dashboard. Attribution is the on-the-ground view. It lives at the touchpoint level, assigning credit for a conversion across digital interactions: clicks, impressions, site visits, retargeting. That makes it an execution tool, useful for optimizing within the system you have already chosen. The problem is that attribution overvalues channels that are easiest to track, especially bottom-funnel tactics like Google Search and retargeting. Incrementality testing is the scientific view. It answers the question attribution cannot: what would have happened if we did not do this? You hold out a control group, compare outcomes, and measure the difference. It gives leadership confidence that marketing is driving behavior that would not have happened anyway. But incrementality is still probabilistic, not deterministic. Running the same test again could produce a materially different outcome.
Think Like an NFL General Manager
Patrick Gilbert draws a detailed analogy between marketing measurement and NFL roster construction to make the framework tangible. In the 2024 season, Bills running back James Cook scored 16 rushing touchdowns, tied for the most in franchise history. If you only looked at the box score, you would assume he was the most valuable offensive player. This is exactly how attribution works in marketing. The channel that finishes the drive gets the credit. But James Cook's contract represented just 1.2% of the salary cap, while quarterback Josh Allen accounted for more than 13%. Allen shaped outcomes long before the ball crossed the goal line. And sitting between Allen and Cook were three offensive linemen whose individual cap hits were more than double Cook's. They never scored touchdowns. Most fans would not recognize their names. But without them, the offense would not function. Attribution has the same blind spot. It struggles to value the work that creates conditions for success rather than claiming success outright. Brand campaigns, upper-funnel media, and creative that changes perception rarely score in attribution models. But they quietly shape outcomes everywhere else. The Bills also let their leading touchdown receiver, Mack Hollins, walk in free agency while extending Khalil Shakir, who led the team in targets and receptions. The offense was roughly twice as productive when Shakir was on the field. That is an incrementality judgment: measuring what changes because a player is active, not just who scored.
No single model can do everything. MMM, incrementality, and attribution form a layered measurement system. Each tool makes the others better when used for the right purpose.
How It Works
Define the Film Room (Campaign Evaluation)
The film room is where MMM, attribution, and incrementality testing belong. The purpose is learning and optimization. You are reviewing tape, trying to understand what worked and what to change next. The mindset is analytical and collaborative, not punitive. You are not looking for someone to blame. You are looking for better odds next time.
Define the Scoreboard (Team Evaluation)
Evaluate teams and partners through business health over time: revenue growth, profit, market share, retention, and brand demand. Include qualitative factors that do not fit into models: strategic thinking, proactivity, reliability, decision-making, and trust. These traits determine whether you are building something durable.
Assign Each Tool Its Proper Role
Use MMM for budget allocation decisions across channels and quarters. Use attribution for tactical optimization within channels: which creative, which audience, which keyword. Use incrementality to validate big bets and settle real arguments about whether something is driving incremental value.
Stop Using Film Room Tools as Scoreboards
When you evaluate an agency or team through attribution dashboards, you create incentives to optimize for measurable, claimable results rather than business health. This is the root cause of most dysfunction in marketing measurement. Separate the tools from the judgment.
Build Earned Confidence
Replace the illusion of control with something better: confidence in the plan, confidence in the feedback loops, and confidence that the people involved are playing for the win rather than the stats. Accept that all models are wrong, but some are useful. The right question is never whether a model is true, but whether it is useful for the decision you are trying to make.
Frequently Asked Questions
What is the scoreboards vs. film room framework?
It separates two distinct activities in marketing measurement. The film room is for evaluating campaigns and tactics using MMM, attribution, and incrementality. The purpose is learning. The scoreboard is for evaluating teams and partners through business health metrics and qualitative factors. When film room tools are used as scoreboards, incentives distort.
What is the marketing measurement triangle?
The triangle consists of three complementary tools. Marketing Mix Modeling provides the 30,000-foot view for allocation decisions. Attribution provides the on-the-ground view for tactical optimization. Incrementality testing provides the scientific view for validating whether something drives incremental value. Each tool answers different questions.
Why is attribution alone not enough?
Attribution overvalues channels that are easiest to track, especially bottom-funnel tactics like Google Search and retargeting. It struggles with offline influence, cross-device behavior, long consideration cycles, and the compounding value of brand building. Using attribution as the sole basis for budget decisions systematically under-invests in upper-funnel work.
How does incrementality testing work?
You hold out a control group that does not see the marketing, compare it to a group that does, and measure the difference. It is the closest marketing gets to a true experiment. But incrementality is probabilistic. Running the exact same test again could produce a different outcome. Results are influenced by timing, creative, market conditions, and randomness.
How do I evaluate my marketing team or agency?
Through business health over time, not through dashboards. Revenue growth, profit, market share, retention, and brand demand are the scoreboard. Add qualitative factors: strategic thinking, proactivity, reliability, decision-making quality, and trust. These traits matter more than any single attribution metric.
What does 'all models are wrong but some are useful' mean for marketing?
No model can fully explain a complex system. Human behavior, markets, competition, and timing do not fit cleanly into equations. The right question is not whether a model is true but whether it is useful for the decision you need to make. MMM, incrementality, and attribution are all approximations of reality that help you make better decisions.

From the Book
Chapter 21 uses the 2024 Buffalo Bills season to explain the marketing measurement triangle. James Cook's touchdowns, Josh Allen's quarterbacking, and the offensive line's invisible contributions show why box scores and attribution dashboards never tell the full story.
This is just a glimpse. The book explores dozens of cognitive biases and decision-making frameworks that change how you think, decide, and act.
Want to go deeper on this topic?
Chat with the AI companion to explore these concepts with the full context of the book.
Chat about this topicRelated Reading
The 60/40 Rule: How to Split Your Marketing Budget Between Brand and Performance
Learn how to apply Binet and Field's 60/40 brand-to-performance marketing budget split. Understand the research, why most digital brands get it wrong, and how to implement it.
frameworkMarketing Strategy in Five Steps: How to Move from Tactics to Strategic Thinking
Learn a five-step framework for building real marketing strategy. Stop confusing tactics for strategy and start making decisions that compound over time.
conceptBrand vs Performance Marketing: Why the Split is Killing Your Growth
The false division between brand and performance marketing is weakening both. Learn why integrated campaigns outperform siloed approaches.
glossaryMarketing Mix Modeling: The 30,000-Foot View of Marketing ROI
Marketing mix modeling (MMM) is a top-down statistical approach to measuring marketing effectiveness across channels. Learn how MMM works for budget allocation.
glossaryIncrementality Testing: Measuring True Marketing Impact
Learn incrementality testing marketing methods to measure true causal impact vs. correlation. Move beyond attribution to understand real marketing effectiveness.