Most measurement frameworks fail because they were never backed by a contract. Not a technology contract. An organizational one: a signed agreement between analytics, marketing, and leadership about what gets measured, who owns the insight, and what happens when the numbers land.
Key Takeaways
- Frameworks fail at the organizational layer, not the technical one: the missing piece is a signed pre-agreement, not a better dashboard.
- Every metric needs three components before it reaches a leader's screen: an insight, a recommended action, and a predicted business impact.
- Data silos survive because they're political territory, not technical problems: breaking them requires authority, not tools.
- Building this contract is harder than building the dashboard because it demands leadership commitment before anyone knows what the data will say.
Your analytics team has the tools. They’ve built the dashboards. They’ve implemented attribution models, media mix models, incrementality tests. They’ve hired analysts who can find patterns that would impress a data scientist. And between 60% and 75% of marketing organizations will tell you the measurement systems still aren’t delivering the speed, accuracy, or trust the business needs (1. IAB/BWG Global, 2026).
The gap is organizational.
Somewhere between the dashboard and the decision, insight dies. Delivery comes too late for the planning window it was supposed to inform. The people who receive it have opinions but no authority to act. Quarterly reviews become the graveyard: everyone nods at the slides, nobody changes the budget. The frameworks look credible in the leadership deck. They fall apart the moment leadership is supposed to act on what they show.
Avinash Kaushik, Google’s former Digital Marketing Evangelist and the architect of the Digital Marketing and Measurement Model, has been naming this failure for over a decade. His prescription is blunt: if your leadership team hasn’t signed a measurement contract, you’re messing around with data (2. Kaushik, n.d.).
The Contract Nobody Wrote
The contract starts with five questions your leadership team should be able to answer. Kaushik calls it the DMMM: Objectives, Goals, KPIs, Targets, Segments. Each step requires one clear answer from one clear owner, with active sign-off from senior management.
The output is a pre-alignment contract: an explicit agreement about why your digital presence exists, what strategies serve that purpose, what one critical metric measures each strategy’s performance, what target separates success from failure, and which audience segments matter most for business outcomes.
Five questions. Simple framework. And most organizations have never completed it. They skip the contract and jump straight to the dashboard. They instrument everything, measure what’s easy to capture, and then wonder why the C-suite treats the reports as background noise.
The breakdown happens at the point of delivery. Every piece of analysis landing on a leader’s desk should deliver three things: an Insight (the non-obvious pattern in the data), a recommended Action, and the predicted Business Impact if that action is taken. Kaushik calls this the IAbI model. Without all three, you’ve handed leadership a report they’ll acknowledge and file. Behavior stays the same.
Most analytics teams stop at the report. The ambitious ones stop at the insight. Almost none deliver the recommended action and the predicted impact in the same artifact.
Why It Stays Broken
Jim Sterne founded the Marketing Analytics Summit and has spent three decades shaping how the analytics profession thinks about its own role. He puts the structural problem in five words: data silos are political, not technical (3. Sterne, 2021). The integration engineering is solvable. Most organizations have solved it, or could solve it by next quarter. But the VP of Marketing and the VP of Sales each treat their data as personal leverage. The CFO wants a different definition of ROI than the CMO. Governance requires authority changes that most analytics teams don’t have the organizational standing to negotiate, let alone force.
AI isn’t solving this. Organizations that report successful AI initiatives invest up to four times more, as a percentage of revenue, in data quality, governance, people readiness, and change management than organizations seeing poor AI outcomes (4. Gartner, 2026). The organizations reporting successful AI outcomes are the same ones that invested in governance, data quality, and people readiness: the same infrastructure the measurement contract is supposed to create.
The pattern repeats across industries and stack sizes: analytics teams can build the insight. But they can’t force the meeting where the budget gets reallocated. The planning calendar won’t rewrite itself so analysis arrives before the decision window closes. And nobody is volunteering to mediate the argument between the CMO and the CTO about who owns the data dictionary.
What Goes in the Contract
The measurement contract requires four agreements most organizations have been avoiding.
Decision rights. For every KPI in the framework, one person owns the authority to act on what the data says. Not “informed.” Not “consulted.” Owns the decision and the budget attached to it.
Insight SLAs. Analysis arrives within the planning window where it can change the next decision, not the post-mortem of the last one. If quarterly insights land after the quarterly budget is locked, you’ve built an expensive archive.
Accountability rituals. A recurring cadence where the insight owner presents the IAbI (insight, recommended action, predicted business impact) and the decision owner responds with a commitment: act, defer with a stated reason, or reject with evidence. No nodding and moving on.
Scope boundaries. Kaushik’s DMMM forces this with five questions, each requiring one answer, signed off by leadership. The scope contract prevents the team from instrumenting every touchpoint and analyzing nothing that connects to a business outcome. If a metric can’t trace back to one of those five answers, drop it.
Technically, none of this is hard. Politically, every piece of it is expensive. The measurement contract asks leadership to commit to acting on what the data says before they know what the data will say. That’s the trade-off most organizations aren’t willing to make. And it’s why they’d rather build another dashboard.
Frequently Asked Questions
What is a measurement contract in marketing?
What is the IAbI framework?
Why do marketing measurement frameworks fail?
How does AI affect marketing measurement?
What should a measurement contract include?
References
- IAB/BWG Global. (2026). State of Data 2026: The AI-Powered Measurement Transformation. Interactive Advertising Bureau. https://www.iab.com/events/modernizing-mmm-attribution-incrementality-ai/
- Kaushik, A. (n.d.). Five Key Elements For A Big Analytics Driven Business Impact. kaushik.net. https://www.kaushik.net/avinash/elements-for-big-digital-analytics-driven-business-success/
- Sterne, J. (2021). Jim Sterne: Data Silos are Political - Not a Technical Problem. eMarketing Association. https://www.emarketingassociation.com/2021/08/jim-sterne-data-silos-are-political-not-a-technical-problem/
- Gartner. (2026). Gartner Says Organizations with Successful AI Initiatives Invest Up to Four Times More in Data and Analytics Foundations. https://www.gartner.com/en/newsroom/press-releases/2026-04-16-gartner-says-organizations-with-successful-ai-initiatives-invest-up-to-four-times-more-in-data-and-analytics-foundations

