Why Your ROAS Number Is Probably Wrong

Why Your ROAS Number Is Probably Wrong

The Number Everyone Trusts (and Shouldn’t)

Return on ad spend is the metric that marketing teams live and die by. It’s the first number in the monthly report, the headline in the agency review, and the figure that determines whether budgets grow or shrink.

There’s just one problem. The ROAS number most teams are using is almost certainly wrong.

Not wrong as in slightly off. Wrong as in fundamentally misleading. The gap between what platforms report and what’s actually happening can be enormous, and the consequences of trusting the wrong number ripple through every budget decision a marketing team makes.

Three Industries, Three Blind Spots

The QSR Brand That Could Only See 10%

A national QSR franchise with 37 stores was running campaigns across Meta, Google, and several other digital channels. Their platform dashboards painted a detailed picture of online conversions, click-through rates, and attributed revenue.

The problem? 70% of their sales happened in-store. Customers saw an ad, drove to a location, and paid at the counter. No click. No pixel. No attribution.

When they actually measured, using marketing mix modelling to connect spend to total sales across all channels, the picture was radically different from what the platforms had reported. Platform attribution was capturing roughly 10% of total transactions. The other 90% of the story was invisible.

The true ROAS turned out to be 4.6x, far higher than any platform was reporting. The marketing was working dramatically better than anyone realised, but the measurement framework was too narrow to see it.

The D2C Brand Where Seven-Day Windows Made No Sense

A direct-to-consumer brand selling premium home furnishings was wrestling with a different version of the same problem. Their products, high-quality rugs and textiles, weren’t impulse purchases. Customers typically researched for six to 18 months before buying.

Their marketing consultant put it bluntly: “Seven-day click attribution is weird, frankly, for a product like this.”

Think about what seven-day click attribution actually means. A customer clicks an ad on Monday. If they don’t purchase by the following Monday, the platform records zero attributed revenue from that click. For a brand selling products that take months of consideration, the vast majority of eventual purchases fall outside the attribution window entirely.

The platform ROAS looked mediocre. But the actual impact of marketing on long-term purchase behaviour was invisible to the attribution model. Every campaign report was systematically undercounting the true return because the measurement window didn’t match the buying cycle.

The B2B SaaS Company Drowning in Signals

A B2B SaaS company had a different problem again. They weren’t short on data. They were drowning in it.

Prospects were engaging across dozens of touchpoints: webinars, whitepapers, email sequences, LinkedIn ads, organic search, partner referrals, conference appearances, and product demos. The marketing team had “200 companies saying the same thing” about pain points, but no reliable way to measure which channels were actually driving qualified leads versus which ones were just generating noise.

Multi-touch attribution models assigned fractional credit across touchpoints, but the numbers shifted dramatically depending on which model they used. First-touch attribution told one story. Last-touch told another. Linear attribution split the difference and satisfied nobody.

The fundamental issue wasn’t the attribution model. It was that touchpoint-level attribution was the wrong framework for understanding how marketing drives B2B pipeline. The buying journey is too long, too non-linear, and involves too many unmeasured interactions (word of mouth, internal conversations, competitor comparisons) for any click-based model to capture accurately.

Why Platforms Overcount (and Undercount)

Platform-reported ROAS has two systematic problems that pull in opposite directions.

Overcounting through claimed conversions. Digital platforms are incentivised to take credit for as many conversions as possible. If a customer was going to buy anyway and happened to click an ad along the way, the platform claims that sale. This inflates ROAS for channels that intercept existing demand rather than creating new demand.

Undercounting through narrow windows. At the same time, platforms can only track what they can see. Offline conversions, cross-device journeys, long consideration cycles, and word-of-mouth effects are all invisible. For businesses with significant offline revenue or long sales cycles, this undercounting can be massive.

The net effect depends on your business model. For the QSR brand with 70% offline sales, the undercounting far outweighed the overcounting, and true ROAS was much higher than reported. For a pure-play ecommerce brand with short purchase cycles, the overcounting might dominate, and true incremental ROAS could be lower than what the dashboard shows.

The Incrementality Gap

The most important concept in marketing measurement is incrementality. It’s the difference between “this customer converted after seeing our ad” and “this customer converted because of our ad.”

That distinction is everything. A customer who searches your brand name and clicks your paid search ad was probably going to find you anyway. The ad didn’t create that demand. It intercepted it. The incremental value of that click is close to zero, even though the platform records it as a fully attributed conversion.

Conversely, a social media ad that introduces your brand to someone who had never heard of you might plant a seed that leads to a purchase six months later. The incremental value is high, but if the purchase happens outside the attribution window or in a physical store, the platform records nothing.

How to Get Closer to the Real Number

Match Your Measurement to Your Business

If most of your revenue is offline, you need a measurement framework that accounts for offline sales. Platform attribution isn’t wrong for what it measures. It’s just incomplete. MMM bridges the gap by connecting total marketing spend to total revenue, regardless of where the transaction occurs.

Extend Your Time Horizon

If your product has a long consideration cycle, seven-day or even 28-day attribution windows will systematically undercount your returns. Look at longer time horizons. Use holdout tests to measure the true impact of marketing over months, not days.

Question Self-Reported Metrics

Every platform has an incentive to show you that your money is well spent on their platform. Treat platform-reported ROAS as one input, not the answer. Cross-reference with independent measurement to validate or challenge those numbers.

Separate Demand Creation From Demand Capture

Not all marketing spend does the same job. Brand awareness campaigns create demand. Branded search campaigns capture it. Measuring them with the same ROAS framework treats them as equivalent when they’re fundamentally different activities with different time horizons and different return profiles.

The Takeaway

The ROAS number in your dashboard is a starting point, not a destination. Across QSR, D2C, and B2B SaaS, the gap between platform-reported returns and true incrementality is where the most important strategic insights live. The brands that close this measurement gap don’t just report better numbers. They make better decisions about where to invest, where to cut, and where the real growth opportunities are hiding in plain sight.

Ready to Grow Your Business?

Join companies already using Seeda to accelerate growth and streamline operations.