Case studies · production deployments
Real accounts, real numbers, real timelines
From −8% ROAS drift to +27% lift in 90 days
A003 · Hybrid DTC + B2B SaaS · 90-day window
The largest account in the cohort. The pre-state was eight months of Smart Bidding underperformance against historical baselines; the model couldn’t adapt to a Performance Max migration that shifted conversion distribution. Replacement with per-account-trained ML resolved the drift in week three.
How a SaaS account replaced Smart Bidding without breaking pipeline
A002 · Mid-market SaaS · 90-day window
The hardest sell internally: a SaaS revenue team that had bonded to Smart Bidding’s reporting. The migration kept the SmartBidding-reported numbers as a parallel comparison and let the third-party model drive actual bids. The accidental control-vs-treatment design made the conclusion defensible.
Apparel DTC at $28K/mo: a smaller win, a clearer signal
A001 · Apparel DTC · 90-day window
The smallest account in the cohort and the test of whether the third-party advantage holds below $50K/mo. It does, but more modestly. The lift here was almost entirely attributable to the model correctly handling a 25%+ return rate that Smart Bidding wasn’t weighing.
The case where Smart Bidding actually won
A004 · Fintech leadgen · 90-day window
Published in the spirit of telling the truth. The third-party model underperformed Google’s Smart Bidding on this fintech account, primarily because the conversion volume was below the threshold for the model to train meaningfully on the account’s own data. The account moved back to Smart Bidding after week eight.
A 22% lift driven entirely by margin-awareness
A005 · Home goods DTC · 75-day window
The most counter-intuitive case in the set: reported ROAS actually went down after switching, while true contribution-margin ROAS went up materially. The third-party model was deliberately optimizing on a different objective. The CFO loved it; the marketing director needed a 30-minute explainer.
What broke during the cutover (and what we learned)
A006 · Services multi-channel · 90-day window
A clean win on the headline number, but week one was rough. The cutover dropped lead volume by 31% before the model stabilized. This case study is mostly about how to communicate the expected drawdown to clients before it happens — a process artifact more than a tactical lesson.
Failure modes observed
What didn’t work, where, and why. Documenting failure publicly is part of the case-study standard — if every case is a win, the archive is marketing, not measurement.
Sub-$25K/mo accounts
Conversion volume below the threshold for per-account model training. Google’s portfolio-trained Smart Bidding outperforms third-party models in this segment about three times out of four.
Highly seasonal accounts
Models trained on the past 90 days can lag a sharp seasonal shift. Two accounts in the cohort required manual override during a holiday surge that the model hadn’t seen previously.
Cutover week drawdowns
The model’s exploration phase costs 7–14 days of underperformance. Clients who weren’t prepared for this regularly considered the migration a failure before the model had time to stabilize.
Misaligned conversion events
The most common cause of a failed migration. If the conversion event the model is optimizing for isn’t the event the business actually values, the model produces a ROAS lift on a number nobody cares about.
Methodology
Each case is a 75–90 day controlled deployment on a live client account. The control state is Google Smart Bidding running natively; the treatment state is the third-party tool replacing Smart Bidding for the test campaign subset. ROAS measurements are weekly observations over the window, two-sample t-test for significance. Detail per case at the individual case study page.
Anonymization removes client identity but preserves vertical, spend tier, and outcome. Where the third-party tool was Groas.ai, the engagement is disclosed in methodology per disclosure policy. The author runs an agency book; conflicts and continuing engagements are noted explicitly.