- The DTC Letters
- Posts
- The A/B Tests That Actually Drive Revenue
The A/B Tests That Actually Drive Revenue
(Not Just More Opens)
Hey there,
Michael Rathman from AdSumo Digital here.
If you've been around email marketing for any length of time, you've probably seen agencies bragging about their latest A/B test results:
"Our subject line test increased open rates by 12%!" "We got 3X more clicks with this new CTA button!"
But here's what they're not telling you: most A/B tests don't actually impact revenue.
They're the marketing equivalent of rearranging deck chairs on the Titanic. Lots of activity, zero impact on the destination.
The Actionable Tip of the Day
When running A/B tests, always ask: "Will this change move dollars, or just metrics?" If you can't draw a clear line to revenue, it's probably not worth testing.
Top Links You Should Check Out This Week
🔍 Email Marketing How We Generated $265K in Monthly Email Revenue (A Real Client Case Study)
📊 Analytics How To Design High-Converting Emails With Figma For Klaviyo
🚀 Strategy The exact 3-step reactivation playbook we used to bring 127K subs back to life
🧠 Deep Dive: How to actually generate six figures in monthly campaign revenue
After managing retention for dozens of 7 and 8-figure ecommerce brands, I've noticed a consistent pattern:
Marketers test what's easy to implement (subject lines, send times)
They optimize for opens and clicks rather than sales
They rarely connect test results to actual dollars generated
This approach is fundamentally broken.
We just wrapped January for our client accounts, and I thought I'd share some of the A/B tests that actually moved revenue numbers, not just vanity metrics.
These are real tests, real brands, real numbers. (with the names redacted)