The A/B Tests That Actually Drive Revenue

(Not Just More Opens)

Hey there,

Michael Rathman from AdSumo Digital here.

If you've been around email marketing for any length of time, you've probably seen agencies bragging about their latest A/B test results:

"Our subject line test increased open rates by 12%!" "We got 3X more clicks with this new CTA button!"

But here's what they're not telling you: most A/B tests don't actually impact revenue.

They're the marketing equivalent of rearranging deck chairs on the Titanic. Lots of activity, zero impact on the destination.

The Actionable Tip of the Day

When running A/B tests, always ask: "Will this change move dollars, or just metrics?" If you can't draw a clear line to revenue, it's probably not worth testing.

After managing retention for dozens of 7 and 8-figure ecommerce brands, I've noticed a consistent pattern:

  • Marketers test what's easy to implement (subject lines, send times)

  • They optimize for opens and clicks rather than sales

  • They rarely connect test results to actual dollars generated

This approach is fundamentally broken.

We just wrapped January for our client accounts, and I thought I'd share some of the A/B tests that actually moved revenue numbers, not just vanity metrics.

These are real tests, real brands, real numbers. (with the names redacted)

Subscribe to keep reading

This content is free, but you must be subscribed to The DTC Letters to continue reading.

Already a subscriber?Sign in.Not now