How to AB Test Your Brands Emails for Maximum Impact

From Good to Great

Hey, Michael here from AdSumo Digital.

We’re back with another edition of The DTC Letters.

Today, we’re going to cover:

  1. Why A/B testing matters—even after you've found what works

  2. How to A/B test your flows for maximum results

  3. Key elements to test in your campaigns

  4. Common mistakes to avoid when testing

Let’s jump in!

Why A/B Testing Is Still Essential After You've Nailed the 80/20

You’ve locked in the 80/20—those core elements of your email strategy that drive the majority of your results.

You’re seeing great revenue gains, but that’s not where you stop.

This is where A/B testing comes in to optimize and scale even further.

Even when you think you've nailed it, A/B testing is the only way to continually improve.

Why? Because trends shift, customer behaviors evolve, and what worked yesterday might not hit the same tomorrow.

Through consistent A/B testing, you can squeeze out extra performance from your campaigns, boosting conversions, revenue, and even lifetime customer value.

How to A/B Test Your Flows for Maximum Impact

Flows are the automated backbone of your email marketing, running 24/7 to nurture and convert. But are they optimized to their full potential? A/B testing can take them from solid to exceptional.

Here’s what’s actually worth testing in flows:

  • Offers: Do your customers prefer 10% off, free shipping, or a gift with purchase? Test these variations to see what resonates most. We’ve found that % off usually wins because it’s easy for customers to understand, but brands with consumable products might see better results with a free gift.

  • Timing: Test how quickly you send follow-up emails. Sending the first email of your abandoned cart flow within the first hour can drastically improve results. Don’t be afraid to be more aggressive with high-intent customers—more often than not, they appreciate the reminder.

  • Subject Lines: Short vs. long? Formal vs. informal? Personalization vs. no personalization? Each of these can drastically affect open rates. From our testing, short, informal, and personalized subject lines tend to outperform others.

  • Content: Long-form vs. short-form? High design vs. minimalist? Direct-response copy vs. branded? The answer may vary by brand, but what we’ve found is that a mix of both direct response and branded copy performs best.

What to Test in Your Email Campaigns

Campaigns are different from flows because they’re more dynamic and often reflect current promotions or brand updates.

Instead of focusing on individual winners, you want to look for trends over time.

Here’s what you should be testing:

  • Send Times: Start by testing drastically different times—like 9 AM vs. 5 PM. Once you’ve got a clear winner, refine further by testing smaller variations like 5:30 PM vs. 7 PM. Over time, you’ll find a sweet spot.

  • Subject Lines: Short vs. long, formal vs. informal, emoji vs. no emoji—run these tests regularly. We’ve seen that informal subject lines with emojis often outperform formal ones, but always validate with your own data.

  • Copy Style: Direct copy vs. indirect copy (e.g., more conversational), short vs. long, character-filled vs. straightforward. Find out which tone connects better with your audience.

  • Volume: Once your metrics (open rates >35%, click rates >0.5%, unsubscribe rates <0.2%) are healthy, don’t be afraid to scale up. Start by sending 2 campaigns per week to engaged lists of 15k+ and increase frequency as needed.

Common A/B Testing Mistakes to Avoid

  • Testing Minor Variations: Testing small differences—like a turquoise button vs. a teal button—won’t give you statistically significant data. Make sure your tests are big enough to yield clear insights.

  • Not Running Tests Long Enough: Don’t jump to conclusions too early. Give your tests enough time to collect meaningful data so your results are statistically valid.

  • Testing Too Many Variables at Once: Focus on one variable at a time. If you’re testing both subject lines and content in the same email, you won’t know which element made the impact.

  • Ignoring Context: Sometimes a test may show a winner in one email but not in another. For example, we found free shipping beat 10% off in a pop-up test, yet the revenue per recipient was lower with free shipping. Always look at the full picture.

Actionable Tip of The Day

Testing Your Offers in Flows

Start by A/B testing the offers in your flows—this can have a huge impact on your conversions.

Different offers resonate with different audiences, and figuring out which works best for your brand can skyrocket your flow performance. Whether it's 10% off, free shipping, or a free gift, the right offer will optimize your flows for maximum results.

Start by testing offers in your welcome flow. Run a test with a % off vs. free gift, or free shipping vs. $ off.

Evaluate based on conversions, revenue per recipient, and average order value to determine the winner.

Reply to this email with any questions or content that you want covered in future newsletters.

Best,

Michael

PS - If you run an Ecom brand and want to have a 15 minute strategy session with me, book a call here »

Are You Liking The DTC Letters So Far?

Be honest...no gaslighting

Login or Subscribe to participate in polls.