#3 - A/B Testing on Campaigns

You know what rhymes with three?

Me.

And what better way to start off our third edition with a self introduction of our third writer, Mira Lastari.

Good morning/afternoon/evening to ya, and welcome to our third edition of “The Weekly Scoop w/ Chronos“.

Mira, the stage’s yours.

Bow Thank You GIF by Out of Office

Hello!

Mira here, fellow Potterhead and recent skiing enthusiast 👋

There’s just something special about hitting the slopes… The rush of cold mountain air, jaw dropping views and the crunch of snow under my skis. I never took myself as an adrenaline junky, but I’m kinda sad I didn’t start sooner.

I’ve been a Senior Retention Marketing Manager at Chronos for nearly three years now, and it’s been fascinating seeing how the eCommerce landscape evolve over time.

Without further ado…

A/B Testing on Email Campaigns

A quick brain exercise for you:
How do you personally approach A/B testing on email campaigns?

I’ve seen brands/agencies test a gamut of things - from headlines, creatives, tone of voice, button color, CTA text.

All of which are great, except that they either do it all at once, or with a statistically inconclusive test.

A/B testing shouldn’t be a ‘Oh I feel like testing this today’ kinda feeling. You need to approach it with a clear hypothesis and outcome in mind.

Your first step is to analyze metrics that need to be optimized and tested.

For example, if your open rates are suffering, you might want to look into testing your subject lines, the time of day you’re sending your email, the segment you’re pushing out your campaigns to.etc

I typically run one to four hypotheses per month, and test it out on at least 3 different campaigns.

I then pull it all into a spreadsheet for easy tracking, and do the comparison there. From time to time, if I’m working with a smaller sample size, I also tap on Neil’s calculator here to let me know if my results are statistically significant.

You need to keep an open mind, because what you might believe is true, might turn out to be false.

For example, we ran an A/B test with the hypothesis that emails with recipes will have a higher click rate/conversion rate as compared to emails without recipes.

We tested this with 2 different segments:
Segment 1: Profiles with an active monthly subscription plan
Segment 2: Profiles without an active monthly subscription plan

The results:
Segment 1: Higher conversion rate on emails without recipes
Segment 2: Higher conversion rate on emails without recipes

 See why we test now?🤷

Good A/B Tests For Flows

1) New Customer Acquisition

  • Discount/No discount on pop-up

  • $ vs % OFF

  • Generic vs dynamic code

2) Cart Recovery

  • Text vs HTML (we’ve found text based works better for our clients)

  • With discount vs no discount

  • Dynamic blocked placed as hero vs body content

3) Browse Recovery

  • Different journey for profiles with vs without placed order histories

  • Subject line: Statement vs question

  • With vs without dynamic block

4) New Customer Nurture

  • Additional discount right after placed order vs confirmation email

  • Sitewide offer vs bundle/special offer

  • Review with vs without offer

Quick & Dirty A/B Testing Checklist

We use this for every brand we have under our wing, feel free to use this as a launching pad.

Does your brand convert better with…

✅ Klaviyo dynamic product blocks or custom product block images
✅ GIF or static image
✅ GIF or static CTA
✅ Model photo or product photo (esp relevant for beauty brands)
✅ Without price or slashed out pricing (esp relevant for high value item brands)
✅ SPU or MPU
✅ Best sending time (using Klaviyo’s send time testing feature)
✅ $ or % discount

And that’s a wrap!

I can keep going, but also want to make sure that the info is digestible.

Hope you found this episode helpful 🙂 

If you have any other ways you’re running your A/B tests, send them my way (I love to read about them).

Cheers.

Mira Lastari