The Big Webinar Case Study

The Big Webinar Case Study

What happens when you split test EVERYTHING in a page?

Not one button, not a subtle iteration of the headline – the whole thing.

If split-testing normally calls for a tweezers-and-magnifying-glass approach… what would happen if you took a wrecking ball to it?

A round of sugary energy drinks and many crackings of knuckles later, we were ready to find out.

Many moons ago, our fearless founder Andy was starring in a Q&A webinar, and we had a nice shiny signup page ready to go.

So, of course, we made another one, and made them fight. (Don’t worry – Andy wasn’t involved.)

Methodology

Simple, really:

  1. Create a new signup page and change literally everything about it.*
  2. Throw some gooey-in-the-centre warm traffic at both pages, and see what happens.

* We changed everything but the time, date and signup link to the webinar. That would be silly.

If our other case studies were nutritious, intellectual green smoothies, this split test would be a dirty beefburger oozing with plastic cheese: everything on it. No holds barred.

Let’s meet our contenders:

The Sign Up Pages

Page 1 (let’s call him Apollo) was a tried-and-tested approach: colourful background, plain text, centre justification, and a little cartoon Andy.

What’s important to note about this page is that it’s on-brand: we’d done Q&A webinars before, and all the signup pages looked like Apollo, from the colours to the layout and copy.

Page 2 (Rocky), once we’d hacked it about, looked quite different.

In the background, a man walks alone on a salt flat, searching for answers. A bold headline simply asks: Questions? The audience is encouraged not to face them alone.

With a pared-down aesthetic, simple sign up form and shorter copy, this page looked, felt and read totally different. And, if you’ve been paying attention, you’ll know background video converts much higher than a static image – we proved it.

But would it work for us this time?

The Results

…Were very interesting.

In a shocking turn of events, not a lot happened.

Which is actually a huge deal.

Here’s the numbers in their raw, crunchy form:

Apollo: 85 hits, 66 conversions (78%)

Rocky: 74 hits, 50 conversions (68%)

After making a ritualistic sacrifice to the gods of Excel, our data priest decreed this was significant at the 95% level.

But heck, look at those numbers. What on earth is going on?

Two very important things we must remember: number one, everything on the page changed. No image, no font, no copy stayed the same – apart from the form fields, and the time and date of the webinar. These were two very different contenders.

Number two, our traffic was red hot.

A Q&A webinar isn’t everyone’s cup of tea, but for us here at Convertri and our users, it’s just the Darjeeling to get them out of bed in the morning. Questions are answered, banter is had, we exchange a few laughs, and it’s generally a really nice day out for everyone.

But if you don’t like sales funnels, fast pages, banter or tea? You’d be better off giving this one a miss.

So, we only promoted the webinar to people we knew would love to come – everyone likes to think they have warm traffic, but ours was nicely toasted.

This means, to some degree, they were more than likely to be ready to sign up before they even clicked on the page: the colours, layouts, fonts, copy… none of it actually mattered.

The second possibility is… the two pages were simply evenly matched. Like either side of a chocolate Digestive: neither side is prettier than the other, and it’s the whole of it’s still a delicious biscuit.

And what was with that tiny 10% difference in conversion rates? Did our audience simply resonate more with the page that looked more like the sign-ups to past Convertri webinars? Did we go too off-brand – even though we’d warmed up our traffic enough that it only dropped 5%?

And why did the background video not work this time, when it caused a pretty huge leap in conversions on a previous split test?

We will never know.

But we do know this: split testing every variable at once probably isn’t the best idea in the world.

What’s Next?

Overall, for a dramatic split test with very un-dramatic results, many lessons were learned: testing every variable makes the data murky, traffic temperature is a force to be reckoned with, and video isn’t always the best course of action.

As always, if you’re cooking up your own split tests and experienced frightening, startling or midly tea-splurting results – we’d love to hear from you. Have you split tested your pages? What results did you get? Leave a note in the comments!