Split Down the Middle: 5 Reasons Why Attest Beats Facebook's A/B (Split) Testing Feature

November 06, 2018 - 7 minute read

An A/B test, otherwise known as a split test, is fast becoming a vital step in the marketing process for today’s digital marketers. You (hopefully) wouldn’t launch a new advert on Facebook without a killer headline, but unless you’re testing the headline, how do you know it’ll stand out, won’t alienate your audience, and will drive sales?

While it’s near-impossible to pinpoint the value of an A/B test until it’s completed (you don’t know, until you know, after all), it’s universally accepted that optimising adverts to drive up Return on Investment (ROI) is a seriously good use of marketers’ time.

Testing improves ROI by indicating the creative elements - across the full advert, from body text to headline, image or video content to colour scheme - that resonate most with the consumers in the right market to buy the product advertised. Marketing teams can harness this information to optimise current and future campaigns.

Facebook’s split testing function was launched in November 2017, to allow those advertising on the platform to A/B test some of the advert’s elements, and subsequently push further the adverts that do best with target audiences. Facebook allows brands to test creatives, delivery optimisation, audience and placement; while giving the option to show adverts to target consumers based on location, age, gender and other demographic details.

For the 93% of social marketers that use Facebook paid adverts regularly, this addition to the advertising process has been helpful in ensuring the design of adverts that are effective. But with so many brands utilising Facebook adverts, Facebook’s own split testing facility isn’t the best option to get the rich insight that will truly set your adverts ahead of competitors’.

If your brand is utilising Facebook to advertise to consumers, read on to discover the added value Attest’s scalable intelligence platform can offer to ensure your social media adverts are performing at their best.


There’s no second chance to make a first impression

Facebook split test adverts, even in the early development stage in which A/B testing is most useful, are live. Real consumers, interacting with Facebook in their real lives, are seeing your adverts for up to 14 days. After the handful of days it takes to reach statistical significance, the under-performing campaigns are cut down or out entirely, and notes can be taken to help improve future campaigns.

But you’ve still made a first impression on thousands of consumers with underperforming campaigns, and wasted those opportunities. Those Facebook consumers don’t know you’re working on improving your adverts, they think they’re witnessing the fruits of hours of optimisation labour.

With Attest, on the other hand, respondents have opted-in to actively help refine your creatives, it means less native responses, but far more considered, honest and less random.

Another risk when optimising live adverts, especially on a social platform like Facebook, is the very thing your brand is likely aiming for: virality. Though the chances are perhaps slim, depending on the nature of your content, one advert can (for any number of reasons, unfortunately including very negative reasons) explode in likes and shares as it goes viral. While your key metrics, such as Cost Per Acquisition, might not jump up if it’s not the best-performing advert, your sub-optimal advert will be getting in front of plenty of people.

Dove learnt this lesson the hard way with a Facebook ad test which was deemed by consumers to be racist. Had they pre-tested this advert before setting it live, they might have avoided this negative reaction.

Unfortunately, you’ve wasted your vital first impression on a less-than-ideal advert. Working on improvements prior to launching in front of your key Facebook audience, with the Attest survey platform, can seriously help your brand save face.


The right consumers

A/B testing audience segments means spending marketing budget on reaching consumers who might have no interest in your brand or product, but the potential discovery of a new niche can make the investment worthwhile.

However, if you are making the effort to speak to people who aren’t intending to buy, you should at least maximise the opportunity. By keeping the audience reach wide with an Attest survey, you can still allow for unexpected pockets of interest, but, further, you can route those uninterested in your product towards questions that probe them for why they feel that way, and perhaps learn how to attract them in the future.

If, on the other hand, you already have a defined audience, Attest can collect statistically significant results from these consumers in a shorter turnaround time than Facebook can offer. By only offering non-overlapping audiences, Facebook divides your (potentially already small) target market into even smaller, statistically insignificant, audiences. While exclusive audiences are an option using Attest, if your target market is small you can still achieve significantly robust results by showing multiple creatives to the same audience and asking them to directly compare and contrast them.

With Attest, you can send your surveys to exactly the consumers you wish to speak to, choosing from up to 16 demographic filters. Additionally, you can utilise qualifying questions, to screen-in consumers with certain behaviours and attitudes that can’t be highlighted demographically - for instance consumers who holiday at least once a year, or those who prefer dark chocolate to milk.


Stability, significance and sustainability

During the relatively young lifetime of Facebook’s split testing feature, the algorithms used have already shifted dramatically. The social media platform uses algorithms to measure chosen metrics, and pushes those that are performing best, while those under-performing fall further behind by receiving fewer and fewer eyeballs. Some marketers say that this can lead to the under-testing of some variants of the advert, dragging down the statistical significance of the results, or forcing a longer wait in order to reach significance.

Shifting algorithms essentially move the goalposts, making comparison between different independent tests, necessary to understand whether new adverts are moving in the right direction, at best difficult and at worst false.

If your brand used Facebook’s split testing to create the best version of campaign X in 2017, and now want to test campaign Y, but the algorithms have changed, can you be sure that equal metrics really reflect equal sentiments in consumers?

“Fine”, you’re thinking, “just test the new adverts alongside the old ones!” No such luck. Your 2017 adverts are likely fatigued, and results showing that 2018 adverts are just as popular will actually mean they perform less well than those of 2017; they’ve only reached the fatigued level of success, not the peak of success.

With Attest’s scalable intelligence platform, we rely on genuine consumer data, with questions, audiences and quotas chosen by the surveying brand, not algorithms. This ensures the stability and comparability of results over time, and ensures you can reach statistical significance in record time. The ability to iterate at speed is gifted by the easy duplication and re-running of Attest surveys over and over.


Thinking beyond the newsfeed

With a user base larger than the population of China, it’s no surprise Facebook is a go-to marketing channel for brands from around the world. But it’s not the only channel on which your brand should be advertising.

A multi-touch-point approach to media buying can be the key to encouraging consumers through to sale by repeatedly popping into their minds. As such, limiting your A/B tests to Facebook will optimise your adverts for just one of channel of many in use.

Consumer intelligence surveys using Attest can be set up to screen in consumers who interact with your adverts at any, or many, of your key touch-points, and allow you to cross tab results to dig deeper into the elements that work across-platforms not just in-platform. This contributes to a more cohesive, recognisable brand identity for each consumer who’s exposed to your marketing strategy.

 

Which advert succeeds and why

Facebook’s own guidelines advise testing one element at a time, to understand which change is responsible for moving the needle. This is because split testing on Facebook is limited to answering only the ‘which is the best advert?’, while they’re unable to answer why it’s the best. With Attest’s free text question type - along with many other question types to choose from - brands can dig far deeper into the many elements of each advert, and uncover useful insight.

With Facebook there’s no answer to which advert is objectively successful, just which performers subjectively better than others in the ad set. Advert Y might receive more clicks than advert X, but maybe advert Z would receive twice as many clicks still. With a free text option, such as Attest offers, you can see whether the sentiments surrounding the most well-performing advert are genuinely positive, or whether there is room for improvement still.

The richness of the data and insight achievable through the Attest platform far out-strips that of the Facebook split testing facility. Take this example:

Your travel brand is advertising summer holidays. You’re testing an advert featuring a sun-drenched Mallorcan beach, versus an image of downtown Barcelona. Your Facebook split test might indicate that you should opt for the Mallorcan imagery, it receives statistically higher click rates in the 4 days of testing. What Facebook can’t tell you, and what you can’t hope to learn without further understanding the reasons why consumers are clicking your advert, is that your target audience have been captivated by the Love Island live final filmed on, you guessed it, Majorca, a few days earlier. Had you had the option to ask why that image was preferred, you’d have been alerted to the presence of a skewing, temporary trend.

The depth of understanding gifted by consumer intelligence, over Facebook’s split testing, gives genuine learnings to be applied to future campaign development. Without knowing why an advert performs well, it can be near-impossible to recreate the success in future instances. And, should unexpected results appear, easily follow up with the respondents who offered interesting answers to give you a clear view of the responses.


The next steps

Running A/B tests with Attest provides you with a wealth of data, more quickly and economically, than you can find exclusively running them with Facebook.

Further explore exactly how you can go about split testing with Attest in this article detailing 5 creative tests you could embark on today.

Alternatively, get in touch with a member of the team to begin your journey towards more thoroughly testing, understanding and optimising your adverts.

Related posts



How to Test Your Next Audio Advert

Consumers hear a lot of things. They spend their days listening to conference calls, listening for train announcements, listening for the ping of the microwave, even listening in their sleep. In fact, silence can be quite a novelty in today’s busy world.

Posted by Beth McGarrick on November 06, 2018


10 Important Stats from the 2018 Media Consumption Report

In 2017 we ran our first Media Consumption Report, and in July we ran an updated report for 2018. Drawing direct comparisons from 2017 we’ve been able to establish the most important shifts within UK consumers’ media habits in the last 12 months.

Posted by Beth McGarrick on September 27, 2018


5 Simple Creative Tests to Get the Most from Audio and Video Advertising

You know that feeling, when you read or say a word so many times that it begins to lose all meaning, and stops looking and sounding like a real word? 

Posted by Beth McGarrick on March 05, 2018