Facebook A/B Testing - How to Do it Right
A/B testing, also known as split testing, is a fantastic tool for optimising your Facebook presence. You compare two ad versions – “A” and “B” – to see which converts the most leads. . From there, you can continue your iterations to refine and perfect your campaigns.
But why do so many Facebook advertisers either get it wrong or ignore it entirely?
It comes down to many reasons.
Sometimes, advertisers get frustrated when they don't see an immediate payoff. Other times, they ignore statistical significance, declaring a winning variant prematurely or misattributing success to an ineffective element.
In this Facebook A/B testing guide, we’ll explore the best practices and strategies to ensure accurate and effective A/B testing on the platform.
- Why advertisers should test their Facebook ads
- What Facebook ad components can you test?
- 3 levels of Facebook A/B Testing
- Which Facebook metrics should you track?
- How to analyze your Facebook A/B testing results
- How much should you spend on A/B testing
- Step by step on how A/B test your Facebook ads
- 10 mistakes you need to avoid when A/B testing
Why You Need to Incorporate Facebook A/B Testing
Thinking about A/B testing? Then you probably have a good idea of who your audience is. But let's face it - unless you've got psychic powers, you won't know exactly what resonates with them.
That's where experimentation or Facebook A/B testing comes in.
Believe it or not, the tiniest tweak like changing your text overlay’s position or its color could give your ad a big performance boost. It's all in the details!
Facebook A/B testing. essentially, helps you weed out ineffective ad components, maximizing Facebook ad spend. You'll see which ads work and which ones you should ditch.
Additionally, split testing helps to remove personal opinions or "gut feelings" out of the equation. Even industry experts have their own biases, believing that they know what's best for a given campaign. With A/B testing, you can disprove these assumptions without relying on your own intuition or personal preference.
Below are the main reasons why you should incorporate Facebook A/B testing:
- Informed Decisions: A/B testing is data-driven, decision-making becomes fact-based.
- Efficient Budgeting: It helps maximize your ad spend by identifying what works best.
- Enhanced Engagement: By understanding your audience, you create more effective, engaging ads.
- Competitive Advantage: With deeper insights, you stay ahead of competitors who rely on intuition alone.
- Continuous Improvement: A/B testing is a process of constant learning and refining.
What Facebook Ad Components You Should A/B Test
The beauty of Facebook A/B testing is that you can test virtually any ad component. You can experiment with different variations of the same component i.e you can change an image’s color, overlay text size and position, or style.
Below are some of the components you can A/B test for your ads:
- Ad formats i.e carousel, single image, video
- Call-to-action i.e Learn more, Order today, etc.
- Ad placement
- Delivery optimization
- Targeting criteria
What’s more, you can test multiple ad components in one go by creating multiple ad sets. For example, you can test different images, headlines and calls to action in a single experiment.
Once your Facebook ads are submitted, reviewed and approved, you’ll get the results in a few days. However, Facebook may start optimizing your ad delivery, ensuring ads that receive the highest CTRs get the most visibility. The problem is that Facebook may do this too quickly, not giving you time to collect enough data.
3 Levels of Facebook A/B Testing
When performing A/B testing, Facebook advertisers can choose from three levels of testing.
Campaign level: This involves testing different objectives, ad sets and budgets to find out what works best for each campaign.
Ad Set level: Now we delve into more granular testing. This involves comparing target audiences, placements and delivery optimization strategies to see which performs best.
Ad level: This is where we can test the intricacies of an ad. This includes testing different images, headlines, descriptions and calls to action.
Each level presents different opportunities for optimization, so it’s important to understand them all. Let's go through each to understand how to make the most out of your Facebook A/B tests.
Campaign Level Testing
Different campaign objectives provide different ad templates.
For example, Sales allow users to select specific conversion locations i.e website, calls, app etc and the following ad formats i.e carousel, single image, video and collection. Lead generation, meanwhile, offers the option of instant forms, and leads to message you through instagram.
Campaign objectives also determine bidding options and delivery optimization. If your goal is ‘conversions’, then Facebook can use automated bid strategies and delivery optimization options like lowest cost, cost per result or target cost per acquisition.
By testing different campaign objectives, you can work out which is most effective for the goals of your own Facebook campaigns.
Ad set level testing
Now we start looking into more granular testing.
Advertisers can now test campaign elements, such as:
- Audience: Different audience targeting, including demographics, interests and behaviors.
- Placements: Where your ad is shown i.e Facebook, Messenger, Instagram and Meta Audience Network.
- Advantage campaign budget: An option that lets Facebook allocate more funds to your best performing ad sets.
- Delivery optimization: How Facebook should optimize delivery, i.e. based on impressions, clicks or conversions.
When testing ad sets, your goal is to see which ad placements, audiences and optimization works best for each ad set. As such, ensure to keep ads, creative and message consistent across each ad set.
You can also split test the ‘Advantage campaign budget’ ‘feature to see if letting Facebook auto-allocate more budget to your best performing ad sets improves overall performance. The idea here is that if an ad set is doing well, it might do even better with more funds funneled its way.
Ad level testing
The last step is ad-level testing.
Here, we get into the nitty gritty of the creative elements of our ads.
Advertisers can test different creative configurations, such as:
- Image or video - see which resonates with your audience and leads to more clicks or conversions.
- Headline - try different headlines on the same ad.
- Call to action - see what call to action works best.
- Copy - use a different copy in each ad.
When testing ads, the number one rule is this: Only test variables on the ad level, not at the campaign or ad set level. Your goal is to only test your ad's creative elements to see which performs better. It's also recommended to test one or two variables at a time to not get too overwhelmed.
Facebook also gives you the Advantage+ placements option, where it chooses placements most likely to work best. You can also try testing placements manually if you believe most of your target audience is primarily using a specific platform, like Instagram over Facebook, or vice versa.
What Facebook Metrics to Track When Split Testing
Once your A/B test is underway, there are some important Facebook metrics to track during the process. These include:
- Impressions: The number of times your ad is shown.
- Reach: The number of people who saw your ad at least once.
- Link clicks: The number of clicks on your ads’ links and calls to action.
- Cost per click (CPC): How much you pay for each link click.
- Conversion rate: The number of people who completed an action, like downloading an app or signing up for a newsletter.
Once the Facebook algorithm kicks in (around 24 hours), you can start to evaluate each variable’s performance. However, you'll still need to wait longer to collect enough statistically significant results. Facebook also takes a little time to optimize its delivery, also known as its learning phase, so you'll need to wait even longer for the best results.
In addition to tracking these metrics, it's essential to stay informed about successful ad strategies and gain inspiration from industry trends. One valuable resource for advertisers is the Facebook Ad Library. By exploring the Ad Library, which provides a vast collection of ads from various advertisers, you can gather insights, analyze competitor strategies, and refine your own campaigns for optimal performance.
How to Analyze A/B Test Results
When it comes to A/B testing, the more data you collect, the better. However, there isn't a hard-and-fast rule for how much data you need before drawing conclusions.
Ideally, you should wait until the difference between the two variations is statistically significant. For example, if one ad variation has an average click-through rate (CTR) of 2.5% and another has a CTR of 10%, the difference would be considered statistically significant.
Additionally, waiting until you at least receive 300-500 ad clicks and at least 10,000 impressions is a good rule of thumb. You'll then have an adequate amount of data to make an informed decision about which ad variation is most effective.
CTR’s don’t tell the whole story
Apart from brand awareness campaigns which consider impressions to measure performance, you should look at the post-click engagement metrics. Remember that clicks aren't the same as conversions, and using the CTR metric may not be enough to judge an ad variation's performance.
If your goal is sales, find out how much each variation has earned you in terms of revenue. From there, you can calculate your cost per sale by dividing the amount paid for your ads by the total number of sales.
If you're looking to optimize for conversions or leads rather than revenue, look at the cost-per-acquisition (CPA). It's the amount you pay for each conversion and is an important metric to consider when optimizing your ad campaigns.
Determining the Right Budget for Your A/B tests
When planning out the budget for your A/B test, it's tricky to know how much you should spend. You don't want to overspend and blow your budget, but you still need enough data to make informed decisions.
If you're stuck for ideas, follow this time and tested method for budgeting your A/B tests:
Let's say you've created two ads with different ad creative: Advert "A" and Advert "B:. Ensure to do the following:
- Calculate your average cost per conversion. You can do this by seeing your average conversion cost from your previous campaigns.
- Determine how many conversions is adequate for further analysis i.e 500.
- Multiply your average cost per conversion with the number of conversions needed for adequate analysis.
So for example:
Average conversion cost: $1.00
Number of conversions for further analysis: 500
$1.00 x 500 = $500
Your budget should be $500.00 for each ad.
Finally, remember to allocate a certain amount of your budget for post-testing analysis and optimization. Once your tests are concluded, it takes time to properly dissect and analyze the data. Dedicating a portion of your budget to this step will ensure you make informed decisions based on reliable data.
How to Start A/B Testing Your Facebook Ads
Ok, let's get down to business. A/B testing requires a few simple steps that you will need to repeat for both of the ads you are testing:
- Go to Facebook Ads Manager
Log in to your Facebook account and click on 'Ads Manager'. You'll then be directed to your campaigns page.
You'll need an actively running campaign before you can proceed to A/B testing. However, this can be done during the campaign creation process. Toggle the “Create A/B Test” switch at the campaign level and go on to publish it. Facebook will then ask you to create an identical campaign for the second ad you are testing.
If you're not creating a new ad, ensure to do the following:
- Select A/B test in your dashboard
If you have a published campaign you wish to test, go to your campaigns page/dashboard. Choose the campaign you want testing and click on A/B testing on your toolbar.
- Press "Get Started" and choose your test goals
After you click A/B testing, a popup window will appear. Press 'Get Started' to begin.
- Choose your A/B testing campaign and variables
You can now choose to either create an ad duplicate or to compare it with up to 4 campaigns. If you choose a duplicate, pick the variable and key metric you want to test. Similarly with comparison you'll be asked to select campaigns.
- Creative - messaging, imagery, and videos
- Audience - age/gender targeting, interests, etc.
- Placement - let Facebook automatically distribute your ad or manually set the placements.
Users can really hone in on their preferred key metric as Facebook provides a wide range of options. Some of these include: Cost per result, Cost per click, and Cost per 1,000 Account Center accounts reached. You can even go deeper and analyze metrics like Cost per Page like or follower, cost per add to cart and so on.
Once you've chosen the variable and key metric, press 'Duplicate Ad set'. You can then alter your duplicated ad's details, such budget, schedule and other settings. Just ensure not to change too much as it can disrupt the test's accuracy.
Now you’re ready to begin your A/B testing!
Common Pitfalls to Avoid when Testing Your Facebook Ads
Before you dive right into your A/B testing, there are a few pitfalls you should avoid.
Let's go through the most important ones:
Mistake 1 - Winging it with your hypothesis
When you're setting up an A/B test, it's like setting up a science experiment. You need a solid question you're trying to answer. Just randomly picking a question without thinking about your end goal is like trying to hit a bullseye with a blindfold on.
Let's say, for instance, a lot of people comment when you post cute dog pics. So, a good question for your A/B test could be:
"If we include cute dog pics in our ads, will more people click on it?"
This question makes sense because you can actually check it with your A/B test. If more people click on the ad with a cute dog pic, you'll know your hypothesis was right.
Now, here's a bad question:
"If we change the font of our ad text, will it get more clicks?"
I mean, sure, some people might dig Comic Sans over Arial, but is that really gonna make them click? Doubt it. Stick with questions that are relevant to what you're trying to achieve. And the more specific, the better.
Mistake 2 - Changing too many things at once
Running a Facebook ad test isn't the time to throw everything at the wall to see what sticks. Change and test one thing at a time to get clear results.
Let's say you have a Facebook ad with a t-shirt image, a "Summer Sale - 50% Off!" headline, and ad copy that says, "Revamp your wardrobe with our trendy clothes!"
Now, you want to make the ad perform better. You create a duplicate ad, and you change the t-shirt to a dress, the headline to "Buy Now, Look Fabulous Later!", and the ad copy to "Complete your look with our stylish clothes!"
Suddenly, your second ad does better - great! But was it the image, the headline, or the ad copy that made people click? You don't know.
That's why you should always test one variable at a time. In this scenario, you could first test the image, then the headline, and lastly, the ad copy. That way you'd know exactly which part of your ad made people click.
It's slower, but you'll know exactly what's working and what's not. This way, you can make informed decisions for future ad campaigns.
Mistake 3 - Not timing your test right
When you're conducting a Facebook ad test, there's an optimal amount of time to test each variable. It should be long enough for the ad to reach a sizable audience and short enough so you can get results quickly. Think of it as baking a cake - not enough time and it's still gooey, too much time and it's burnt.
Facebook recommends running your tests for a minimum of 7 days but no more than 30 days. If your test is too short, you'll have insufficient data to draw any meaningful conclusions. Furthermore, it does take some time to get leads or sales, so you don't want to cut it too short.
If it's too long, the cost of running a single test may be too high. This is especially prevalent if you're running multiple tests simultaneously and you're running out of budget.
Mistake 4 - Incorrect Facebook Campaign Structure
When orchestrating a Facebook ad test, you need to carefully consider your campaign's structure.
There are two prevalent strategies: either incorporate all your ad variations into one ad set or segregate each variation into individual ad sets.
If you adopt the first strategy, Facebook's algorithm will commence an automatic optimization process, picking favorites by allocating more resources to the top performers. This certainly saves time as you don't have to manually readjust your budget, but it distorts the results and stops all ads from reaching their full potential.
We recommend opting for the second strategy. By assigning each ad variation its own independent ad set, you can scrutinize the standalone performance of each variation. This strategy provides a more transparent, unbiased view of your results, contributing to more effective future campaign optimizations.
Mistake 5 - Not getting enough data to draw meaningful conclusions
We've already mentioned that you x amount of conversions or leads to make an informed decision, but let's urge the importance of this point once more.
You must accrue enough data before concluding your test. If you end the test too soon, you may not have sufficient information and you may end up with misleading results.
If you're comparing metrics such as ad clicks, aim to obtain a minimum of 300-500 ad clicks per variation. And for impressions, wait until you receive at least 10,000.
Additionally, you want to wait until you see a noticeable difference in performance between variations. If, for example, one ad has a clickthrough rate of 6% and the other has a clickthrough rate of 5%, then it's simply not worth making a decision.
Once you've gained enough data, analyze your results and use them to inform any future A/B tests.
Mistake 5 - Ignoring your audience
Sometimes, we get so focused on the ad itself that we forget about who we're showing it to. Remember, everyone is going to react to your ad in a slightly different way.
When conducting A/B testing, don't be afraid to experiment with different audiences. For example, you can have two ads that target the same region, but one targets men and the other women. You'll quickly find that an ad that flops with one group soars with another.
Mistake 6 - Overlooking ad placement
We all have our unique habits when exploring Facebook. Some of us like to scroll through our newsfeed, while others like to prioritize reels and stories.
You must be mindful of where your ads are being posted, as different placements have a noticeable impact on performance. Experiment with different placements - News Feed, right column, Stories, etc. - and see which spot gets your ad the most love. It's like choosing a location for your storefront, some spots just get more foot traffic.
Mistake 7 - Forgetting about mobile vs desktop
Today's digital world is increasingly mobile, but don't write off desktop users just yet. Run your tests for both desktop and mobile users. You might find surprising differences in how each group interacts with your ad. Remember, people use different devices in different contexts, and that can affect how they respond to your ad.
If you run a website, you can check your Google Analytics stats to get an idea of how much traffic is on mobile vs desktop. Use this data to inform your A/B tests and make sure you're reaching both desktop and mobile users.
Mistake 8 - Neglecting the importance of timing
When it comes to running ads, timing can be everything. You need to test different times of the day and days of the week to find out when your audience is most responsive. Just like your favorite restaurant has rush hours, your ads will have their own 'peak times' too.
Mistake 9 - Skipping over the ad frequency
There's a fine line between reminding your audience about your product and spamming them. That's why you need to test different ad frequencies.
Show your ad too little, and you're not creating a memorable brand impression. Show it too much, and you're annoying your audience.
Run tests to see how different frequencies affect user engagement. You can then begin to understand the optimal frequency to reach your audience without overreaching.
Mistake 10 - Giving up after a single A/B test
Let's be real, nobody hits a home run on their first swing. According to Wordstream's Facebook Ad benchmarks, the average conversion rate for paid Facebook ads across all industries is about 9.21%. This means a good conversion rate for your Facebook Ads would be anything around 10% or higher.
This implies that if you're getting around 10% or more, you're actually doing better than most. If you're not hitting the 10% mark, however, don't panic. Remember that the whole point of A/B testing is to iterate and refine your approach. So if you don't hit the mark on your first test, that's ok. Keep refining your tests, and you should eventually find a strategy that works for your audience.
Create the Best Possible Ad Creative with MagicBrief
With our Chrome extension, you can effortlessly save competitor ads from Facebook, Instagram, and TikTok. You can then analyze, compare, and draw inspiration from these ads to create your own compelling campaigns.
Our Ad Library takes it a step further. It categorizes your saved ads by industry, giving you an organized and streamlined view for analysis.
So what are you waiting for?
Register today, and try our 14 day free trial! You can dive into our features and see firsthand how we can transform your ad creation process.