Published on 7/31/2025 Staff Pick

Solved: Meta Ads Structure for New Product Creative Testing

Inside this article, you'll discover:

Hi there, So were launching a new product on our e-commerce store and were trying to figure out the best setup in Meta Ads for testing different creatives. Right now I have about 7 vids and 3 photos that I want to test. What kind of structure would you recommend for make sure the test is done well? I have a campaign in the works, but its not launched yet: CBO set at €100 2 Adsets BROAD (18+, both genders, Manual targeting) 1 Adset all videos (1 adset with 7 video ads) 1 Adset all photos (1 adset with 3 images) Would you say this setup will work, or should I go with ABO instead? Like, should each creative have its own adset at €10/15 a day just for testing? And also, could a CBO with 2 adsets for images and vids interfere with one another?

Mentioned On*

Bloomberg MarketWatch Reuters BUSINESS INSIDER National Post

Hi there,

Thanks for reaching out!

Happy to give you some initial thoughts and guidance on your Meta ads structure for creative testing. It's a really common question and getting it right from the start can save you alot of money and time. It's good you're putting so much thought into the testing phase before you even launch, that already puts you ahead of many others.

Your current idea for a structure is a decent starting point, but I think we can refine it quite a bit to get you much clearer, more reliable data on which of your 7 videos and 3 images are actually the winners. Let's walk through it.

I'd say you need a more granular testing setup...

Right, so your proposed campaign: CBO at €100 with two adsets, one for all videos and one for all images, both targeting broad.

The main issue here is how CBO (Campaign Budget Optimisation) works. When you give the budget to the campaign level, Meta's algorithm will try to get you the most results for the lowest cost. It's smart, but in a testing scenario, its efficiency can actually work against you. What will likely happen is that Meta will quickly decide which of your two adsets (video or image) it 'likes' more, and it'll start funnelling the majority of the €100 budget into that one. The other adset will barely get any spend, so you won't have a fair test between videos and images as a format.

It gets worse inside the adsets themselves. Within the adset that gets the budget (let's say it's the video one), the same thing will happen. Meta will pick one or two of your 7 videos that it thinks will perform best and give them most of the spend. The other five videos might get a few impressions here and there, but not enough to give you any real idea if they could have been winners. You'll end the test thinking 'Video X is the best', when actually Video Y just never got a proper chance to be seen.

You also mentioned a potential conflict between the adsets. It's not so much a conflict, but more of a competition for the budget, and CBO is designed to pick a winner very quickly. For scaling, this is brilliant. For testing, it's a bit of a nightmare because you need to force the budget to be distributed evenly to get clean data. You need to know not just which creative wins, but why and by how much. The proposed CBO structure just won't give you that clarity.

Basicly, you'll end up with skewed results that are based on Meta's initial, rapid-fire optimisation rather than a true, controlled test of each individual creative asset. We need to isolate the variables, and your current structure lumps too many variables together.

You'll need to think about ABO vs CBO for testing...

This leads us to the ABO (Ad Set Budget Optimisation) vs CBO debate. As a general rule of thumb that I've seen work time and time again: use ABO for testing, and CBO for scaling.

With ABO, you set the budget at the ad set level. This gives you complete control over how much you spend on testing each element. It forces an even spend, which is exactly what we want in a test. It might seem less efficient on the surface, and your overall CPA might be higher during the test phase, but the quality of the data you get is far superior and will save you money in the long run.

So, how should we structure this? There are a couple of ways to approach it, depending on how rigorous you want to be.

Method 1: The Adset-Level Test

This is a solid middle-ground. You'd create one campaign using ABO.

  • -> Campaign: [PRODUCT] - Creative Test - ABO

Inside this campaign, you'd create your adsets. For a new product, we need to test audiences as well as creatives. The best creative in the world will fail if you show it to the wrong people. So let's pick 2 or 3 distinct audiences to start with. These should be ToFu (Top of Funnel) audiences based on interests. For an e-commerce product, think about magazines your audience reads, brands they like, influencers they follow, etc. Let's say we pick 'Audience A' and 'Audience B'.

Your structure would look like this:

  • -> Adset 1: Audience A - €20/day budget (ABO)
    • -> Ad 1: Image 1
    • -> Ad 2: Image 2
    • -> Ad 3: Image 3
    • -> Ad 4: Video 1
    • -> ...and so on for all 10 creatives.
  • -> Adset 2: Audience B - €20/day budget (ABO)
    • -> Ad 1: Image 1
    • -> Ad 2: Image 2
    • -> ...and so on for all 10 creatives.

In this setup, each audience is tested with all creatives. You'll start to see patterns. Maybe videos work best with Audience A, but images are better for Audience B. Or maybe Creative X is a clear winner across both audiences. It's a good approach, but it still has a slight version of the CBO problem: within each adset, Meta will still favour certain creatives over others. It's better than your original plan, but not perfect.

Method 2: The Pure Creative Test (My Recommended Approach)

This is the most scientifically sound way to do it. It's more work to set up, but the data is squeaky clean. This structure isolates every single creative so there's no doubt about what's working.

The structure is one creative per adset. Yes, that means 10 adsets.

  • -> Campaign: [PRODUCT] - Creative Test - ABO
  • -> Adset 1: Creative 1 (Video) - €10/day budget - Targeting Audience A
    • -> Ad 1: Video 1
  • -> Adset 2: Creative 2 (Video) - €10/day budget - Targeting Audience A
    • -> Ad 1: Video 2
  • -> ...and so on for all 10 creatives, each in their own adset, all with the same budget and same audience.

Why is this better? Because you've fixed the budget and the audience. The *only* variable is the creative itself. If Adset 1 gets a €5 CPA and Adset 2 gets a €20 CPA, you know with 100% certainty that Video 1 is four times more effective than Video 2 with this audience. There's no guesswork. You're not relying on Meta's algorithm to distribute spend within an adset; you are forcing it. This gives you undeniable proof.

Your total budget would be 10 adsets x €10/day = €100/day, the same as your proposed CBO. But the results will be a world apart in terms of clarity. After you've run this test on Audience A, you can duplicate the campaign and run it again on Audience B to see if the results hold up. I remember working with a women's apparel brand, and using this exact method, we achieved a 691% return simply by being this rigorous with our initial creative testing. We found that one particular style of user-generated video outperformed studio-shot images by a factor of 5, something we would never have discovered with a lumped-together test.

We'll need to look at your audience targeting...

As I mentioned, the creative is only half the battle. You could have the best ad in the world, but if you show it to people who have zero interest in your product, you'll get zero sales. You've put 'BROAD' in your plan, which can work for accounts with a huge amount of pixel data and a mature product, but for a new product launch, it's like throwing darts in the dark. You're relying on Meta to find your customer from scratch, which is expensive and slow.

You need to give the algorithm a much stronger starting signal. This means using detailed targeting based on interests, behaviours, and demographics. This is your Top of Funnel (ToFu) strategy.

Think deeply about your ideal customer. Don't just think about age and gender.

  • -> What websites do they visit?
  • -> What influencers do they follow on Instagram?
  • -> What brands (even competitors) do they buy from?
  • -> What magazines or blogs do they read?
  • -> What tools or software do they use?

The key is to pick interests that are *specific* to your audience. A common mistake is picking interests that are too broad. For instance, if you're selling high-end coffee beans, targeting the interest "Coffee" is a bad idea. Millions of people who just drink instant coffee fall into that category. You'd be better off targeting interests like "James Hoffmann", "Aeropress", "Speciality Coffee Association", or high-end coffee machine brands. These interests are far more likely to contain your actual target customer, and exclude those who aren't.

For your initial test, I'd suggest creating 2-3 distinct audience 'hypotheses'.

  • -> Audience A (Competitor-based): A group of interests based on your direct competitors.
  • -> Audience B (Interest-based): A group of interests based on related hobbies, magazines, influencers etc.
  • -> Audience C (Demographic/Behaviour-based): A more layered audience, maybe combining an interest with a behaviour like 'Engaged Shoppers'.

You'd then run your 'Pure Creative Test' (Method 2) on Audience A first. Once you have your winning creatives, you can then test them against Audience B and C to see if you can beat the results. This structured approach to audience testing is just as important as the creative test.

Later on, once your pixel has gathered data (you want at least 100 purchase events, but honestly more like 500-1000 to be really effective), you can move into the more advanced stuff like Lookalike Audiences and Retargeting (MoFu/BoFu). A 1% Lookalike of your 'Purchasers' list will almost always outperform any interest-based audience you can build. But you have to earn the data to build that first. You start with interests.

You probably should focus on the right metrics...

So, you're running your test. How do you decide what's a 'winner'? It's easy to get lost in all the columns in Ads Manager. For an e-commerce store, you need to focus on the metrics that actually lead to money.

Here's how I'd analyse the results, in order of importance:

  1. Return On Ad Spend (ROAS): This is the king. If you spend €10 and make €40 in revenue, your ROAS is 4x. This is the ultimate measure of success. The creative/adset with the highest ROAS wins. Period.
  2. Cost Per Purchase (CPA): If ROAS data isn't available or you don't have enough purchases yet, CPA is your next best metric. How much does it cost you to get one sale? Lower is better.
  3. Cost per Add to Cart / Initiate Checkout: In the very early days of a test (first 24-48 hours), you might not have many purchases. These are your leading indicators. If one creative is getting Add to Carts for €2 and another is costing €10, it's a strong sign the first one is going to be your winner.
  4. Click-Through Rate (CTR - Link): This tells you how compelling your ad is. A high CTR (above 1% is okay, above 2% is good for ToFu) means your image/video and headline are grabbing attention. If a creative has a very low CTR, it's probably dead on arrival, no matter how good the offer is. It's failing at its first job.
  5. Cost Per Click (CPC - Link): This is closely related to CTR. A high CTR usually leads to a lower CPC. It's a good measure of how 'relevent' Meta thinks your ad is to the audience.

Don't get bogged down by vanity metrics like reach, impressions, or 'post engagement'. They don't pay the bills. Focus on the actions that happen on your website.

You need a rule for when to kill a failing adset in your test. A good rule of thumb is to let it spend at least your target CPA. If your product is €50 and you're aiming for a €25 CPA, let each adset spend at least €25-€30 before you make a call. If it hasn't gotten a single purchase by then (or even an Add to Cart), it's probably not going to work. Turn it off and let the budget go to the other contenders. Some people use 2x or 3x target CPA as their cutoff, it depends how much risk you are willing to take. But definately don't turn things off after just a few euros of spend.

I've detailed my main recommendations for you below:

This is a lot to take in, I know. So here's a summary of the actionable strategy I would recommend for your new product launch creative test. I remember a client that sells cleaning products where we used a similar approach to find a creative that drove a 633% return.

Phase Campaign Setup Structure & Rationale Key Metrics to Judge By
Phase 1: Pure Creative Test
(Duration: 3-5 days)
Campaign Type: Sales
Budget Optimisation: ABO (Ad Set Budget)
Total Budget: €100/day
Structure:
  • 10 seperate Adsets (1 for each creative).
  • Each adset gets a €10/day budget.
  • All adsets target the exact same audience (e.g., "Audience A - Competitors").
Rationale: Isolates the creative as the only variable. Guarantees each creative gets a fair test budget, providing clean, undeniable data on performance.
Primary: ROAS, Cost Per Purchase (CPA).
Secondary: Cost per Add to Cart, Link CTR.
Action: After 3-5 days, identify the top 2-3 winning creatives based on these metrics.
Phase 2: Audience Test
(Duration: 3-5 days)
Campaign Type: Sales
Budget Optimisation: ABO (Ad Set Budget)
Total Budget: €60-€90/day
Structure:
  • Create 2-3 new adsets.
  • Each adset targets a different new audience (e.g., "Audience B - Interests", "Audience C - Demographics").
  • Inside each adset, place your 2-3 winning creatives from Phase 1.
Rationale: Pits your winning creatives against new audiences to find the most profitable creative/audience combination.
Primary: ROAS, Cost Per Purchase (CPA).
Action: Identify the single best combination of creative + audience. This is your "golden adset".
Phase 3: Scaling
(Ongoing)
Campaign Type: Sales
Budget Optimisation: CBO (Campaign Budget)
Total Budget: Start with €50-€100/day, increase slowly.
Structure:
  • Create a new CBO campaign.
  • Create one adset inside using your winning audience.
  • Place your top 2-3 winning creatives in this adset.
Rationale: You've done the testing. Now you hand the reins back to Meta's algorithm to efficiently scale what you've proven to work, finding the cheapest conversions possible.
Primary: ROAS.
Action: Monitor ROAS closely. Increase budget by 20% every 2-3 days as long as ROAS remains above your target.

As you can see, a proper testing framework is a multi-stage process. It's methodical and requires patience, but it's the foundation for building a truly profitable advertising account. Just throwing things at the wall with a broad CBO campaign is a recipe for wasted ad spend and confusing results. This way, you build from a position of strength, knowing exactly what works and why.

This process is time-consuming and requires a fair bit of experience to analyse the results correctly and not make knee-jerk reactions. Getting it wrong can be costly, not just in ad spend but in the missed opportunity of failing to identify a creative that could have been a huge winner for your business.

This is often where expert help can make a significant difference. An experienced eye can help you set up these tests correctly, interpret the data without emotion, and make the right decisions to scale your campaigns profitably. We've run these kinds of tests for countless clients across dozens of niches, from eCommerce and SaaS to course creators, and that experience helps us get to the winning formula much faster.

If you'd like to chat through this in more detail and have us take a look at your specific plans, we're happy to offer a free initial consultation. It's a no-obligation call where we can give you some more tailored advice and you can get a better sense of how we work.

Hope this helps you get started on the right foot!

Regards,

Team @ Lukas Holschuh

Real Results

See how we've turned 5-figure ad spends
into 6-figure revenue streams.

View All Case Studies
$ Software / Google Ads

3,543 users at £0.96 each

A detailed walkthrough on how we achieved 3,543 users at just £0.96 each using Google Ads. We used a variety of campaigns, including Search, PMax, Discovery, and app install campaigns. Discover our strategy, campaign setup, and results.

Implement This For Me
$ Software / Meta Ads

5082 Software Trials at $7 per trial

We reveal the exact strategy we've used to drive 5,082 trials at just $7 per trial for a B2B software product. See the strategy, designs, campaign setup, and optimization techniques.

Implement This For Me
👥 eLearning / Meta Ads

$115k Revenue in 1.5 Months

Walk through the strategy we've used to scale an eLearning course from launch to $115k in sales. We delve into the campaign's ad designs, split testing, and audience targeting that propelled this success.

Implement This For Me
📱 App Growth / Multiple

45k+ signups at under £2 each

Learn how we achieved app installs for under £1 and leads for under £2 for a software and sports events client. We used a multi-channel strategy, including a chatbot to automatically qualify leads, custom-made landing pages, and campaigns on multiple ad platforms.

Implement This For Me
🏆 Luxury / Meta Ads

£107k Revenue at 618% ROAS

Learn the winning strategy that turned £17k in ad spend into a £107k jackpot. We'll reveal the exact strategies and optimizations that led to these outstanding numbers and how you can apply them to your own business.

Implement This For Me
💼 B2B / LinkedIn Ads

B2B decision makers: $22 CPL

Watch this if you're struggling with B2B lead generation or want to increase leads for your sales team. We'll show you the power of conversion-focused ad copy, effective ad designs, and the use of LinkedIn native lead form ads that we've used to get B2B leads at $22 per lead.

Implement This For Me
👥 eLearning / Meta Ads

7,400 leads - eLearning

Unlock proven eLearning lead generation strategies with campaign planning, ad creative, and targeting tips. Learn how to boost your course enrollments effectively.

Implement This For Me
🏕 Outdoor / Meta Ads

Campaign structure to drive 18k website visitors

We dive into the impressive campaign structure that has driven a whopping 18,000 website visitors for ARB in the outdoor equipment niche. See the strategy behind this successful campaign, including split testing, targeting options, and the power of continuous optimisation.

Implement This For Me
🛒 eCommerce / Meta Ads

633% return, 190 % increase in revenue

We show you how we used catalogue ads and product showcases to drive these impressive results for an e-commerce store specialising in cleaning products.

Implement This For Me
🌍 Environmental / LinkedIn & Meta

How to reduce your cost per lead by 84%

We share some amazing insights and strategies that led to an 84% decrease in cost per lead for Stiebel Eltron's water heater and heat pump campaigns.

Implement This For Me
🛒 eCommerce / Meta Ads

8x Return, $71k Revenue - Maps & Navigation

Learn how we tackled challenges for an Australian outdoor store to significantly boost purchase volumes and maintain a strong return on ad spend through effective ad campaigns and strategic performance optimisation.

Implement This For Me
$ Software / Meta Ads

4,622 Registrations at $2.38

See how we got 4,622 B2B software registrations at just $2.38 each! We’ll cover our ad strategies, campaign setups, and optimisation tips.

Implement This For Me
📱 Software / Meta & Google

App & Marketplace Growth: 5700 Signups

Get the insight scoop of this campaign we ran for a childcare services marketplace and app. With 5700 signups across two ad platforms and multiple campaign types.

Implement This For Me
🎓 Student Recruitment / Meta Ads

How to reduce your cost per booking by 80%

We discuss how to reduce your cost per booking by 80% in student recruitment. We explore a case study where a primary school in Melbourne, Australia implemented a simple optimisation.

Implement This For Me
🛒 eCommerce / Meta Ads

Store launch - 1500 leads at $0.29/leads

Learn how we built awareness for this store's launch while targeting a niche audience and navigating ad policies.

Implement This For Me

Featured Content

The Ultimate Guide to Stop Wasting Money on LinkedIn Ads: Target Ideal B2B Customers & Drive High-Quality Leads

Tired of LinkedIn Ads that drain your budget and deliver poor results? This guide reveals the common mistakes B2B companies make and provides a proven framework for targeting the right customers, crafting compelling ads, and generating high-quality leads.

July 26, 2025

Find the Best PPC Consultant in London: Expert Guide

Tired of PPC 'experts' who don't deliver? This guide reveals how to find a results-driven PPC consultant in London, spot charlatans, and ensure a profitable ad strategy.

July 31, 2025

The Complete Guide to Google Ads for B2B SaaS

B2B SaaS Google Ads a money pit? Target the WRONG people & offer demos nobody wants? This guide reveals how to fix it by focusing on customer nightmares.

August 15, 2025

Fix Failing Facebook Ads: The Ultimate Troubleshooting Guide

Frustrated with Facebook ads that burn cash? This expert guide reveals why your campaigns fail and provides a step-by-step strategy to turn them into profit-generating machines.

July 31, 2025

Solved: Video ads or still images on Facebook Ads?

I'm trying to figure out if I should make video ads or just use still images on Facebook. Because it's a newer solution to business problems, I'm thinking of using still images to get a simple message across to users. What do you all recommend?

August 4, 2025

Solved: Best bid strategy for new Meta Ads ecom account?

Im starting a new meta ads account for my ecom company and im not sure what bid strategy to use.

July 18, 2025

B2B Social Media Advertising: Generate Leads on LinkedIn & Meta

Unlock the power of B2B social media advertising! This guide reveals how to choose the right platforms, target your ideal customers, craft compelling ads, and optimize your campaigns for lead generation success.

August 4, 2025

The Complete Guide to Meta Ads for B2B SaaS Lead Generation

B2B SaaS ads failing? You're likely making these mistakes. Discover how to fix them by targeting pain points and offering instant value, not demos!

August 17, 2025

Building Your In-House Paid Ads Team vs. Hiring an Agency: A Founder's Decision Framework

Struggling to decide between an in-house team and an agency? Discover a founder's framework that avoids costly mistakes by focusing on speed, expertise, and risk mitigation. Learn how a hybrid model with a junior coordinator and the agency will let you scale faster!

August 8, 2025

Google Ads vs. Meta Ads: A Data-Driven Framework for E-commerce Brands

Struggling to choose between Google & Meta ads? E-commerce brands, discover a data-driven framework using LTV. Plus: Target search intent & ad creative tips!

August 19, 2025

Solved: Need LinkedIn Ads Agency for B2B SaaS in London

I'm trying to find an agency that know how to run LinkedIn ads for B2B SaaS, but I'm having a tough time finding someone in London that get it.

July 31, 2025

The Small Business Owner's First Paid Ads Campaign: A Step-by-Step Guide

Struggling with your first paid ads? It's likely you're making critical foundational mistakes. Discover how defining your customer's 'nightmare' and LTV can unlock explosive growth. Plus: high-value offer secrets!

August 19, 2025

Unlock The Ad Expertise You're Missing.

Free Consultation & Audit