Published on 11/25/2025 Staff Pick

Solved: Add New Creatives to Advantage+ Campaign?

Inside this article, you'll discover:

I've ben running my Ads on Advantage+ Campaign for a month, and now steady results are coming in. Now I want to add new creatives into my account, but worried about re-entering the learning phase. Should i make a new compaign with the new creatives instead of putting them in my current campaign? What do you advice?

Mentioned On*

Bloomberg MarketWatch Reuters BUSINESS INSIDER National Post

Hi there,

Thanks for reaching out! Happy to give you some of my initial thoughts on your Advantage+ question. It's a really common situation to be in – you've got something that's working well and you're rightly cautious about breaking it. The short answer is you should definitely avoid touching your main campaign.

The way Advantage+ (ASC) works means that making significant changes, like adding a bunch of new creatives, will almost certainly reset the learning phase and could easily wreck the stable performance you're seeing. Instead, the approach I'd take, and what we do for our clients, is to build a systematic, separate testing framework. This lets you find your next winning ads without risking the revenue from your main campaign. I'll walk you through how to do that below.

TLDR;

  • Do NOT add new creatives directly to your working Advantage+ campaign. It will reset the learning phase and you risk destroying its current stable performance.
  • Set up a separate "Creative Testing" campaign. This should be a standard conversions campaign using Ad Set Budget Optimisation (ABO) to isolate variables and give each new creative a fair test.
  • Use this testing campaign to identify statistically proven "winners" based on metrics like ROAS and CPA before even thinking about introducing them to your main campaign.
  • The most important piece of advice is to treat your main scaling campaign (ASC) as sacred. Your goal is to feed it only proven winners from your testing environment, not use it as a test bed itself.
  • I've included a flowchart below that visualises the testing structure, as well as an interactive calculator to help you analyse and compare creative performance.

The Advantage+ Campaign is a Black Box - Don't Shake It

First off, it’s great that you've got an Advantage+ campaign that's out of learning and delivering steady results. That's the goal, and a lot of advertisers struggle to get there. The temptation to tweak and 'optimise' is always strong, but with ASC, it's a temptation you need to resist.

You need to think of a successful ASC campaign as a finely tuned engine. The algorithm has processed a huge amount of data about your customers, your products, and your existing creatives. It has found a specific 'pocket' of performance – a combination of audiences, placements, and bidding strategies that works. When you add new creatives, you're not just adding more options; you're forcing the entire system to re-evaluate everything from scratch. You're telling the algorithm, "That stable, profitable model you built? Forget it. Here's a load of new, unproven stuff to figure out."

This is what triggers the learning phase to reset. It's not just a status message; it’s a period of high volatility and increased spend where the algorithm is essentially gambling with your money to find a new stable path. Sometimes it finds a better one, but more often than not, especially when the original campaign was already working well, performance gets worse. You can easily see your Cost Per Acquisition (CPA) double overnight and your Return on Ad Spend (ROAS) get cut in half. I've seen it happen countless times in accounts we've taken over. An advertiser has a 'golden' campaign, gets impatient, throws in a dozen new ads, and kills it dead.

The core principle here is stability. Your main, money-making campaign's job is to scale predictably. Its job is not to test new ideas. You need a different place for that, a sandpit where you can experiment without knocking over the castle you’ve already built.

We'll need to look at building a "Challenger" Testing Framework

So, if you can't touch your main campaign, how do you test new creatives? You build a separate, dedicated testing environment. This is what we call a "Challenger" or "Creative Testing" campaign. Its only job is to put new ad creatives through their paces in a controlled way to find the next winner that can eventually be graduated to your main scaling campaign.

This approach completely de-risks the process. Your main ASC campaign keeps running, bringing in steady sales. Meanwhile, on the side, your testing campaign is methodically working to find a creative that can outperform your current best ads. For instance, I remember working with a women's apparel brand that was hesitant to touch their main campaign. By implementing a separate testing framework, we were able to identify new winning creatives that ultimately helped drive their return on ad spend to 691%. You get clean, reliable data because you are isolating the variable you want to test: the creative itself.

Think about it like a Formula 1 team. They have their main race car, which is optimised to the absolute limit for winning the current race. They don't try out a brand-new, untested engine design during the Grand Prix on Sunday. That would be insane. Instead, they have a separate test track and a test car where they run new parts through rigorous trials. Only once a new part has proven, with data, that it's faster and more reliable does it earn a place on the main race car. Your advertising should operate under the exact same logic.

1. Main Scaling Campaign
Your existing Advantage+ Campaign. Do not edit this. It continues to run and generate stable sales.
+
2. Creative Testing Campaign
A new, separate ABO Conversions campaign with a smaller budget (e.g., 10-20% of main spend).
Ad Set 1
Creative A
Ad Set 2
Creative B
Ad Set 3
Creative C
3. Analyse & Identify Winner
After enough data, one creative proves to have the best ROAS/CPA. Let's say it's Creative B.
4. Graduate the Winner
Turn off the testing campaign. Add ONLY the winning Creative B into a duplicated version of your main ASC campaign to scale it.

This flowchart illustrates the recommended parallel campaign structure. Your main ASC campaign runs untouched for stability, while a separate testing campaign identifies new winning creatives in a controlled environment.

I'd say you need to structure the test properly

Setting up the testing campaign correctly is vital for getting clear results. A poorly structured test is just as bad as not testing at all, because you can't trust the data. Here’s a simple but effective way to set it up:

  • Campaign Objective: Always choose 'Sales' (or 'Leads' if that's your goal). You must test using the same objective as your main campaign. Testing with a 'Traffic' or 'Engagement' objective tells you nothing about a creative's ability to actually convert a customer. It's a classic mistake that wastes a lot of money.

  • Budgeting: Use Ad Set Budget Optimisation (ABO), not Campaign Budget Optimisation (CBO). This is a point of contention for some, but for pure creative testing, ABO is superior. It allows you to set a specific daily budget for each ad set, ensuring that every creative you're testing gets a fair amount of spend to prove itself. With CBO, Meta's algorithm will quickly shift the budget to the ad it *thinks* will win based on early signals, which can often kill a potentially great ad before it has a chance to gather enough data.

  • Ad Set Structure: The golden rule is one creative per ad set. If you put multiple creatives in the same ad set, you're back to the CBO problem – Facebook will pick an early favourite and the other ads will barely get any impressions. By isolating each new creative in its own ad set, you force an equal test. Let's say you have three new video ads to test. You would create three ad sets, each with the exact same targeting and budget, but with a different one of the new videos in each.

  • Audience Targeting: In your testing campaign, the audience should not be the variable. The creative is. Therefore, you should use a single, reliable, and broad audience for all ad sets in the test. A proven lookalike audience (like 1% of Purchasers) or a broad targeting stack that has worked for you in the past is perfect. The goal is to keep the audience consistent so that any difference in performance between the ad sets can be attributed directly to the creative.

  • Budget Allocation: Your testing budget doesn't need to be huge. A good rule of thumb is to allocate around 10-20% of your main campaign's daily spend to testing. The key is to set the daily budget for each ad set high enough to get at least one or two conversions per day, based on your average CPA. If your CPA is £30, a £10/day budget per ad set won't give you meaningful data quickly. You might need to set it to £40-£50/day to get the fast feedback you need.

By following this structure, you create a scientific experiment. The only significant difference between each ad set is the ad creative itself. This means when you look at the results, you can be highly confident that the winning ad set contains the winning creative.

You probably should analyse the data, not your gut feeling

Once your test is running, the next challenge is knowing how to interpret the results and when to make a decision. It's easy to get impatient, but this is where the biggest gains are made. For one B2B software client, a methodical testing process like this was key to reducing their cost per user acquisition from a staggering £100 all the way down to just £7. It's easy to declare a winner after just one day, but this often leads to poor choices based on statistical noise.

Here’s what to look for and how long to wait:

  1. Give it Time: Let each ad set run until it has spent at least 1-2x your target CPA. If your target Cost Per Purchase is £25, don't even look at the data until each ad set has spent £25-£50. This ensures you're moving beyond initial luck and getting a more stable picture of performance. Ideally, you want to wait 3-7 days to smooth out any daily fluctuations in performance.

  2. The Primary Metric is ROAS (or CPA): While it's tempting to focus on vanity metrics like Click-Through Rate (CTR) or Cost Per Click (CPC), they don't pay the bills. The ultimate decider is efficiency. Which creative is generating sales at the lowest cost? That's your winner. A creative might have a lower CTR but a much higher conversion rate on the website, making it far more profitable. Always prioritise the bottom-of-funnel metrics.

  3. Look for Statistical Significance: A creative with 2 sales at a £10 CPA isn't necessarily better than one with 1 sale at a £15 CPA. The data set is too small. You need to see a clear and sustained trend. The winning creative should be consistently outperforming the others on your key metric over several days.

To help with this, you can use a simple comparison framework. Look at the performance of your existing ads in your main ASC campaign – what's your average ROAS and CPA there? That's your benchmark. A new creative is only a "winner" if it can confidently beat that benchmark in your testing campaign.

Control Creative (Your Benchmark)
ROAS: 3.00x
Test Creative
ROAS: 4.00x

Use this interactive calculator to compare the performance of your benchmark creative against a new test creative. A new ad is only a 'winner' if it can consistently deliver a better ROAS. Results are for illustrative purposes only. For a tailored analysis, please consider scheduling a free consultation.

You'll need a process for scaling the winners

Okay, so you've run your test and you've found a clear winner. A new creative that's delivering a 4.0x ROAS while your old ads are averaging 3.0x. Now what? How do you get this new ad working at scale without, again, breaking your main campaign?

You still need to be cautious. My preferred method is to duplicate your existing, successful Advantage+ campaign. In this new, duplicated campaign, you would turn off all the old creatives and add *only* your newly proven winner (or maybe your top 2-3 all-time winners including the new one). You then launch this new "Champion" ASC campaign with a similar budget to the original.

Now you have two campaigns running side-by-side: the "Incumbent" (your original, untouched campaign) and the "Challenger" (the new one with the winning creative). Let them run for a few days. Very often, the new campaign with just the lean, proven creative will outperform the older one. If it does, you can begin to scale the budget up on the new one while scaling it down on the old one, eventually phasing the original campaign out completely. This ensures a smooth transition and continuous improvement without any catastrophic drops in performance.

This whole process might sound like a lot of work compared to just clicking "edit" and adding the new ad to your existing campaign. And it is. But it's a system. And systems are what allow you to scale your ad spend reliably and profitably. It turns creative refreshment from a risky gamble into a predictable process of improvement. This is the difference between amateur 'ad boosting' and professional media buying.

This is the main advice I have for you:

To put it all together, here is a simple, actionable plan you can follow. This is the exact process we'd use for a client in your position.

Step Action to Take Why You're Doing It
Step 1: Isolate Do not edit your existing, working Advantage+ Campaign. Treat it as a sacred, money-making machine that should not be disturbed while it's performing well. To protect your primary source of revenue and avoid resetting the learning phase, which could cause a major and unpredictable drop in performance.
Step 2: Create Launch a new, separate "Creative Testing" campaign. Set the objective to Sales and use Ad Set Budget Optimisation (ABO). Allocate 10-20% of your main campaign spend to it. To create a controlled 'sandpit' environment where new ideas can be tested without risking your main campaign's stability.
Step 3: Test Inside the testing campaign, create one ad set for each new creative you want to test. (1 Creative per Ad Set). Use the same broad/proven audience for all ad sets. This isolates the creative as the only variable, ensuring that any performance difference is due to the ad itself, giving you clean, reliable data.
Step 4: Analyse Let the test run until each ad set has spent at least 1-2x your target CPA. Identify the winner based on the best ROAS or CPA, not on vanity metrics like CTR. To make data-driven decisions rather than guessing. You need enough spend to ensure the results are statistically significant and not just luck.
Step 5: Graduate Duplicate your original ASC campaign. In the duplicated version, pause all old creatives and add ONLY the proven winner from your test. Launch this as a "Challenger". This is the safest way to introduce a new creative to the powerful ASC algorithm, by giving it a clean slate with a proven ad to scale, minimising disruption.
Step 6: Scale Monitor the "Challenger" vs the "Incumbent" campaign. If the new campaign outperforms the old one, gradually shift budget to it until it becomes your new primary scaling campaign. To ensure a smooth transition of budget and scale up what's working best, creating a cycle of continuous, data-backed improvement.

This might seem like a complex process, but once you've done it once, it becomes second nature. It's the foundation of scalable and sustainable advertising on Meta. Having a reliable system for testing and scaling creatives is arguably the most valuable asset an advertiser can have.

Managing this process of continuous testing, analysis, and scaling is a full-time job, and it's where expert help can make a significant difference. A seasoned eye can spot winning trends faster, structure tests more efficiently, and knows from experience which creative angles are most likely to work for your market. This systematic approach is precisely how we manage client accounts to deliver consistent growth, taking the guesswork out of creative strategy.

If you'd like to chat through your account in more detail and see how a framework like this could be implemented for your specific business, we offer a completely free, no-obligation initial consultation call. We can review your setup together and identify some immediate opportunities.

Hope this helps!

Regards,

Team @ Lukas Holschuh

Real Results

See how we've turned 5-figure ad spends
into 6-figure revenue streams.

View All Case Studies
$ Software / Google Ads

3,543 users at £0.96 each

A detailed walkthrough on how we achieved 3,543 users at just £0.96 each using Google Ads. We used a variety of campaigns, including Search, PMax, Discovery, and app install campaigns. Discover our strategy, campaign setup, and results.

Implement This For Me
$ Software / Meta Ads

5082 Software Trials at $7 per trial

We reveal the exact strategy we've used to drive 5,082 trials at just $7 per trial for a B2B software product. See the strategy, designs, campaign setup, and optimization techniques.

Implement This For Me
👥 eLearning / Meta Ads

$115k Revenue in 1.5 Months

Walk through the strategy we've used to scale an eLearning course from launch to $115k in sales. We delve into the campaign's ad designs, split testing, and audience targeting that propelled this success.

Implement This For Me
📱 App Growth / Multiple

45k+ signups at under £2 each

Learn how we achieved app installs for under £1 and leads for under £2 for a software and sports events client. We used a multi-channel strategy, including a chatbot to automatically qualify leads, custom-made landing pages, and campaigns on multiple ad platforms.

Implement This For Me
🏆 Luxury / Meta Ads

£107k Revenue at 618% ROAS

Learn the winning strategy that turned £17k in ad spend into a £107k jackpot. We'll reveal the exact strategies and optimizations that led to these outstanding numbers and how you can apply them to your own business.

Implement This For Me
💼 B2B / LinkedIn Ads

B2B decision makers: $22 CPL

Watch this if you're struggling with B2B lead generation or want to increase leads for your sales team. We'll show you the power of conversion-focused ad copy, effective ad designs, and the use of LinkedIn native lead form ads that we've used to get B2B leads at $22 per lead.

Implement This For Me
👥 eLearning / Meta Ads

7,400 leads - eLearning

Unlock proven eLearning lead generation strategies with campaign planning, ad creative, and targeting tips. Learn how to boost your course enrollments effectively.

Implement This For Me
🏕 Outdoor / Meta Ads

Campaign structure to drive 18k website visitors

We dive into the impressive campaign structure that has driven a whopping 18,000 website visitors for ARB in the outdoor equipment niche. See the strategy behind this successful campaign, including split testing, targeting options, and the power of continuous optimisation.

Implement This For Me
🛒 eCommerce / Meta Ads

633% return, 190 % increase in revenue

We show you how we used catalogue ads and product showcases to drive these impressive results for an e-commerce store specialising in cleaning products.

Implement This For Me
🌍 Environmental / LinkedIn & Meta

How to reduce your cost per lead by 84%

We share some amazing insights and strategies that led to an 84% decrease in cost per lead for Stiebel Eltron's water heater and heat pump campaigns.

Implement This For Me
🛒 eCommerce / Meta Ads

8x Return, $71k Revenue - Maps & Navigation

Learn how we tackled challenges for an Australian outdoor store to significantly boost purchase volumes and maintain a strong return on ad spend through effective ad campaigns and strategic performance optimisation.

Implement This For Me
$ Software / Meta Ads

4,622 Registrations at $2.38

See how we got 4,622 B2B software registrations at just $2.38 each! We’ll cover our ad strategies, campaign setups, and optimisation tips.

Implement This For Me
📱 Software / Meta & Google

App & Marketplace Growth: 5700 Signups

Get the insight scoop of this campaign we ran for a childcare services marketplace and app. With 5700 signups across two ad platforms and multiple campaign types.

Implement This For Me
🎓 Student Recruitment / Meta Ads

How to reduce your cost per booking by 80%

We discuss how to reduce your cost per booking by 80% in student recruitment. We explore a case study where a primary school in Melbourne, Australia implemented a simple optimisation.

Implement This For Me
🛒 eCommerce / Meta Ads

Store launch - 1500 leads at $0.29/leads

Learn how we built awareness for this store's launch while targeting a niche audience and navigating ad policies.

Implement This For Me

Featured Content

Solved: Need LinkedIn Ads Agency for B2B SaaS in London

I'm trying to find an agency that know how to run LinkedIn ads for B2B SaaS, but I'm having a tough time finding someone in London that get it.

January 22, 2026

Solved: Video ads or still images on Facebook Ads?

I'm trying to figure out if I should make video ads or just use still images on Facebook. Because it's a newer solution to business problems, I'm thinking of using still images to get a simple message across to users. What do you all recommend?

January 22, 2026

Find the Best PPC Consultant in London: Expert Guide

Tired of PPC 'experts' who don't deliver? This guide reveals how to find a results-driven PPC consultant in London, spot charlatans, and ensure a profitable ad strategy.

January 22, 2026

B2B Social Media Advertising: Generate Leads on LinkedIn & Meta

Unlock the power of B2B social media advertising! This guide reveals how to choose the right platforms, target your ideal customers, craft compelling ads, and optimize your campaigns for lead generation success.

January 22, 2026

Fix Failing Facebook Ads: The Ultimate Troubleshooting Guide

Frustrated with Facebook ads that burn cash? This expert guide reveals why your campaigns fail and provides a step-by-step strategy to turn them into profit-generating machines.

January 22, 2026

Building Your In-House Paid Ads Team vs. Hiring an Agency: A Founder's Decision Framework

Struggling to decide between an in-house team and an agency? Discover a founder's framework that avoids costly mistakes by focusing on speed, expertise, and risk mitigation. Learn how a hybrid model with a junior coordinator and the agency will let you scale faster!

January 22, 2026

The Complete Guide to Meta Ads for B2B SaaS Lead Generation

B2B SaaS ads failing? You're likely making these mistakes. Discover how to fix them by targeting pain points and offering instant value, not demos!

January 22, 2026

Google Ads vs. Meta Ads: A Data-Driven Framework for E-commerce Brands

Struggling to choose between Google & Meta ads? E-commerce brands, discover a data-driven framework using LTV. Plus: Target search intent & ad creative tips!

January 22, 2026

The Small Business Owner's First Paid Ads Campaign: A Step-by-Step Guide

Struggling with your first paid ads? It's likely you're making critical foundational mistakes. Discover how defining your customer's 'nightmare' and LTV can unlock explosive growth. Plus: high-value offer secrets!

January 22, 2026

The Complete Guide to Google Ads for B2B SaaS

B2B SaaS Google Ads a money pit? Target the WRONG people & offer demos nobody wants? This guide reveals how to fix it by focusing on customer nightmares.

August 15, 2025

The Ultimate Guide to Stop Wasting Money on LinkedIn Ads: Target Ideal B2B Customers & Drive High-Quality Leads

Tired of LinkedIn Ads that drain your budget and deliver poor results? This guide reveals the common mistakes B2B companies make and provides a proven framework for targeting the right customers, crafting compelling ads, and generating high-quality leads.

July 26, 2025

Solved: Best bid strategy for new Meta Ads ecom account?

Im starting a new meta ads account for my ecom company and im not sure what bid strategy to use.

July 18, 2025

Unlock The Ad Expertise You're Missing.

Free Consultation & Audit