Hi Matt,
Thanks for reaching out!
Read through your situation and it's a classic one, honestly. You've had a brilliant start, 500 conversions in your first month is fantastic, but now you've hit the first major wall that trips everyone up: trying to optimise and test inside the same campaign. It's a common mistake, so don't feel bad about it. The way you've described your CBO campaign putting all its budget into one ad is exactly what it's designed to do, even if it feels like it's not working properly.
I can give you some initial thoughts on this. The solution isn't just about shuffling ads between campaigns; it's about fundamentally changing how you think about testing, scaling, and the very structure of your ad account. Let's get this sorted.
We'll need to look at why your CBO is behaving this way...
Alright, first things first. Let's bust a myth. Your Campaign Budget Optimisation (CBO) campaign isn't broken, and it's not "telling you the new ads are bad" after only a day or two. It's doing precisely what you commanded it to do, which is to get you the most conversions for your £60 budget based on the data it has.
Think of it like this: your old video ad has a history. It's run before, it's got clicks, it's got conversions. In Meta's eyes, it has a proven track record. Your four new creatives? They are complete unknowns. They have zero data, zero history, zero proof that they can perform. So when you put them all in a CBO cage fight, the algorithm looks at the grizzled veteran (your old video) and the four brand new fighters and says, "Right, my job is to win the fight (get conversions) for the lowest cost. I'm putting my money on the guy who's won before."
This is why on Day 1, it pumped £58 into the old video. It was leaning on its historical data. The 15 conversions it got reinforced that decision. On Day 2, when performance dropped to 5 conversions, the algorithm was likely still in the 'learning phase' with the new creatives and didn't have enough data to justify shifting significant budget to them yet. It saw the old ad's performance dip, but it still saw it as a safer bet than the complete unknowns. It takes more than a few pence of spend for the algorithm to gather any meaningful data.
The core problem here is you're using an optimisation tool (CBO) as a testing tool. That's like using a race car to learn how to drive. CBO is built to scale winners, not to fairly test unproven ideas. You're asking it to do two completely different jobs at once, and it will always default to the job of optimisation. It will never give your new creatives a 'fair' go, because 'fairness' isn't in its programming. Efficiency is. This is probably the single biggest misunderstanding I see in accounts I audit. People beleive CBO is for testing, it is not.
So, to directly answer your question: "Why is meta just putting all the budget in the old video?" Because you told it to. By putting it in a CBO, you told it to find the cheapest conversions, and it decided the ad with a history was the most likely candidate. It's not a judgement on your new creatives; it's just a reflection of data scarcity.
I'd say you need a dedicated, proper testing framework...
So how do we fix this? We stop asking the race car to be a driving school car. You need to seperate the job of testing from the job of scaling. This is non-negotiable if you want to grow your account without tearing your hair out.
To answer your other question: "Should I put the new creatives in a seperate campaign and test them WITHOUT the old ad video inside?" Yes. Absolutely, one hundred percent yes. This is the only way to get a clean, unbiased read on whether your new stuff actually works.
Here’s how you build a proper testing environment. We'll use Ad set Budget Optimisation (ABO) for this, not CBO. With ABO, you set the budget at the ad set level, which forces Facebook to spend that money within that specific ad set, giving your new creatives the airtime they need to prove themselves.
Here's what a simple, effective testing structure looks like:
| Campaign Level | Ad Set Level | Ad Level |
|---|---|---|
| Campaign: Creative Testing (ABO) Objective: Conversions |
Ad Set 1: Broad Test Budget: £15/day (ABO) Audience: Your best broad audience |
-> Creative 1 (New Image A) -> Creative 2 (New Image B) -> Creative 3 (New Image C) -> Creative 4 (New Video) |
| Ad Set 2: Targeting Test Budget: £15/day (ABO) Audience: Your best interest-based audience |
-> Creative 1 (New Image A) -> Creative 2 (New Image B) -> Creative 3 (New Image C) -> Creative 4 (New Video) |
With this setup, you are forcing Meta to spend £15 per day testing your creatives on your broad audiance, and another £15 per day testing them on your targeted audience. You're giving them a controlled environment and a fixed budget to work with. After 3-5 days, or once each ad set has spent enough (a rule of thumb is at least 2-3x your target Cost Per Conversion), you'll have real data. You can then look at the results and confidently say "Okay, Creative B is the clear winner here" or "This video works great on a broad audience but not on my targeted one."
You analyse the results at the ad level within each ad set. The creatives that get good results (low CPA, high CTR) are your new winners. The ones that don't, you turn off. It's that simple. Once you've found a new winning creative, you can then move it into a *separate* scaling campaign, which is where CBO comes back into play.
You probably should rethink your whole audience strategy...
This brings me to a bigger point. You mentioned you have a "broad ad" and a "targeting ad". This suggests you're thinking about audiences in buckets, which is good, but we can make it much more systematic. The most successful ad accounts don't just have random ad sets for different audiences; they structure their campaigns around a marketing funnel.
Think about it in three stages:
1. Top of Funnel (ToFu): These are cold audiences. People who have never heard of you. This is your Broad audience and your Interest-based targeting ad sets. Their only job is to bring new people to your website or landing page.
2. Middle of Funnel (MoFu): These are warm audiences. People who have shown some interest but haven't taken a key action yet. This could be people who have visited your website, watched 50% of your video ad, or engaged with your Facebook page, but haven't added to cart or purchased.
3. Bottom of Funnel (BoFu): These are hot audiences. People who are on the verge of converting. They've added a product to their cart or initiated checkout but didn't complete the purchase. These are your most valuable prospects.
Right now, it sounds like you're lumping all this together. A much more powerful approach is to have seperate campaigns for each stage of the funnel. This lets you talk to each group differently. You wouldn't say the same thing to a complete stranger (ToFu) as you would to someone about to hand you their credit card details (BoFu), right? Your ads should reflect that.
Here's a more advanced structure you should be aiming for as you grow:
| Campaign (Objective) | Ad Sets (Audiences) | Purpose |
|---|---|---|
| Campaign 1: ToFu - Prospecting (CBO) Objective: Conversions (e.g., Purchase) |
-> Ad Set 1: Broad -> Ad Set 2: Interest Group A -> Ad Set 3: Lookalike (Purchasers) 1% -> Ad Set 4: Lookalike (Add to Cart) 1% |
Find new customers. Use CBO to automatically scale the best performing cold audiences. You'd put your proven, winning creatives in here. |
| Campaign 2: MoFu/BoFu - Retargeting (ABO) Objective: Conversions (e.g., Purchase) |
-> Ad Set 1: Website Visitors (7 Days, excl. purchasers) -> Ad Set 2: Video Viewers (50%, 14 Days) -> Ad Set 3: Add to Cart (3 Days) |
Bring back interested people to finish their purchase. Use ABO for controlled spend on these smaller, high-intent audiences. Ads here might have a special offer or address common objections. |
This structure gives you total control. Your CBO campaign does what it's best at: finding new customers at scale. Your ABO retargeting campaign does what *it's* best at: precisely targeting small, warm audiences with specific messages to get them over the line. And your separate testing campaign (from the previous section) constantly feeds new winning creatives and audience ideas into this machine.
You'll need to understand what makes a winning creative...
Since you're testing new creatives, it's also worth thinking about *what* you're testing and *how* you measure success. It's not just about the final conversion number, especially in the first few days.
When you're in that ABO testing campaign, look at leading indicators. Is a new image ad getting a really high Click-Thru Rate (CTR)? That tells you the image and headline are grabbing attention, even if it hasn't got a conversion yet. That's a great sign. Is a new video getting a high 3-second view rate and a high "hook" rate (percentage of people who watch past the first 3 seconds)? That tells you your intro is compelling. These metrics give you clues about what's working before the conversion data fully matures.
And when you make new ads, don't just throw different pictures up. Test different angles. Your copy is just as important. A simple framework we use is Problem-Agitate-Solve.
Let's pretend you sell a high-quality, comfortable office chair.
- Problem: "Still using that cheap dining chair for your home office? Is your back paying the price after an 8-hour day?"
- Agitate: "That nagging lower back pain isn't just annoying. It kills your focus, ruins your evenings, and makes you dread sitting down to work."
- Solve: "Our ergonomic chair is engineered for all-day comfort and support. Stop suffering and start focusing on what matters. Get yours with free next-day delivery."
See how that's much more powerful than "Buy our office chair, it's great"? It connects with a real pain point. You should test different problems, different angles, and different hooks. A good creative isn't just a pretty picture; it's a message that resonates deeply with the audiance's problems or desires.
We'll need to look at scaling what works...
Finally, you asked about duplicating the video that's creating conversions. "is that too much overlapping with the same audiance."
Yes, it is. Don't do it. This is a common but outdated tactic called "ad set duplication" that people used to try and "reset" the algorithm or escape the learning phase. These days, it mostly just creates problems. You'll end up with multiple ad sets with the same ad, targeting the same audience, bidding against each other in the auction. This drives up your costs and makes your results messy and difficult to analyse. It's called audience overlap, and it's a silent account killer.
The modern, correct way to scale is in two directions:
1. Vertical Scaling: This is the simplest. You have a winning ad set in your CBO scaling campaign that's performing well. You simply increase the campaign budget. Do it slowly, maybe 20-30% every 2-3 days, to avoid shocking the algorithm and re-entering the learning phase. You're just giving your proven winner more fuel.
2. Horizontal Scaling: This is where you take your proven, winning creative (the ad itself) and test it on new, completely different audiences. Found a video that your "Broad" audience loves? Great. Now put it in your testing campaign and see if a Lookalike audience of your past customers loves it too. Or an interest audience based on your competitors. You're expanding your reach by finding new pockets of people who respond to your best message.
Scaling isn't about duplicating what works; it's about giving your winners more budget (vertically) or more new audiences to play with (horizontally). This is a much more stable and sustainable way to grow your ad spend and your business.
This is the main advice I have for you:
I know that's a lot to take in. You've gone from a simple setup to a full-funnel, multi-campaign strategy. To make it clearer, here are the key, actionable steps I'd recommend you take, based on everything we've discussed.
| Recommendation | Actionable Step | Why It's Important |
|---|---|---|
| 1. Separate Testing & Scaling | Create a brand new campaign specifically for testing new creatives and audiences. Pause your current CBO test. | Stops CBO from ignoring your new ads and allows for a fair, controlled test to identify real winners. |
| 2. Use ABO for Testing | Set up your new testing campaign using Ad set Budget Optimization (ABO). Give each ad set its own daily budget. | Forces Meta to spend a minimum amount on each test, guaranteeing your new ads get enough data to be evaluated properly. |
| 3. Stop Duplicating Ads | Delete any duplicated ad sets targeting the same audience. Do not duplicate your winning video. | Prevents internal auction competition (audience overlap) which drives up costs and confuses the algorithm. |
| 4. Build a Scaling Campaign | Create a separate, long-term CBO campaign. Only place your *proven winning creatives* from your testing campaign into this one. | Lets CBO do its job properly: scaling your best ads by automatically allocating budget to the best-performing ad sets inside it. |
| 5. Implement Funnel-Based Retargeting | Create a third campaign for retargeting (MoFu/BoFu audiences like Website Visitors and Add to Carts). Use ABO for this. | Allows you to speak to your warmest audiences with specific, high-conversion messages (e.g., testimonials, discount codes) to get them over the finish line. |
Managing all of this – the constant testing, analysing metrics, moving creatives between campaigns, and managing different funnels – is a full-time job. It’s the difference between having 'ads that run' and having a predictable, scalable customer acquisition machine. It takes a lot of time, effort, and expertise to get right. We have worked with clients who came to us with exactly your problem, and by implementing this kind of robust structure, we've managed to achieve some significant results. For one software client, we reduced their cost per acquisition from £100 down to just £7. For another, we generated over 5,000 trials at just $7 each.
It's not about magic tricks; it's about applying a rigorous, proven process. If you'd like to go thru your account and have a chat about how a strategy like this could be implemented specifically for your business, we offer a free, no-obligation initial consultation. We can take a proper look at your setup and give you a clear roadmap for growth.
Hope that helps clear things up!
Regards,
Team @ Lukas Holschuh