Hi there,
Thanks for reaching out! Really appreciate you getting in touch. I had a look over your question about testing your video ads, and it's a great one. It's something a lot of advertisers get wrong, especially because the platforms have changed so much in the last few years.
Honestly, the old way of doing things – one ad, one headline, one bit of copy, and then duplicating it to change one thing – is pretty much a relic of the past. It’s slow, it’s inefficient, and it actually fights against how the ad algorithms work today. You're right to question it.
I'm happy to give you some initial thoughts and a bit of a guide on how we approach this. It's a shift in mindset, really. Instead of you trying to manually find the one "perfect ad," the goal now is to feed the machine the right ingredients and let it do the heavy lifting to find the perfect combination for each individual person seeing your ad. It's a much more powerful and scalable way to work.
So, let's get into it.
TLDR;
- Your current testing method of one variation per ad is outdated, expensive, and works against the modern ad algorithms. You need to stop doing this immediately.
- You should be using Dynamic Creative Optimisation (DCO) to test all your videos, headlines, and text copy within a single ad unit. This is non-negotiable for efficient testing.
- The goal isn't to find a single "winning ad" anymore. It's to identify winning *components* (the best video, the best headline, the best copy) that you can then reuse and iterate on.
- Simplify your campaign structure. Use a single Campaign Budget Optimisation (CBO) campaign with one broad ad set to give the algorithm maximum data and flexibility to find conversions.
- This letter includes a detailed breakdown of the DCO process, a flowchart visualising the modern testing framework, and an interactive calculator to help you figure out your Customer Lifetime Value (LTV), which is the most important metric you're probably not tracking.
We'll need to look at why your old testing method is just burning your cash...
Right, let's be blunt. The way you've been testing, by creating seperate ads for each individual variation, is probably the single biggest reason you're finding it hard to scale or get clear results. It feels logical, like a proper scientific experiment, but it's a complete mis-understanding of how platforms like Meta work in 2024.
Think about it. For each ad you launch, the algorithm has to go through a "learning phase". It needs to serve the ad enough times to gather data on who clicks, who converts, and who just scrolls by. When you have dozens of seperate ads, you're splitting your budget into tiny little pockets. Each ad gets a little sprinkle of cash, but none of them get enough to exit the learning phase quickly, or even at all. This means the algorithm never gets enough data to properly optimise, so your costs stay high and performance is all over the place. You're essentially forcing the platform to make decisions based on tiny, statistically insignificant data sets.
You mentioned not being able to 'glean the winning copy'. With this old method, you're right. Let's say Ad A (Video 1 + Headline 1) gets 5 sales and Ad B (Video 1 + Headline 2) gets 3 sales. Is Headline 1 the winner? Maybe. Or maybe Ad A just got lucky and was shown to a slightly better pocket of the audience at the right time. You'd need to spend a huge amount of money to know for sure, far more than is practical for most businesses. It's a slow, expensive, and unreliable way to learn anything meaningful.
The fundamental flaw is that you're assuming there is *one* perfect ad that works for everyone. That's just not true. A 21-year-old student in Manchester might respond to a completely different headline than a 45-year-old business owner in London, even if they're both in the same "interest" audience. The old method can't account for this. It tries to find a single winner, whereas the modern approach lets the algorithm act like a master salesperson, tailoring the pitch on the fly for every single person it talks to. Your job isn't to be that salesperson anymore; your job is to give them the best raw materials to work with.
I'd say you should fully embrace Dynamic Creative...
This is the solution to all the problems we just talked about. Dynamic Creative Optimisation (DCO) is a feature built directly into platforms like Meta that completely changes the testing game. Instead of creating dozens of individual ads, you create *one* flexible ad unit.
Here’s how it works:
-> You provide the system with a pool of creative 'assets'. In your case, this would be your 3 videos.
-> You also provide a pool of text assets. Let's say 5 different headlines and 5 different blocks of primary text.
-> You might also provide a couple of different Call-to-Action buttons, like "Learn More" and "Sign Up".
You load all of this into a single ad in your ad set. When you launch the campaign, the algorithm gets to work. It starts automatically mixing and matching your assets, creating hundreds of different combinations on the fly. It might show Video 1 with Headline 3 and Text 5 to one person, and then show Video 2 with Headline 1 and Text 1 to the next person.
It then watches obsessively. It sees which combinations get the most clicks, the most engagement, and most importantly, the most conversions. It learns at a speed no human ever could. It figures out that people who like your competitor's page respond best to Video 3, but only when paired with your most direct, benefit-driven headline. It learns that your retargeting audience converts better with a headline that mentions a specific pain point. It does all of this automatically.
This approach has massive advantages:
1. Efficiency: Your entire test budget is consolidated. It's all working together inside one ad unit, which means the learning phase is completed much, much faster. The algorithm gets the data it needs to start optimising properly in days, not weeks.
2. Performance: Because the algorithm is tailoring the ad creative to the individual, your overall results are almost always better. You get a lower cost per conversion because the system is constantly finding the most persuasive combination for every little pocket of your audience.
3. Real Insights: This is the most important bit, and it directly addresses your original concern. You absolutely *can* glean the winning copy. In fact, you get much clearer insights than with the old method. Inside the Ads Manager, you can use the "Breakdown" feature and select "By Dynamic Creative Asset". This will show you a report with a row for every single video, headline, and text you provided. You can see the spend, impressions, clicks, and conversions attributed to each individual component. You'll quickly see that one video is getting the lion's share of the budget and driving most of the sales. You'll see that two of your headlines are clear winners, while the other three are duds. This isn't guesswork; it's clear data showing you which *ingredients* are the most powerful.
This is the modern way to test. It's about creating a system, not just a single ad. Below is a simple flowchart of how this process works in practice.
Step 1: Input Assets
Upload 3 videos, 5 headlines, 5 primary texts into one Dynamic Creative Ad.
Step 2: Algorithm Mixes
The platform automatically creates hundreds of ad variations by combining your assets.
Step 3: Serve & Learn
Ads are shown to your target audience. The algorithm gathers performance data on every combination.
Step 4: Optimise & Report
The system allocates more budget to winning combinations and provides a breakdown of the best-performing individual assets.
You probably should rethink your campaign structure for testing...
Running a solid Dynamic Creative test isn't just about the ad itself; it's about putting that ad into the right campaign structure. If your campaign setup is fragmented and complex, you'll still be starving the algorithm of the data it needs to work its magic. Again, simplicity is your best friend here.
Forget the old advice about creating dozens of tiny ad sets, each with a single niche interest. This is another outdated tactic that just fragments your budget and data. When I take on a new client, one of the first things we often do is consolidate their entire account structure. We've seen this alone make a huge difference. For one of our eLearning clients, simplifying their structure and implementing a robust testing framework was part of how we helped them achieve a 447% Return On Ad Spend in just the first week.
Here’s the structure I’d recommend you start with:
1. One Single CBO Campaign: Use Campaign Budget Optimisation (CBO). This means you set the budget at the campaign level, not the ad set level. Why? Because it lets the algorithm decide where to spend the money. If it sees one ad set is getting cheaper conversions than another, it will automatically shift the budget to the winner. You're removing yourself as the bottleneck and letting the machine make the smartest financial decisions in real-time.
2. One or Two Broad Ad Sets: Inside your CBO campaign, start with just one or two ad sets. For a testing phase, your main ad set should be your "Prospecting" audience. This is where you target new people who haven't heard of you before. Instead of layering dozens of tiny interests, go broad. Group related interests together. For example, if you sell productivity software, you might have one large ad set that includes interests like 'Asana', 'Trello', 'Monday.com', 'Productivity', and 'Project Management'. Don't be afraid of large audience sizes; you want to give the algorithm plenty of room to find customers. A second ad set could be for Retargeting (e.g., website visitors from the last 30 days), but only if you have enough traffic to justify it. For now, lets focus on prospecting.
3. Your One Dynamic Creative Ad: Inside that one broad prospecting ad set, you place your single Dynamic Creative ad containing all your assets (3 videos, 5 headlines, 5 texts).
That's it. Your entire testing setup is one campaign, one ad set, and one ad. This structure gives the algorithm the maximum possible amount of data, budget, and freedom. It has a large audience to explore and a flexible ad to play with. This is the fastest and most effective way to find out what works. Once you've run this for a week or two and used the "Breakdown" report to identify your winning video and your top 2-3 headlines and text variations, you can then take those winning *components* and launch a new "Scaling" campaign, confident that you're leading with your strongest creative.
You'll need a better way to measure success...
Okay, so you're running your tests more efficiently. But what are you actually optimising for? You mentioned a high ROAS, which is great, but ROAS on its own can be a vanity metric. A 10x ROAS on a £5 sale is nice, but it's only £50 in revenue. A 3x ROAS on a £1,000 service is £3,000 in revenue. To make truly smart decisions about your ad spend, you need to know the one metric that matters more than any other: Customer Lifetime Value (LTV).
The question isn't "How cheap can I get a sale?". The real question is "How much can I afford to *pay* to acquire a customer and still be highly profitable?". Knowing your LTV answers this question and gives you the confidence to invest properly in your advertising.
Your LTV is the total amount of profit you can expect to make from a single customer over the entire course of their relationship with your business. When you know that number, you can stop panicking about day-to-day fluctuations in your ad costs. I've worked with B2B SaaS clients where a lead costs over $100, which sounds terrifying. But when we calculated their LTV was over $15,000, that $100 CPL suddenly looked like an incredible bargain. They were printing money.
Calculating it isn't as hard as it sounds. You just need three numbers:
1. Average Revenue Per Account (ARPA): How much does a typical customer spend with you per month (or year, if it's a one-off purchase business model)?
2. Gross Margin %: What's your profit margin on that revenue after accounting for the cost of goods sold?
3. Monthly Churn Rate: What percentage of your customers do you lose each month?
The formula is: LTV = (ARPA * Gross Margin %) / Monthly Churn Rate
To make this easier, I've built a simple interactive calculator for you below. Play around with the sliders to see how small changes in your business metrics can have a massive impact on the value of each customer, and therefore how much you can afford to spend to get a new one.
Interactive Customer Lifetime Value (LTV) Calculator
We'll need to look at your messaging components...
Now that you have the right structure and measurement in place, we can talk about the actual *content* of your ads. Having a great testing system is useless if the ingredients you're putting into it are bland and uninspired. Your videos are from a trusted messenger, which is a fantastic start – that builds instant authority. But the headlines and text you test alongside them need to be just as powerful.
Don't just test slight variations of the same idea. You need to test fundamentally different psychological angles. Your goal is to find the core message that truly resonates with your ideal customer's deepest pains and desires. Most businesses write copy that describes what their product *is*. You need to write copy that describes what your product *does* for the customer. It's about transformation.
Here are a couple of powerful copywriting frameworks we use for our clients that you should test:
1. Problem-Agitate-Solve (PAS): This is a classic for a reason. It's incredibly effective for audiences who are aware they have a problem but haven't found a solution yet.
-> Problem: Start by calling out the exact pain point they're feeling. Be specific. "Struggling to get consistent leads from your ads?"
-> Agitate: Pour a little salt in the wound. Describe the negative consequences of that problem. "Are you tired of wasting money on campaigns that start strong and then fizzle out, leaving you back at square one?"
-> Solve: Introduce your product or service as the clear, simple solution. "Our 3-step video ad framework turns unpredictable results into a reliable sales machine. Watch the short video to see how."
2. Before-After-Bridge (BAB): This framework is fantastic for selling a desirable outcome or a new opportunity.
-> Before: Paint a picture of their world right now, with all its frustrations. "Your ad account looks like a battlefield. Dozens of confusing campaigns, duplicated ad sets, and no clear idea of what's actually working."
-> After: Paint a picture of the dream outcome. "Imagine logging in and seeing one, single campaign, driving profitable sales day in, day out, automatically."
-> Bridge: Position your offer as the bridge that gets them from the 'Before' state to the 'After' state. "This video reveals the simple campaign structure that makes it possible."
When you're writing your 5 headlines and 5 primary texts to load into your Dynamic Creative ad, don't just write 5 versions of the same thing. Write one using the PAS framework. Write another using BAB. Write one that's purely benefit-driven. Write another that uses a surprising statistic. Write one that asks a provocative question. This gives the algorithm a truly diverse set of messages to test, increasing the odds that it will find a powerful combination that really works.
Here's an example of how you could structure these different angles in a table to keep your testing organised.
| Copy Angle | Example Headline | Example Primary Text |
|---|---|---|
| Problem-Agitate-Solve | Ad Spend Leaking Money? | Tired of campaigns that don't deliver? You're not alone. Most businesses waste 50% of their ad budget on the wrong audience. This is how you fix it... |
| Before-After-Bridge | From Unpredictable to Profitable | Imagine knowing every £1 you put into ads would bring back £5. That's not a dream. Our framework is the bridge to get you there. |
| Direct Benefit | Slash Your Acquisition Costs | For one client, we reduced their cost to acquire a new user from £100 to just £7. Watch this short video from [Messenger Name] to see the simple principles we applied. |
| Social Proof / Authority | The Framework Behind $115k in Sales | This is the exact testing strategy we used to help an online course creator generate $115k in revenue in just six weeks. And you can implement it in the next 10 minutes. |
| Question / Intrigue | Is Your Ad Agency Hiding This? | There's one simple setting inside Ads Manager that can instantly improve performance, but most people don't know about it. Are you one of them? |
I've detailed my main recommendations for you below:
I know that's a lot to take in. It's a fundamental shift from the old way of doing things. To make it a bit more concrete, here is a step-by-step plan that summarises everything we've talked about. This is the exact process we'd use if we were taking over your account today.
| Phase | Action Item | Why It's Important |
|---|---|---|
| Phase 1: Foundation | Calculate Your LTV: Use the calculator and your business data to find your Customer Lifetime Value and target Cost Per Acquisition. | This gives you a clear, data-backed budget for your advertising and tells you what a 'good' result actually is. Without it, you're flying blind. |
| Phase 2: Structure | Build a CBO Campaign: Create ONE new campaign with Campaign Budget Optimisation turned ON. Inside, create ONE broad ad set. | This consolidates your budget and data, giving the algorithm maximum power and flexibility to find the cheapest conversions for you. |
| Phase 3: Creative | Launch a Dynamic Creative Ad: Inside your ad set, create ONE ad and enable the "Dynamic Creative" option. Load all your assets: 3 videos, 5 unique headlines, 5 unique primary texts. | This is the engine of your testing. It lets the system do the heavy lifting of finding the best-performing combinations of your creative assets. |
| Phase 4: Analysis | Wait & Analyse with Breakdown: Let the campaign run for at least 7 days (or until you have ~50 conversions). Then, use the 'Breakdown' -> 'By Dynamic Creative Asset' report. | This is how you get your answers. The data will clearly show you which video, which headlines, and which copy blocks are your top performers. |
| Phase 5: Iterate & Scale | Launch a "Winners" Campaign: Take your single best video and your top 2 headlines/texts. Create a NEW campaign with this proven combination to scale your ad spend confidently. | You're no longer guessing. You're scaling with creative that has been proven by real-world data to be the most effective, which massively de-risks increasing your budget. |
This systematic approach takes the guesswork and emotion out of advertising. It turns it into a process. You're not hoping for a lucky break with a 'magic' ad; you're building a machine that consistently identifies and capitalises on what works.
This is obviously just a starting point, and there are many more layers of complexity we could get into, from audience strategy to pixel optimisation and funnel design. But getting your core testing methodology right is the most important first step, and it's where most people go wrong.
Running this process effectively requires a good deal of experience – knowing how to interpret the breakdown data, when to kill an underperforming asset, and how to structure the next iteration of tests. It's a continuous cycle of learning and optimising. This is, of course, where having an expert partner can make a significant differance, helping you navigate the complexities and get to profitability much faster.
I hope this detailed breakdown has been genuinely helpful for you. If you'd like to chat through your specific situation and have us take a look at your ad account, we offer a completely free, no-obligation initial strategy session. It's a chance for us to properly dig in and give you some more tailored advice.
Regards,
Team @ Lukas Holschuh