Hi there,
Thanks for your enquiry! It's a really common point of confusion, especially with all the conflicting advice out there. It's good that you're questioning it, because that advice you heard about picking creatives with the highest CPM/CPC is, to be blunt, completely backwards and will likely lead you to burn through your budget.
I'm happy to give you some of my thoughts on this. It really comes down to focusing on the metrics that actually drive business results, not vanity metrics or misleading indicators. Let's clear this up for you.
TLDR;
- The advice to choose a "winning creative" based on the highest CPM or CPC is wrong. These are cost metrics, not performance indicators. You want the lowest cost for the desired action.
- CPM (Cost Per 1,000 Impressions) is an auction outcome, influenced by your audience's competitiveness, not a direct measure of your creative's quality. A high CPM can simply mean you're targeting a valuable (and expensive) audience.
- The only true measure of a "winning creative" is its ability to generate your desired outcome—sales, leads, signups—at the lowest possible Cost Per Action (CPA) or highest Return On Ad Spend (ROAS).
- Stop using Engagement campaigns to test creatives for conversion goals. You're training the algorithm to find people who like and share, not people who buy. Test your creatives in a conversion campaign from day one.
- This letter includes a diagnostic flowchart to help you analyse creative performance and a fully functional Creative KPI Calculator to help you measure the metrics that truly matter.
Let's dismantle that 'highest CPM' advice... it's mostly nonsense.
I want to tackle this head-on because it’s a piece of "guru" advice that sounds clever but falls apart under any real-world scrutiny. Your intuition is spot on: why would you want to pay *more* for something if you can pay less? You wouldn't.
Your understanding that CPM fluctuates based on supply and demand for ad placements is exactly right. Think of CPM as the 'rent' you pay for the digital real estate to show your ad to 1,000 people. That rent is determined by a few key factors:
- -> Audience Competiton: Are you trying to target high-value B2B decision-makers or a very popular interest group? If so, lots of other advertisers are too, which drives up the auction price (and your CPM). This isn't a bad thing; it just means you're fishing in a well-stocked (and crowded) pond.
- -> Seasonality: Trying to advertise during Black Friday or Christmas? Your CPM will skyrocket because every retailer on the planet is bidding for attention.
- -> Ad Quality & Relevance: This is where it gets nuanced. Meta does give a slight edge to ads that are getting good engagement, which can *lower* your CPM a bit. But this effect is minor compared to the auction pressure from your audience choice.
So, a high CPM doesn't indicate a "winning creative." It most often indicates you're targeting a competitive, and potentially very valuable, audience. A low CPM could mean you've found an untapped niche, or it could mean you're targeting an audience nobody else wants because they don't buy anything. On its own, CPM tells you almost nothing about whether your ad creative is any good. Judging a creative by its CPM is like judging a car's performance by the price of petrol in the area. The two are related, but one is not a reliable indicator of the other's quality.
The same logic applies to CPC (Cost Per Click). CPC is a result of your CPM and your Click-Through Rate (CTR). A high CPC is not a sign of a winner; it's a sign you're paying a lot for each click, which is generally something you want to avoid.
You're using the wrong tool: why testing in an engagement campaign is a fundamental error.
This is probably the most important bit of advice I can give you. When you run a campaign with the objective set to "Engagement," you are giving Meta's algorithm a very specific, and very literal, instruction: "Go and find me the people inside my target audience who are most likely to like, comment, or share this post, for the lowest possible cost."
The algorithm is incredibly good at its job. It will dutifully ignore the people who might actually buy your product and instead seek out the 'click-happy' users. These users' attention is cheap precisely because they don't convert, so other advertisers aren't bidding as much for them. You are actively paying Meta to find you an audience of non-customers.
I've seen this in so many accounts I've audited. A creative gets thousands of likes in an engagement test, the advertiser gets excited, and then they move it to a conversion campaign where it completely bombs. Why? Because the creative was optimised to appeal to people who engage, not people who purchase. The two groups are often very, very different.
The only way to find a creative that drives conversions is to test it in a campaign that optimises for conversions. You need to tell the algorithm from the start, "My goal is a purchase" or "My goal is a lead." The system will then show your ads to the subset of users within your audience that its data suggests are most likely to actually take that action. The creative that performs best with *that* high-intent audience is your true winner.
So, what should you actually be looking at? A better framework for analysing performance.
Instead of getting tangled up in misleading metrics, you need a clear hierarchy of what to look at. Think of it like a doctor diagnosing a patient. You start with the most important vital sign, and only then do you look at secondary indicators to understand the 'why'.
Your #1 Metric: The Outcome.
This is your North Star. It's either your Cost Per Action (CPA) or your Return On Ad Spend (ROAS). How much did you pay to get one sale? How much revenue did you get back for every pound you spent? Nothing else matters more than this. A creative could have a terrible CTR and a high CPC, but if it's bringing in sales at a profitable CPA, it's a winner. End of story.
Your Diagnostic Metrics: The 'Why'.
If your CPA is too high or your ROAS is too low, you then look at these secondary metrics to figure out what's broken.
- -> Click-Through Rate (CTR): This measures what percentage of people who see your ad actually click on it. It's a great indicator of how well your creative is grabbing attention and resonating with your audience. A low CTR often means your image/video or your headline isn't compelling enough.
- -> Conversion Rate (CVR): This measures what percentage of people who *click* your ad go on to convert on your website. This isolates the performance of your landing page. If you have a high CTR but a low CVR, it suggests your ad is making a promise that your landing page isn't keeping.
This diagnostic process is crucial. You can't just look at one number in isolation. The relationship between them tells the full story. I've mapped this out for you in a flowchart to make it clearer.
Launch Creative Test in Conversion Campaign
Analyse CPA / ROAS
Scale Winning Creative
Diagnose Issue
Check CTR
Your CVR is likely low. Improve page copy, offer, or UX.
Your ad isn't stopping the scroll. Test new visuals, hooks, or copy.
Here's a better way to structure your creative tests.
So, how do you put this into practice? You need a simple, repeatable structure for testing that gives the algorithm the best chance to find winners. Forget complex setups with dozens of ad sets. Simplicity is your friend.
I usually recomend a structure like this for new accounts or new tests:
| Level | Setup & Settings | Purpose |
|---|---|---|
| Campaign | Objective: Conversions (e.g., Sales or Leads) Budget: CBO (Campaign Budget Optimisation) enabled |
Sets the overall goal for the algorithm and allows Meta to automatically allocate budget to the best-performing audience. |
| Ad Set 1 | Audience: Broad Targeting OR your best Interest-based audience. | Tests your creatives against a cold audience to find what resonates with new potential customers. |
| Ad Set 2 | Audience: Retargeting (e.g., Website Visitors in last 30 days). | Tests your creatives against a warm audience that already knows you. Different messaging might be needed here. |
| Ads (within each Ad Set) | - Ad 1: Video Creative - Ad 2: Carousel Creative - Ad 3: Static Image Creative - Ad 4: User-Generated Content (UGC) style creative |
This is where you place your 3-5 different creative 'hypotheses'. Make them meaningfully different from each other. |
With this setup, you let the campaign run until each ad has had a fair chance to perform (I usually suggest letting it spend at least 2x your target CPA per ad). Then, you go in and analyse the results using the flowchart logic. Turn off the clear losers, and if you have a clear winner, you can either let it keep running or move it into its own "scaling" campaign. The key is you are making all your decisions based on the CPA or ROAS, not the CPM.
To help you with the maths and get comfortable with these KPIs, I've built a small calculator. You can plug in the numbers directly from your Ads Manager to see all the key metrics in one place and start to build an intuition for how they relate to each other.
I've detailed my main recommendations for you below:
To wrap this all up, here is the simple, actionable advice I have for you. If you follow this framework, you'll be far ahead of most people starting out and will avoid wasting money chasing the wrong metrics.
| Principle | Actionable Advice |
|---|---|
| Focus on the Right Objective | Never use an 'Engagement' campaign to test creatives intended for sales or leads. Always test within a 'Conversion' campaign optimised for your true business goal. This is non-negotiable. |
| Identify Your North Star KPI | Your success is measured by Cost Per Action (CPA) or Return On Ad Spend (ROAS). This is the first and last metric you should look at to decide if a creative is a "winner". |
| Use Other Metrics for Diagnosis Only | Treat CPM and CPC as indicators of auction cost, not creative quality. Use CTR and CVR to diagnose *why* a creative's CPA is good or bad, following the flowchart logic. |
| Test Systematically | Use a simple, repeatable testing structure (like the one outlined above). Test 3-5 distinct creative ideas at a time and give them enough budget to prove themselves before making a decision. |
Getting this right isn't just about setting up an ad; it's about understanding the machine you're operating, speaking its language, and making decisions based on data that actually impacts your bottom line. It's a skill, and like any skill, it takes time to develop. Many business owners find that navigating these nuances and optimising campaigns effectively takes up too much of their time and focus.
That's where having an expert partner can make a huge difference. We spend all day, every day inside ad accounts, running these tests, analysing the data, and turning those insights into profitable campaigns for our clients. It allows them to focus on running their business, confident that their ad spend is being managed for maximum results.
If you'd like to have a chat and walk through your own specific situation, we offer a completely free, no-obligation initial consultation. We can take a look at what you're planning and give you some direct, actionable feedback. Feel free to get in touch if that sounds helpful.
Hope this helps!
Regards,
Team @ Lukas Holschuh