TLDR;
- Your app install tracking problem isn't just a technical glitch; it's likely a symptom of misaligned campaign objectives and a leaky user journey. Simply "fixing" the numbers won't fix the underlying issue of acquiring valuable users.
- Stop optimising for "App Installs". You're telling Meta to find you the cheapest installs, not the best users. You need to switch to App Event Optimisation (AEO) to target users who will actually complete valuable actions like registering or subscribing.
- The discrepancy between Meta's data and your app store's data is normal. It's caused by different attribution models (e.g., view-through vs. click-through) and the impact of iOS14+. You need to learn to analyse the trends, not chase perfect 1:1 matching.
- You must map and analyse your entire funnel from ad click to a valuable in-app action. We've included an interactive App Funnel Drop-off Calculator in this letter to help you pinpoint exactly where you're losing potential customers.
- A proper technical setup using both the Meta SDK and the App Events API (server-to-server) is non-negotiable for getting the most reliable data possible in a post-iOS14 world. We've included a flowchart to illustrate how this works.
Hi there,
Thanks for reaching out!
I've had a look over your situation with tracking app installs from your Meta ads. It's a really common headache, so you're definately not alone. A lot of people get bogged down in the data discrepancies and assume the platform is just broken, but the truth is usually a bit more complicated and often points to some bigger opportunities for your campaigns.
So, I'm happy to give you some initial thoughts and guidance based on what we've seen work for our app clients. The key is to shift your thinking from just "tracking installs" to "acquiring valuable users". Once you do that, a lot of these data issues become less of a problem and more of a signpost telling you where to optimise.
Let's debunk the 'tracking is just broken' myth...
First things first, let's get the technical side sorted. Tbh, when people say Meta's tracking is "off", what they usually mean is their setup isn't robust enough for how mobile advertising works today, especially with Apple's privacy changes.
You can't just rely on one method. You need a combination of tools to get the clearest possible picture. The three main pillars are:
1. The Meta SDK (Software Development Kit): This is a bit of code you integrate directly into your app. It's the primary way your app communicates with Meta, sending data about in-app events (like 'install', 'registration completed', 'purchase', etc.) directly from the user's device. It’s pretty reliable for Android and for iOS users who opt-in to tracking.
2. The App Events API (Conversions API or CAPI): This is a server-to-server connection. Instead of the data coming from the user's phone, it comes directly from your server to Meta's server. Why is this so important? Because it's not affected by things like ad blockers or iOS privacy settings in the same way the SDK is. It's your backup and your source of truth. It helps fill in the gaps, especially for iOS users. A lot of businesses neglect this, and its a costly mistake.
3. Apple's SKAdNetwork (SKAN): For iOS 14.5+ users who opt out of tracking, this is Apple's own privacy-friendly framework. It sends install attribution data back to ad networks like Meta, but it's aggregated, delayed, and limited. You don't get rich, user-level data. You get a signal that says "an install happened that can be attributed to this campaign," but not much more. It's clunky, but it's what we have to work with for a large chunk of iOS traffic.
When you have all three working together, you create a more resilient tracking system. The SDK catches what it can, the App Events API fills in the blanks, and SKAN gives you a baseline for opted-out iOS users. If you're only using the SDK, you're flying half-blind. The data discrepancy you're seeing isn't an error; it's the bit of the iceberg that's underwater, the part your current setup can't see.
To make it clearer, here is how the data flows through these systems. You can see how having both the SDK and the API creates two seperate paths for data to reach Meta, making it much more reliable.
(Meta Platform)
(iOS/Android)
(Data from Device)
(Data from Your Server)
You're probably looking at the wrong numbers...
Even with a perfect technical setup, the number of installs Meta reports will almost never match what you see in App Store Connect or the Google Play Console. This drives founders crazy, but it's completely normal. You need to stop trying to make the numbers match perfectly and start understanding why they're different.
The main culprit is the attribution model. They are just measured differently.
- Meta Ads Manager uses a default 7-day click, 1-day view attribution window. This means if someone clicks your ad and installs within 7 days, OR sees your ad (without clicking) and installs within 1 day, Meta counts it as a conversion. They are trying to capture the full influence of your ad.
- App Store Connect / Google Play Console typically use a last-click model over a longer period (e.g., 30 days). They only care about the final touchpoint that led the user to the store. They have no idea if a user saw your Meta ad a day before.
So, you have a situation where Meta might claim an install because someone saw your video ad yesterday, then searched for your app today and installed. Meta says, "Our ad influenced that!" The App Store says, "That was an organic search install." Both are technically correct from their own perspectives.
The key isn't to reconcile every single install. It's to look at the trends. Is Meta reporting a steady increase in installs while the app store shows a flat line? Then you might have a problem. Are both trending up together, even if the absolute numbers are off by 20-30%? That's what success looks like. The discrepancy is just the cost of doing business in a multi-channel world.
Here’s a typical view of what this discrepancy looks like over a week. Notice how the shape of the trend is similar, but the numbers are consistently different. Meta's view-through attribution often inflates the numbers compared to the store's stricter click-based model.
I'd say your campaign objective is part of the problem...
Now, let's get to the real meat of the issue. You could have the most perfectly accurate tracking in the world, but if you're telling Meta to chase the wrong goal, you're just getting perfect data on a failing strategy. I'd wager you're running "App Install" campaigns (MAI). This is one of the biggest and most common mistakes I see.
When you set your campaign objective to "App Installs," you give Meta's algorithm a very simple command: "Find me the cheapest possible people who will install this app."
And the algorithm, being the ruthlessly efficient machine it is, does exactly that. It finds users who are prone to downloading apps but may never open them. They are not in demand by other advertisers for more valuable actions (like making a purchase), so their attention is cheap. You are actively paying Meta to find you the lowest-quality audience for your product.
The solution is to change your objective. You need to run App Event Optimisation (AEO) campaigns. With AEO, you tell Meta to optimise for a specific action that happens after the install. This could be:
- CompleteRegistration
- StartTrial
- Subscribe
- AddToCart (for e-commerce apps)
- Purchase
By doing this, you're now asking the algorithm a much better question: "Find me people who are not just likely to install, but also likely to complete a registration." This is a completely different audience. They're more expensive to reach, yes, but they're infinitely more valuable to your business.
I remember one of our clients, an e-learning app, was getting installs for under £1. They were thrilled. But their user activation rate was terrible. Hardly anyone completed the first module. We switched their campaigns from MAI to AEO, optimising for 'ModuleComplete'. The Cost Per Install shot up to £3. But the cost to get a user to complete the first module dropped by over 70%. Their business started growing properly for the first time. That's the difference.
Look at this comparison. It's a simplified model but it shows the typical tradeoff you see when shifting from MAI to AEO.
| Metric | Campaign Type: App Installs (MAI) | Campaign Type: App Event Optimisation (AEO) |
|---|---|---|
| Primary Goal | Lowest Cost Per Install | Lowest Cost Per Key In-App Action |
| Ad Spend | £1,000 | £1,000 |
| Reported Installs | 1,000 | 400 |
| Cost Per Install (CPI) | £1.00 | £2.50 |
| Users Who Register | 100 (10% activation rate) | 200 (50% activation rate) |
| Effective Cost Per Registration | £10.00 | £5.00 |
The MAI campaign looks better if you only look at the Cost Per Install. But the AEO campaign is twice as efficient at achieving the actual business goal: getting active, registered users.
We'll need to look at your entire user journey, not just the click...
Fixing your tracking and campaign objectives are big steps. But the real gains come from looking at the entire process from the user's perspective. Your ad campaigns don't exist in a vacuum. They're just the front door. If the rest of the house is a mess, it doesn't matter how many people you get through the door.
You need to map out and measure every single step of the funnel:
Ad Click → App Store Page View → App Install → App Open → Onboarding Completed → Registration → First Key Action
A drop-off at any of these stages is costing you money. The problem you describe as a "tracking discrepancy" could actually be a massive leak in your funnel that you're just not seeing.
- Big drop from Clicks to Page Views? Your ad creative might be setting the wrong expectations, or your deep link might be broken. People click, but they never even make it to the app store.
- Big drop from Page Views to Installs? This is a classic app store optimisation (ASO) problem. Your app icon, screenshots, video preview, description, or reviews are putting people off. They get to your store page and think, "nah."
- Big drop from Installs to Opens/Registrations? This is an onboarding issue. The first-time user experience is confusing, slow, buggy, or asks for too much information upfront. You've got them to install the app, and you're fumbling the introduction.
You need to diagnose where your biggest leak is and focus all your energy there first. To help, I've put together a little interactive calculator. Plug in your own numbers from your analytics and see where your funnel is breaking down. This will be far more valuable than trying to match Meta's install numbers to the app store.
Your Funnel Performance
You'll need a solid testing structure to find what actually works...
Once your tracking is solid, your objective is right, and your funnel is plugged, then it's time to talk about the ads themselves. You need a methodical way to test audiences and creatives to constantly improve performance. Just "boosting posts" or running one ad set won't cut it.
For audiences, we usually prioritise them in an order that reflects user intent. For an app, it would look something like this:
- Retargeting (BoFu - Bottom of Funnel): People who installed but didn't register, or registered but didn't subscribe. These are your warmest leads. You need dedicated campaigns to bring them back.
- Lookalike Audiences (ToFu - Top of Funnel): This is where the scale comes from. Create lookalikes based on your most valuable users. Don't just make a lookalike of "all installs". Create one from a source list of your paying subscribers, or users who have been active for 30+ days. This tells Meta to find more people like your best customers, not just any customer. This is a subtle but hugely powerful shift. We've seen this alone cut acquisition costs in half for clients.
- Detailed Targeting (ToFu): This is where you test interests and behaviours. Think about what other apps, tools, or media your ideal customer uses. Target those. But this should always be tested against your high-quality lookalikes.
For creative, you've got to be relentless. We've worked on campaigns for SaaS apps where we saw UGC (User-Generated Content) style videos outperform slick, polished studio productions by a mile. People respond to authenticity. Test different hooks in the first 3 seconds, different calls to action, different formats (video vs. image vs. carousel). One of our software clients saw a drop from a £100 CPA to just £7 after we implemented a rigorous testing structure for both their audiences and their ad creative. The initial problem looked like "bad leads from Google", but the solution was a complete overhaul of their testing methodology on Meta.
The goal is to build a system where you are always learning and always have a new "champion" ad ready to go when your current best-performer starts to fatigue. Without that system, any success you have will be short-lived.
This is the main advice I have for you:
To pull all of this together, here’s a summary of the steps I’d recommend you take. This isn't a quick fix, but a systematic approach to building a scalable and profitable user acquisition engine for your app.
| Problem Area | Recommended Action | Expected Outcome |
|---|---|---|
| 1. Incomplete Tracking | Implement both the Meta SDK and the App Events API (server-to-server). Ensure all key in-app events (registration, purchase, etc.) are configured to fire correctly. | More reliable and complete data, enabling better optimisation and filling the gaps left by iOS14+. |
| 2. Data Misinterpretation | Stop trying to perfectly reconcile Meta and App Store numbers. Instead, analyse directional trends and accept a 20-30% variance as normal due to different attribution models. | Reduced frustration and a clearer understanding of overall campaign performance and impact. |
| 3. Wrong Campaign Objective | Pause "App Install" (MAI) campaigns. Shift budget to "App Event Optimisation" (AEO) campaigns, optimising for a valuable post-install action like 'Registration'. | Higher quality user acquisition. Your Cost Per Install will rise, but your cost for a valuable user will fall significantly. |
| 4. Leaky User Funnel | Use analytics to map your full funnel (Click -> Store View -> Install -> Open -> Register). Identify the single biggest drop-off point and focus on fixing it (e.g., improve app store page, simplify onboarding). | Improved overall conversion rate, making every ad pound you spend more effective and lowering your true customer acquisition cost. |
| 5. Ad-hoc Testing | Create a structured testing plan. Prioritise Lookalike audiences of your best users (e.g., subscribers) and systematically test different creative angles (especially UGC-style video). | Consistent performance improvements and the ability to scale your campaigns without performance degrading. |
As you can probably see, solving this goes quite a bit deeper than just fixing a tracking number. It involves a strategic shift in how you approach user acquisition, touching on your tech setup, your campaign strategy, your user experience, and your creative process.
Getting all these pieces right and keeping them optimised takes a lot of time and very specific expertise. It's not something you can just set and forget. This is why many app developers decide to work with a specialist agency. We've been through this process with dozens of apps, like the one we took to 45k+ signups, and we've developed the systems to diagnose these issues quickly and implement the solutions effectively.
If you'd like to have a chat about your specific situation, we offer a free, no-obligation 20-minute strategy session where we can look at your ad account and app funnel together and give you some more tailored advice. It might be a good next step to get some clarity on how to move forward.
Hope this helps!
Regards,
Team @ Lukas Holschuh