A recurring theme I hear among paid social pros is the measurement discrepancy between Facebook and Google Analytics.
People seem to have a really hard time matching up what’s happening with campaigns on Facebook and subsequent engagement, as measured by their analytics platform of choice (including – but not limited to – Google Analytics).
Some people go as far as claiming that this is systematic click fraud perpetrated by Facebook. This is highly unlikely, for all kinds of obvious reasons. However, there is a clear problem.
The good news is that you can do something about it.
Third party tools use different measurement techniques, so aligning two platforms perfectly is pretty much impossible. But you can get much closer.
If your data looks way off, then it’s probably linked to at least one of the following reasons. Check and doublecheck to see if you narrow the measurement gap.
Do let me know if there are other reasons for the discrepancies in the comments section below, or if any of these things help you.
1) Your measurement window is screwy
By default Facebook sets the window to one-day after an ad was viewed, or a 28-day post-click window. If this isn’t what you want to track, then you should change your settings.
2) You’re measuring conversions on a last click basis
This isn’t how Facebook does it, so you’re measuring different things.
Facebook says:
“Third-party platforms capture conversions that resulted from a direct referral. Even if you created a campaign using URL parameters to track the link used in your ad, third parties identify those last-clicks, or linear conversions, where the user clicked an ad and immediately converted.”
Facebook aside, it’s definitely time to start attributing performance across channels, rather than focusing on the last click in the user journey.
3) You have pixel implementation issues
Perhaps your tech setup isn’t working properly? Facebook points at the following possible issues and tests to try:
- Check whether your raw pixel fires match.
- If you’re using conditional firing, expect a small discrepancy.
- Check your currency, decimal places and other variables in your purchase event codes.
- Use the Pixel Helper tool to check for duplicate pixel fires
- Check that you’re not “pixel piggybacking on existing tag managers”.
4) You’re comparing clicks with apples
Total ‘clicks’ as reported by Facebook includes a whole bunch of stuff that you may not be aware of. You might think a click is all about the link in the ad, but you’d be wrong.
For example, as well as the links in your ads, Facebook clicks include things like expanding descriptions, clicking to read comments, page likes, as well as post likes, comments and shares. There are others too.
As such, focusing on clicks is perhaps not the best measure of direct engagement and campaign performance.
5) You’ve got cross-device tracking issues
Conversions on mobile devices aren’t always tracked by cookie-based analytics platforms. Mobile apps and browsers often refuse to cooperate with one another when it comes to cookies.
Facebook provides ‘cross-device reports’ that will help improve visibility in this area, but the likelihood is that there are always going to be issues.
6) You’ve got Google (or other) Analytics issues
It’s always worth sense checking your analytics setup whenever discrepancies show up. Is everything working as it should be?
Sometimes a simple website change that nobody told you about can lead to a savage skewing of your numbers. Advertising is occasionally the canary in the coal mine.
7) You’re not tracking your links properly
It goes without saying that you’re probably using Google’s URL builder to help track campaigns. If you’re not, then take a look at how it works.
It’s worth adding that I share these butt ugly links on a daily basis on channels that they are not attributed to, so this is also an imperfect solution.
For example, people post links on Twitter that are often appended with 100+ characters of tracking code. This is all very well, until you see ‘medium=email’ or some other non-social campaign identifier in there.
The starting point in the above example appears to be a daily email newsletter. The end point, as far as my click is concerned, was Twitter.
As such I’m increasingly questioning how these links are actually attributed. We need to understand how these links are shared, and to stop overstating the performance of email (and similar ‘campaign’ channels).
Supersized doses of salt are needed when tracking performance in this manner. The smartest people in the room take a wider view on attribution.
8) You’ve got a slow-ass website
If your site takes an age to power up then you’re going to lose people between the ad click and the page loading. The longer it takes, the more people will bail out.
If your analytics tags don’t fire immediately then there’s going to be a reporting discrepancy.
In addition, some users might click on an ad twice if all they see is a blank loading page the first time around (and then press the back button, and try again). Double trouble.
9) Your users are disabling JavaScript en masse
Ok, so they’re probably not, but this will account for some of the measurement gap.
A couple of years ago the GDS found that 1.1% of web users weren’t running JavaScript, either by accident or design. Since most analytics platforms use JavaScript it could account for a small discrepancy.
10) Your time settings are not in sync
You might want to check that both Facebook and your analytics platform are both set to the same time zone and reporting period.
11) You’re using filters
Analytics tools allow you to ignore certain data. Make sure that you’re not excluding a large chunk of people who engage with your advertising, whether that’s on Facebook or elsewhere.
What else is worth testing? Any other comments or suggestions?
via Search Engine Watch http://ift.tt/1TnEDpB