Advanced analytics techniques to measure PPC

I got the opportunity to present at SMX Advanced 2024 on a topic I live and breathe every week with my agency’s clients: advanced PPC analytics and measurement.

I’ve been in digital marketing for over 20 years, and in my (rather biased) opinion, intelligent analytics have never been more of a differentiating factor in marketing campaigns.

Privacy-focused regulations have made measurement more complicated and AI has stripped marketers of many ways to uplevel their performance within channels. The proliferation of channel options means that seeing the whole picture is imperative.

With that said, if you didn’t get the chance to catch the presentation and/or don’t have time to watch the recording, I’ll break down five main takeaways:

  • Consent Mode v2 puts even more control in the hands of users.
  • The shift from observed to modeled data is at hand.
  • Back-end data implementation is crucial.
  • It’s time to embrace non-cookie measurement options.
  • Proxy metrics can fill gaps when data is scarce.

In 2012, the ePrivacy Directive (a European Union cookie law) required websites to obtain consent from visitors before storing/accessing information on their devices. That was the precursor to 2018’s watershed GDPR and, to a lesser extent, CCPA privacy regulations. 

In 2020, Google introduced Consent Mode v1, which allowed website owners to adjust behavior for Google tags based on users’ consent status and, therefore, comply with GDPR and CCPA.

What new wrinkles does Consent Mode v2 bring? Essentially, this update requires that end users be told how to revoke consent to ads personalization and enables anonymous tracking.

Users get more control of their personal data, including the ability to amend preferences. Google gets anonymous visitor information without using cookies or other tracking information, instead using so-called “cookieless pings” for more accurate data modeling.

2. The shift from observed to modeled data

In plain speak for advertisers:

You won’t be able to track what individual users are doing, so you’d better prepare to zoom out and work more effectively with a big-picture perspective with conversion modeling. 

Part of that initiative is implementing Consent Mode v2 because you’ll have less accurate reporting from limited conversion tracking without it. You’ll also have fewer and thinner audience insights, which can impact audience segmentation and targeting.

Conversion modeling, based on those “cookieless pings,” requires a daily ad click threshold of 700 ad clicks over a seven-day period per country and domain grouping.

Its function is to examine how many of the unconsented clicks lead to conversions. It’s not perfect, but it does a good job of patching the data gap of overall conversions in a campaign, even if unconsenting users aren’t tracked individually.

In the example below, the advertiser has a consent rate of 50% but only a 19% drop in conversions (12 out of 62) and an 18% conversion rate uplift from conversion modeling.

The shift from observed to modeled dataThe shift from observed to modeled data

The upshot is that if you implement both Consent Mode v2 and conversion modeling (the mechanics of those are too lengthy to cover here), you’ll stay compliant and mitigate much of the resulting data loss.

Dig deeper: 4 ways to check your website’s Google consent mode setup


3. Back-end data steps to the fore

Once user data gets into your CRM, it’s first-party data that you own and control – and, as you’ve probably heard ad nauseam in the last 18 months or so, its value in today’s privacy-first landscape is through the roof.

Why? You can use it for lookalike targeting, remarketing, feeding it back into bidding and targeting algorithms of the major ad platforms to train them to find your best users, etc. 

Whatever your vertical, your CRM should be set up to capture data that allows you to segment your users into buckets with different values. 

For ecommerce, that can be according to LTV; for B2B and lead gen, that can be by stages of qualification: MQL, SQL, Opps, Closed-won. 

Segmenting that data allows you to feed specific segments into the algorithms, which is handy when you don’t have a ton of data density.

For instance, if you don’t have enough Opps over a specific time period to effectively train the algorithms, combine SQLs and Opps to hit the volume you need while keeping user quality high.

Yes, Google scrapped its plans to deprecate third-party cookies in Chrome.

But cookie tracking is inherently flawed (users opting out, no cross-browser tracking, etc.).

Two non-cookie measurement options I frequently use to analyze client campaigns are: 

  • Geo-lift testing.
  • Media mix modeling.

Geo-lift testing has one big con (it only works for a single channel at a time), but has many pros:

  • It does not rely on cookies or even ad clicks (because impressions carry value as well).
  • It is available across most platforms (GA4, Shopify, Salesforce, etc.).
  • You can use it to run experiments in specific geos (DMAs, states, countries).
  • Google’s Causal Impact R package can be used to detect any effect in the test group over time.
  • It helps measure incremental revenue (sessions, new users, purchases, leads) from your marketing dollars.

Media mix modeling (MMM), on the other hand, measures historic, holistic channel contributions to help advertisers adjust their budget allocation across channels to achieve better overall performance.  

Like geo-lift testing, it has a healthy list of pros:

  • It does not rely on cookies or even ad clicks.
  • It only requires channel spend and aggregated revenue from your back-end data sources (GA4, Shopify, Salesforce, etc.).
  • It allows advertisers to estimate the contribution of every single channel.
  • Advertisers can use Meta’s Robyn to account for seasonality and latency.

MMM is a bit more complicated than geo-lift testing, though, and carries the following cons:

  • It requires 2 years of data.
  • It requires some fine-tuning.
  • Its implementation relies on R or Python knowledge.

Even with cookies still on the table, both of these measurement methods have consistently produced solid performance and efficiency gains for our clients.

Dig deeper: How to evolve your PPC measurement strategy for a privacy-first future

5. When data is scarce, use proxy metrics

Because we’re adjusting the way we look at data, I’d like to discuss proxy metrics. Those are “soft” metrics that indicate strong engagement and/or are strong predictors of meaningful actions. 

Because they occur earlier in the customer journey, there are more of them. While they carry less value than down-funnel engagements, understanding that proportion can help you use proxy metrics with decent precision.

For instance, let’s say you don’t have enough form fills to feed into Google’s bidding algorithm. You can do some analysis to understand that roughly one in four users who view the page with that form end up converting into a lead. 

If you know the average value of those leads, you can use the ratio of pageviews to form fills to calculate the average value of a pageview. Then, you would:

  • Create a proxy metric in GTM.
  • Bring the proxy metric into Google Ads, hardcode the nominal value, and collect data for 2-4 weeks.
  • Make the proxy metric a primary event so it is used for bid optimization.
  • Measure impact.

Fluency with proxy metric usage is a great way to keep your campaign analysis nimble despite inconsistent data density.

Dig deeper: 5 outdated marketing KPIs to toss and what to reference instead

Advanced analytics strategies for modern PPC campaigns

Here’s a big silver lining: most of what we just discussed will be relevant no matter what happens with cookies. 

My strong recommendation is to roll up your sleeves and get familiar with all of these initiatives now because every single one of them has already produced significant competitive advantages for our clients.

WATCH: Advanced analytics techniques to measure PPC performance

Here’s my full presentation from SMX Advanced:

Contributing authors are invited to create content for Search Engine Land and are chosen for their expertise and contribution to the search community. Our contributors work under the oversight of the editorial staff and contributions are checked for quality and relevance to our readers. The opinions they express are their own.

Leave a Reply

Your email address will not be published. Required fields are marked *