All articles
Guide 28 min read

How to Audit Your Conversion Tracking with Claude or ChatGPT

A

Adspirer Team

Share Y
How to Audit Your Conversion Tracking with Claude or ChatGPT
Summary

Yes — Claude and ChatGPT can audit your conversion tracking setup via Adspirer and surface misconfigurations, double-firing pixels, attribution window mismatches, and broken conversion events. The audit pulls live data directly from Google Ads and Meta APIs, compares what’s configured against what’s actually recording, and flags specific discrepancies with conversion action names and event counts attached. It cannot access your GTM container, site code, or server-side tag setup directly — but it catches the majority of real-world tracking problems without ever touching your website.

Something is wrong with your reporting and you can’t figure out what.

Your ROAS went from 4.1 to 6.8 in two weeks but your revenue didn’t move. Or it’s the opposite: ROAS tanked but sales seem fine, and your ad platform is telling you conversions dropped while your CRM says they’re up. Maybe your Meta pixel events are spiking and you can’t explain why. Maybe you launched a new campaign last month, it shows zero conversions, and you’re not sure if it’s the campaign or the tracking.

These are the real r/PPC moments — the ones where you’ve checked everything obvious and still can’t find the disconnect.

The problem is almost always in the tracking setup, and diagnosing it usually means bouncing between Google Ads, Meta Events Manager, GA4, and your tag manager, trying to triangulate across platforms that use different attribution models and measure different things. A process that should take 20 minutes ends up taking two hours — if you find the answer at all.

This guide shows you how to run that diagnostic in a single conversation with Claude or ChatGPT. You’ll connect your ad accounts through Adspirer, run six specific audit prompts, and get back a clear picture of what your tracking setup is actually recording — and what it isn’t.

Info

Adspirer connects Claude and ChatGPT directly to your Google Ads, Meta, LinkedIn, and TikTok accounts — no API keys, no developer setup. Connect your accounts in two minutes and start auditing immediately.

Connect your ad accounts free →


The 5 Most Common Conversion Tracking Problems

These five issues account for the vast majority of “my reporting looks wrong but I don’t know why” situations. The table maps each problem to its symptom, the platform where it lives, and what an AI audit can surface from API data.

Before the table: it’s worth understanding that most tracking problems fall into one of two categories. Configuration problems are when the tracking is set up wrong — micro-conversions counted as purchases, wrong attribution window, view-through conversions inflating ROAS. Implementation problems are when the tracking was configured correctly but the tag isn’t firing — URL changed, code removed, trigger broken.

The six audit prompts below surface both categories. The fixes are different: configuration problems are usually fixable within the ad platform (and often from Claude directly), while implementation problems require touching the site or tag manager.

ProblemSymptomPlatformWhat Claude Can Surface
Tag firing on wrong pageConversion action shows high volume but doesn’t match actual salesGoogle AdsConversion action name, type, last conversion date, and conversion count — lets you compare against expected volume and spot the broken one
Meta Pixel double-firingEvent counts 2–3x higher than expected; CPA looks artificially lowMetaEvent fire counts per day over a date range; events firing at anomalously high rates relative to what a normal funnel would produce
Attribution window mismatchGoogle Ads and Meta report wildly different conversion numbers for the same periodBothAttribution model and window settings per conversion action; surfaces the Google 30-day click vs Meta 7-day click discrepancy explicitly
Missing UTMs breaking source trackingGA4 shows most traffic as Direct; ad platform conversions don’t reconcile with GA4 goalsGoogle Ads / MetaConfirms whether auto-tagging is enabled (Google) and whether UTM parameters are present in Meta URL tracking templates
GCLID not passing to CRMGoogle Ads offline conversion imports failing or empty; offline conversions show zeroGoogle AdsWhether offline conversion imports are configured, last successful import date, and conversion action status for offline import actions
Warning

The attribution window trap — read this before comparing platforms: Google Ads default attribution is 30-day click, 1-day view. Meta default is 7-day click, 1-day view. On a 14-day report window, Google is counting conversions from clicks up to 30 days old while Meta is only counting clicks from the past 7 days. This single difference explains the majority of cross-platform discrepancies you’ll encounter — and it’s not a misconfiguration, it’s a fundamental architectural difference in how each platform counts. Both numbers can be entirely correct under their own model and still look nothing alike. You cannot reconcile Google Ads and Meta conversion numbers directly. The audit prompts below surface each platform’s attribution settings so you know exactly what you’re comparing.


Run the Audit

The process takes about 15 minutes the first time. After that, re-running specific prompts when something looks off takes under five minutes.

Connect Your Ad Accounts to Adspirer

Sign up at adspirer.ai and connect your Google Ads and Meta accounts via OAuth. If you’re also running LinkedIn or TikTok, connect those too — the LinkedIn audit prompt is included below. No API keys or developer access required. The OAuth connection is the same flow as any ad platform integration — you’ll be prompted to grant read and write access to your account.

Add Adspirer to Claude or ChatGPT

Claude: Go to Customize → Connectors, click Add custom connector, and enter https://mcp.adspirer.com/mcp. Claude discovers all available tools automatically. See the full Claude setup guide.

ChatGPT: Open ChatGPT → Explore GPTs, search for Adspirer, and install the plugin. Authenticate when prompted. See the full ChatGPT setup guide.

Verify the Connection

Start a new conversation and run:

Check my connected ad platforms and show me a summary of active campaigns and conversion actions configured in Google Ads.

You should see your account name, campaign count, and a list of conversion actions with their status. If you manage multiple accounts and Claude asks which one to use, specify by name or account ID.

Run the Six Audit Prompts

Use the prompts in the next section. Each one targets a different tracking failure mode. You don’t need to run all six — start with the ones that match your symptoms. If you can’t explain a ROAS spike, start with prompt 3. If your Meta events look inflated, start with prompt 2. If a new campaign shows zero conversions, start with prompt 1.

Interpret the Results

Claude returns the API data interpreted in plain language: conversion actions with zero conversions in 30 days despite active spend are flagged, events firing at anomalous rates are called out, mismatched attribution windows are surfaced. Cross-reference what Claude returns against what you see in the ad platform UI — they should match, and any gap is worth investigating.

Make the Fixes

Most fixes happen outside Claude — in Google Tag Manager, your site code, or the ad platform settings. Claude can tell you what’s wrong and what the correct fix is; it can’t edit your website or GTM container. For changes that live in the ad platform itself — disabling a micro-conversion from the Conversions column, changing an attribution model, adjusting a Meta campaign’s attribution window — Claude can make those changes directly with your confirmation.


The 6 Audit Prompts

Run these in a conversation where Adspirer is connected. Replace account names, platforms, or date ranges as needed for your situation.

1. Google Ads — Zero-Conversion Actions

Check my Google Ads conversion actions. Are any counting zero conversions in the last 30 days despite active campaigns? List conversion action name, type, and last conversion date.

Also flag any conversion actions with very low conversion counts that seem inconsistent with my campaign volume — I want to know if something looks like it’s tracking but might be broken or misfiring.

This is the most direct signal of a broken tag. A conversion action that’s configured, enabled, and attached to a campaign with real spend — but shows zero or near-zero conversions — almost always means the tag isn’t firing, is firing on the wrong page, or wasn’t deployed after a site update. Claude returns a list sorted by last conversion date so stale conversion actions are immediately visible.

A common scenario: a site update moved the order confirmation page from /checkout/complete to /order/success. The Google tag trigger still targets the old URL pattern. The conversion action shows status “Active” in Google Ads because the tag was verified once historically — but it hasn’t fired in six weeks.

Another: a developer added conversion tracking to a staging environment for testing. The test conversions registered against the real conversion action. Now the last recorded conversion date looks recent, but the volume is wrong.

2. Meta Pixel — Double-Firing Check

Show me my Meta pixel events for the last 7 days. Which events are firing most? Are any events firing at unexpectedly high rates that might indicate double-firing?

For each event type, show me the total fire count for the period and the average per day. Flag any event where the daily rate seems unusually high relative to what you’d expect from a typical conversion funnel — for example, Purchase events firing more frequently than ViewContent or AddToCart events.

Double-firing is common after site migrations, theme updates, or when both Meta Pixel base code and CAPI (Conversions API) are sending the same event without deduplication configured. The giveaway is a Purchase event count that’s 2–4x your actual order volume, or any event firing more frequently than events earlier in the funnel. Claude surfaces event-by-event counts; the pattern of which events are anomalously high points you to the problem.

One note: a Purchase event firing at 2x your order count doesn’t automatically mean double-firing — some businesses have legitimate re-fires (subscription renewals, multi-item orders tracked separately). Context matters, which is why comparing to actual order volume is the confirmation step.

The funnel comparison is often more revealing than absolute counts: if your Purchase event fires more times than your AddToCart event in the same period, that’s definitionally wrong and signals double-firing regardless of your actual order count.

3. Google Ads — Inflated ROAS from Micro-Conversions

My Google Ads ROAS is 4.2 but I don’t see a matching revenue lift. Can you check if any conversion actions are set to “All conversions” vs “Key conversions” — and whether any low-value micro-conversions are inflating the ROAS number?

Show me each conversion action, its category (purchase, lead, page view, phone call, etc.), its assigned value, and whether it’s included in the “Conversions” column vs “All conversions” only.

This catches a specific configuration problem: conversion actions categorized as page views, button clicks, or phone call starts accidentally included in the main “Conversions” column that Smart Bidding optimizes toward. When micro-conversions with $0 or nominal values mix with your primary purchase conversions, ROAS becomes meaningless. A $5 “Add to Cart” event counted alongside $150 purchases will inflate reported ROAS while actual revenue stays flat — and Smart Bidding will optimize toward the cheap micro-conversions, compounding the problem.

The fix is simple: move micro-conversions from “Conversions” to “All conversions” only. Smart Bidding will ignore them, your ROAS calculation becomes accurate, and you can still track them in the “All conversions” column for informational purposes.

Worth noting: this is one of the few tracking fixes Claude can apply directly. Once it identifies which conversion actions are miscategorized, you can ask it to update the “Include in Conversions” setting on each one without touching the Google Ads UI.

4. Meta — Conversion Discrepancy Check

Pull my Meta ad conversion data for the last 14 days. Compare the attributed conversions against what my campaigns are reporting as total conversions. Flag any significant discrepancies.

Break it down by campaign, and also tell me what attribution window is currently set for each campaign. I want to understand if different campaigns are using different attribution windows — that would make comparing their conversion numbers misleading even within the same ad account.

Meta allows setting different attribution windows per campaign, which is a problem when you’re comparing performance across campaigns in the same account. A campaign on 7-day click attribution and one on 1-day click attribution aren’t comparable, even within Meta’s own reporting interface. The one with the longer window will almost always show more attributed conversions — not because it’s performing better, but because it’s counting a longer historical tail of clicks.

This prompt surfaces mismatched attribution windows across your campaigns so you can standardize them — typically to 7-day click, 1-day view for most businesses.

The deeper issue is that Meta’s default reporting in Ads Manager aggregates all campaigns together, so if you have a mix of 1-day and 7-day attribution campaigns, the account-level conversion number is a meaningless blend of two different measurement standards. Standardizing attribution windows is one of the most underrated data hygiene fixes in Meta Ads management.

5. Google Ads — View-Through Conversion Inflation

Are any of my Google Ads campaigns using view-through conversions that might be inflating reported ROAS? Show me conversion actions by type and attribution model.

Specifically: are any conversion actions using data-driven attribution or last-click attribution that might differ from my primary analysis model? And are view-through conversions contributing to the main Conversions column for any campaigns?

View-through conversions (VTC) record a conversion whenever someone saw your ad, didn’t click it, but later converted through any channel. Display and YouTube campaigns enable VTC by default. When VTCs are included in your Conversions column, your ROAS calculation includes credit for people who may have converted entirely due to organic search, a referral link, or a Meta ad — your display impression just happened to serve to them at some point in a 30-day window.

This is a structural over-attribution issue baked into Google’s defaults. It’s not fraud, but it significantly inflates apparent ROAS for Display and YouTube campaigns. The audit surfaces whether it’s happening and to what degree.

If view-through conversions are a meaningful share of your reported conversions, you have two options: exclude VTCs from the Conversions column entirely (cleanest for reporting), or leave them in but add a VTC-only breakdown column so you can see the split. The right answer depends on whether you believe your Display campaigns genuinely influence conversions — some businesses have real lift from brand awareness campaigns, and excluding VTCs entirely would undervalue them.

6. LinkedIn — Insight Tag and Conversion Events

I recently launched on LinkedIn. Can you check if my LinkedIn Insight Tag is firing correctly and show me which conversion events are configured vs which are actually recording conversions?

List each LinkedIn conversion event, its type, the last conversion recorded, and total conversions in the last 30 days. Flag any events that are configured but showing zero conversions.

LinkedIn conversion tracking is more frequently broken than Google or Meta because it’s set up less often and tested less rigorously. The Insight Tag is a separate JavaScript snippet from your Meta Pixel and Google tag — it gets missed during site migrations or CMS updates.

This prompt checks whether configured conversion events are actually recording anything, which is the fastest confirmation that the Insight Tag is live and correctly placed. Zero conversions across all events on a campaign that’s been running three weeks is a strong signal the Insight Tag isn’t firing.

LinkedIn also supports URL-based conversions (fire on a specific page URL) and event-based conversions (fire when a specific event is triggered). URL-based ones break exactly like Google’s tag trigger problem — a URL change that doesn’t get reflected in the conversion event configuration.


Google vs Meta: How Conversion Tracking Works Differently

The audit experience differs between platforms because the tracking architectures are genuinely different. Understanding those differences helps you interpret what Claude returns and know when a discrepancy is a problem versus expected behavior.

How Google Ads conversion tracking works:

Google conversion actions are created in the Google Ads interface and deployed via one of three methods: Google Tag (gtag.js), Google Tag Manager, or the Google Ads conversion import (for offline conversions from CRM data). Each conversion action has its own tag snippet that must be separately deployed.

What “broken” looks like on Google Ads:

  • Conversion action shows status “Unverified” — tag was created but never recorded a conversion
  • Conversion action shows zero conversions for 7+ days despite active campaigns sending traffic to the tracked page
  • Conversion value is a fixed value or $0 when you expect dynamic revenue values from ecommerce
  • “All conversions” column is significantly higher than “Conversions” — micro-conversions mixed in

How Claude interprets the data:

Claude pulls your conversion action list from the Google Ads API, which includes status, conversion category, attribution model, value settings, and recent conversion counts. It cannot access your Google Tag Manager container or verify that the tag code is correctly placed on your site — that requires GTM access or a browser-based debugging tool like Google Tag Assistant.

What it can definitively tell you: which conversion actions are recording, which aren’t, what attribution models they’re using, and what values they’re assigning. That’s usually enough to narrow the problem to a specific conversion action and confirm the root cause with Tag Assistant.

Common fix pattern: Claude flags a Purchase conversion action with zero conversions for 45 days. Tag Assistant reveals the tag isn’t firing on the confirmation page. Root cause: the page URL structure changed after a site migration and the tag trigger no longer matches. Fix: update the trigger URL pattern in GTM to match the new URL.

One thing to know about Google Ads conversion reporting: there’s typically a 3–4 hour delay in conversion data, sometimes up to 24 hours for same-day conversions. If you just deployed a fix and are checking whether conversions are now recording, wait at least a few hours before concluding the fix didn’t work.

How Meta conversion tracking works:

Meta tracking has two layers: the Meta Pixel (browser-side JavaScript) and the Conversions API (server-side, if configured). Both can send the same events — deduplication between them relies on matching eventID values. If deduplication isn’t configured, events counted by both the Pixel and CAPI result in double-counting.

Meta’s attribution model differs from last-click. By default, it credits a conversion to an ad if the user clicked the ad within 7 days OR viewed the ad within 1 day of converting. This is broader than most GA4 or CRM attribution models, which is why Meta always reports more attributed conversions than other sources.

What “broken” looks like on Meta:

  • Purchase events in Events Manager at 2–3x actual order count — double-firing from Pixel + CAPI without deduplication
  • Purchase events 50–70% lower than actual orders — Pixel not firing on all checkout paths
  • Event data flagged as “Not matched” in Events Manager — missing customer information parameters that enable matching
  • Zero conversions for specific ad sets despite confirmed traffic to the landing page

How Claude interprets the data:

Claude pulls event data from the Meta Marketing API, which shows event counts by day and basic quality signals. It cannot access the Events Manager diagnostic tools, the pixel test events panel, or view the actual event payloads being sent. For a Pixel that’s firing but sending malformed data — missing value, wrong currency, incorrect content_ids — Events Manager’s test events panel is the right tool.

What Claude can surface: event volume over time, anomalously high event counts suggesting double-firing, and attribution window settings per campaign. For most double-firing diagnoses and zero-conversion investigations, that’s the right starting point.

Common fix pattern: Claude reports 340 Purchase events in a 7-day window. Actual orders for the week: 165. Claude flags the Purchase event rate as anomalously high. Root cause: Pixel and Conversions API both configured, but eventID deduplication not set up. Fix: add matching eventID to both the Pixel fbq() call and the CAPI event payload.

One additional note: Meta’s event data in the API has a reporting delay too, typically 1–3 hours for pixel events and up to 24 hours for some CAPI events. If you just deployed a fix, let it run overnight before treating a continued spike as confirmation that the fix didn’t work.


What Claude Can and Can’t Check

Being explicit about this builds trust and saves time. The API data Claude accesses catches most real-world tracking problems, but a subset of issues require direct site or tag manager access to diagnose. Know the boundary before you start.

Can Check (via Ad Platform API)Cannot Check
Which conversion actions are configured and their statusWhether tag code is correctly placed on your site
Which conversion actions have zero or anomalously low conversionsWhether GTM triggers are firing as expected
Attribution model and window per conversion actionWhether custom parameters (revenue, order ID) are being passed correctly
Meta pixel event counts and daily ratesWhether server-side CAPI events contain correct customer match data
Attribution window settings per Meta campaignWhether a Pixel fires on all steps of a multi-step checkout
Whether LinkedIn Insight Tag conversion events are recordingActual event payloads or tag debug logs
Whether Google auto-tagging is enabled (GCLID)Whether GCLID is being captured and stored in your CRM fields
View-through conversion settings and their contributionCross-device conversion paths
Conversion value settings (fixed vs dynamic)Whether a URL trigger pattern matches the actual confirmation page URL

The practical workflow: use Claude’s audit to identify which conversion action or pixel event is the problem, then use Google Tag Assistant, Meta Events Manager’s test events panel, or GTM preview mode to confirm the root cause at the implementation level.

Think of it as two stages: Claude narrows the problem to a specific conversion action or event, cutting diagnostic time from hours to minutes. Tag-level debugging tools confirm the implementation detail. You rarely need both without the first stage pointing you toward the right one.

It also means you can skip debugging platforms where the tracking looks healthy. If prompt 1 shows all your Google conversion actions are recording normally, you don’t need to open Tag Assistant. Time is better spent investigating the one platform where the audit flagged a problem.


After the Audit: What to Do with What You Find

Running the audit gets you a list of specific problems with conversion action names and numbers attached. Knowing what to do next depends on where the problem lives.

The general rule: if the problem is in the ad platform settings (attribution window, conversion count setting, micro-conversion in Conversions column), Claude can fix it directly. If the problem is in the implementation (tag not firing, URL trigger wrong, Pixel code missing from a page), the fix happens outside Claude — in GTM, in your site code, or in your CAPI server configuration.

If Claude finds zero conversions on a Google Ads conversion action:

  1. Open Google Tag Assistant and simulate the conversion action (navigate to the confirmation page, submit a test form, etc.)
  2. Check whether the tag fires — Tag Assistant shows each tag that fires on the page in real time
  3. If the tag doesn’t fire, the trigger URL is wrong or the tag wasn’t deployed after the last GTM publish. Fix in GTM and republish.
  4. If the tag fires in Tag Assistant but conversions still show zero in Google Ads after 24 hours, check whether auto-tagging is enabled at the account level. Without auto-tagging, Google Ads can’t match click data to conversions for most campaign types.
  5. Also verify the conversion action is included in the campaign’s conversion goal settings — a correctly firing tag on a campaign that targets a different conversion goal won’t show conversions.

If Claude finds a Meta Pixel event with anomalously high counts:

  1. Check your actual order volume against the Purchase event count. If orders were 180 and Purchase events were 340, that’s likely double-firing.
  2. Go to Meta Events Manager → Diagnostics. Look for duplicate events flagged there — Meta’s own deduplication check will flag events it detects as likely duplicates.
  3. Check whether both Pixel and CAPI are configured. If yes, verify that eventID is being passed and is identical in both the Pixel call and the CAPI payload.
  4. If only Pixel is configured, check whether the Pixel base code appears twice on the page — common after theme updates or when a third-party tool adds its own Pixel embed.
  5. Also check: is your checkout a multi-step flow where the same page is revisited? Some checkout flows reload the confirmation page, which can re-fire the Pixel on refresh if the tag doesn’t check for duplicate fires.

If Claude finds mismatched attribution windows across Meta campaigns:

  1. Go to the campaign settings for each affected campaign in Meta Ads Manager
  2. Under “Attribution setting,” standardize to a single window — typically 7-day click, 1-day view for most businesses
  3. Note: changing attribution windows affects reported conversion counts going forward. Historical data looks different after the change — expect a temporary apparent performance drop if you’re moving from a longer to a shorter window.
  4. Document the change date so you don’t compare pre-change and post-change periods as if they’re apples-to-apples.

If Claude flags micro-conversions inflating Google Ads ROAS:

  1. In Google Ads, go to Tools → Measurement → Conversions
  2. Click the micro-conversion action (page view, button click, etc.) and change “Include in Conversions” from “Yes” to “No”
  3. The action stays in your “All conversions” column so you can still track it — it just won’t affect your ROAS calculation or Smart Bidding optimization

Alternatively, ask Claude directly: “Set the [conversion action name] to not be included in Conversions.” Claude can apply the setting change in Google Ads with your confirmation — no need to navigate the UI.


When Tracking Breaks: The Most Common Triggers

Conversion tracking problems don’t usually come from nowhere. If you’re trying to pinpoint when something changed, these are the events most likely to have caused it.

TriggerWhat BreaksWhat to Check First
Site migration or redesignTag triggers built on URL patterns stop matching new URLsGoogle Ads conversion action status; run prompt 1
CMS or theme updatePixel or tag code stripped from page templatesMeta Purchase event count vs actual orders; run prompt 2
Checkout flow redesignConversion page URL or DOM structure changes; tags stop firingGoogle Ads last conversion date on purchase action; run prompt 1
Added CAPI without deduplicationMeta purchase events double-countedMeta event rates; run prompt 2
Campaign attribution window changeHistorical conversion comparisons breakMeta attribution window per campaign; run prompt 4
New campaign launchedNew campaign conversions attached to wrong conversion actionGoogle conversion action breakdown; run prompt 1
Smart Bidding strategy changeBidding shifts toward a different conversion actionConversion action “Include in Conversions” settings; run prompt 3

If you can’t narrow down when your tracking changed, look at the conversion volume trend in Google Ads or Meta Events Manager. The date where the line drops (or spikes) is usually the week the triggering event happened — and knowing the date makes diagnosing the cause much faster.

Cross-referencing that date against your deployment history or changelog (even a simple Slack message like “we pushed the new checkout today”) usually gets you to the root cause in minutes. Most tracking problems are post-deployment problems in disguise.


Keep Tracking Clean: A Recurring Audit Cadence

A one-time audit catches the current problems. A monthly audit cadence catches new ones before they compound.

The prompts above are quick enough to run monthly — 15 minutes to check all six. The most valuable ones to revisit regularly are:

  • Prompt 1 (zero-conversion actions) — Run after any site update or deployment
  • Prompt 2 (Meta double-firing) — Run monthly; Pixel double-firing can appear after theme updates
  • Prompt 3 (micro-conversion inflation) — Run after any Google Ads account restructuring
  • Prompt 5 (view-through inflation) — Run whenever you launch or pause Display or YouTube campaigns

The LinkedIn prompt (prompt 6) is worth running after any site deployment, since the Insight Tag is the most frequently broken tag across all four platforms.

You don’t need to run all six every month. Build a rotation: run prompts 1 and 2 monthly, prompts 3 and 5 quarterly, and prompts 4 and 6 whenever you add a new campaign type or platform.

A useful discipline: run prompt 1 immediately after any significant site deployment, before reviewing campaign performance. Confirmation page URLs are the most common casualty of site updates, and catching a broken Google tag on deployment day beats discovering it three weeks later when you’re trying to explain a conversion drop.

If you want to automate this, Adspirer supports scheduled agent tasks. You can configure a weekly check that runs prompt 1 and prompt 2 automatically and alerts you if anything looks broken — without you having to remember to run it. See the agent skills documentation for details on setting up recurring audits.


FAQ

My Google Ads ROAS went up but revenue didn't change. What's usually happening?

Three scenarios cover most of this. First, micro-conversions got included in the Conversions column — a page view or button click event with a $0 value (or a nominal value like $1) mixing with real purchase conversions makes ROAS calculations misleading. Second, a conversion action got reconfigured to count “Every conversion” instead of “One conversion per click,” so each order is now counting as 2–3 conversions. Third, Smart Bidding shifted budget toward campaigns with high view-through conversion credit — the ROAS looks great because the attribution model is generous, not because campaigns are performing better. Prompt 3 above surfaces all three issues. The fix for the first two is in the conversion action settings. The third requires a judgment call about whether you want to include view-through attribution in your primary metric.

Can Claude fix my tracking setup, or just identify problems?

Claude can fix ad account-level configuration directly — changing a conversion action’s attribution model, removing a micro-conversion from the Conversions column, updating a conversion window, or adjusting Meta campaign attribution settings. All changes require your explicit confirmation before applying. What Claude cannot do is edit your website, Google Tag Manager container, Meta Pixel implementation, or server-side tag configuration — those changes happen outside the ad platform APIs. The typical workflow: Claude identifies which conversion action is misconfigured and describes the correct setting, you confirm the change, Claude applies it. Then you verify the fix by checking conversion reporting or using Tag Assistant.

Google Ads and Meta will never show the same conversion number. Is that normal?

Yes — and the gap is usually larger than advertisers expect. The default attribution windows alone explain most of it: Google counts conversions from clicks up to 30 days old and views up to 1 day old; Meta counts clicks up to 7 days old and views up to 1 day old. On any 14-day reporting window, Google is pulling in a longer historical tail of clicks. Beyond attribution windows, each platform only counts conversions where its own tracking fires — Google needs its pixel to fire, Meta needs its pixel to fire. A customer who clicked both a Google ad and a Meta ad before converting may show up as a conversion in both platforms. The right question isn’t “why don’t the numbers match” — they never will. The right question is “are the trends moving in the same direction?” If one platform shows a 30% conversion increase after a campaign change and the other shows a decrease, that’s the signal worth investigating.

I have both Meta Pixel and CAPI set up. Why are my purchase events doubling?

This is a deduplication configuration issue, not a problem with having both. Meta Pixel and Conversions API are designed to work together — browser-side and server-side — but deduplication requires passing a matching eventID parameter in both the Pixel event and the CAPI event payload. If eventID is absent from either side, Meta counts them as two separate events. The fix is adding a unique identifier (your order ID, or a UUID generated at the time of the conversion) to both the Pixel fbq('track', 'Purchase', {...}, {eventID: 'order_123'}) call and the CAPI event_id field. Meta matches events with identical eventID values and counts only one. Allow 24–48 hours after the fix for event counts to normalize.

Does this audit work if I'm running conversion tracking through GA4 imported into Google Ads?

Partially. If you’ve set up GA4 conversion events and linked GA4 to Google Ads, those events appear as conversion actions in Google Ads. The audit prompts can check whether those conversion actions are recording, what attribution model they’re using, and whether they’re contributing to your Conversions column. What the audit can’t verify is whether the underlying GA4 events are firing correctly — that’s a GA4 and GTM question. If GA4-imported conversion actions show zero conversions in the audit, the diagnostic path is: confirm the GA4 event is recording in GA4 directly, then confirm the GA4 to Google Ads property link is active, then confirm the event is marked as a conversion in GA4. Claude can help you walk through the diagnostic even if it can’t access your GA4 property.


Conclusion

Broken conversion tracking is one of those problems that compounds quietly. You don’t notice ROAS inflation from micro-conversions until you’ve been making budget decisions based on it for three months. You don’t catch the double-firing Pixel until you notice your CPA looks suspiciously good relative to actual sales. By then, the data you’ve accumulated is unreliable and historical comparisons are meaningless.

The audit prompts above take about 15 minutes the first time and surface the most common problems without requiring GTM access, site inspection, or manual investigation across four separate platform UIs. What you get back is a list of specific conversion actions and pixel events with concrete data about what they’re recording — not generic warnings.

The things Claude can’t check — your GTM container, your site code, whether a tag trigger is correctly configured — are a smaller set of the overall problem space than most people expect. For the majority of tracking issues, the API data is enough to identify the specific problem and point to the fix.

One more thing worth saying: clean conversion tracking isn’t just a reporting hygiene problem. Smart Bidding and Meta’s Advantage+ systems use your conversion signals to optimize. When those signals are wrong — micro-conversions mixed with purchases, double-fired events, attribution inflated by view-throughs — the algorithm optimizes toward the wrong thing. The fix shows up in your actual performance, not just your dashboard.

Start with prompt 1 if Google Ads looks wrong. Start with prompt 2 if Meta looks inflated. Start with prompt 3 if ROAS looks disconnected from revenue. The six prompts cover the full diagnostic surface — you just need to know which symptom you’re chasing.

Most advertisers run the full set once, fix what Claude flags, then check back quarterly or after major site changes. That cadence is enough to keep tracking clean and keep your reporting trustworthy.

Info

Connect your ad accounts and run your first tracking audit in under 15 minutes. Adspirer is free to start — no credit card, no setup beyond connecting your accounts.

Start your free audit →


Google Ads Meta Ads Conversion Tracking Attribution Claude ChatGPT Audit

More articles to read