Skip to main content
    March 20268 min read

    Conversions vs Conversion by Time - The Column That's Silently Distorting Your Decisions

    Google Ads gives you two ways to count conversions. One attributes the conversion to the date the click happened. The other attributes it to the date the conversion happened. They tell completely different stories - and most brands don't know which one they're reading.

    Two Columns, Two Realities

    Open any Google Ads report and you'll see "Conversions." Add a column called "Conversions (by conv. time)" and you'll see a different number. Sometimes slightly different. Sometimes wildly different.

    Here's the distinction that matters:

    • "Conversions" - attributed to the date of the click that led to the conversion
    • "Conversions (by conv. time)" - attributed to the date the conversion actually occurred

    A customer clicks your ad on March 1st. They browse, compare, sleep on it. They buy on March 5th. The "Conversions" column credits March 1st. The "by conv. time" column credits March 5th. Same sale. Different date. Different story.

    Conversions: Click-Date Attribution

    The standard "Conversions" column is click-date attribution. It answers the question: "How valuable was the traffic I drove on this date?"

    This is the column Smart Bidding optimises against. When Google's algorithm evaluates whether a bid was effective, it looks at the click and all the conversions that eventually resulted from it - regardless of when they happened.

    Click-date attribution is powerful for understanding campaign effectiveness. But it has a significant drawback: recent days are always incomplete. If your average purchase cycle is 5 days, then the last 5 days in your Conversions column are systematically understated. Conversions from those clicks simply haven't happened yet.

    This is why checking "yesterday's ROAS" in the Conversions column is meaningless. You're looking at a fraction of the data. It's like reading a half-written exam paper and concluding the student failed. The data staleness problem compounds this - even conversions that have happened take 24-72 hours to appear.

    By Conversion Time: Calendar-Date Attribution

    "Conversions (by conv. time)" answers a different question: "How many conversions happened on this date?" Regardless of when the triggering click occurred.

    This column aligns with your ecommerce platform. If Shopify says you had 50 orders on Tuesday, the "by conv. time" column should broadly agree (modelling, attribution windows, and tracking gaps aside). It matches cash flow. It matches your P&L.

    For finance teams, this is the column that makes sense. Revenue hit the books on Tuesday, so Tuesday is when it counts. No back-attribution to clicks from last week. No mysterious future-dated conversions appearing on historical dates.

    But here's the catch: this column tells you nothing about why those conversions happened. It doesn't help you evaluate which campaigns drove performance. A great converting day might be the result of clicks from five different days across three different campaigns. The calendar view flattens all of that into a single number.

    Why the Numbers Disagree

    The gap between these two columns depends on your purchase consideration window - the time between click and conversion. For impulse purchases (fast fashion, snacks, low-price accessories), the gap is small. Click and buy happen on the same day.

    For considered purchases (furniture, electronics, premium fashion), the gap is significant. A 7-14 day purchase cycle means Monday's clicks convert next week. The two columns can show completely different trend lines for the same period.

    The disagreement is most extreme in three scenarios:

    • Promotional events: Heavy traffic on Black Friday, conversions spread across the following week. Click-date shows Friday as incredible; conversion-date shows a gradual tail.
    • Campaign launches: New campaigns generate clicks immediately but conversions lag. Click-date looks terrible early on; conversion-date normalises over time.
    • Budget changes: Increasing spend drives more clicks today but conversions arrive later. Click-date ROAS tanks temporarily; conversion-date stays stable.

    If you don't understand which column you're reading, every one of these scenarios triggers a wrong decision. You either celebrate too early, panic too soon, or reverse a change that was actually working.

    Which Column Matters - And When

    Neither column is "right." They serve different purposes:

    Use "Conversions" (click-date) for:

    • • Evaluating campaign and ad group effectiveness
    • • Understanding which traffic is converting
    • • Bid strategy assessment (this is what Smart Bidding sees)
    • • Historical performance analysis (settled data only - 7+ days old)

    Use "Conversions (by conv. time)" for:

    • • Finance reconciliation and P&L alignment
    • • Comparing Google Ads data against Shopify/GA4
    • • Daily revenue tracking and cash-flow visibility
    • • Board-level reporting where dates must match transactions

    The mistake most brands make is using click-date data for finance reporting (creating discrepancies with Shopify) or using conversion-date data for campaign optimisation (which tells you nothing about what caused the conversions). Each column has a job. Use them for their intended purpose.

    The Bidding Implications

    Smart Bidding uses click-date conversions. It evaluates each auction by asking: "If I bid £X on this click, what's the probability it leads to a conversion?" It doesn't care when that conversion happens - only if it happens within the conversion window.

    This creates an important implication: your conversion window setting directly affects which conversions Smart Bidding "sees." A 7-day window and a 30-day window produce different conversion counts for the same clicks. The algorithm optimises differently under each setting.

    If your actual purchase cycle is 14 days but your conversion window is set to 7, Smart Bidding is systematically undervaluing your traffic. It sees fewer conversions per click than actually occur, which leads to lower bids, which leads to less traffic, which creates a downward spiral.

    Conversely, a 90-day window on a product with a 3-day purchase cycle inflates the attribution window unnecessarily and may credit clicks that had no real influence. The conversion window should match your actual purchase behaviour - and that varies by product category. If you sell both impulse buys and considered purchases, a P&L-aligned account structure lets you set different windows for different segments.

    Fixing Your Reporting Cadence

    The practical solution isn't choosing one column over the other. It's building a reporting cadence that uses each appropriately:

    • Daily cash-flow check: Use "by conv. time" to track actual daily revenue against forecast. Share this with finance.
    • Weekly performance review: Use "Conversions" (click-date) but only look at data 7+ days old. This gives settled click-date attribution for campaign-level decisions.
    • Monthly reporting: Include both columns. Show finance the "by conv. time" view for P&L alignment. Show the marketing team the "Conversions" view for campaign assessment. Explain the difference in a footnote - it prevents the "our numbers don't match" conversation.
    • Real-time panic prevention: Never look at either column for the last 72 hours and make a decision. The data isn't settled. The data lag problem makes both columns unreliable for recent periods.

    The most common failure mode: an ecommerce director checks yesterday's "Conversions" column, sees low numbers, and fires off an email asking why performance dropped. But those numbers will look completely different by Friday once delayed conversions populate. The intervention triggered by Monday's panic is worse than doing nothing. Report what matters - not what's newest.

    Next Steps

    If your marketing team and finance team are looking at different conversion columns - or worse, neither knows which one they're reading - you have a reporting alignment problem that's driving bad decisions. Fix the framework before you optimise the campaigns.

    We use cookies to improve your experience. Privacy Policy