Google Search Console vs. Google Analytics – Why Clicks Don’t Match Sessions



Several members of the SEO community have called the Search Analytics report in Google Search Console (formerly Google Webmaster Tools) inaccurate, largely because the count of clicks does not match the count of sessions in Google Analytics for the source Google and medium organic.

Rather than “inaccurate”, it’s really just a matter of trying to compare two completely different things. This article aims to cover how both are measured and a number of additional potential differences that need to be adjusted for when comparing these reports.

Google Search Console Search Analytics Report

Different Metrics

Google Analytics uses last non-direct attribution.

The use of last non-direct attribution in assigning sessions to a source and medium means that the reported number of Google organic sessions actually equals the true number of Google organic sessions plus the number of direct sessions that followed it. (A direct session is any session where Google did not identify the source, including sessions where the user visited the site directly without coming from another site.)

For example, if a user visited LunaMetrics 5 times – the first time through Google, and the last 4 visits via bookmark or typing in the URL in the web browser bar or visiting from an application like Outlook – GA attributes all 5 sessions to Google; however, Google Search Console will only report 1 click. This article explains GA’s attribution of direct visits in more detail.

This last non-direct attribution can cause dramatic differences between GA sessions and Search Console clicks. If your site has a large proportion of direct sessions and non-new sessions, there’s a good chance GA is inflating true organic traffic by quite a bit.

The metrics clicks and sessions are substantially different by definition.

Clicks – “Count of clicks from a Google search results page that landed the user on your property.”

Sessions – Visits initiated by a user, counted when an HTML page loads the Google Analytics JavaScript snippet, adjusted for session timeouts.

That’s a mouthful. Let’s break it down.

Importantly, the Google Analytics JavaScript code snippet (or a script executing it) must be on the page to be counted in Google Analytics. That leads to three situations where Google Analytics won’t count a Search Console click as a session.

  1. Non-HTML pages like PDFs are not counted in Google Analytics (but are counted as Google Search Console clicks).
  2. The few users who do not have JavaScript enabled will go unnoticed by Google Analytics.
  3. The few users that click on a search result but bounce off the site before the GA snippet can be loaded will also go unnoticed by GA. (This number may be higher if the code snippet takes an unusually long time to execute – if it is at the bottom of the source code, for example.)

Also, session timeouts can moderately inflate the number of GA sessions over the number of clicks. As explained in this help article, Google Analytics ends a session and starts a new session after any one of the following three conditions are met:

  1. After 30 minutes of inactivity (followed by a resumption of activity)
  2. At midnight
  3. If a user arrives via one campaign, leaves, and then comes back via a different campaign.

Possibly Different Sources and Destination

If you want to compare Google Analytics to Search Console data, make sure you are looking at the same traffic source to the same place.

Are you looking at the same Google traffic?

Google Search Console displays separate click counts for three different Search Types – Web, Image, and Video. Web is selected by default. Thus, if your site receives a lot of image traffic, the number of clicks in the default Search Console screen (for Web) will be extra low compared to Google Analytics sessions for Google organic traffic. To know the total Google organic clicks in Search Console, you would need to add Web, Image, and Video clicks together.

Filtering can moderately deflate Google Analytics traffic. If you have Google Analytics filters set, you may be filtering out some traffic that Search Console tracks. Additionally, in a Search Console help article, Google states that, “Some processing of our source data might cause these stats to differ from stats listed in other sources (for example, to eliminate duplicates and visits from robots). However, these changes should not be significant.”

Are you looking at traffic to the same place?

It’s easy to think Search Console and GA are reporting traffic to the same pages when they really aren’t.

It is worth mentioning again that GA does not report activity on non-HTML pages like PDFs and Word docs but that Search Console does.

Also, the hostnames are often different between Search Console and GA. Google Analytics shows all pages from all subdomains and hostnames with the Google Analytics snippet on it (including scraped content like However, Google Search Console only shows one specific subdomain per account. To make sure you’re looking at traffic to the same pages in Search Console and GA, you must make sure you match the GA hostname to Search Console. If your site has multiple subdomains, this is important to note.

Finally, an individual Search Console account only has data for one version of protocol (HTTP or HTTPS). If your site has both HTTP and HTTPS, and you want to look at traffic to your entire site in Search Console, then you need Search Console accounts for both the HTTP and the HTTPS versions (as well as every subdomain), and you’d need to add all of the clicks reported in each account.

Reid Bandremer is a former LunaMetrician and contributor to our blog.

  • Jackson Lo

    Great article Reid. It’s also believed that Search Console just doesn’t report every data point of a site, especially when you’re getting into the millions of pages.

  • Antonio Araya

    Good article. Also in many occasions you can’t see all the data because your site has 2 or more subdomains or you have canonicalization issues: lack of 301 between www and non www versions, http and https, etc. and don’t select the final version in Search Console, so I always try to verify all the versions of your website and capitalize all the versions in just one to gather all the data in one place in SC

    • Reid Bandremer

      Agreed. I think not verifying all versions of your site is #1 Search Console mistake.

  • Simon

    Hi Reid. This is great, but I think it omits some key points. Yes, if a user initially finds a site via organic and then bookmarks it and visits it directly the day after, GA will attribute all sessions to organic whereas GSC will attribute just 1 click, so GA may appear to have the higher figure. But what happens to counter this is that a user will typically engage in a lot of search activity all at one time when looking for a particular answer/product, often visiting the same site multiple times. This will appear as multiple clicks on GSC but only 1 session on GA, so the GSC number will be higher. This is a more likely scenario than the bookmarking one.

    We see GA Google organic sessions as about 1.5 x GSC clicks (web + images + video) and it’s extremely unlikely the difference is due to direct sessions being counted as organic. Given that all other data in GSC is without doubt *completely* inaccurate/wrong/outdated (backlinks, mobile usability, HTML improvements), do you not think it likely that the GSC search analytics is often just wrong too?

    • Reid Bandremer


      Yes, I did not explicitly point out that if a user visits a site via Google organic then leaves and comes back via Google organic all within a 30 minute timeframe that there is no session timeout and GA only counts it as 1 session.

      Your scenario is certainly not an uncommon one, and may be particular common for research-intensive search situations. However, I do not think this scenario is more likely than GA attributing direct as organic.

      Keep in mind that direct / (none) traffic is anything without a referrer value in the cookie, that the default campaign timeout is 6 months, and that GA never attributes a session as direct / (none) if there is a previous known referrer value in those 6 months. So organic traffic can also include up to 6 months worth of visits from bookmarks plus typing the site in the browser, plus email, PDFs, Word docs, software and apps, going from https page to an http page, etc.

      But of course, there are many different scenarios I’ve seen. And I think that is the more important point – the main reason(s) for a discrepancy in the totals of GSC clicks vs. GA sessions really depends on the specific situation.

      And, no I do not think it is fair to attribute the discrepancy to being GSC being *just wrong*. It is important to note that it is different from GA in specific ways. Every data reporting tool has technical limitations.

      And every report in GSC has specific technical limitations and caveats as well. I do not agree that “all other data in GSC is without doubt *completely* inaccurate/wrong/outdated” – that is a strong statement. Most of them are accurate enough to provide substantial value – but there are definitely some I trust and understand more than others.

      A major reason (and a valid one) for lack of trust in GSC click reporting is that GSC does not document every aspect of it’s reporting mechanisms (or does it poorly) – especially compared to the wealth of literature about the technical specifics of GA reporting.

      • Simon

        Hi Reid, many thanks for the detailed response. The scenario I’ve described is common for ecommerce sites. Users will search for a particular item, click on a result, review the price/delivery time/store trustworthiness, then click back to Google results, try another store, and so on, often jumping to same store multiple times as they narrow down their choice. We see this happening all the time. Hence, 1 GA session, multiple GSC clicks.

        Our other GSC data really is completely inaccurate/wrong/outdated. Calling these issues ‘technical limitations and caveats’ is a very diplomatic way of talking round serious problems with the tool. Our backlinks profile includes huge numbers of links that haven’t existed for 6 – 12 months and omits hundreds of domains that should be there. Our mobile usability report has been stuck for months flagging around 50 ‘invalid’ pages, all of which are fully mobile usable when you do the live Google test. Our HTML improvements page lists duplicate titles that are actually the same page where the URL has been shortened (all versions 301 to the single correct URL) because it takes months to interpret the 301, etc. And then there are the bugs like the sudden drop in index count and missing 18th July.

        I disagree that most reports are accurate enough to provide substantial value. When there are this many issues with the tool and the data, it’s no longer possible to trust it. You suggest that Google could help clarify by providing further documentation, but are you sure?

        “PLEASE NOTE: This inbound links report may be up to 12 months out of date. Many links listed do not exist any more. Hundreds of domains that may be affecting your organic rankings are also excluded.”

  • Simon

    Hi Reid, thank you again for the detailed response. I still really struggle to trust GSC data, so I can’t find it as valuable as you. I wish I could.

    What I don’t understand (and perhaps you do?) is that GSC must get its data from somewhere. Let’s just consider backlinks for now. Google bots crawl the web and store all the backlink (and other) data. This data is presumably then used by GSC because it’s extremely unlikely that GSC has its own inferior separate crawl bot. So the question is… if GSC has Google’s backlink data available to it, how come it shows such inaccurate reports with hundreds of domains missing and many links shown that no longer exist?

    • Reid Bandremer

      I wish I could answer that. What happens between data collection and data reporting publishing is a mystery to us all. Are they using sampling and infrequent updates to conserve their own bandwidth or do they purposely withhold some info, and to what extent to they do this? I don’t know, but the answer appears different for different reports.

      The Links to Your Site Report should be viewed as a sample set only, imo (as should many other reports). They don’t claim the backlink data in 100% complete – they refer to “sample” links and they also explicitly state in the help article “Not all links to your site may be listed. This is normal.”

      • Simon

        Thanks Reid. Much appreciated.

  • Richard Mountainjoy

    Thanks for your guys’ old article on , it was very useful. I couldn’t understand the difference between strict and appeared search in the exported csv. I couldn’t comment there, but thanks anyway.

  • BMCInternetMarketing

    I really like the Search Console features, but in my opinion the data is really beta.
    Example data from one of our sites: Summary on top says: total clicks 54, total views 1023, average ctr 5,28%, average position 8,1. And now the strange thing comes… the overview only specifies ONE click, Analytics says 60 clicks.

    These 54 clicks measured by GSC compared to 60 click in Analytics makes sense. But GSC only specifying one click? I really do not understand that at all….

    Somebody experciencing the same kind of problem? I do experience this with several sites.


  • Mark Watson


    I’m seeing lately more traffic in Webmaster tools than in Analytics and this goes up to 25% on a daily basis.

    Does somebody know what might be causing this and is this something to worry about?



  • tekTutorialsHub

    Excellent Article. Thanks for Sharing

  • Kanika

    Hi Reid,

    I’m seeing more traffic in Google Webmaster or Search Console tools than in Google Analytics. The difference is of around 30%-50%.

    Can you please help me to know what might be causing this and is this something to worry about?

  • Paolo Albera

    Thanks a lot for this useful insight about the difference between GA and GSC traffic value. A very infromative piece of content. Bookmarked!

  • Sean Juan

    I have always wondered how certain search terms could have 1,000’s of impressions and averaging on page 35. What human goes anywhere near page 35? My theory is that rank checkers are causing these crazy stats.

  • Anil Patel

    Ohh, finally I got the answer of my question running in my mind for this big difference. Thanks a lot Reid for this wonderful article.

  • gravymatt

    Man, I just lost 2 minuted even before I started reading the article ’cause I recognized the hero image from Mortal Kombat 1 as the secret boss level where you fight Reptile – NICE!!!

  • Anthony

    We are DesignSprings found in Sep,2016, by Anthony is an online marketplace providing logo, website, print and graphic design services by providing access to freelance graphic designers and design studios around the world. DesignSprings was started to help people from around the world access creative talent, and to help creatives from across the globe find new customers.

    DesignSprings gives you access to a ‘virtual team’ of 527,168 designers from around the world (via a process called crowdsourcing) – helping you to tap into the very best international design talent available, at a low cost. Designer industry and law, DesignSprings is a labor of love for a team that believes strongly in the creative process and the protection of intellectual property.

    Post a project on DESIGNsprings and watch designs pour in from around the world (the average project receives 50+ designs).. We connect more than one million talented freelance designers with creative people, genius entrepreneurs, savvy businesses anyone who needs great work. VISIT

Contact Us.


24 S. 18th Street, Suite 100,
Pittsburgh, PA 15203

Follow Us



We'll get back to you
in ONE business day.