Google Limits Search Referrals

I have just been analysing the stats in Google Webmaster tools for one of my sites and just spotted a very suspicious trend.

Between 4th Sept and 4 Oct 2010 one site received 2400 Google search referrals on 16 occasions. Each time the number of search impressions varied. Also, it received 1900 search referrals (clicks) on 10 days and for the remaining days it received 1600 clicks. On the highest search impression day the site appeared 27,100 times in the Google results pages, but got 2900 clicks. On the highest 2400 click day it had 22,200 impressions and on the lowest 2400 click day it had 18,100 impressions. Below this number of impressions clicks dropped to 1900.

There can only be four explanations for this:

  1. Webmaster Tools provides false information
  2. Webmaster Tools rounds everything up or down
  3. Humans are incredibly predictable, always stopping at 1600, 1900 or 2400 visits per day
  4. Search referrals are capped by Google

The second option is implausible as I get traffic from all over the World. So unless I am at the centre of a Truman Show type web development reality TV show, I cannot believe that is the reason.

Could Webmaster tools just be wrong? Do they just round everything up or down to the nearest 500 to give an idea  of search volume? Maybe, but then what is the point in that? The idea is to help you analyse your traffic to learn why people are coming to your site, what they want and how to get more people. Rounding up or down in lots of 500 goes no way to aid you in this respect.

Actually, it is not lots of 500, the pattern seen in my reports is:

  • 1600 Google search clicks
  • 1900 Google search clicks
  • 2400 Google search clicks

Maybe each band increases by 200 more than the previous difference (probably a statistical term for this, but I was never good at stats). Maybe if I can up my game Google will then send 3100 visitors to the site? If so the next daily Google search click limits will be:

  • 3100
  • 4000
  • 5100
  • 6400
  • 7900
  • 9600
  • 11500
  • 13600
  • 15900
  • 18400
  • 21100

The easiest way to determine if this is the case if for other people to check the Webmaster tools for September to see if any of these daily figures appear in their webmaster tools (Your Site on the Web > Search queries).

So, the final option, that search is capped, that for some sites Google will send X number of visitors to your domain and then turn the tap off, seems to be the only option.

What does this mean? Well, if this is true, then it may become pointless for you to try to win more traffic from Google with a few small changes to your site. Maybe the cap is based on the number of pages you have, or the quality score of content, or the pagerank of the site. Whatever it is, Google do not appear to be telling us anything about it.

If anyone has any insights into this trend please let me know.

6 Comments on “Google Limits Search Referrals”

  1. I am now started to think that option 2 is the case – everything in Google webmaster tools is rounded. But why do they not say this? There is a help page on data but no mention of rounding the figures.

  2. #4

    On a relatively new and unestablished website, Google runs you through its algorithms and spits out a fixed number of times it will allow people to see your website in the serps.

    This is one of the reasons (for instance) that google webmaster tools tells you that you rank top 10 for some infinitely difficult keyword, but it shows < 10 for clicks. It shows < 10 for clicks because it may have only allowed you 20 impressions of that keyword and you didn't have the opportunity to get more than 10 clicks.

    On unestablished websites, Google throttles traffic. It determines a fixed amount of times to show your website and then people eat up your impressions.

    In this way, people who run google bot scrapers harm webmasters with new or unyet trusted websites — their bot (which has absolutely no intention of clicking your link or visiting your website) — burns an impression whenever your site is found in the search results it is scraping.

    This has many implications. For instance, I can start a new website and shoot ONLY for long tail traffic and make it a point not to have my site beneficial for a difficult keyword term, and I will get more traffic. Or — I could shoot directly for a difficult keyword term that many people are scraping google to monitor (such as "real estate") — and if I end up ranking like #473 for "real estate New York", 99.9% of my impressions will be burned up by bots before any real people see my website to click it.

    If a webmaster works their butt off, they can get their impression cap removed and be allowed unlimited traffic — i.e. no matter how many times a search is performed, you rank top 10 for a particular keyword. This only happens with certain trust factors in place, i.e. a good set of trusted backlinks, steady content produced over time, steady backlinks over time.

  3. i have the same suspicion that google definitely caps it. by comparing the performance of different domains, i come to think that this is yet another of google’s tricks to make you pay for ppc ads…..
    so, why not simply create several domains with unique content and benefit from a multiple search referral allowance?

  4. Yeah, but then the counter argument is that if you build a greater quality site then in time you will be rewarded by moving to the next level, which will bring much more traffic, i.e. a much higher cap, than you would get with 10 sites on the same subject. And optimising and managing 10 sites has to be harder than putting all your energy into one really good one?

  5. true, i’d like to add one more reason: domain diversification.
    on the stock market we wouldn’t put all money in just one stock, we would diversify our portfolio to many stocks so we can reduce risks. naturally this implies more effort, more work, and higher transaction and maintainance costs. I think the same thing applies online, too. if one domain gets ditched for whatever reason, or doesn’t perform as well, it won’t hurt the pocket if there are a couple of sister domains.

    then, i also noticed the domain keyword profile that google generates from the textual content of the entire domain. if you have a lot of topics covered on a single domain, you’re basically at a disadvantage. splitting it up to several domains makes each domain look more ‘specialized’ because of google’s simplistic detection method. specializing in one topic per domain may in turn improve ranking. article directories that try to cover all topics in the world should therefore have it difficult to rank well…..but then, there’s wikipedia….explain that one 😉

    but as you say, it’s much more work for sure to work on several domains simultaneously.

  6. Well, that is true. A subject I have been trying to tackle recently. Of course, there are more ways to diversify. You can start to build a reputation offline, traditional marketing and advertising to promote products and services, rather than rely only on organic search.

Leave a Reply

Your email address will not be published. Required fields are marked *