Over the last couple of years Google has made some radical changes to the way it ranks webpages in its search index. Many of the these changes have resulted in previously popular and successful websites being dumped overnight.
Attack of the Pandas
In early 2011 the Panda update was rolled out. This decimated sites that had too much low quality content. Thin pages, repetitive subjects, duplicated and re-used articles were slam dunked into the search abyss. Recovery was nigh on impossible for some people (although thanks to some excellent advice from a Google+ acquaintance I recovered my main business site). Many people simply abandoned their sites though and started working on new ones. The solution: Remove thin, crappy and duplicate content.
March of the Penguins
Then in 2012 Google presented us with its Penguin. Quack quack! No, that’s ducks isn’t it? Anyway, where Panda was all about the website (a.k.a. on-page optimisation / spam), Penguin was, apparently, largely about off-site SEO – i.e. links. This gave birth to a new (and totally unproven idea in SEO – Negative SEO. Suddenly everyone who had lost rankings post-Penguin was blaming low quality links.
Personally, I still think that this is a load of bull. Why? Some people, mostly SEOs building traffic to content sites, used cheap SEO tactics in the past. These low quality links have not been discredited. The result is that websites drop in rankings. But this is not a penalty (IHMO) but a re-adjustment. You cannot rank a site (so easily) using only low quality (and easily obtainable) links. The solution – better quality links are needed now from authority sites.
A bit later this year there was then the poorly named page layout update. This one smacked sites that had slapped adverts all over the top half of the page (above the fold) to entice readers to click on adverts before they had even seen the content. The SEOs who had survived Panda and Penguin thought that they were clear and for a while were raking in all the cash from adverts in the top niches. But then those sites got whacked too.
Exact Match Dunked
Then, yesterday, we had another Google update. This one downgraded all websites that had relied on using an exact match domain (EMD) to rank well. Many SEOs commented that these should have been the first to go. I suspect Google chose to knock them on the head last to see first if the Panda, Penguin and MFA changes had knocked them out also.
N.B. Do not confuse with EMF, like I did. Pressing play is optional. However, a musical interlude is recommended at this stage in the article.
The result was of course that some EMD sites were doing even better (unlike EMF who never really did much since Schubert Dip, although they have been performing at some festivals recently). This update seems to have hit some people really hard though. Many web development firms who dabble in SEO have built websites for their customers using exact match domains rather than brand domains. For (made up) example, Mr. Frank Shufflebottom’s haberdashery business would be put on www.besthatsinlondon.com (not actually registered at time going to virtual press) rather than ShufflebottomHaberdashery.com, which although is a far better name, is not one that any sane person would search for (or even make up).
Now today Google have announced that they have just published an update to their Webmaster Guidelines. We could therefore leap to the conclusion that this means that Google are done with fiddling with the search algorithm for the time being. For lack of anything better to do right now, I shall indeed leap to that conclusion. So, lets take a little time to read …..
Webmaster Guidelines – Best practices to help Google find, crawl, and index your site
Google really do not care for pretty URLs. Their Webmaster Guidelines are here: http://support.google.com/webmasters/bin/answer.py?hl=en&answer=35769
I highlighted bin for no reason other than it appeals to my childish sense of humour.
The page starts with an introduction from Michael Wyszomierski. He works at Google (you can tell by the huge Google logo on his T-shirt and the white background in the video). He helps me, but does not know me. (My 5-year-old just came in the room and said “who’s that?” while pointing at Michael Wyszomierski on the screen. I told him that he works for Google. My son then asked me if he helps me, I said “yes”. He then asked if I know him, I replied “no”. Thought I had better explain that!)
In the video Michael talks about syndicated content, thin affiliates and doorway pages. These are areas that have really been targeted in the last year or so.
“A violation of our webmaster guidelines can negatively impact your performance in our search results, and in some cases can even result in removal from our search results.” Michael Wyszomierski,
I should put on my serious voice now. It really is a great read, you could argue, a vital read. So read it. The new and improved webmaster guidelines cover:
- Design and content guidelines
- Technical guidelines
- Quality guidelines
I have skimmed over the page, and really cannot see anything blindingly obvious that has changed. I think that there is just better clarification now. For example, one piece of advice under Quality guidelines is to avoid “Participating in affiliate programs without adding sufficient value”. It then links to a page that goes into more detail on this topic.
Pierre Far (a Googler who speaks on Google+) made some comments too:
“And we’ve added guidelines for rich snippets!!
The key sentences are from the blog post: The main message of our quality guidelines hasn’t changed: Focus on the user. However, we’ve added more guidance and examples of behavior that you should avoid in order to keep your site in good standing with Google’s search results.”
Another tip that I do not remember reading before (although maybe because it was obvious) is “Preventing and removing user-generated spam on your site“. So, spammy comments on blogs, in forums or on wiki sites – get rid of it, it can harm your site. Linking out to bad neighbourhoods has been bad for SEO for as long as I have been involved (which is since 2006) so nothing dramatically new – but automated spam commenting using computer programs is something newer, at least, it did seem to become more prevalent in the last few years.
Why are you still reading this? You should be reading the Google page now!