Google Panda comes of age

At the start of this month (June 2015), Google announced that a revision of its Panda algorithm update was due to be rolled out “over the coming weeks”. Announcements such as these leave the SEO community with baited breath—and rightly so, since an algorithm update can truly shake up the visibility of brands online.

Panda is a site-wide algorithm, which means that if Google decides to penalise your website, your visibility as a whole will take a hit. Your only hope of recovery is at the set points of subsequent refreshes, which can mean that you suffer a significant dip in traffic for months or even a year. It’s not unusual for smaller businesses to cease trading following an algorithmic penalty.

By contrast, a manual penalty is on a page-by-page basis and can be lifted as soon as you tidy up your act and submit a re-crawl request to Google. This is far easier to resolve and move past, hence the quaking in the SEO crowd’s proverbial boots at the announcement of an algorithm update.

In the case of the forthcoming Panda roll-out, we’re advised that this is a “data refresh”, which is a cyclical activity carried out by Google in an attempt to reassess sites that were previously penalised and which have since made efforts to improve their content quality.

Alongside the above, a fresh crawl of the Internet will catch out new and previously undetected websites with “thin” and otherwise spammy content.

What is “thin” content? In a nutshell, this term describes text and images that are merely there to try and influence page rank without actually offering any value to an end user.

Our advice here is to write all of your website copy with your audience in mind. This means that even if that one keyword is shown to drive an impressive number of online searches, you must avoid stuffing your on-page copy and your meta data with it in repetition.

Write using a rich lexicon and using synonyms; just as you would talk and write otherwise. Google is now sufficiently evolved to appreciate and reward written flair.

The above also applies to all that goes on “behind the scenes” of your website, within the meta data. Keep this honest and clear, and, again, avoid keyword stuffing. If your website sells red dresses, then do use your meta fields and your primary headings (H1s) to express this—but don’t try and rig the system to say that you sell blue dresses in order to capture susceptible traffic. You’ll only be slapped on the wrists in the increasingly shorter term.

What’s more, the algorithm refresh will also look to catch out and relegate the overall visibility of websites with spelling and grammar errors, since this is perceived as spam generated by a robot rather than a human.

We also advise you to look out for innocent mistakes that can see your website hit with a penalty. Internal duplicate content can land you in hot water and e-commerce sites in particular will need to sort out their robots.txt file, in order to account for the delivery and returns information that will potentially be repeated over hundreds of pages.

Similarly, CMS setups that automatically generate duplicate URLs need to be remedied, and canonical tags used correctly, to resolve www.website.com, http://website.com and www.website.com/ to one URL only (ideally the first permutation).

Remember that the algorithm is artificially intelligent, and still operates on a yes/no basis, so it will take all of the above as spam, whereas the reality of the situation is that when it comes to technical setup, most webmasters are simply unaware of their digital footprint.

Finally, if you’re still in the era of content farming and article spinning, we advise you to make a sharp exit before the panda gets you.

Want more information? Contact us here

About the Author