When animals stop being cute: a guide to Google algorithm updates

Google tends to name its algorithms after endearing animals—belying the brutal nature of their impact on website traffic and search engine visibility.

Before we dive into a better understanding of what each one does, it’s important for us to define what an algorithm is: it’s simply a set of computer-based rules.

The Google algorithm is immensely intricate and is designed to ensure that search results provide a great user experience. When search engines were first created, early search marketers were able to easily find ways to make the search engine think that their client’s site was the one that should rank well. As a result of constant attempts to “game” the system, the algorithm has now evolved into a set of rules that comes scarily close to human thought.


Panda focuses on content quality, and new iterations of this algorithm are rolled out monthly, with announcements only made by Google for significant alterations. Recovery from a Panda update hit may take one or two refreshes, thereby giving Google the requisite time to crawl all website pages. A drive should be made to ensure that page copy is unique, is value-adding to the end user and is authoritative about its products, subject or industry.


This algorithm update zones in on the quality of the website backlink profile. A protocol for analysing and removing “toxic” backlinks is explored later on, along with the right means of securing new, high-quality backlinks.


Whereas Panda and Penguin could be considered component refreshes of the algorithm, Hummingbird has overhauled the algorithm as a whole, in its bid to better understand human thought patterns, thereby analysing search queries more intelligently, to yield more relevant results. In order to perform highly on the SERPs in light of this significant change, the emphasis is on content which answers the questions of users, rather than being designed solely for SEO purposes.


In September 2014, this roll-out shook up the way in which results were served for local SEO. Most notably, “hyperlocalisation” saw the search radius for queries reduce quite significantly, emphasising the need to include Google My Business listings for all store branches of a given brand.


Website content should be unique, engaging and written to please users—rather than search engine robots.

The above entails producing content that is rich in synonyms and answers questions rather than simply stating the obvious.

Backlinks are to be cleansed if they are disreputable.

New backlinks should be achieved from only relevant and high-quality sources, based on the distribution of meaningful and unique content.

Want more information? Contact us here

About the Author