The Google ranking algorithm is constantly changing. Multiple times a day, Google introduces small tweaks, but every now and then, they implement larger changes that more dramatically alter the SERPs. It can certainly be hard for SEO professionals to keep up with all these changes, so this post will give a brief overview of some of the most important changes to the Google ranking algorithm over the past few years. For a more in-depth look at these and some of the more minor algorithm updates, Moz provides the most up-to-date history of changes.

First things first, it’s important to understand how SEOs uncover these algorithm updates. Google doesn’t always announce when changes are coming, nor do they confirm when an update has occurred. SEOs across the globe come together to discuss changes in the SERPs on forums, social media, and news sites. Combining their knowledge and data, collectively SEOs are able to determine how volatile the rankings are at any given point in time. Highly volatile rankings (rankings that are showing significant changes) point towards some form of an update with Google’s ranking algorithm. Luckily, you don’t always have to keep up with all the industry chatter. Sources like SEMRush provide helpful tools that give you an idea of how volatile the SERPs are each day overall and for a variety of verticals in particular.

1. Panda

When: February 24, 2011

Why:

Google wanted to weed out sites that were heavy in obtrusive ads, sites that contained a large amount of thin content with no real value to searchers, and sites that engaged in “keyword stuffing”. The Panda update was one of the first algorithm updates to start to seriously tackle the issue of “webspam” that cluttered the SERPs.

Effects:

The Panda algorithm update implemented a new “quality score” to all sites in Google’s index. Ultimately, it penalized sites with low-value content and awarded sites with quality content.

This quality score initially began as what Google called a “filter”, but it was eventually rolled into the official ranking algorithm in January 2016. Now, sites penalized by Panda for having a low quality score will see those effects much more quickly; conversely, changes to your site can help you recover from those penalties much faster than before.

Further Reading:

2. Penguin

When: April 24, 2012

Why:

Google wanted to devalue “webspam”. As most SEOs know, one of the key ranking factors in Google’s algorithm is a site’s backlink profile. Before Penguin, there were a large number of sites that would buy links from other sites or participate in “link networks”, groups of sites that would link to each other in large volume to artificially inflate each other’s backlink profile.

Effects:

Sites that were participating in link networks or buying links saw an immediate decrease in their rankings, with some getting a complete site-wide penalty that removed them from the SERPs entirely. In the post-Penguin SEO landscape, SEOs now must come up with new and creative ways to obtain legitimate backlinks from reputable sites, rather than simply attempting to achieve as many backlinks from as many sources as possible.

Further Reading:

3. Hummingbird

When: August 22, 2013

Why:

Prior to Hummingbird, Google’s search algorithm was highly focused on individual terms in a search query. It’s no surprise that one of the main tools SEOs use to track progress is how a site ranks for a given list of keywords; this practice stems from the early days of SEO. Hummingbird changed this significantly. Rather than look at the individual words in a given query, the Hummingbird update changed the algorithm to start looking more at the meaning of the entire phrase, understanding the meaning of the query as a whole, rather than a combination of its individual parts. In other words, this update was one of the first major pushes towards semantic search.

Effects:

The post-Hummingbird SERPs are now much more targeted towards a user’s intent. For example, whereas previously a search for “buying tickets to a chicago cubs game” would return the Chicago Cubs homepage as the number one result, Hummingbird adjusted the SERPs to now direct searchers to the specific page on the Cubs site for buying tickets. In addition, this update greatly increased the frequency of knowledge graph results appearing in the SERPs.

Further Reading:

4. Pigeon

When: July 24, 2014

Why:

Before Pigeon, Google’s core ranking algorithm and their ranking algorithm for the local pack were relatively distinct. Pigeon brought the local algorithm and the core algorithm closer together by changing the local algorithm to include more of the ranking factors associated with the core algorithm.

Effects:

Pigeon really shook up the local SERPs. Sites with stronger web presence and more refined websites saw their rankings increase in local results, whereas previously they may have suffered due to lack of proximity to the searcher. In addition, before Pigeon the local pack would sometimes show up to 7 results; post-Pigeon the frequency of these 7-packs dropped dramatically. These changes made local search more competitive and continued Google’s trend toward rewarding the sites that provide the most real value to searchers.

Further Reading:

5. Mobile (aka Mobilegeddon)

When: April 21, 2015

Why:

With more and more users searching from mobile devices like phones and tablets, it was becoming apparent that web design as a whole was changing. Sites were beginning to develop mobile websites and responsive web design began to take hold across the Internet. To ensure that the SERPs were providing results that would be equally as accessible on desktop as on mobile, Google changed it’s algorithm to reward sites that were mobile-friendly.

Effects:

Sites that were mobile-friendly, whether they had a dedicated mobile version or whether their desktop site was designed responsively, were heavily rewarded, while sites that were difficult to access or slow to load on mobile devices suffered a drop in rank. In the end, the Mobile update (or Mobilegeddon, as many people referred to it) brought more mobile-friendly results to the SERPs and increased the amount of app content that was displayed, allowing mobile users to click directly through to apps.

Further Reading:

6. RankBrain

When: October 26, 2015

Why:

To continue to increase the quality of their SERPs, Google decided to implement machine learning to help guide the development of their core algorithm.

Effects:

The effects of RankBrain are far-reaching and complex, even to most SEO professionals. SearchEngineLand sums up how RankBrain affects the SERPs concisely, explaining:

The methods Google already uses to refine queries generally all flow back to some human being somewhere doing work, either having created stemming lists or synonym lists or making database connections between things. Sure, there’s some automation involved. But largely, it depends on human work.

The problem is that Google processes three billion searches per day. In 2007, Google said that 20 percent to 25 percent of those queries had never been seen before. In 2013, it brought that number down to 15 percent, which was used again in yesterday’s Bloomberg article and which Google reconfirmed to us. But 15 percent of three billion is still a huge number of queries never entered by any human searcher — 450 million per day.

Among those can be complex, multi-word queries, also called “long-tail” queries. RankBrain is designed to help better interpret those queries and effectively translate them, behind the scenes in a way, to find the best pages for the searcher.

Further Reading:

8. Fred

When: March 8, 2017

Why:

Recent years have seen a spike in affiliate marketing. As a result, many sites were producing thin content that was mainly composed of a large number of affiliate links or content that was very ad-centered. A large portion of these sites were in direct violation of Google’s webmaster guidelines.

Effects:

Sites whose main purpose was driving ad-revenue, rather than providing valuable content, were heavily penalized, with some sources reporting as much as a 90% drop in organic traffic to some affected sites. The Fred update is just the latest in Google’s continued pursuit of ridding the SERPs of value-less content.

Further Reading:

Jordan Stella
Customer Success Manager at | More Posts

Jordan J Stella is a digital marketing professional whose passion is to help brands find their voice, tell their story, and connect with real people. As Customer Success Manager for UpCity, Jordan helps clients integrate our Agency Growth Engine into their existing workflows and processes.