The Evolution of Google’s Algorithms

First Published:

The History of Google’s Algorithms 

Google modifies its search algorithms thousands of times annually. Generally, these adjustments are minor, but on occasion, significant changes are revealed early to allow web developers to adjust their sites beforehand.

To prepare for and benefit from any Google update, it’s best to review Google’s documentation on SEO. Generally, suppose the content and experience your website offers are high-quality. In that case, you won’t need to worry too much about the update.

This article will help you gain familiarity with major algorithm updates Google has rolled out in the past 10 years, how to get ready for the upcoming Core Web Vitals released in June 2021, and how to monitor future updates when they launch throughout the remainder of this year and beyond.


Google utilises more than 200 different ranking signals and a complex system of rules to decide which pages on the web are most relevant to a given search query. This information is used to organise the search results in terms of relevance, ensuring that users find what they’re looking for efficiently.


In the beginning stages, Google made fewer updates to its algorithm that now – thousands of large and small ones each year. Most of these are minor changes that webmasters and users need to notice. Nevertheless, significant algorithmic upgrades are declared every so often; such modifications are referred to as core updates because they indicate a critical aspect of the algorithm is being switched up, which might significantly affect SEO or user experience.


Here is a list of the major Google algorithm updates over the past 10 years and a short summary of what each meant to webmasters and users.

Caffeine (2010)

Panda (2011)

Penguin (2012)

Venice (2012)

Pirate (2012)

Hummingbird (2013)

Pigeon (2013)

HTTPS/SSL (2014)

Mobilegeddon (2015)

RankBrain (2015)

Possum (2016)

Intrusive Interstitials Update (2017)

Mobile Page Speed Update (2018)

Medic (2018)

BERT (2019)


In 2010, Google released the Caffeine indexing system to provide faster search results for users and content publishers.

Caffeine enabled Google to crawl and store data more effectively. Google said, “It gave web searches 50% fresher results than our prior index, and included the largest ever compilation of web content.” – Google.

To keep up with today’s fast-paced internet, Google announced that website speed would become a factor in their ranking algorithm. This adjustment was made to ensure the web continues providing users with the best possible experience.


The Caffeine index update didn’t introduce any new criteria for ranking websites. Still, it was part of Google’s plan to make the internet faster, with desktop site speed becoming a factor in website rankings. Emphasis was placed on website performance.

Caffeine allowed Google to index more content over the web, thereby giving users a better browsing experience.


As part of its effort to provide users with high-quality content, Google launched the Panda Update. This update penalised websites which used keyword stuffing and content farms, which led to improved search engine results by getting rid of low-quality pages from top listings.

SEO professionals had to modify their techniques as black hat SEO strategies such as keyword stuffing had ceased to be effective. To stay competitive, they needed to ensure that the content they provided benefited readers and gave accurate answers to searches.


Google’s Penguin algorithm update targeted web spam, such as deceptive link-building tactics. They referred to it as the “webspam algorithm update.”

With this update, the algorithm can tell if the links to a certain page are valid or spam.

The Penguin update specifically targeted tricks used in black hat SEO. Building deceptive links stopped working, leading to decreased organic traffic and rankings for sites with many low-quality links.

If you don’t follow Google’s Webmaster Guidelines, your website could be penalised in rankings after Penguin.

VENICE (2012)

The Venice algorithm upgrade boosted search results for items and services based on their proximity to the inquirer. As a result, users can anticipate outcomes from within, or very close to, their city and town much more accurately. Queries such as “fitness centre near me” or “fitness centre Dublin” are good examples. After the improvement, the 10 blue links in search results were more likely to provide local options.

This change substantially affected local SEO, especially for small businesses and companies. Now they can compete against more significant enterprises for terms with local significance that receive a lot of searches.

PIRATE (2012)

To protect copyright holders, Google implemented the Pirate algorithm update, which vets sites reported for DMCA violations and prevents them from ranking high in search results.

If Google discovers your website is violating the DMCA, it will be heavily demoted in organic search results.


Google revolutionised its search engine with the Hummingbird algorithm, designed to interpret what searchers wanted based on their inquiries. This advancement allowed a better understanding of user queries and the pages most relevant to the search.

The Hummingbird update made finding the most relevant pages to answer a user’s query more straightforward. This motivates search professionals to examine the actual goal of specific keywords and aim to optimise for that.

PIGEON (2013)

The Pigeon Update focused on making local search better. It aimed to bridge the gap between Google’s local and web algorithms to produce improved ranking results that take physical closeness into account.

The pigeon update revolutionised local search. It enabled small businesses to target a local audience with ease. As such, by using standard SEO protocols — e.g. getting backlinks and developing domain authority— small businesses can boost their odds of appearing in search results.

The Pigeon algorithm has changed the importance of optimising for local businesses, requiring them to contextualise their content, get local reviews, and use structured data markup to appear in rich results when those searches are conducted.

Google launched their HTTPS Everywhere and incentivised websites to switch from HTTP to HTTPS by installing SSL certificates on servers. This made the internet more secure, thus providing users with access to trustworthy and safe websites from their search results.

With its recent update, Google gave priority to websites with HTTPS by using it as a factor in determining rankings and raised user expectations for trust and security in organic search results.


In April 2015, Mobilegeddon was revealed under Google’s “mobile-friendly” update. This was the first significant alteration to advance Google’s “mobile-first” motivation. A page should comply with particular visual criteria to be construed as mobile-oriented. Specifically, its proportions must be apt for a handheld device (smartphone) display.

Having a mobile-friendly version of your webpage will help it to appear higher in search engine results. Note: this does not apply to whole websites, just individual pages.


Google’s RankBrain update uses machine learning AI to surface the best search results for any query. It considers location and word count to understand user intent and deliver the most relevant results.

This update has changed the ranking criteria based on the query. Optimising typical elements is insufficient; your optimisation methods should prioritise what best satisfies a user’s needs.

POSSUM (2016)

The Possum update enhanced local search outcomes. Before this alteration, businesses outside a certain city’s borders faced immense challenges in being visible for related keywords. This update addressed such trouble and, furthermore, made the searcher’s physical location as well as the differentiation of local terms more significant too.

Google recently changed their local SEO algorithms, increasing filters based on address and location, adding more diversity to results, and inhibiting spam from obtaining high rankings.


In 2017, Google changed its algorithm to reduce intrusive interstitials, which are large pop-ups that cover a big part of the page and impede users from accessing the content they need.

Google made the update to reduce spam and improve SEO. Although content with intrusive advertising is sometimes visible, it may still get a high ranking if it effectively satisfies user searches.

Because of this change, you can be penalised in SEO rankings if you use intrusive pop-ups.


In January 2018, Google announced a “speed update” to their algorithm, prioritising mobile search results for websites with faster page loading speeds. This update was part of both the company’s broader “speed up the internet” endeavour and its “mobile-first” objective.

Google has announced that PageSpeed Insights will now reflect data from the Chrome User Experience Report to get a more precise representation of a

The Mobile Page Speed Update made it, so that website loading speed was a factor in mobile search rankings. Websites with faster speeds rose in the ranking, while sites with slower speeds fell. SEOs should work together with software developers to improve mobile performance.

MEDIC (2018)

Using the Medic update, evaluating expertise, authority, and trustworthiness in online content has been made more efficient.

The EAT ranking mechanism is impacted by many different aspects. For instance, short reviews and few backlinks, little content, infrequent website updates, and incomplete business or author profiles are all factors that might push your page down in results if they’re not addressed.


The BERT algorithm (Bidirectional Encoder Representations from Transformers) is an advanced Natural Language Processing Machine Learning system. It allows a search engine to comprehend the intent of words in a sentence as comprehensively as a human would. With BERT’s installation, Google can better determine which content is more appropriate and top-notch for user queries. With the implementation of BERT, Google can now appreciate nuances and recognise subtleties between similar words given their context.

How does it work? To illustrate, when you say “he’s the GOAT,” you don’t mean he’s a farm animal. You’re implying that he is “the greatest of all time.” With Google’s natural language processing technology, Google can distinguish between these two meanings.

Using BERT, SEO copywriting focuses less on exact keywords. Instead, it places more importance on natural language related to those keywords. When writing content about a topic, making sure it is accurate with sources cited enables Google’s algorithm to identify the quality of your work.


This is a compilation of the significant Google algorithm changes from the past two years and their effects on search engine results.


Google’s Featured Snippets update removes duplicate URLs from search results pages with featured snippets. Let’s say your website has a featured snippet for a query and

There is a genuine concern that sites on the first organic list and the featured snippet will experience significant drops in clicks for similar queries.


Google’s Passage Ranking update enabled them to rank individual passages from a web page in their search results. The cause behind this update was that it was hard to satisfy queries with exact intentions before because the answers were usually hidden deep within a web page’s content. Consider this comparison with trying to find a needle in a haystack here. The solution was for Google to consider the context and definition of specific web page passages while deciding what search results they should prioritise.

SEO on pages gains more worth as Google can now select and appraise parts of your content separately from the others. Because of this, header tags positioned further down the page are now essential. The same applies to lengthy or distinct keywords. Furthermore, composing extended-form material is presently incentivised since Google can quickly analyse it.


Product Reviews are now optimising rewards by featuring comprehensive research and analysis from experienced reviewers. Before this, content with shallow information and standard formatting could rank highly in Google searches.

Google will not punish websites with substandard reviews in and of themselves. Instead, they will rank more thorough review content higher than these results. Google has provided a list of questions to ask when constructing reviews in their own documentation on this update. To illustrate, consider if the reviews display expert knowledge, portray the product’s use, contrast it from other products, and cover major decision-making or shopping determinants.


Google has recently included three new Core Web Vitals as part of its Page Experience ranking factor: Largest Contentful Paint, First Input Delay, and Cumulative.


This new algorithm lowers the rankings of websites that take a long time to load important content, which appears first, reacts slowly after users press buttons and links, and appear unstable on screen – such as when an ad shifts content while loading.


Google is launching a new algorithm update in June 2021 to optimise the user loading experience, which it dubs “page experience.” This Core Update will introduce new ranking signals called Core Web Vitals, made up of LCP (largest contentful paint), FID (first input delay) and CLS (cumulative layout shift).

As part of Google’s goal to make the internet faster, they are giving websites that do well on three metrics a boost in their rankings. However, it is only one factor among many.


You can take several steps to enhance your core web vitals. Each metric requires different approaches and a bit of troubleshooting.

We suggest using Huckabuy Page Speed software to automate the process of optimising your site’s speed. A Content Delivery Network (CDN) can also be beneficial for website rendering. To learn more about improving loading metrics, follow the links above.


To stay informed about Google’s algorithm updates, read their Chromium and Developers blog, follow the @googlesearchc Twitter account, or check out Search Engine Journal.



At a high level, Google’s search algorithms are composed of multiple algorithms. These search algorithms factor in query meaning, webpage relevance, content quality, usability, and context and settings to deliver optimal results. The proportion of weight given to each category is contingent on the query type.


The “Page Experience” update, the newest core algorithm, is scheduled to launch in June 2021. This update will negatively impact websites with poor user experiences based on loading speed, responsiveness, visual stability, security and safety measures, mobile friendliness, and advertising policies.


Google makes thousands of modifications to its algorithm over a year. Most of these changes are minor and go undetected. Google occasionally reveals significant algorithm updates and lets webmasters know what changes to anticipate.


Google is striving to make the world’s information available and valuable. They are working hard to refine their algorithms to give users the best search results possible. Every update to the algorithm is a move closer to their mission.


To stay ahead of Google algorithm updates, create good content on a well-structured website with top-notch external and internal links. As an additional step, monitor Google’s primary search news outlets such as the Chromium blog, the Google Developers blog, and the Twitter account @googlesearchc.