Understanding the Google Algorithm

Since posting this article Google has moved further toward being the ‘answer machine’. Read about current developments around not provided keyword data and the Hummingbird algorithm.

One of the key reasons that Google now dominates the search engine market is the unique functionality of the original version of its algorithm. While many popular search engines of the time used a very keyword focused algorithm, Google’s patented PageRank algorithm looked at human generated links. The logic behind this was the idea that any site being linked to by other strong and important sites would be worth visiting, in essence it was an attempt to make their results more naturally driven.

Since then the Google algorithm has changed significantly however it is important for any business’s online strategy to not only understand the way the Google algorithm works but to also know why they have written it to work that way.

Boston to Penguin – the Google algorithm journey so far

The first named update to their algorithm that Google released was in February 2003. Boston, named for the city it was announced in, like many of the early updates was a combination of algorithm updates and major index refreshes – what was called the Google Dance – and notable mainly because it was the first to be named.

From there Google followed the Hurricane method when naming its algorithm updates – alphabetically with alternate male and female names. Then, after some time, came a much more significant update named Florida. This Google algorithm update created a massive stir in the SEO and online business world. Many companies completely lost their rankings and panic ensued. Targeting low value SEO tactics of the time such as key word stuffing and other deceptive on page tactics, Florida reconfigured the entire SEO game forcing the industry to develop much more intelligent and refined strategies.


Google continued to update its algorithm making changes that required SEO strategy to continue reshaping and evolving. However like Florida they focussed mostly on bad and spammy SEO practices like low quality links and duplicate content but in February 2011 Google released the algorithm update they named Panda. What was significant about Panda was it didn’t only target spammy sites but sites that weren’t very useful to a user, or ‘thin’ sites as well as low quality and scraped content. Google claimed that the Panda update effected 12% of all search results.

Such a major shake-up based around not just eliminating spam and black hat SEO practices but the perceived usefulness of a site to a user can explain a lot to any business how Google wants its algorithm to function. The sort of SEO that Google rewards is from sites that show the algorithm that they have strength and are important. According to Rand Fishkin, within the constantly changing ranking factors for the Google algorithm trust or authority of the host domain is the fasted growing factor and was seen by him to be more important that anchor text, keywords and links. If Google wants your site to be strong and useful to the user one of the biggest favours you can do for yourself when building it is make it strong and useful to the user.

Where does Google’s algorithm go from here?

So what direction is Google’s algorithm likely to take from here? An interview in the Wall Street Journal with a top Google search executive, Amit Singhal, signals that they intend to incorporate semantic technology to so searches can interpret the actual meanings of words. Rand Fishkin recently did an video blog about Google’s current use of co-citation, where it ranks sites for terms not based on any on page content but based on a wider association of terms across the net. What all this seems to point towards is a smarter Google algorithm that is trying to understand what the user wants from their search terms.

And what does this mean for businesses trying to establish an online presence? When it comes to the Google algorithm it pays to be proactive and everything currently is pointing towards making it easy for Google’s algorithm to work out what it is your business does. Not just based on the keywords you target and the anchor text you use but by the structured mark up of your page to make it easier for Google to see what the content is really about, and the content in general, using appropriate terms and language that Google is likely to associate with your industry.

As the Google algorithm gets more nuanced so must your business’ online SEO strategy. The days of easy and quick fixes have long been a thing of the past and the way forward is somewhat unknown but whatever it will be will likely be anchored from strong site structure and quality content.

Contact roi.com.au today to learn more about how our quality SEO services included optimised copywriting can help your business grow.

People who liked this also liked…

  1. Latest Analysis Of Penguin 3.0 Update
  2. Are You Prepared For “Mobilegeddon”?
  3. 12 Digital Marketing lessons We Learnt In 2014

Contact Us


Suite 3, Level 1
801 Glenferrie Road

Hawthorn VIC 3122


Level 4, 10-14
Waterloo Street Surry Hills NSW, 2010

Weekdays: 9.00am-5.30pm