Important! – Respected Google Webmaster Trends Analyst John Mueller recently provided clarification and valuable insight into how websites can recover from Google Algorithmic penalties.
The full post by John Mueller can be viewed here or at the end of this post, but one of the most important statements he made was the following, where he validated the frustration of businesses who have been struggling to recover from Penguin:
“… we do realize that it would be great if we could speed the refresh-cycle of some of these algorithms up a bit, and I know the team is working on that.
I know it can be frustrating to not see changes after spending a lot of time to improve things. In the meantime, I’d really recommend – as above – not focusing on any specific aspect of an algorithm, and instead making sure that your site is (or becomes) the absolute best of its kind by far.”
If you have experienced a Google ranking drop over the past 3 years you have probably heard that it may be due to:
A) A manual link penalty. We have previously discussed manual link penalties.
C) Every other SEO theory out in the marketplace
For roi.com.au, our experience has been that manual link penalties are relatively straight forward to fix; in fact we have had 100% success rate with recovery from this type of penalty.
Algorithmic penalties have been much harder and the recovery results have been varied. This is why we found John Mueller’s insight very interesting, as it validated our experiences and the ranking recovery results we have achieved for our clients.
One of the views of the SEO industry as reported by Barry Schwartz last evening was:
A) If you have a Google algorithmic penalty issue you need to wait for the next algorithmic refresh for your rankings to return
B) It has been 339 days since the last Penguin refresh, so there are a lot of Australian websites in SEO pain right now
“Google Confirms That Recovering From Penguin Requires An Algorithm Refresh” – Original Source SEO Round Table
I must say I read John Mueller’s post a little differently…
A) My interpretation of this post is that in theory you have to wait, but in practice if you invest in cleaning up your link profile, and focus on providing a great customer and user experience, your site’s rankings can improve regularly, and without having to wait for the next algorithmic update
B) John also specifically addressed the issue of web spam issues where links is one of the most common, in which he said that if you invest in cleaning up your link profile you will be rewarded in the short term and the long term
C) Continually invest in making your website better and you will be rewarded with more leads and better rankings
Here’s what you need to do:
Google has told you that they have 200 new ranking factors, so you now have 200 reason to act on the below, which I suggest every Australian website and business owner should do:
A) Invest in cleaning up your link profile – give your website backyard a makeover as it most probably needs one if you have been online for more than 3 years
B) Rewrite content for “customer intent” not keywords
C) Make your content engaging by using images, videos, natural language, proactive questions and answers – this will result in high time on site plus better rankings
D) Build site authority by spending time writing thought-provoking newsletters that can be published on a blog or shared through social media
E) If you are a service business and have nothing to blog about, invest in customer reviews and positive after service engagement
F) Make your website mobile ready – mobile is a large % of search traffic and Google only ranks sites that are mobile ready
Read the full post of John Mueller below:
Let me try a longer answer :-)… In theory: If a site is affected by any specific algorithm or its data, and it fixes the issue that led to that situation, then the algorithm and/or its data must be refreshed in order to see those changes.
Sometimes those changes aren’t immediately visible even after a refresh, that’s normal too. In practice, a site is never in a void alone with just a single algorithm.
We use over 200 factors in crawling, indexing, and ranking.
While there are some cases where a site is strongly affected by a single algorithm, that doesn’t mean that it won’t see any changes until that algorithm or its data is refreshed.
For example, if a site is strongly affected by a web-spam algorithm, and you resolve all of those web-spam issues and work to make your site fantastic, you’re likely to see changes in search even before that algorithm or its data is refreshed.
Some of those effects might be directly related to the changes you made (other algorithms finding that your site is really much better), some of them might be more indirect (users loving your updated site and recommending it to others).
So yes, in a theoretical void of just your site and a single algorithm (and of course such a void doesn’t really exist!), you’d need to wait for the algorithm and/or its data to refresh to see any changes based on the new situation.
In practice, however, things are much more involved, and improvements that you make (especially significant ones) are likely to have visible effects even outside of that single algorithm.
One part that helps to keep in mind here is that you shouldn’t be focusing on individual factors of individual algorithms, it makes much more sense to focus on your site overall — cleaning up individual issues, but not assuming that these are the only aspects worth working on.
All that said, we do realize that it would be great if we could speed the refresh-cycle of some of these algorithms up a bit, and I know the team is working on that.
I know it can be frustrating to not see changes after spending a lot of time to improve things. In the meantime, I’d really recommend – as above – not focusing on any specific aspect of an algorithm, and instead making sure that your site is (or becomes) the absolute best of its kind by far.