Anyone who is someone in the SEO world has heard about Google’s algorithmic updates, which have set the quality standard for websites that wish to rank in its top results. If you are new to the world of search engine optimization there is no need to worry. What you need to know is that Google uses over 200 parameters in order to rank websites, and in this Google Update guide we will cover the most important ones.
The term “algorithm” refers to the ‘logical-based, step-by-step procedure that solves a particular problems’. SEO algorithms were created in order to better sort relevant web-pages and improve user-experience. Four major algorithm updates have been released to date: Panda, Penguin, Hummingbird and Pigeon (chronologically).
Of course, there have been countless of other updates, each of them with the power to impact your site’s ranking, which you can check out in our infographic below (this succinct timeline presents, the most important algorithms updates and their refreshes).
Why is a Google Update Important?
The year 2015 has been rather quiet in terms of Google Algorithm updates. Except for the Unnamed, the Mobilegeddon and the Quality updates, there haven’t been a lot of exciting things happening. This is rather strange considering that in 2014 and 2013 over 30 updates rolled out. Back when Panda was released, the SEO world was in a major upheaval: people didn’t understand what was happening with their site’s traffic and had no idea how to fix them.
Even if major algorithm refreshes caused waves of panic among SEOs, they were also eagerly expected because without a refresh it is impossible for a site to truly recover. In a recent article, SearchEngineLand mentioned that Google will probably roll out new updates for the Panda and Penguin algorithms.
On the heels of this news, we decided to create an in-depthth guide that will help you solve any issues, which may lead to a Panda or Penguin penalty as well as help you recover from them. So grab a coffee and sit tight, it’s going to be a long ride.
Nobody can accurately predict what Google is planning to do next, nor can they guess the secret formula that will bring a website on the top three positions of SERPs. However, by analysing behavioral patterns and quality filters from Google, we have managed to get a better idea about what we should be doing, and that is to provide quality. The main reason behind Google’s algorithmic updates is its commitment to providing relevant results for users.
Some Brief History:
The first Google Algorithmic update was issued in February 2003. Shortly after, in February 2004, the Brandy Google update, which branded the term of Latent Semantic Indexing, was released. Since then, more and more updates that focused on keyword relevance, link-profile and accuracy were issued. The bottom line is that algorithm changes have changed the way we experience the web and they have had huge positive effects for the user. A Google update may be considered a disaster by certain SEOs, but they help the search engine evolve and deliver better results.
Before we go into the specifics, let’s take a look at a few practices that will surely increase your rankings in SERPs:
- In-depth content: the phrase ‘content is king’ was coined more than 5 years ago and it is available even today. As a matter of fact, content has become more important than ever before, especially since Google announced that it will give higher consideration to in-depth (2000+ words) articles that actually help the user. Before Panda was released many websites managed to grab the top positions of SERPs with thin articles. Not anymore.
- Category Anchor Text Distribution: a few years ago, Google didn’t really care about anchor text. However, the search engine has evolved and is trying to rank only sites with natural link profiles. This means that you shouldn’t have more than 17% of your anchor text as brand names (according to Google). A diverse anchor text distribution profile proves that links were created naturally.
- Freshness: you might not have known, but Google now gives websites a score based on their articles’ freshness. The longer you wait before posting an article, the less score you will have. In addition to this, Google tends to rank fresh content better than old one (revisions made to older articles will also increase your rankings).
What is the Google Panda Update?
The first Panda Google update was released in February 2011. Its primary goal was to prevent sites with low-quality content from ranking in top SERPs. Since its initial release, the Panda algorithm update has received several updates (26 to be exact) and it will probably receive more in the future.
In other words, those who have escaped from its claws before, will probably not be as lucky next time. The Google Panda update revolutionized the world of SEO and gave keyword research, targetting and content a whole new meaning.
- Panda 1.0: was specifically aimed towards content farms (sites that simply rehash existing content without adding value to the web). This Google update affected 12% of queries in the United States.
- Panda 2.0: Shortly after the first Panda release, the 2.0 version rolled out and affected 2% of queries in the U.S. The major difference was that this Google update was also directed to international queries.
- Panda 2.4: a few more minor updates were made, but Panda 2.4 was the one to make a huge splash, affecting approximately 6-9% of queries. According to Michael Whitaker, the update’s goal was to fix site conversion rates and engagement.
- Panda 4.0: this was another major Google update to the algorithm. According to experts, it announced the arrival of a “next generation of Pandas”. It was mainly directed towards the sites that were dominating Google’s top 10 positions.
Examples of traffic Drops after Panda Google update
How to Recover from a Google Panda Update
… and how to protect your site from a Panda Penalty
Despite the fact that Panda significantly improved searches, it caused serious headaches for confused webmasters. With the Panda 4.1. Google update, which was released in September 2014, many sites lost their rankings. What many webmasters didn’t know about Panda is that it is a DOMAIN LEVEL PENALTY. In other words, if your content is penalized, your entire domain will suffer from that penalty.
As you can probably imagine, the best way to get rid of a Panda penalty is by removing or fixing low-quality content.
Not sure what caused your drop in traffic? You can analyze your traffic drops and use our infographic below to see if they occurred at the same time with a Panda update. You can analyze traffic drops with Google Analytics, SemRush or other SEO tools.
Sites Vulnerable to Google Panda Penalties:
- websites with low-quality content or content farms.
- websites with bad SEO structure, duplicate content, ridiculous amounts of advertisements or irrelevant indexed pages.
- websites with bad grammar, high loading time and overly optimized content.
- websites with too many broken links and low CTR from search engines (as a result of poorly optimized meta descriptions and titles).
The good news is that the broken links problem can be resolved easily with a few redirects. But what about the other issues?
Google Panda Penalty Recovery Tips:
With these things out of the way, let’s focus on actionable strategies that will help you lift the Panda penalty, or bulletproof your website against imminent penalties. One thing you should be aware of from the start is that, even if you manage to solve all the problems, your site might not get 100% of its traffic back.
#1 Create Quality Content
This is probably the most important thing that you need to do in order to please the all-mighty Panda. By quality content I am referring to compelling, useful and fresh content complete with relevant keywords and SEO tweaks.
Choose a topic that you want to write about, but before you start writing, try to find the “content gap” that we were talking about in a previous article. Your goal isn’t to simply write on a subject that was already covered. You have to be original by adding your personal opinion, extra tips, alternative content etc.
Make sure you include your focused keyword in the meta-description, title, article body (several times), URL and Image attributes (you can check this with a WordPress plugin such as Yoast! or All in One SEO Pack). A good post without focused keywords may receive likes from followers, but new readers will never be able to find it unless you make it possible for them to do so.
The best thing about indepth & high-quality articles is the fact that they can rank for multiple relevant keywords at the same time, without being considered unnatural by the search engine. Also, LSI words can easily be included in longer posts. If you don’t believe that long articles rank higher on search engines simply conduct a search on a relevant long-tail keyword and count the number of words for each top 10 post.
Another thing that you should take into account is article format. If it isn’t user-friendly and easy to read it won’t rank well. So what can you do about this?
Well, you should start by throwing in some bolds and italics, keywords in your heading tags, reference links etc. If you want to take it one step further you could also add some extra images, or even create your own infographics. Podcasts, videos, content visualizations etc. are also considered ‘content’. The more diverse your page is, the likelier it is that it will rank well.
We could talk about content optimization and creation until tomorrow and not cover everything. Luckily for us, there are plenty of bloggers who already shared their tips on content creation. You see, Panda is also about not trying to reinvent the wheel. If the information is already out there, why not use it?
Here are some incredibly useful sources for in-depth content creation:
#2 Remove or Fix Low Quality Content
Arguably the hardest thing to fix is bad content that has already been published. With new conent it’s all about adding more quality to a future post, but what do you do when you have 500+ bad articles that have already been indexed? You can’t simply forget about them, because the longer you wait, the harder it will be for your site to recover.
Hence, you will lose even more traffic. There are two ways to handle low-quality content:you stop indexing it or you improve it. Improving content will probably take more time than writing a new post, so unless the article is really worth it, you might as well try to de-index it.
What is considered low-quality content?
Of course there is no 100% guide that can tell you what bad content is, but there are a few things you should consider:
- Is your content long enough to be relevant?
- Is it targetting relevant keywords?
- Is it useful for your audience?
- Does it come from a trustworthy source?
- Is it well-researched?
How You Can Remove Low-Quality Content
There are several solutions that can be employed in order to remove bad content from a website. Here are is what I usually do:
- From my Google Analytics tab, I check the traffic and performance for each URL. Those that receive less than 20 visits in 3 months are definitely not converting or helping visitors. I check the article to see why it isn’t ranking. If it’s because keywords are missing, I simply update it and add relevant keywords to it. Alternatively, if it is not something worth the effort, I deindex it from Google.
- High bounce rate is extremely bad for a website. According to Google, an acceptable bounce rate hovers at around 55-65%. If I have posts with more than 85% bounce-rate I try to fix them to make the reader spend more time on them.
- Although in-depth content is preferred, it is impossible to write 2000+ word articles all the time. However, this doesn’t mean that they cannot be 1000 words long. With the ScreamingFrog crawler I identify all the posts with 500 words or less and try to improve them.
- Another thing that might be negatively influencing your content is poorly written or inexistent meta-descriptions and tags. Upon identifying these issues I proceed in filling out all the missing meta-descriptions according to SEO best practices.
#3 Use an SEO-friendly WordPress Theme
Most WordPress themes are beautiful, but not all of them are SEO-friendly. There are a few things you should check before installing a new theme. For example, do you know what a Genesis or Thesis theme is? Premium themes may be a tad expensive, but they are definitely worth the investment. If you are having problems choosing an appropriate one stay tuned for our next post. If you want to comply with the Panda Google Update you will want to invest in a SEO-friendly theme.
#4 Remove Duplicate Content
Duplicate content is a big no-no. You don’t have to be an expert to understand how severely duplicate content can affect your website’s ranking. Also, duplicate content doesn’t only refer to articles. You must ensure that your site’s structure and on-site SEO isn’t considered duplicate. Also, if you find other websites with duplicate content you should report them to Google.
You can check for Duplicate Content with Grammarly or Copyscape. There are also free, online, plagiarism checker tools.
#5 External Links & Social Media Promotion
Social signals and high-quality backlinks prove that your article is truly valuable. You should spend as much time promoting your content as you are writing it. There are many social networks (Facebook, Twitter, Pinterest, Google+, to name a few), where you can engage with readers. The more social signals you have, the better your content will rank.
As far as back-linking goes, you should ensure that your link profile is natural. You can find more information about link-building in our previous tutorial.
There are several other things that you can do to protect your site from a Google Penalty. Here’s a short checklist of strategies worth considering. Don’t forget, the best way to protect your site from major algorithm Google updates is by providing quality.
Google has a very useful tool that will help you diagnose the problems on your site. It also offers a lot of information regarding Panda and Penguin penalties. This is a must-read!
What is the Google Penguin Update?
Google Penguin is responsible for the second most common type of penalty. This algorithm was created to combat websites with aggressive link-building strategies.The first version of it was released in April 2012 and affected many queries in the United States. What Penguin does is to analyze unnatural link patterns and aggressive, short-term strategies for creating back-links. These ‘artificial’ link profiles are severely punished by Google (usually resulting in a complete traffic loss).
Matt Cuts said that link profiles shouldn’t LOOK natural, they should BE natural. Of course, that’s impossible. You know it, we knot it and Matt Cutts. Nobody will ever link to your site if it isn’t already authoritative, and you can’t have an authoritative site without links. So how can you escape the vicious cycle and comply with the Penguin Google update rules?
Instead of trying to buy links, make exchanges or getting involved in mass syndication you can simply ask. If you create content that is truly compelling and interesting you can simply ask other webmasters or site-owners to showcase it on their pages. But more about this later.
How can you Identify a Penguin Penalty?
Unlike the Panda Algorithm, which punishes the entire website in case of unnatural behavior, the Penguin Update only targets suspicious keywords or groups of keywords (page-wide penalty) that you are targeting with your links. Here’s what you should look for:
- Specific keywords that recorded a steep drop in rankings. (you can check this on SemRush from the New/Lost keywords tab).
- Pages with specific keywords that have been de-indexed from the search engine.
- A message in your Google WebMaster Tools which points out “unnatural linking” on your website.
Google Penguin Penalty Recovery Tips:
The good news is that it is easier to recover from a Penguin penalty than from a Panda penalty. Often, the best solution is to completely de-index pages on your site. Keep in mind that you will not see immediate results with Penguin recovery. Also, it will not help you at all to submit a reconsideration request on Google Webmasters. You have to wait until the algorithm sees that all the bad links and pages have been removed.
Here’s what you need to do to recover from a Penguin update:
- Use a link tool (like Ahrefs, Open Site Explorer, Google Webmaster tools etc.) to create a database with your links.
- Organize your links according to different categories (B for blogs, C for comments etc.)
- Sadly, the next step is to manually check if they are good or bad for your site. If they are bad you have to take the contact details from the page and ask the webmaster to remove the link.
- You have to be polite, yet persistent in your endeavor. Some webmasters might not answer at all, others might choose to ignore you. No matter the case, you need proof that you actually tried to remove these links.
- If they refuse to grant your request you can start using the Google Disavow Tool to manually disavow your bad links. Once you clean up all these links you can submit a reconsideration request.
- Here comes the hard part. Your reconsideration request must contain thorough documentation proving that you actually went to great lengths to remove negative links.
What is the Google Hummingbird Google Update?
In my opinion, Hummingbird was an incredible algorithm update which significantly improved the way we experience the web. Before it, people could only find results for pages that contained the exact same keyword in their content. With Hummingbird that changed.
This algorithm is capable of analyzing each word in a page, as well as its meaning. For example, if I type in “Where can I find computer fans?“, the algorithm will analyze pages that contain relevant information on the subject. Hummingbird also introduced the concept of conversational search.
“The goal is that pages matching the meaning do better, rather than pages matching just a few words.” – SearchEngineLand
What is the Pigeon Google Update?
Another Google update, which was released on July 24, 2014, is Pigeon. This brand new algorithm was designed to provide with relevant local search results. In other words, the Google engine is now capable of pulling out results based on your location and area-specific keywords. So if you’re looking for “cheap shoes in Oklahoma“, it will pull out store websites for Oklahoma companies.
Considering that more and more people are using Google from their mobile phones, and searching for area-specific information, the Pigeon Google update is extremely useful. This also opened up plenty of opportunities for local businesses that couldn’t even dream of ranking over big brands that were completely dominating SERPs.