Chat with us, powered by LiveChat
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor. Lorem ipsum dolor sit amet, consectetur.
Contact us now +1128 5255 8454
[email protected]

Instagram Feed


Google Penguin Update Explained

At EverSpark, we believe in helping businesses understand SEO. That’s why we created this overview for one of the most talked-about SEO developments of all time—Google Penguin.

Penguin is the code name for an update to Google’s algorithm that was released in 2012. It focused on the number and quality of links leading to websites, and was the first major crackdown on spammy links on the Internet.

Whether you want to handle SEO yourself or just understand what your SEO team is doing, here is your complete guide to Google’s algorithm, the Penguin update, and our prediction for what Google is planning next.

The Evolution of Google’s Algorithm 

Every search engine needs a method for determining which results to give for a search term and in what order. Because of the vast amount of data involved, this system is automated with a complex algorithm weighing the relevance of each web page. But the key variables in this algorithm have changed over time.

In the early days of Google, as far back as 1998, the algorithm focused mainly on keywords. For early SEO companies, that meant “keyword stuffing” was a simple, effective tactic. Keyword stuffing simply involved using variations of your keyword on the page as often as possible.

That changed in 2003. Keyword gluttony had led to clunky, unnatural copy, which Google didn’t want to reward. Google engineers changed the algorithm’s focus to links: the more people who linked to a site, the more valuabhyperlinkle it apparently was, so links became the new gold standard.

Unfortunately, this change led to questionable SEO practices of its own, namely “link spamming.” Link spamming meant hiring people to go out and put as many links to you as possible, on as many sites as possible. The easiest way to do this was to simply leave comments on as many blogs as you could, with a keyword-heavy link in each comment. It didn’t even matter if the blogs were relevant to your niche.

Clearly, that couldn’t last. Eventually the rise of link spamming led to Google’s most revolutionary update of all: Penguin. Penguin was the first update to focus on link quality. Under the new algorithm, Google evaluated all of the links leading to a site—and compared the anchor text on each one. If it didn’t look natural, the site was penalized or filtered down lower in the search results.

Problem solved, right?

Negative SEO

Penguin succeeded at penalizing sites using spammy SEO. But it had a crucial weakness, one which saboteurs quickly learned to exploit: if Google was going to hand out penalties for spammy links, why not just create spammy links to your competitor?

Thus began the era of “negative SEO,” or sabotaging rival businesses’ search rankings. Negative SEO had a six-month run, between April 2012 when Penguin was released and October 2012 when Google came up with a solution, known as the Google Disavow Tool.

The Disavow Tool lets you alert Google to links to your site that you don’t want to associate with. In theory, that lets you halt negative SEO and even repair the damage from earlier, cruder SEO efforts. But it takes several steps of legwork:

  • First you have to find all the links that point to your site, using tools like Google Webmaster Tools, AHREFS, Majestic SEO and MOZ Open Site Explorer;
  • You then have to decide which links are good and which ones are bad;
  • You have to at least attempt to contact the site owners and ask them to take down the links voluntarily (and document the whole process);
  • Only then can you submit a disavow request for the links, and you usually have to submit it several times to get a penalty removed.

Disavow and Big Data

It’s not an easy process, but it’s a worthwhile one to eliminate penalties and get on Google’s good side. But you’re not the only site doing this—tens of thousands of businesses have submitted detailed disavow requests for millions of links.

big-dataThat’s a lot of data. And data is something Google does well.

At EverSpark, we don’t believe that Google will sit on that data forever without using it. By now they’ve essentially built a massive database showing all the sites that business owners report as spammy or fraudulent. If spam links turn up on the same sites over and over again, Google can detect that pattern.

That’s why we believe that the next major update to Google will be a refresh of the original Penguin update, one that bypasses individual disavow requests and handles spammy sites algorithmically. The biggest offenders could be penalized automatically so they pass less PageRank and influence to the sites they link to, essentially making their links irrelevant. We call it Penguin Refresh.

Getting Ready

Is your site ready for the next evolution of SEO? EverSpark is here to help. We offer a free consultation to discuss your SEO needs and the solutions that will work for you. Get your consultation today.