While many SEOs and businesses religiously follow the Official Google Blog as well as Google Webmaster Central and Google Inside Search blogs to keep up with any recent algorithmic changes to the search engine ranking system, few actually know how it works. Google recently posted a video in an effort to illuminate that very process, and after watching closely we think that there are several things worth paying close attention to.
Google Lets Us “Look Under the Hood”
Even if its just a small change to the algorithm, Google goes through a rigorous scientific and collaborative (across multiple departments – analysis, engineering, etc) process to come to a decision about what will be best for user experience. Here’s the video, if you want to watch for yourself. Otherwise, here’s our recap:
The video begins with this astounding statement: “Every year, Google implements over 500 improvements to its search algorithms.” This means that every day the algorithm is being tweaked once or twice in some small way. In addition, the video describes that the search algorithm is composed of hundreds of signals. In an effort to provide good results for users, Google works to put these signals together in the best possible way.
Google highlights collaboration and scientific processes as the foundation if its problem solving when it comes to search algorithm changes.
Each change is evaluated and made in the following way:
1st Step: The first thing to do is come up with idea. The idea comes from “a set of motivating searches” (that are not performing as well as Google would like).
2nd Step: “Rank engineers” brainstorm hypotheses as to why and work to find what data could be integrated into algorithm to improve the situation.
3rd Step: The ideas go through scientific processes; people referred to as “raters” (people who are not actually with Google but are trained by Google) decide if a ranking is more high-quality and/or more relevant than another (they look at them side by side).
4th Step: “The Sandbox” is where Google tests their live experiments on users. They direct a small amount of traffic to the sandbox. They did this over 20,000 times in 2010 (well, they did that many experiments, at least)!
5th Step: Search analysts – who according to Quantitative Analyst Sangeeta Das, “provide an informed data driven decision and present an unbiased view” – collect and “roll up” data and present it in a “launch decision meeting,” where decisions about potential algorithmic changes are made. If the results of the scientific data indicate that the change will be a positive one for Google’s users, it is launched. According to Google Fellow Amit Singhal, “When you align Google’s interests with users interests as we have aligned, good things happen.”
Speaking of the Google Algorithm…
The idea of an upcoming algorithm change might give some the shivers – after all, many are still putting their efforts towards recovering from the Panda update and its follow up iterations (the last one that we know of was run a few weeks ago, with an additional expansion to mostly every language having been released early this month).
One thing these consistent follow ups have been after is the –to some, incredibly frustrating- scraper. Many webmasters complain that their sites have been scraped, and that the scraper sites actually rank higher than their own pages that contain the original content.
Search Engine Land reports on the following tweet by Google’s head of the webspam team, Matt Cutts, calling it a signal of an upcoming algorithm change:
“Scrapers getting you down? Tell us about blog scrapers you see. We need datapoints for testing.”
Included in the tweet is a link to a form that notes that Google may use this data to “test and improve our algorithms.” On this form, Google’s users and webmasters can report queries that have scraping issues, and the URLs of both the scraper page and the original page that was scraped.
Though part of the intentions behind the Panda update involved punishing scraper sites, it wasn’t entirely effective on that front as it mostly took aim at content farms. Even Panda 2.2, which specifically targeted at scraper sites, didn’t really fix the problem (especially when it came to blog content). So, hopefully the data collected by Google about scrapers will help those who can identify the problem can also contribute to the solution (finally).
How the Inside Search Video Indicates Google Might Proceed to Change its Algorithm to Eliminate Scrapers Once and For All
Based on the Inside Search video described at the beginning of the post, this might be what the process for altering the algorithm might look like when it comes to scrapers:
It seems as if Matt Cutts is doing research for a change to the algorithm by gathering motivating factors. Next, rank engineers will create hypotheses about what could be integrated into the algorithm to root out scrapers once and for all, and raters will analyze some changes. Next, users will be sent to the “sandbox,” (likely without even knowing it), search analysts will provide their data about what works, and the change that most benefits users will roll out (hopefully sooner rather than later).
Our Atlanta SEO company follows Google very closely, especially when it comes to algorithmic changes. Because we not only like to learn but also like to share our knowledge, we blog daily about what’s going on within the SEO, Google and interactive marketing worlds. For more information about developing your site’s organic SEO strategy or for advice having to do with questions that are more specific to your business or website, contact our Atlanta SEO company at 770-481-1766 .