Can Subdomains Help Me Recover from Panda?

Read the full Wall Street Journal story here: Site Claims to Loosen Google ‘Death Grip’

Hey everyone, it’s Jason Hennessey from Everspark Interactive and today we are going to be addressing a recent new story that is causing some buzz in the search engine optimization community.

Yesterday, the Wall Street Journal published a story on one website’s claim that it has been able to recover from Google’s Panda Update by dividing its large site among several sub-domains. The five-year-old site, HubPages.com, was severely affected by the update because it relied heavily on Google for its traffic. The content publishing website’s CEO, Paul Edmonson, guest blogged on Techcrunch.com about his post-Panda plight on May 5th, admitting that his open publishing platform furnishes both high quality and some low quality content. He also added that YouTube does this as well. However, in his guest post, he discusses his shock at the discovery that his entire domain was being punished by Google due to the low-quality work of certain content contributors. Back in May, Edmonson was having no luck when he was trying to reach out to Google for help or an explanation (SURPRISE SURPRISE); however, recently, the HubPages CEO’s luck changed. With a great deal of hard work, Edmonson has triumphed over Panda, and claims that HubPages is slowly moving up in the rankings since he began making one significant change. Since this is the first real solution that has been presented to the Panda co-nun-drum, I thought I’d comment on whether this could possibly be a viable solution for sites that have suffered greatly since Google’s February algorithmic change.

HubPages.com’s mix of high quality content and low quality content is probably the reason for its 50 percent rankings drop not to mention the millions of dollars in lost revenue post Panda. Though the site has made its standards stricter, and cleaned up its act, it still saw no positive results. Then, Edmonson came up with the idea of creating subdomains, where each content author on the site has his or her own site. This way, Google could distinguish between the high quality written content and the low quality stuff. Finally receiving confirmation that this was a good idea from Google’s own Matt Cutts, Edmonson began testing this method in late June and claims to have already seen pre-Panda traffic on some higher quality subdomains. Encouraged, HubPages began implementing this technique for every author on its site this week.

Essentially, by separating the good content and bad content, this site has given Google the opportunity to reward some of the sub-domains and eliminate the others that have very little to contribute to what Google wants to be a content-farm free internet. Either Google is reassessing all of the content, or the revamped HubPages has simply avoided penalty; either way, this is the first time a large website has been able to escape Panda’s “Death Grip.” Google is treating each subdomain like it is a separate website, and therefore is giving website owners the opportunity to bounce back by separating what they know to be low quality content from what is high quality.

Can Subdomains Help Me Recover from Panda?

SEOBook’s CEO Aaron Wall recently blogged about this new development commenting on the irony of the fact that as soon as Edmonson pointed out in his guest blog that Google’s own YouTube wasn’t being punished for low-quality content when other sites (including HubPages.com) were, Matt Cutts quickly responded to his queries with confirmation that sub-domains were the way to go. According to Aaron, this irony runs deeper, as he says that “everything that is now ‘the right solution’ is the exact opposite of the ‘best practices’ from last year.” He is, of course, referring to Google’s announcement in March of last year that webmasters should allow the search engine to do its job and not worry about certain pages causing a site to be penalized for duplicate or bad content (because, as Google claimed, the crawlers will recognize what is good and what is bad). Now, site owners are supposed to create sub domains so that they are not penalized for bad content on their pages because the crawlers are not going to do this job. The algorithm change did come after this advice, but the least Google could do is update their directions. Now, maybe they will.

This new development could help a lot of sites like HubPages, where so much content is being created and published that it is hard to keep track of the quality, to overcome the frustration of making all changes required by the update and still seeing no changes in their rankings. As this possible solution to Panda’s devastating effects develops, we will keep you updated. Check our blog for daily commentary on everything going on in the search engine world. Thanks for watching and we’ll see you next time!