For those websites still trying to recover from the Google Panda update, the news that Google has quietly unleashed the Panda 2.2 algorithm update earlier this week is both frightening and encouraging. Could it help the sites that were hurt by Panda’s release? Or could it hurt those sites that managed to slide by unnoticed during the first onslaught of plummeting rankings?
At the SMX Advanced conference session a few weeks ago, Matt Cutts, the head of the Google “webspam team,” did talk a little bit (during a Q&A with Danny Sullivan) about what could be in store for us with the newest version of Panda.
Cutts noted that there have been a few tweaks that may help some sites that were wrongly affected by the original roll-out of the algorithm change. However, if you go to your site and immediately make changes to improve your site for Panda, your ranking will not change automatically. When Google runs the periodic Panda assessments, however, (some say this is done every few weeks – there is no official word from Google on this, though), hopefully it will produce a change for your site then.
One important problem that Panda 2.2 will target is the fact that in some instances, sites that re-publish content are ranking higher than the sites that originally published that content. This improved scraper detection system is certainly a welcome change for many frustrated webmasters.
Cutts assured the audience that Google plans to continually change and update Panda in the future. So this is only one in a progression of many improvements (from what we can tell) to Google’s Panda algorithm update.