Mountain View, California–Google Penguin 2.1 made its debut on Friday, Matt Cutts, the search engine’s chief webspam engineer, announced via Twitter. The latest version of the algorithm is designed to go deeper into sites containing webspam, as previous iterations only analyzed the home page.
Penguin 2.1 will effect approximately 1 percent of all queries, according to Cutts. This update is part of the continuing effort to disinclude low quality sites that both Google and Bing do not want to channel traffic to through their search portals.
Penguin 2.1 and Panda
Along with Penguin 2.1, which is a slight improvement over 2.0, Panda 5 also went live. The original Panda hit the web in February 2011, and was aimed directly at “low-quality sites” or “thin sites“. The first version of Penguin hit the internet in April of 2012, and it targeted sites participating in link schemes.
The ongoing effort by the search engines is to deliver the best results possible for search queries. To that end, Google in particular, has made public its largest algorithmic inventions and changes, a means of discouraging webmasters from publishing low quality content and/or using black hat SEO techniques.
A New Generation of Algorithms
Though Google has stated it uses over 200 signals to determine organic ranking in search, much of those are unknown. Recently, the search engine incorporated a new kind of signal, specifically to handle more complex queries. The inclusion into Google’s base search portal is named Hummingbird, and rather than parse keywords from a query, it attempts to put the search in context. Along with Panda and Penguin, search is evolving along with what’s known as the semantic web.
“It’s a brand new generation of algorithms. The previous iteration of Penguin would essentially only look at the home page of a site. The newer generation of Penguin goes much deeper and has a really big impact in certain small areas,” Cutts said in May of this year.