At first I thought that figuring out exactly waht Penguin was all about would be as simple as putting together a child’s puzzle. In the past, all I had to do was run a few reports, analyze the data, and the puzzle came together before my eyes.
Penguin is a completely different beast. It looks a lot more like this. There are many many pieces, overlaying each other, some hidden completely underneath many others.
But the full picture is starting to emerge before my eyes.
I believe Penguin has everything to do with stock holders, ratios, raters, analytics, social media, usability, and anchor text.
By ratios I mean a myriad of comparison between high and low quality links. Contextual and irrelevant. Free and manufactured. I’ve seen sites that are ranking better that have many low quality spammy links. But the ratio of low quality vs high quality keeps them under the radar. Sites with extremely high anchor text for money terms are still ranking well. But those same sites also have pretty active social media profiles. And the keywords are not expensive enough.
What do you think happened to all of those businesses that got decimated? You guessed right. They ran right to Google Adwords to increase their spend to counteract for the loss of their organic traffic. And guess what else I’ve noticed? Industries were there was vast movement were large money terms. Terms that didn’t change much with Penguin were terms that didn’t make much money for Google. And they can easily track this by utilizing Adwords CPC data.
Many speculate that anchor density over-optimization and manufactured anchor text profiles were the culprits of the ranking losses with Penguin. And yes, many of the sites that lost rankings did have very high anchor density for money terms. But simultaneously, I have found MANY sites that are still ranking that have incredibly high anchor density. Why are some sites still doing well, while others got destroyed?
I believe that to compensate for turning down the dial on the importance of anchor text, they increased the importance of 2 important signals: Social media and analytics. Now instead of using links as votes, they are using people’s opinions directly as votes. They analyze this data by utilizing data from people’s analytics accounts. Google encourages webmasters to focus on creating good sites with good content. But how can an algorithm determine this? By relying on user data on a site’s analytics account. Bounce rate, time spent on site, conversions, social votes. This data is direct user feedback, and it’s all very easy to track through Google Analytics, which a large portion of the web utilizes.
I also saw sites with active social profiles surviving the Penguin slap, while other sites with similar backlink profiles but without social profiles drowned. A copule of days after Penguin Google Analytics sent out a public announcement about their trackin of social signals on Google Analytics. Coincidence? Maybe. I chose to believe the two are related. Additionally, Google recently purchased PostRank which tracked Social Media analytics. Coincidence that Penguin is now live, PostRank as shut down, and Google Analytics publicly announces better social media tracking? Nah.
In a previous post I talked about Google openly using human raters. What does this account for? Inconsistencies in data. Why are sites with high anchor data and low quality link profiles still doing well? They simply haven’t been caught and/or rated yet. I believe human raters played a big role in who got hit and who didn’t.
Based on the many factors mentioned above, usability is now more important than ever. Having a usable site that keeps people engaged is now an essential ranking signal. Building quality content that keeps people coming back and gets them to stay on the site longer can now make the difference between a top 5 and top 20 spot in the rankings.
How do we put all of this together, and move forward with SEO in the post-Penguin world?