The Google ‘Panda’ Update
Over the past 12 months Google has declared war on SEO for forms of Search Engine Optimisation that have become the norm for how websites achieve high rankings. Partly for keywords that they shouldn’t or that have been appearing in the search results pages too often for low quality content.
This practice had affected the quality of results too much and Google implemented some major changes to their algorithm to combat web spam.There was also the ‘Venice update’ which seems forgotten, but for now we’ll just look Panda update that has had a big impact on SEO
In February 2012 Google launched the ‘Panda’ update, although every year they release an major update that has the SEO community in a spin. This one was different, as it has affected up to 12% of global searches.
The Panda update is algorithmic, so it’s an instant change to the way results are returned. It’s partly based on saleable machine learning so it thinks more like a person and is less logical rather than a programme or formula. In the past we didn’t think too much just the technicalities, SEO is now it’s more about ‘Inbound Marketing’ with a technical layer.
Over the past few months there has been over 13 updates, some every other week so it seems. Google is now on version 3.7 and 3.8 scheduled for 28ish of July 2012.
The ‘Panda’ update really targets the content of a webpage by looking to see how useful it is to the end-user.
Often an SEO would create a page laced with Keywords and with a few other changes it wouldn’t be too difficult to make that page rank for the desired keywords. Which is one reason why we’ve seen so many web designers claim to specialize in Search Engine Optimisation, there’s a big difference in understanding best practice for web standard in design compared to SEO practice across the net for multiple brands.
Google is now looking at the content of you website to understand how it meets the user’s needs. One of the signals they use is to understand the reading level of the site, which can indicate the content level in how well it’s been written and the standard of content.
The entire website is being measured so it’s not enough just to think about one page and one range of keywords. The Site authority becoming ever more important as seen in some of the author rank changes which is separate, but shows a sign of how Google’s looking more towards the user experience (whist pushing their social network to be more integrated within search)
They’re able to pull data from other products, such as Android devices Gmail and Chrome which helps feed some of the machine learning.
Some of the data they are able to actively use to understand quality is to look at the websites bounce rate, although they have said that that don’t use analytics as part of the algorithm they’re still able to measure how much time a person spends on a web page after they have left the results page. They do measure click-through-rate from the results which is shown in Google Webmasters tools.
The new Google plus button also indicates this (in isolation in terms of how it renders) but they have years of data to rank and incorporate into the other factors when trying to gauge the standard of content on a web page.
If they are sending people to a site that ranks high but doesn’t give them what they are looking for, then it’s a poor refection of their search engine, so they remain firmly committed to ensuring that people get results.
Although some web pages may have a high return rate because they answer the query such as a telephone number or they’ve answered direct question based on the query etc.
So the task for us now really to create compelling content that engages with people although engagement is subjective often. An old method of SEO was to use article directories and ‘syndicate’ your content across the web and linking back to you website, had a really positive effect. This update is seeking to find whether your content is elsewhere and unique, so be careful where its goes!