Finally, after what seems like years of writing about how inflexible Google’s actions against websites have been, we see a ray of light at the end of the tunnel. Just this week they have introduced a new option into Webmaster Tools which shows you if they are taking any manual action against your site.
The tool stems from a problem that Google has created, and is now trying to address; namely that you as a site owner are wholly responsible for the sites that link to you. When you look at it in these terms it seems incredible that you are to be held responsible for everyone who likes you; there is absolutely no parallel to this in normal life. Imagine for example a political party being held responsible for the people who support it? Or a football team responsible for everyone who supports it? Sounds ridiculous when you put it in those terms. But Google being Google, they have decided that you are responsible for the links to your site, whether you know about them or not.
Of course under normal circumstances this shouldn’t be a problem but as Google counts the links to your site and uses them as part of its algorithm to determine just how popular and valuable your site is it has become the ‘stock in trade’ of spammers to try and manipulate this.
Now most websites won’t be affected by this as the vast majority of website owners have never, and would never, try to manipulate the rankings by manipulating the backlinks. The problem is that the small minority that have done this over the years means that Google have implemented a ‘speed bumps’ solution to the problem. Much like a busy road on which a handful of people speed, putting speed bumps in the road slows everyone down but penalises the good drivers as well as the bad.
As soon as we were made aware of this new tool the first job was to check every one of our client sites to make sure that there were no issues that needed addressing. In the majority of cases we found this;
However in just one case we found a different type of warning. This was slightly confusing as this particular site is for a small local business and they have never tried to buy any links, replying on good on page optimisation and good digital marketing to gain reputability.
The warning looked like this;
The interesting bit about this new tool is that Google seem to have accepted that as a site owner you can’t be held completely responsible for all the links back to your site and in this instance they have recognised that there are some links, pointing to some pages, which are deemed to be ‘unnatural artificial, deceptive or manipulative’. This particular warning shows that Google have shifted the responsibility back to the sites that link to you and are simply advising you that some links may be bad.
Of course the next job was to run a full backlink profile and analyse them to see what could be causing the problem. Given that this site had never tried any link building we were surprised to see that there were over 700 links to the site and when we dug into it we found that the majority came from sites which no longer existed. The sites and pages were simply not there but of those that were still there we found some which had big bright warnings on them saying that ‘XYZ site has been hurting your rankings and we have therefore closed the site’. It seems that the clients domain had been scraped at some point and added to a cheap directory database, and from there others had simply scraped that data leading to lots of links from very poor sites.
The solution was to create a list of all the domains that we considered to be bad (most of them) and upload a ‘disavow links’ file. We’ve now asked Google for a reconsideration and expect that this would be granted quite quickly.
So the new functionality is interesting and useful but more importantly it marks a shift away from the draconian approach previously used to a more user friendly approach by Google. The next step of course will be that when these actions are taken that a message appears in Webmaster tools (it doesn’t currently but no doubt this will be on the development list) and it would be nice to be given some idea of the type of link that Google considers to be bad. This last point is something that Google have promised some time ago but not yet delivered and it is essential for this type of work. If they expect site owners to police their own backlinks (and whether they should or not is another matter entirely), it is essential that Google can provide them with some idea of what to look for. Simply telling people their links are bad but not which ones, where they are from or indeed why they are bad is no use to site owners.
So hats off to Google for this latest development; it’s a good step forward. And unusually we can sign this piece off without being critical about Google. Is this a sign that things are changing for the better?