I got an email from an SEO expert yesterday. Not that it is a particularly noteworthy bit of news; but I got called out on one of the naughtier SEO tricks I did to rank the Gorilla Marketing website. I’m not condoning the practice of ‘black hat SEO’ at all – but there were one or two ‘grey hat’ things I may have done to help the website along the way; which subsequently required the need to hide my backlinks.
Why Hide Your Backlinks?
This one shouldn’t be too hard to figure out. The SEO world is dog eat dog, especially when ranking for SEO related keywords. By not taking these precautions you are in effect giving a competitor a blueprint on how to copy what you have done. Some people also have PBN’s (personal blog networks) or paid for links that they want to hide from prying eyes for obvious reasons too.
Understanding How Tools Find Your Links
Just like Google, the major tools out there have their own crawler that finds web pages and follows links to new web pages. The process continues until they have all the linked to web pages on the internet saved (with information on who they link out to). Unlike Google though, who uses an algorithm to sort this information into the results you see when you search for something; SEO tools make this information available to the public. This, in effect, gives anyone a blueprint to duplicate your efforts – which obviously won’t do.
Hiding Your Links
There are a few ways to do this, and it all depends on the type of link going back to your site. If the link is on your personal network then you are going to want to make sure that the tools don’t find the entire website. In this scenario you will own the website that is linking back to you, so you have control over the robots.txt file. This is the file that bots check before they crawl a website in order to see if they are allowed on it. While most guides out there will give you a list of bots to disallow, you will never get them all. For this reason, I choose to disallow them all, only letting the good bots in. Here is what you put in a robots.txt file:
User-Agent: * Disallow: / User-Agent: Googlebot Allow: / User-Agent: Googlebot-Mobile Allow: / User-Agent: Googlebot-Image Allow: / User-Agent: Mediapartners-Google Allow: / User-Agent: Adsbot-Google Allow: / User-Agent: Slurp Allow: / User-Agent: msnbot Allow: / User-Agent: msnbot-media Allow: / User-Agent: Teoma Allow: /
What I would also recommend doing is going into the .htaccess file and making sure these bots go somewhere else other than the website on your network. It isn’t as easy to disallow all and only allow a select few this way, so we will be only sending the bots of the major SEO tools to another site.
RewriteEngine On RewriteBase / RewriteCond %{HTTP_USER_AGENT} .*AhrefsBot.* [OR] RewriteCond %{HTTP_USER_AGENT} .*SemrushBot.* [OR] RewriteCond %{HTTP_USER_AGENT} .*MJ12Bot.* [OR] RewriteCond %{HTTP_USER_AGENT} .*RogerBot.* RewriteRule ^(.*)$ http://www.Google.com/ [L,R=301] Order Allow,Deny Allow from all Deny from 216.123.8.0/8 Deny from ....
Shout out to Craig Campbell SEO for the code used in the .htaccess file. Once you have done that, sites on your network linking back to your main site will now only be found by the bots of search engines. This is only half the battle though, and not everyone (myself included) ranks with a blog network.
Hiding Links From Sites You Don’t Own
I can’t really remember how I stumbled across this method or I would give them a link. See, some SEOs have some really good sources that link back to them, and we don’t want to be giving that information away to our competitors. Unfortunately, there is nothing you can do about links you attract naturally, but if you are putting your link on other websites with the intention of ranking – then you can hide them. This does require a small sacrifice in ‘link juice’ though – but it is very worth it.
UPDATE: Hooray! As of 1st Aug 2016, you don’t lose any juice through this method! Moz Article
The first thing you are going to want to do is to create a new sub-domain on the website you are ranking (keyword sub-domain perhaps?). Just like how you edit the robots and .htaccess files on a PBN, you do the exact same for this sub-domain. You then 301 redirect this newly created sub-domain to the page you are trying to rank. Now you build links to this subdomain instead of directly to your webpage – and as such no one will be able to find them.
In case you didn’t notice the best part too – what you also have here with this sub-domain is a ‘tap’. Any link juice from links you build to it can very easily be switched on and off by deleting/moving the 301. Now, I’m not trying to put any ideas in anyone’s head here – but I’ll leave you with this: if a sub-domain gets penalised, the root domain remains unaffected. Have fun!