It’s time for another SEO Basics post and this time getting more out of link building efforts is the topic.
Link building continues to be an important part of marketing and optimizing web sites and web marketers can often get distracted by quantity goals rather than quality. Link building efforts for search engine optimization purposes rely on clean links that can be crawled by search engine bots. But what’s a “clean crawlable link”? It’s one that is not blocked with Robots NoIndex meta tag, JavaScript redirect, blocked with robots.txt or a NoFollow tag.
There are hundreds of ways to attract and acquire links. If link requests, article submissions or other high labor, low impact tactics are used, then it’s important to make sure the links acquired are good for both users and search engines.
Here are a few things to check for:
1. To check the robots meta tag, look at the page source code. If there is no robots tag, that’s fine.
If there is a robots meta tag and it looks like this,
meta name = “robots” content = “index, follow”
that’s good – although it’s not really necessary on the part of the webmaster.
If the robots meta tag looks like this:
“robots” content = “noindex, nofollow”
OR
“robots” content = “index, nofollow”
that’s not good as far as links for SEO benefit.
2. To see if there is a JavaScript redirect of links from a desired page, put your cursor over the link and look at the url that appears in the status bar at the bottom of the browser. If it shows the correct link url, then in most cases, it’s ok.
However, this can be faked within the JavaScript so if you can copy the displayed URL you may want to run the link through a tool like Rex Swain’s HTTP Viewer which will show you if the link redirects, what type and where.
3. To see if robots.txt is blocking search engine spiders from crawling links on a page, then add the text “/robots.txt” to the end of the URL in question.
For example:
http://www.articleblast.com/robots.txt
Here you will see:
User-agent: *
Disallow: /administrator/
Disallow: /cache/
Disallow: /components/
Disallow: /editor/
Disallow: /help/
Disallow: /images/
Disallow: /includes/
Disallow: /language/
Disallow: /mambots/
Disallow: /media/
Disallow: /modules/
Disallow: /templates/
Disallow: /installation/
What the “disallow” instruction does above is to tell search engine spiders not to crawl the designated directories. In the case of the article sharing site above, if any articles are located in one of these directories, then the links within those articles to client sites are no good for SEO benefit. However, readers can still click on the links and arrive at the indicated destination.
4. To see if there is a no follow tag, right click on the link URL using a browser like MSIE or Firefox and click on “Properties”. See if there is an attribute called:
rel = nofollow.
If that’s there, it’s no good for SEO.
If it’s not there at all or has another value besides “nofollow” then it’s probably ok.
You only have to do this once in most cases on any page of a particular site. The reason sites will do any of these 4 things is to hoard or “sculpt” their site’s PageRank. In the case of blogs, it’s to discourage comment spam.
Now here’s the rub: An exeperienced SEO professional will never do the above 4 steps manually. In almost all cases, they will either develop tools to automatically check for “clean links” or they won’t bother at all. It’s an important question to ask when working with a SEO consultant. Just because a client site’s inbound link count goes from 500 to 5,000 links doesn’t mean all of those links are created equal. In other words, quantity is nowhere near the complete measurement of a link building effort to improve search engine rankings.
Link building that is based on forms or submissions is very difficult to scale and the links are often nofollowed after it is discovered SEOs are populating the site with their content. Link building as a result of creating and promoting content worth linking to is high value, high impact and very scalable. However, if it is important to check link sources to determine their value for SEO benefit you can use the 4 steps above, create your own script to check them automatically or work with a SEO company that has their own.