I’ve speculated about it in my head for quite sometime and now I’ve finally observed the thing working very well in a white-hat dominated SERP — link validation is an immensely important thing in link building, and SEO in general, these days.
In a nutshell, high-volume or even highly powerful, but low-quality links aren’t able to improve rankings as much as they could six months or a year ago — unless they can be validated! In other words, what I’ve seen working very well right now is a combination of high-volume but low-quality links and low-volume but superb quality links.
In a recent post as well, I discussed about the much neglected validation (of authority, links, reactions, etc.) procedure of Google and whether co-occurences could help Google validate the authority of sites. This time though, the focus had been entirely on link building, because in the end, neither co-occurences nor social signals on their own get a site ranked, links do!
Let me tell you exactly what I’ve noticed recently. I’ve seen a site ranking 2nd for a moderately competitive search term (“abc_host review”) in the web hosting niche. Well, nothing wrong with that! What was unusual was another site, with the exact same content and author info (via Google Authorship) ranking at the end of the first page.
Upon closer inspection, I found out that the contents from the main site that was ranking 2nd was duplicated and republished on two other websites, both of which rank on their own within the first few pages as well (a fail from Google’s end in recognizing duplicate content, I thought). Little did I know about the real deal behind this.
So, first of all, I put the main domain into Ahrefs’ URL field and hit the ‘search links’ button. What followed didn’t amaze me, but surprised me for sure.
The page, yes, only that page, had hundreds of low-quality links pointing to it. Amidst them, I noticed two things:
- A dofollow link from a Moz user-profile.
- Links from the websites that cloned some of the contents of the main domain.
Now, the latter may seem a bit like good old link-exchanges (you give me a link, I give you a link) but it’s more interesting than that. The two domains that had duplicated and republished the contents had some good link metrics of their own to show off — one of them was a PR1 site with a Domain Authority of 24. So, by linking to the main site from those powerful domains hosting content duplicated from the main site, whoever was behind the site was tricking Google into believing that those were legit republication, and as a result contain links back to the main post.
The former is a more formidable achievement by the site owner. The site owner, additionally, added the domains that were used to re-publish the contents in the Moz user site-list. So, Moz was sending ranking firepower, as well as trust signals to all the domains that actually made the main site matter.
I’ve seen hundreds of link profiles like that containing similar kinds of high-volume, low-quality links — but most of them don’t rank well. In fact, the link profile was so bad, so bad that at first I thought someone had negative SEO’d that specific page. Of course, it could be the case that the webmaster had recovered easily, as a negative SEO recovery isn’t a very rare case these days, but it somehow seemed to me at even at first that the site never really got hit in the first place. Afterwards, the discovery of the Moz link (links to be honest, as other links to the two other sites still passed some passive value to the main site) made the picture clear to me.
Conclusion
This kind of proved my own thinking that even spammy links, when coupled with trusted authority links do help improve rankings in Google, and that too massively. I’m sure the Moz link itself wouldn’t make the page rank that high, if it wasn’t for the spammy links to supply the actual power. The Moz link merely worked as a validator of all that much power.
I had, in the past tested that spammy links work very well directly on super-high quality authoritative sites, because they already have a very high domain-wide authority (not to be confused with Moz’s Domain Authority) that work as the validator. So, if you publish something great on a very popular site and it ranks at, say #7, pushing it to the #1 spot won’t be that hard, using even low-quality links in good quantity. But this recent discovery has brought me to a very interesting conclusion — adding more and more power to your site or a specific page through either high-volume or high-PR link building is useless unless you can present a believable validation to Google.
So, what’s your say on this? Have you noticed something similar in some other types of SERPs?
Image Courtesy: Watch Dogs