You have perhaps been told that getting links from other sites related to your own website genre is generally a good idea. Perhaps you have even gotten quite a few email requests to this effect.
Most of us seo professionals know what this term means, but for the less geeky it simply means getting others to link to your site from their own.
Traditionally, this was done by exchanging links between sites, and initially search engines had no problem with this but the search engines soon began to devalue these reciprocal efforts due to not just the potential for spam but due to the fact that a sort of REAL link spam that was taking place in this area.
The word „Spam“ is known by some as ham, but in internet lingo it is also known as the email that floods your inbox and makes it difficult to find the real mail.
I.e. some marketers send out unsolicited commercial spam email to millions of unsuspecting users. Others created fake websites and pages with links back to their own – not the phrase „their own“ – commercial products – thus creating a need for Google and other search engines to objectively create quality ranking scores to determine the relationship between linking sites. In those early days, the search engines were not looking at the following things such as which reciprocal links were owned by the same group or which 5 linking websites were on the same machine. In order to slow down those who would fill our inboxes with junk, this data is deemed to be rather important for determining exactly who the spammers are.
It is said that successful backlinking depends very heavily on the keywords one chooses – traditionally, this has been where most linking efforts have failed.
Why? We have no idea exactly how others will link to our online assets. And in direct contradiction to what you might be reading around the net, therein lies a big part of the problem, right?
Secondly, since as a casual reader, one is not likely to be an expert on niche market keywords, you are going to most logically try to pick the keywords having the most traffic. Would this be the correct thing to do? A brand new website, even after being indexed by MSN or most search engines, typically has no chance at ranking on its chosen keywords for many months if not years.
So, possibly a waste of time, right?
But there is yet one more major problem. The page rank of new articles is N/A or after indexing, typically Zero where Zero is not good and 10 is the best. Although some may argue this while a new page with N/A or O as its rank will have a freshness quotient that can help it positively, in most search engines, this zero which is evidence of lack of credibility will most likely work negatively against it.
Exceptions to this – if the newly created page is sitting on a highly popular web2.0 social network property like squidoo or craigslist, bebo or scribd to name a few then it won’t be penalized as much just because its current pagerank or credibility level appears to be a zero.
We suspect these exceptions work because, it is thought that new pages on foundation sites such as those with a credibility level of 5 or above, inherently acquire some of the PageRank or PageTrust of the site that they rest on.
Technical babble galore – the question really is – So, what’s a girl to do?
Google would say, go back to fundamentals, content and be innovative. They would recommend strongly that you even create „link-bait“ that will cause others to want to link to you.I like both this phrase and the thought if you have any idea what this link-bait thing means. Ignoring Google’s advice is always done at your own peril, however I urge you to examine the issue more deeply. Do you really have the 6-18 months that it takes to consistently create new articles almost daily, and to put out such a ferocious amount of intensely likable content in one spot that would cause people to socially bookmark that page on your site – If the answer is no then you understand why most of us will never ever intentionally create link-bait.
There has to be ways around this. What should one do?