Devolution Back To META Tags
May 2nd, 2007 Ryan Jones
Back in the beginning of search engines there were META tags. By today’s standards the 90’s search engines were piss poor at determining the relevance of web pages so webmasters relied on META keywords and descriptions to tell the search engines what they were about.
Since the “relevance” of a website was determined by the webmaster it didn’t take long for any popular search to be deluded with porn and auto loan websites. Something needed to be done.
Google jumped in with PageRank and hasn’t looked back since. Clearly all search engines are thousands of times better than they were in the 1990’s. Unfortunately, that doesn’t stop people from trying to game them.
Whenever SEOs figure out a way to game the engines the engines usually fight back. Not too long ago all the search engines got together and came up with the rel=nofollow tag – sometimes called the link condom. rel=nofollow gave webmasters a way to say “hey, I’m linking to this but I don’t support it; so don’t give it any link juice.
As far as the web was concerned it was the first step away from “relevance by majority” back to “tell me what your site is about.” It was also the first time that search engines like Google encouraged webmasters to do something with regards to the search engines not the users.
The Google Webmaster Guidelines constantly preach to Make pages for users, not for search engines Clearly rel=nofollow doesn’t help the user viewing the web page. It was a step away from this guideline.
Today Yahoo took another step away from this guideline by introducing robots-nocontent. robots-nocontent is a CSS class that users are supposed to use to tell Yahoo that this content isn’t important to the search engine – only the user.
To me, it seems like we’re devolving back into the “tell me what your site is about” theory of the 1990’s META tags. Is the future of search going to rely on the past of search? What do you think?
What’s worse is that I can see this css class causing lots of problems. For one, it forces the webmaster to spend time making changes that won’t help the user but could potentially hurt his own rankings. I can’t see many other webmasters doing that.
Additionally, what happens when a webmaster leaves out a closing tag or improperly nests his HTML tags? Does it cause Yahoo to think that everything after the robots-nocontent tag isn’t important? Can a webmaster accidentally tell Yahoo his whole site isn’t relevant?
Entry Filed under: Main
1 Comment
1. Elias | May 3rd, 2007 at 12:29 pm
I think you’re right – it would be one thing if usage such as this evolved naturally, but when one or more search engines attempt to artificially put it into place, they’re basically giving the playbook to the other team.
rel=nofollow is less egregious, since it negates your vote for someone else. I can’t offhand think of a way that could be abused, at least not as easily as the “porn-goes-here” class Y! suggests (what would a negative link farm do?).