By now you’ve probably noticed the links in the side menu, so you probably know that noslang.com is one of my websites. I’ve been dedicating weekends to working on it and my other sites. Well, I’ve been slacking – bigtime!
I usually check the user submitted words every weekend… but I haven’t done so in about 3 weeks. I’ve just been busy. I was in Texas a couple weekends ago, did some FeedButton updates last weekend etc..
So I just logged in, and boom! 900 words waiting to be approved. This is going to take some time.
If you’re not familiar with my approval process it’s like this:
1.) Check word to see if it makes sense
2.) Make sure everything is spelled correctly (I don’t fix improper spellings, no time really)
3.) Google for the word.
If It makes sense, is spelled correctly, and exists on at least 1 other forum or website.. then I add it. If not, I kill it. that’s part of what makes NoSlang the best internet slang dictionary on the internet (and the only internet slang book too!) While other sites are trying to build a large database by paying per word submitted, NoSlang actually takes hours upon hours of human evaluation. It’s starting to get to be too much.
Does anybody else have a few hours they can donate to this for free? I wouldn’t think anybody would – but it can’t hurt to ask. Right?
powered by performancing firefox
January 21st, 2007
While more and more companies are entering into a paid search battle to get more traffic it seems that newspapers are trying to do just the opposite. This is especially the case in Belgium. First there was the court case that ruled against Google claiming that they couldn’t index a Belgian newspaper. Now, as TechDirt points out, it looks like they’re going after Yahoo.
In an effort to help out, I’m now going to tell all newspapers how to get their sites de-listed from ALL search engines without having to hire a lawyer.
Step 1: Create a new text file called robots.txt The easiest way to do this is to click on start in the lower left corner, click run. Now, type in “notepad robots.txt”. A Windows prompt will come up saying “cannot find file, would you like to create one?”. Select Yes.
Step 2: Enter the following text:
User-agent: *
Disallow: /
Don’t forget to save this file.
Step 3: Upload this file to your server. You may have to talk to somebody from your tech team for help with this. Another way is to go to start, run and this time type in ftp://Username:[email protected]. (of course, replace username, password and yoursite with the actual details. Then, you can just drag and drop the robots.txt file into the window. Congratulations! You’ve now blocked Google AND Yahoo from visiting your website, and it didn’t even cost you any retainer fees.
Of course you’ve also cut off millions of potential visitors, which begs the question “Why are you even putting your articles online if you don’t want anybody to see them?”
January 21st, 2007