Seriously, I wish everybody would stop panicking and get over it. It’s just the flu. It responds to Tamiflu – and the only people dying are those with no access to health care, or the elderly who are already in bad condition.
The swine flu isn’t going to the be next plague – no matter how much fear mongering the news does.
By now I’m sure everybody is familiar with the “disgusting Dominos video” featuring 2 moronic Dominos employees putting cheese up their nose and onto a sandwich. If you haven’t, you can see it here:
The video quickly spread all over the internet and received more than a million views in its first 24 hours online. Since then, Dominos has done a pretty awesome job of cleaning up the mess. They managed to do all of the right things, but there are a few things they did that were too late.
Shortly after firing the employees and pressing charges against them, Dominos president issued the following response:
The crucial lesson here is that although it seems like he’s reading a script, Dominos issued their response using the same medium that the original video aired on: YouTube. Instead of simply putting out a corporate press release and filing it away in an obscure portion of their website, Dominos joined the conversation already in progress. That’s how you have to deal with social media.
Shortly after the incident, Dominos set up a twitter account and started answering questions on it. I can say first hand that they answered a LOT of questions and responded to a LOT of people. When I tweeted at them, they twatted right back within 5 minutes with a great response.
The question I have to ask though, is where was their Twitter presence prior to this incident? If your company isn’t on Twitter now, let this serve as a wake up call that you should be. Had Dominos been there in the first place they could have started the PR machine in motion before this meme even got popular. It will also be interesting to see if they continue to use this new Twitter account to actively monitor and engage the community. I hope they do.
The lack of a twitter account also brings up other issues. If you do a search for Dominos you’ll notice that most of the results on Google are all about this recent video incident. From a reputation management point of view, that’s bad. Had they had a Twitter, Facebook, Myspace, and other social networking presence, those sites would be showing up in search results instead of all the bad PR about these employees. It’s a great side effect of having active social networking profiles.
If the Dominos incident can teach us anything, it’s the importance of establishing an early and dominant position on the social networks. Brands simply have to involve themselves in the active discussions concerning them. Not only that, but they can’t just use these accounts to push corporate marketing messages. Customers want people with a voice; people like Matt Cutts of Google or Scott Monty of Ford or Robert Scoble of Rackspace or Jason Calacanis of whatever the hell he’s working on now. Having a social networking presence doesn’t do any good unless it has a personal style to it as well.
Overall, I think Dominos did a great job at handling this incident – and the failure of their stock price to fall today shows that investors think so too. It may be a while before people get over the mental image and order again (at least not a sandwich,) but because of their quick and effective communication I don’t think they’re going to suffer long term. Hopefully your company can learn from this in case you ever have to deal with this type of situation.
There’s an interesting discussion developing in the SEO community about whether or not SEO people should have intimate knowledge of HTML or not.
The common analogy being thrown around is that HTML is to SEO what anatomy is to a doctor. For the most part, I agree with this statement.
As I talk with more and more people involved in SEO, I’m always shocked at how few of them actually know how to program. Not only should a good SEO know HTML, but they should be familiar with .htaccess, robots.txt, PHP, ASP, Javascript, Flash, and even some basic graphic design.
The reason is simple: All of these factors affect SEO, and the more in depth knowledge you can have of the subject the better prepared you can be to improve a site’s ranking, if this is very difficult for you to handle, companies like https://indexsy.com/ help users to have an strong seo strategy.
The biggest advantage comes from knowing the problems that your other teams are going to face and being able to infer what objections they might raise. When your coding team says “we can’t do that” it can be beneficial to offer suggestions. You’d be surprised how many development teams aren’t familiar with apache modrewrite directives or php 301 redirects.
It’s important to not only know about these concepts, but to know how they’re implemented as well. That’s one strategy that’s benefited me greatly in my career. Knowing how to do all of the code has helped me not only gauge how to prioritize projects but to cut through delays given by outsourced programming teams as well.
Making a recommendation to replace images with text just isn’t as well thought out unless you actually know the work required to do so. You can’t truly SEO a site to it’s full potential unless you’re able to look at the complete picture and see every factor of the site.
It’s almost a sick paradoxical joke to say SEOs need to have technical knowledge though, as often times technical people don’t make the best SEOs. A good SEO also needs to be familiar with sales and marketing – as they’re going to spend a lot of time in those areas as well. Much of the job requires presenting, talking, and generally convincing people that you’re right. Salesmanship is clearly a very valuable SEO skill – not just in acquiring business but in getting client and developer sign off too.
It’s not crucial that you be a l33t code monkey to be an SEO, but if you’re one of those who don’t think it’s important to know HTML you should probably start looking for a new career – your SEO one isn’t going to go too far.
In fact, the only thing that this study shows is that correlation does not imply causation.
What’s most likely at work here is a simple path of logic that got misconstrued.
Kids who study less get worse grades
kids who study less have more free time
Kids with more free time use facebook more
It’s not that facebook is preventing them from studying at all – it’s more likely that after they made the decision not to study they started killing time on a fun site like facebook.
I’m sure if you did another study, you’d also find the same about TV, kids in bands, or kids that do anything else they deem fun.
If anything, the problem here is that kids aren’t spending enough time studying. It has nothing to do with facebook or any other activity.
Got into a brief discussion today about whether or not certain pages should only be linked to with a rel=nofollow tag, excluded in the robots.txt file, and/or should have a meta noindex tag on the page.
Is going with all 3 overkill? What ones are necessary? And what’s the easiest best practice to implement for a site with over 800 pages?
First of all, we should clarify what exactly these three choices are.
rel=nofollow tells a search engine not to follow this link. It not only prevents Pagerank flow, but it also prevents this page from being indexed IF the search spider doesn’t find any other links to it.
Robots.txt exclusions tell a search spider not to crawl or access that particular page.
META NoIndex tells a search engine not to list that page in its search results.
These may all sound similar, but there are some very subtle difference here. To help understand these differences, it’s best to understand what type of pages we’d likely apply them to.
Examples of pages you don’t want indexed or crawled include:
Search results pages
Thank you pages
Error pages
Steps in a signup process
any other page you wouldn’t want a user to start on or see out of context
Basically, if (by some odd fate of chance) a user searches for something and comes upon my “thank you, you have been unsubscribed from my newsletter” page, that user is going to be lost to me. Additionally, they’re going to be confused as hell about the content of the page. Did it really unsubscribe them from something?
The old school way of preventing this was simply to list the page in Robots.txt so that the spiders couldn’t crawl it – but that alone isn’t enough. Looking to our list above, robots.txt only says not to crawl a page. It doesn’t say anything about listing it in the search results; and that’s exactly what happens. If somebody else links to a page that’s forbidden in your robots.txt file, search engines may still show that page’s URL in their results pages. They won’t have any information about it, but it will still be possible for users to click the link.
The other problem is that suddenly all of your form action & result pages are listed in robots.txt. This can provide valuable information to attackers and other people interested in compromising your website. For that reason, I prefer not to use robots.txt for this task.
rel=nofollow eliminates the list of pages created in robots.txt, but it’s also not very effective in keeping pages out of the search results. The problem with rel=nofollow is that it just doesn’t scale. I can tell the search engines not to follow my link, but what about somebody else who links to that page? I can’t count on that not to happen, and I certainly can’t count on them to put the nofollow tag in their link either.
That’s where the Meta NoIndex tag comes in. No matter how the spider ends up on the page or who linked to it, the NoIndex tag will always be there to tell the search engines not to index this page. Additionally, search spiders will still crawl the page and follow any links on it. This can be useful to people trying to manually shape their Pagerank flow.
For those of you curious, the tag looks like this:
<META NAME="ROBOTS" CONTENT="NOINDEX, FOLLOW">
So what do I do?
I use a 2 fold method. Firstly, I make sure I put a meta noindex tag on any page I don’t want indexed. Secondly, I always make sure to put a rel=nofollow tag on any links to that page from my website. This way, I keep my Pagerank flow how I want it and prevent my confirmation pages from being listed in search engines.
April 6th, 2009
About Ryan Jones
Name: Ryan Jones Alias: HockeyGod Location: Michigan Company: Team Detroit Title: Sr. Search Strategist AIM: TheHockeyGod Pets: Who Dey