Archive for June 9th, 2010
The SEO Vets take all comers with Danny, Rae, Bruce Clay, Alex Bennert, Vanessa Fox, Greg Boser, Todd Friesen, and Stephan Spencer is starting soon. I’ve got my seat in the front row and I’ll be liveblogging soon.
screw it ustream: http://ustre.am/iWi6
Phone died during ustream. I’m going to pick up here liveblogging.
Rae has lost her voice. One of the panelists said it’s an early xmas gift for the rest of us.
Bruce and Vanessa are talking about news headlines. Don’t use “giant wave” in the headline if you’re talking about a Tsunami. Bruce says they can’t correct after the fact on CNN – so they have to do it during creation.
Vanessa is talking about crawling again. Whoever asked this question didn’t go to the last architecture session. See my notes below for the answer here, I’m not retyping it.
add &start=990 to google to see the end of the results. &filter=0 will show you omitted results
Don’t worry about bounce rate affecting SEO – worry about bounce rate affecting conversions.
Since Cutts won’t give a straight answer, a good strategy is to ask him questions and judge his physical reaction.
Matt’s taking notes about changing the “omitted results” list’s name to “a list of crap” I like that Idea. “would you like to see a list of crap from this website?”
Many SEOs here don’t believe official Google answers about crawlability. that’s shocking – why would Google lie about how they crawl the web? There’s no motive. Google wants everything crawlable and findable – it gives them a better search engine.
Should you ever stop link building? The answer: if you owned a brick and mortar store would you ever stop trying to get customers?
Nobody admits to buying links, yet lots of people do it. If you get penalized, you have to clean it up and file a re-inclusion request.
Not sure why so much paranoia over linked networks of sites or paid links. You have to go very obsessive to get banned. IN most cases, I don’t think many SEOs have to worry about links in bad places.
Google recently penalized Google Japan for buying links. If Google is willing to penalize themselves, they may penalize you. Google’s really good at finding links.
So what happens if your competitor buys links for your site…. Vanessa says it’s unlikely that you could get penalized in this case as bans usually result for multiple factors.
Don’t buy links, just give out free phones in return for links.
Is facebook and the open graph going to kill Google? No – but should you diversify into social media? Yes.
Rae says she’ll take any links, she doesn’t care where they come from. I’m in the same boat. One of the biggest traffic drivers to NoSlang.com is a nofollowed link on somebody’s site. Sometimes we lose track of why we needed links in the first place.
Don’t go crazy with universal search. Do you really want a video to rank higher than the link to your site where people can buy something? Most videos don’t have a link to go complete a conversion after watching it…those links that ranked do.
Alex Bennert says to email her if WSJ mentions your site but doesn’t link to it. Good to know, they’ve done that to noslang.com in the past.
June 9th, 2010
The live blog of “Build it Better: site architecture for the Advanced SEO” will start momentarily. Vanessa Fox, Adam Audette, Maile Ohye, Lori Ulloa and Brian Ussery will be speaking. Stay tuned here and refresh for my thoughts, insights, & recap.
Vanessa and Maile must have a lot of clout here, as there’s a facebook session going on across the hall and the site architecture room is standing room only rightnow.
Lots of the room has complicated problems – not sure how many are related to search though.
Maile is up first. She’s using Google store. It has 158 products but 380,000 URLs indexed. How does that happen?
First point: protocol and domain case insnsitivity. http://www.example.com and HTTP://WWW.EXAMPLE.COM can be different.
Advocating a consistent url structure to reduce duplication and facilitate accurate indexing. suggestion: keep everything lowercase.
301s and rel=canonical can cause slower crawling. Google crawls 300 and 400 status less than 200s. If your site is down for maintenance use a 500 response code to not reduce crawl times.
Don’t be like the Google store, use standard encodings and &key=value stuff. no crazy stuff in place of key value pairs.
Google crawl prioritization.
Indexing priorities: URLs with updated content, new urls with probability of unique/important content.
Sitemap information (xml here) is used.
Ability to load the site (uptime, load, etc) also comes into play.
To increase Googlebot visits:
strengthen indexing signals above. (links, uniqueness, freshness)
Use the proper response codes.
Keep pages closer to the homepage. Further clicks away = less frequent indexing.
Use standard encodings
Prevent the crawling of unnecessary content.
Improve “long tail content” Be wary that we as webmasters call it long tail, but to users it’s the content they want.
Seek out and destroy duplicate content and use the canonical and 301. Google can find these for you in webmaster tools.
Include Microformats to enhance results with rich snippets. Gives the ability to include reviews, recipes, people, events, etc. Her example is the hrecipes format.
Create video sitemaps/mRSS feed. (only Google supports these currently. BingHOO says they’re working on)
Adam Audette is up now and he’s shilling Vanessa’s book – marketing in the age of google.
First, make the best user experience, then leverage that for SEO. That’s good advice.
He’s using amazon as his example. Talking about the top and left nav and how they make search prominent.
Sweet, they’re giving out jane and robot stickers. Gotta get me one of those.
Google shows less images in search results depending upon the size of your screen resolution.
adding dimensions to your images in addition to alt attributes can help browsers and search engines, and increase speed.
Use .png files if possible. Muchsmaller than.gif or .jpg
use EXIF, Tags, Geo, and what not.
Looks like we’ve killed twitter too.
Things you can test:
1. pages indexed. everybody knows how to do a site: query
2. Canonical (www and non www)
3. in-links using yahoo site explorer. (there was a plea to binghoo to keep this tool live)
4. sitemaps – blah blah blah
5. site speed – I’m so tired of hearing about this. it’s not a big factor at all.
Maile says that Google still doesn’t want search results in their search results. Disallow your search pages. Category pages however, are still very welcome.
Maile also said that Google uses toolbar page load times not googlebot crawl times. Good to know, but still don’t like people obsessing so much over speed.
text -indent -999px is not a safe technique to use instead of alt text.
pubsubhubub is an open protocol that lets you push content to search engines rather than let them try to index you. It’s not yet incorporated into google’s pipeline.
June 9th, 2010
I’ll be live blogging the “so you want to test seo” panel at 10:30 pacific time. Check back here for live updates. This should be a good session.
Actionable, testing, etc. These aren’t words you normally hear when people talk about SEO. So glad to hear them and if you too want to learn more about SEO marketing, you will want to check the PBN Links by Saket Wahi. Several people in the room admit to testing SEO as they are true masters at work. One guy admits that he’s perfect and doesn’t need to test his SEO.
Conrad Saam – director Avvo is speaking now.
He’s talking about statistical sampling and the term “statistically relevant” – I feel this is something that many SEOs fail at.
The average person in this room has 1 breast and 1 testicle. A good example of how averages can be misleading.
He’s now talking about sampling, sample size, variability, and confidence intervals. Also the difference between continuous and binary tests. This is very similar to my college statistics class.
His example of “bad analysis” looks awfully similar to some of the stuff I’ve seen on many SEO blogs. It would have been real easy for him to use a real example from somebody’s blog.
Bad analysis: showing average rank change in google based on control.
Good analysis: Do a type 2 T Test. Excel can do that.
It’s all about the sample size when doing continuous testing
http://abtester.com/calculator – good resource for calculating confidence.
Common mistakes:
Seasonality.
Non representative sample
non bell curve distribution
not isolating variables. This one is huge in SEO as there are over 200 variables considered.
Eww… he’s talking about the google sandbox. Just lost some cred with me, as I don’t believe in a Google sandbox – but it does make his point when testing SEO – that we can’t be 100% sure some changes actually caused the results.
Next up, John Andrews
An seo wants:
to rank better
robustness
avoid penalties and protect competition.
As an agency one wants to actionable data to help make the case why we want that.
Claims: (that need to be tested)
PR sculpting does/doesn’t work.
Title tags should be 165 characters
only the first link on a page counts.
there is no -30 penalty
John says that authors of studies and blogs place more value on the claims and not so much value on the claims. the difference between marketing is that marketers tell stories and make claims – scientists deal with data.
Problems with SEO studies
Remarkable claims get the most attention.
Studies are funded by sponsors who have something to gain.
There’s virtually no peer review.
Success is based on attention not validity.
“citations” are just links – and not as valid as real citations.
Note to self, but a copy of “the manga guide to statistics”
So how can we contribute?
Science is slow boring and not easy.
Most experiments don’t produce significant results
scientists learn by making mistakes
As SEO’s we’re stat checkers. We’re too busying seeing how much we just made and how many visitors we just got to deal with experiments. That’s so true.
Tips: Publish your data without making claims. Be complete and transparent. Say “this is what I did and this is what I saw” and people will email you, cite you, or repeat your experiment. Invite discussion about your test.
A good example of this was Rand’s .org vs .com test where he didn’t account for wikipedia bias and also didn’t alot that most .com domains were brand names (which he excluded)
When it comes to SEO testing, just say what you saw. Let the data tell the story and let others come up with the same analysis that you did. That’s science. Publishing claims is often just a push for attention. Man, that’s so true.
Next up, Jordan LeBaron.
Don’t trust Matt Cutts, test your own shit. Different things work in different situations.
Plan. Execute. Monitor. Share. Maintain Consistency.
Branko Rihtman – a molecular biologist who runs seo-scientist.com
Define question, gather info, form hypothesis, experiment, analyze and interpret data, publish results, retest. That’s the scientific method.
Choose your testing grounds. don’t use real or made up keywords, use nonsensical keywords made up of real words (like translational remedy or bacon polenta)
How to interpret data:
Does the conclusion agree with expectations? Does it have an alternative explanation? Does it agree with other existing data? Bounce the findings off of somebody. Don’t have definite conclusions.
Statistical analysis is hard. get help from somebody who knows statistics. Understand correlation and caustation, understand significance. Don’t rely on average.
Avoid personal bias. Don’t report what you want to see or what you thought you saw, report what you actually saw.
You can learn a lot from buying branko a becks. It’s a known fact that scientists can’t hold their alcohol.
June 9th, 2010