Tuesday, January 17, 2012

The Enterprise SEO Guide To Response Codes

Response codes impact every page, image and file on your website.

A visiting search engine bot figures out what to do based on those codes. Incorrect response codes can cause:

  • Indexation problems;
  • Duplicate content;
  • Site performance problems;
  • All manner of other site higgledy-piggledy.

Enterprise SEO is all about big, site-wide wins.

Response codes are just that: They’re easy to set up. They have a broad impact. Seems like a slam-dunk to me.

And yet, when I checked 1,000+ large sites—’large’ meaning ‘more than 5,000 pages’—only 30% got their response codes right.

Thirty. Percent.

With that, I dust off my response code tutorials, and write this quick guide to response codes for enterprise website developers, SEOs and anyone else who will listen:


The Big Three Response Codes


There are three response codes you want to know the most about:
  • 404. Page not found. If a file simply doesn’t exist, your server should deliver a 404 status. You can use a 410 response if you want Googlebot to retry the bad URL less frequently.
  • 301. Page permanently moved. If you’ve permanently removed one URL and replaced it with another, use a 301.
  • 302. Page temporarily moved. If you’ve removed something and will be putting it back, use a 302.

There are others: 200 means ‘OK’. Hopefully, you’ve got that one squared away.


Page Not Found Responses


Most important: If a browser or bot visits your site and attempts to load a file that does not exist, it should get a 404 response.

404 is how a server says “Uh, that file isn’t here.” It’s not a bad thing. It’s the right answer when someone clicks a broken link, or a page is just gone.

You can get tricky with redirection if you want to try to preserve link authority of a deleted page. But the default answer for a missing page should be 404.


The problem: Many sites deliver 302 temporary redirect, 301 permanent redirect or, even worse, 200 ‘OK’ response codes. This leads to massive site duplication and terrible crawl efficiency. Visiting bots spend their time crawling worthless content.

Possible causes and solutions:
  • .NET loves to take over control of 40x errors, replacing them with a 302 redirect to a friendly error page. That’s nice. But totally wrong. Turn off .NET’s 404 handling and let IIS take over, instead. You can still have a friendly error page.
  • A misguided developer may have thought that redirecting all ‘not found’ errors to your home page helps users. It doesn’t. It’s totally confusing, like going into a revolving door and coming out at some random location. Provide a friendly 404 page that explains something went wrong and provides options.
  • Someone may have set up a redirect page that uses a javascript or meta refresh to then reroute visitors to a ‘best guess’ page. See the previous item—same problem.
  • If your site’s on PHP, it may be using header('location: /'); die();. Try something like header("HTTP/1.1 404 Not Found");, instead.
  • Your site just delivers a 200 ‘OK’ code no matter what. I have no idea why you’d do this, but I’ve seen 100-200 sites that do. Change it.
A 410 response is OK, too. It causes Googlebot to more quickly remove a URL from the index, and to retry the URL less often. You can read up on 4xx codes, and just about every other status code, on the W3′s Status Codes definition page.

What Kind Of Redirection?

Redirects are a powerful SEO tool. They let you consolidate authority in the right places. But you have to do ‘em right.

A 301 code tells a visiting bot or browser that the page it’s loading is gone, forever, and the URL of the replacement page. Use this to consolidate authority and resolve basic canonicalization issues.

A visiting search bot will transfer some of the authority of the old URL to the new one. It will also eventually stop visiting the old URL, replacing it with the new one.

A 302 code tells a visiting bot or browser the page it’s loading is gone, but only temporarily. A visiting bot will keep returning to the old URL, checking to see if the page is back.


The problem: As near as I can tell, large sites randomly mix 302 and 301. They lose authority in some cases, and force bots to crawl permanently-removed content again and again.

Possible causes and solutions:
  • IIS 6 and earlier didn’t have a nice, clear button that said “Make this a 301 redirect”. Instead, you must check “A permanent redirect for this resource”. By default, that box is unchecked. So the default behavior is a 302, temporary redirect.
  • You’re writing redirection into your Web application, but you left out the status code. Some servers are configured to default to a 302, temporary redirect if you don’t set the status code to 301.

It’s Not That Hard

No matter how complex the server infrastructure, getting the big three response codes —404, 301, 302— right makes for sitewide wins. If you’re running a big site, look to your response codes. Get ‘em right. It’ll boost SEO, performance and user experience.

Wednesday, January 11, 2012

7 Quick Tips to Keep Your SEO On Track in 2012

Once again, we've rounded the corner on a new year and many of us find ourselves re-evaluating our performance throughout many of life’s facets, whether it's how much we exercise or how well we eat. Just as we would to continually yearn to better ourselves, we have to take a moment and reevaluate what we’re doing to continually achieve within organic search visibility.

Taking a look at your overall SEO strategy from time to time is a great way to slow down, breathe, and see if you're still on track for SEO success. Any long-standing effort in life deserves an evaluation from time to time.

No SEO success is realized without solid benchmarks. It's important to continually compare progress to past dates to assess improvement.

A new year provides a great point to review year-over-year data to get a big picture of SEO success without the rigors of seasonality and other factors that can mar short-term analysis. Beyond assessing numbers and percentages, it’s also a great time to assess overall strategy, assess where SEO is going, and adjust accordingly. Plus, it’s nice to get away from all those numbers every once in a while!

It’s a New Year…Time for a Review of Your SEO Campaign!


mad-max-car


So, you’ve spent the last year building that “sports car” of a site. In actuality, many of you have been trying to stay abreast with the fast moving world of SEO and find yourselves looking at something resembling more like a "Mad Max" vehicle. For those of you who aren’t Mel Gibson fans, I’m alluding to the fact that you have added to a site little by little and as new SEO opportunities and trends emerge you find yourself looking at a site pieced together that isn’t so pretty.

I must admit that I’m guilty of this from time to time. I’ll take a look back and see that we have added a link, page, content snippet, etc. here and there and when I take a look at the big picture I have multiple links to the same page on a given site page, over-usage of keywords, or – even worse – lack of intended keyword focus on site page. You’ve been so busy monitoring the day-to-day worries of rankings, traffic, 404s, 301s, duplicate content, and on and on and haven’t doubled-back to see what the compilation of your team’s efforts are portraying.

So, how do we review our SEO strategy and ensure we stay on track?

1. Site Mission and KPIs

Revisit the mission of the site/company as well as the KPIs for the site. We know where we want to go with the site, are we still on track.

2. Review Annotations in Google Analytics Timelines

Hopefully you're extremely organized and have notated all changes and implementation dates for SEO initiatives and don't have to fish through email for hours.

3. Look at Your Link Profile

Utilize a tool such as Open Site Explorer and review your overall anchor text counts. Have you gone hog wild in the last year with non-branded keyword anchor text and forgot about branded linking?

4. Assess Top Keywords

Review Google Webmaster Tools and assess the top keywords and their variations found on your site. This helps to provide a holistic view of what Google’s understands your site content to represent.

5. Review Internal Page Links

Stay in Google Webmaster Tools and review your most heavily linked internal pages. You might be surprised that you have unknowingly added certain page links across the site in navigation etc. over the last year and now these are showing as more important than other key site pages.

6. Are You Ranking for the Right Keywords?

Now, head over to your site. Review your targeted keywords, the intended pages for which you want to rank for, and the pages that actually rank for the terms. That is if it does continue to rank.
Ensure that if you’ve added additional content, whether it’s text or images, that it still helps to support the keyword theme of the page. Have you added internal links into the copy whose anchor text may be too similar to the respective page’s keyword theme, confusing search engines? In other words, don’t link from the bicycles page to a product page with the anchor “bicycles.”

7. W3C Validation

You have had a year of designers/developers making continual changes to the site. If you haven’t been continually monitoring this, run a W3C validation to ensure your code is still clean as well as page load tests to assess any page elements dragging down load time.

Summary

Granted, this quick spot check of your SEO strategy doesn’t ensure SEO domination in the new year, but you may be surprised that it can show from a 30,000 foot view that too many small steps to the left or right can leave you far off the path in your journey for SEO success. Now, hopefully you’ve cleared your head of any SEO insecurities and you have more time to get back to your other New Year resolutions!

Source: http://searchenginewatch.com/article/2136668/7-Quick-Tips-to-Keep-Your-SEO-On-Track-in-2012