Wednesday, December 28, 2011

Google+ Passes 62 Million Users, Estimated To Hit 400 Million By End Of 2012


A flurry of activity has circled Google+ as of late, and the effects are beginning to show. TV commercials, enhancements and cr oss promotion on Google products have helped Google+ flourish in the month of December. Paul Allen, the self proclaimed “Google+ unofficial statistician,” has pinned the Google+ user base at 62 million as of December 27th.


This is lift of 12 million visitors within the month of December, and has Google+ posed for the largest month-to-month volume increase since October 1st. This means that nearly 1/4th of all Google+ users (24%) have joined in the month of December. Google+ is now growing by approximately 625,000 users a day.

Google+’s growth in 2011:

  • July 13 – 10 million
  • August 1 – 20.5 million
  • September 1 – 24.7 million
  • October 1 – 38 million
  • November 1 – 43 million
  • December 1 – 50 million

Allen also has some staggering estimation for the continued growth of Google+. The estimate for Google+ users at the current growth rate is 293 million users by the end of 2012, but Allen is bullish on these numbers. He estimates the acceleration will continue and predicts Google+ will end 2012 with over 400+ million users.

                                                   Image Courtesy of Paul Allen

If these numbers are remotely accurate, 2012 should be quite prosperous for Google’s newest foray into social media. All information and Google+ stats courtesy of Paul Allen.

Source: http://marketingland.com/google-passes-62-million-users-estimated-to-hit-400-million-by-end-of-2012-2193

Saturday, December 17, 2011

Delicious Gets A Mini-Makeover – Updates To Design, Usability & Media

The popular bookmarking site Delicious rolled out a new look this week. A major change to Delicious occurred in September, and the new look helps to enhance the new changes and is a byproduct of user feedback.

Delicious was re-purchased by its original founders in April, and has been working on major feature upgrades since the transition. The new enhancements are geared to make content discovery easier as well as improve the look of the site.

Cleaner Design

While the September design helped to give Delicious the visual push it required, some elements were still outdated. The new look helps keep the look clean, consistent and modern.



Delicious stated the following about the new design:
Giving Delicious an interface flexible enough to add new features moving forward was a strong consideration. It required us to take into account overarching themes like how content types get presented (stacks, links, actions, and so on) to minute details like a consistent button style.

Better Discovery With Stacks

The largest improvement to Delicious this year has overwhelmingly been the addition of user-created stacks. These elements allow users to group (or stack) similar content together in an easy to consume format. This week’s improvements have helped to make browsing stacks and finding information even easier. A new navigation bar has been installed that makes browsing by topic easier:


A new change has also been implemented for users who are browsing stacks. More focus has been placed on images and video as the new layout resembles a newspaper format:



The Delicious team has been hard at work lately, with more to come soon. The lead designer and senior software engineer released the following statement about the recent changes:
Delicious has been a work in progress since the beta re-launch, with our attention primarily on completing the migration from Yahoo!. We’re focused now on innovating Delicious to empower web discovery, and actively building the team and dedicating the resources to make it happen.
So stay tuned for more improvements, enhancements & marketing opportunities from the revived Delicious team.

Source: http://marketingland.com/delicious-gets-a-mini-makeover-updates-to-design-usability-media-1602

Friday, December 2, 2011

Do As I Say, Not As I Do: A Look At Search Engines & SEO Best Practices

Now that the holidays are upon us, we all probably could use some cheering up. So I thought I’d have some fun with our favorite search engines: Google, Yahoo, Bing, YouTube, and Blekko.

At Nine By Blue, I have been developing software that automatically checks sites for technical SEO best practices. Normally we run it on our clients’s sites to quickly check for issues and monitor them for any future problems.

But I was curious to see what I would find if I pointed the software at some typical pages on the search engines’s sites and then compare their implementations with the technical SEO best practices that we typically recommend.

Below is a list of some of the issues that I found in no particular order.

Disclaimer #1: This list is intended to point out how difficult it is to fully optimize a site for SEO, especially large-scale enterprise sites. I’m not claiming that I could have done any better, even if I had full control of these sites.

Disclaimer #2: Yes, I’m aware of Google’s SEO report card, but I have never read it because it is too long. Also, I didn’t want to be influenced by it.

Use a Link Rel=Canonical Tag On The Homepage

Most of the sites that I reviewed had many different URLs that lead to the home page. This can be because of tracking parameters (i.e. http://www.site.com/?ref=affilliate1) or default file names (i.e. http://www.site.com/index.php), or even duplicate subdomains (http://www1.site.com/).

Because of this, I always recommend putting a link rel=canonical tag on the home page. This ensures that links to these different home page URLs all get counted as pointing to the same URL. I also recommend adding this tag for any other pages that might have similar issues.

I was surprised to find that Bing was the only site that had a proper link rel=canonical tag on the home page.
YouTube also has a link rel=canonical tag, but it was pointing to an improper URL “/” instead of the full URL “http://www.youtube.com/”.

Avoid Duplicate Subdomains & 301 Redirect Them To The Main Subdomain

With a few exceptions, I have been able to find a duplicate copy of the sites that I review.

I have a list of typical subdomains — like www1, dev, api, m, etc. — that will generally turn up a copy of the site. Other duplicate copies of a site can be found at the IP address (i.e. http://192.168.1.1/ instead of http://www.site.com/) and by probing DNS for additional hostnames or domains.

These duplicate subdomains or duplicate sites have a negative effect on SEO because they make the search engines crawl multiple copies of your site just to get one copy. It can also cause links intended for a particular page to be spread out among multiple copies, reducing the page’s authority.

The best way to fix this is to use a permanent (301) redirect to canonical subdomain’s version of that URL. If that isn’t possible, then a link rel=canonical tag pointing to the canonical subdomain page will work almost as well.

For example, an entire duplicate copy of Bing.com is available at http://www1.bing.com/. Compounding this is the fact that the page has a link rel=canonical tag also pointing to http://www1.bing.com/ and all the links on the page point to www1 as well.

Other subdomains, such as www2 through www5 and www01, all properly redirect to www.bing.com with a 301.

Blekko has an old, pre-launch copy of its site at http://api.blekko.com/. (Here is their old executive page.) Fortunately, this subdomain has a robots.txt file that is preventing it from being crawled. But these pages, like the old executive page at http://api.blekko.com/mgmt.html is also available at http://dev.blekko.com/mgmt.html and the main subdomain at http://blekko.com/mgmt.html.

It would be better to 301 redirect these URLs to the current management page at http://blekko.com/ws/+/management than to leave multiple copies of them on different subdomains.
YouTube redirects its duplicate subdomains www1 through www5 to www.youtube.com, which is in line with best practices. Unfortunately, it redirects with a 302 (temporary) redirect rather than a recommended 301 (permanent) redirect.

Use Permanent Redirects From https: URLs To http: URLs IF They Don’t Require SSL

Another type of duplicate copy of a site that I usually find is the SSL/https version of the site. https is appropriate for pages that require security, like a login page or a page for editing a user profile, but for pages that don’t require security, it is a source of duplicate content causing crawl inefficiency and link diffusion.
The recommended solution for this is to redirect pages from https to http whenever possible.

Our software detected duplicate https copies of most pages, including Microsoft’s help pages, the YouTube about pages, Google’s corporate page, and even the Google webmaster guidelines.

The duplicate content issue with the Google webmaster guidelines page (and the other Google help pages) is compounded by a link rel=canonical tag that points to either the http or https version of the URL, depending on URL is requested.

It is important to make sure that the link rel=canonical tag always points to the intended canonical version of the page, so be careful when dynamically generating this element.

A request for https://www.bing.com/ results in a security warning (shown below) due to a mismatched SSL certificate. This is common for sites using Akamai for global server load balancing.

It even pops up for https://www.whitehouse.gov/. I’m not aware of a way to get around this issue, though I would love to talk with something at Akamai about this.

Use Robots.txt File To Prevent URLs From Being Crawled

Sites generally have different types of pages that they don’t want to have search engine’s index. This could be because these pages are unlikely to convert or aren’t a good experience for users to land on, like a “create an account” or “leave a comment” page. Or it could be because the page is not intended for Web browsers, like an XML response to an API call.

Bing’s search API calls, which are made to URLs starting with http://api.bing.com/ or http://api.bing.net/ can be crawled by spiders according to the robots.txt file. This can be devastating to crawl efficiency because search engines will continue to crawl these XML results even though they are useless to browsers.

A search on Google for [site:api.bing.net OR site:api.bing.com] currently returns about 260 results, but based on analysis I have done on clients’ Web access log files, it is many times more URLs than these have been crawled and rejected.

Use ALT Attributes In Images

Images should always be given alternate text via the ALT attribute (not TITLE or NAME as I have seen on some sites). This is good for accessibility issues like screen readers, and it provides additional context about a page to search engines.

Though many images on the pages that were checked had appropriate alternate text, I couldn’t help but notice that Duane Forrester’s image on his profile page didn’t. But he is in good company because Larry, Sergey, Eric, and the rest of the Google executive team don’t either.

Avoid Use Of Rel=Nofollow Attributes On Links To “sculpt PageRank”

A rel=nofollow attribute on a link tells search engines not to consider the link as part of its link graph. Occasionally, I will review a site that attempts to use this fact to control the way that PageRank “flows” through a site.

This technique is generally considered to be ineffective and actually counterproductive, and I always recommend against it. (There are still valid uses for rel=nofollow attributes on internal links, such as link to pages that are excluded from being crawled by robots.txt.)

None of the search engine pages I checked were using rel=nofollow attributes in this way with the exception of the YouTube home page.

In the image below, nofollowed links are highlighted in red. Links to the most viewed and top favorited are being shown to search engines but general music, entertainment, and sports videos are not.

Return Response Codes Directly

A URL that doesn’t lead to a valid page should return a 404 (page not found) response code directly.
If an invalid URL is sent to Bing’s community blog site, it will redirect to a 404 page. Here is the chain:

  1. The URL http://www.bing.com/community/b/nopagehere.aspx returns a 302 (temporary) redirect to
  2. the URL http://www.bing.com/community/error-notfound.aspx?aspxerrorpath=/community/b/nopagehere.asp, which returns a 404 (page not found) response.

The recommended best practice would be for the first URL to return a 404 directly. If that isn’t possible, then the redirect should be changed to a 301 (permanent) redirect.

Yahoo’s corporate information pages do something interesting when they get an invalid URL.

A request to http://info.yahoo.com/center/us/yahoo/anypage.html, which is not a valid URL, correctly returns a 404 (page not found) response.

But the 404 page contains an old school meta refresh with a time of one second that redirects to http://info.yahoo.com/center/us/yahoo/.

A 301 redirect to this page is the recommended way to handle these types of invalid URLs.

Support If-Modified-Since/Last-Modified Conditional GETs

I am a big fan of using cache control headers to increase crawl efficiency and decrease page speed. (My article on this topic is here.)

I found it interesting that out of all the URLs that were checked only a few Google URLs supported If-Modified-Since requests and none of the URLs supported If-None-Match.

Periodically Check Your DNS Configuration

As part of a site review, I like to use on-line resources like http://intodns.com/ and http://robtex.com/ to check the DNS configuration.

DNS is an important part of technical SEO because if something breaks with DNS, then the site will go down and it isn’t going to get crawled. Fortunately, this rarely happens.

However, I have reviewed sites that had their crawling affected by DNS changes. And I have reviewed several large sites that had their DNS servers on the same subnet, essentially creating a single point of failure for their entire business.

As expected, all the search engines had no serious DNS issues. I was surprised to see that two of them had recursion enabled on their name servers because in some rare instances that can be a security risk.
My recommended best practice is to run these types of checks at least once a quarter.

Conclusion

These are a few of the issues that were found that I commonly see or think are important. There were others, but they were relatively minor or subtle things like short titles, duplicate/missing meta descriptions, missing headers, and too many static resources per page.

Normally, I would have access to Web access log files and webmaster tools, which allows our software to check a lot more things.

I hope this gives you some ideas for things to check on your own site. And I hope that when you find something that you realize that even the search engines have their own technical SEO issues from time to time.

Source: http://searchengineland.com/do-as-i-say-not-as-i-do-a-look-at-search-engines-seo-best-practices-102698

Friday, November 18, 2011

The Art of the Follow-Up Post

Classic linkbait campaigns follow a “throw it up against the wall and see if it sticks” model. Which begs the question: what do you do when it does stick? Let’s say you’ve written a solid piece of linkbait, and you’re ranking on page one for a topical head or near-head term. So what do you do next?

Well, one way to think about the situation is that you have a temporary landing page with strong topical authority.

Particularly given the latest algorithm update, your odds of a short-term success are a lot better—and the price of resting on your laurels is that much higher. But here’s the good news: Google is still ranking topic pages, not just news stories. (Just Google any celebrity’s name and see.)
So, how do you keep your page fresh?

Spotting Opportunities With Google Suggest

Let’s say you’ve written a solid, well-researched, and attention-grabbing piece focused on a great keyword. Fantastic. Google can tell you exactly what to write next: type your keyword into Google, hit space, and scan the suggestions—every one of them is a follow-up waiting to happen.

These don’t have to be nearly as high-quality as the original; users would rather find (and Google would rather rank) a 300-word piece whose headline matches a long-tail query than an 800-word piece whose headline is missing half of the words in the query.

Following MumIt gets better: you can also test out related topics, and see which search suggestions are missing. Google Suggest’s dirty little secret is that they lean on new content, not just search query data.

If people aren’t searching for a simple variant of an emerging keyword, you can write about that variant and literally shift searcher behavior.

The autocomplete fun doesn’t stop there, either. It’s actually creating brands. Lyrics site rap genius is one of the search suggestions for “rap,” and Yahoo!’s “Primetime in No Time” is a top suggestion for the query “Primetime.” And this goes full-circle: the top suggestion for “Primetime” is “Primetime lyrics,” and the top result is from—wait for it—Rap Genius!

Recaps & Context

The problem with reporting news is that you eventually run out of it: no story keeps happening forever. When things slow down, there are two easy angles you can take:
  • Recaps: sum up a few other pieces on the same topic.
  • Context: run a recap, but of a related topic or story. The definition of “context” in online news is ridiculously broad: if it’s directly related, it’s defensible; if it’s indirectly related, it’s merely a clever parallel. (This is the argument behind, e.g. “[X] Marketing Lessons from [Zombines / Lady Gaga / Rick Perry / My Plumber]” articles.)
As long as it’s original content, on topic, and accurately timestamped, it’s going to look like a continuously updated story. And it’s helpful to readers, too: people who search for a topic right when it gets hot are more likely to be subject-matter experts; the amateurs get active later in the news cycle. So it’s safe to be repetitious, if you’re saying similar things to different audiences.

Twitter-Based Resuscitation

Here’s a trick perfected by Business Insider, champions of fast-twitch news reporting. If you have a story that’s really taking off, post it to Twitter again—this time, swap your headline for a pull quote. If it’s a great story, it’s going to stay quotable; the better the story, the more times you can get away with this.
Compare:
I always wondered what a “McRib” was made of. Now I know http://read.bi/uOow21
to
THE McRIB: A “restructured meat product containing a mixture of tripe, heart, and scalded stomach” http://read.bi/uOow21
Same article, obviously. One headline is straight to the point: one’s a little gross but very good at grabbing attention. (And it doesn’t hurt that McRibs are in the news.) A quick hit like this can revive a flagging article. Don’t abuse it—but you’ll be surprised at how much you can get away with.

After Google’s freshness update, we’re doing News SEO whether we like it or not. Google is pushing for a faster, more reactive Web, with a SERP that looks more like a news feed or a Twitter stream. And part of that means keeping the story moving, not just reporting it once.

Fortunately, they’ve equipped us with the tools to do that exceptionally well.
Image from Flickr user Patrick Doheny, used under creative commons.
Source: http://searchengineland.com/the-art-of-the-follow-up-post-in-news-article-publishing-100804

Tuesday, November 8, 2011

9 Common Ways To Bork Your Local Rankings In Google

It’s not surprising that small businesses make mistakes in Google Places when setting up and claiming their profiles. It can be confusing and the guidelines even change over time. So, here’s a list of some common mistakes to avoid.

This isn’t the first time I’ve written a “what not to do” article (see What NOT To Do On Local Business Websites). But it’s worthwhile to emphasize some of the things I still see local businesses doing wrong in Google Places, since some of the more common stuff results in needless frustration and delays.

Messing Up Your Google Places Rankings - Image copyright Chris Silver Smith, 2011.


Nine Common Ways To Bork Your Local Rankings In Google

Again, do not try these at home!

1. Use a post office box for your address
I know it doesn’t make sense – this should be alright to do for businesses which do not have physical addresses, and you may even find some competitors doing it, but Google Places doesn’t like it. If you register a new listing with a P.O. box, you can expect it won’t rank for many primary keyword combinations. (For background on this subject, read about Google Places and businesses without addresses.)

So, find a street address to use for your business. Use your home addresss (often not ideal for privacy/security reasons), or partner with another business that will allow you to share their street address, or contract with a company that provides mail service with a local address.

2. Add directions into your street address
Including directions in the street address field (ex: “on corner with Elm Street”) can result in your map location being messed up and/or can cause Google difficulty in linking information from other business directories for your listing.

Either leave the directions up to Google’s automated map features, or include the helpful directions in the description field, if you absolutely must.

3. Tell Google not to display your address
This often goes hand-in-hand with businesses that use P.O. box addresses, but not always. What’s confusing about this is that Google Places provides this as an option, but they neglect to tell you that it may royally affect your ability to rank. The reason is that they prefer to show business locations on the map, and their algorithm is instantly dubious of any business that obscures its office location.

So, if you’ve traditionally used a P.O. box and are thinking of switching to your home address in combination with not displaying it, then think again. Okay, theoretically, you might be able to develop enough credibility with Google Places to overcome whatever governors they have on rankings for address-obscured companies.

But in practice, this is such an uphill battle with no information or feedback from Google about your status that you might as well avoid the beating at the begining and simply don’t toggle your address display off.

4. Use product names and place names in the business category field
It’s confounding that these are free-form, and it’s silly that Google doesn’t merely warn you if they detect a place-name in this data field for your Place page. But what Google wants here is just the business type, such as “Accountant”, “Florist”, “Attorney”, or “Electronics Shop”.

Do not put the names of products here (generally), nor your city names, even when combined with the category name. Google really hates this and it might even get you dinged!

5. Use a call tracking number as your business’s phone number
There are folks that have a fetish for statistical data who like to argue with me over this one, but there continues to be a pretty good consensus among those of us who are expert consultants for local SEO as to our stance on the matter.

Using an alternate phone number makes it harder for Google to match up your data from multiple sources across the local ecosystem, which can reduce your ability to rank.

For most small, local businesses, rankings and performance in search results ought to trump the desire to have tracking to see where your phone calls originate. Performance is a necessity, and analytics in this case is a comparative nice-to-have!

Google has come out and officially stated not to use tracking numbers, too: “Types of phone numbers that should not be included are: call tracking numbers and phone numbers that are not specific to a business location.”

6. Post some shill reviews in Google Maps
Getting your employees to help you in posting positive reviews for your business, and/or posting negative reviews about your competition, could result in your listing getting flagged by users and automated algorithms.

People can often sense that a review may be false, and this can result them stating their suspicion outright in their own review under your listing, for all to see, or they may report the listing to Google.
Either way, any juice you got from those reviews might get revoked along with anything else you’ve touched in Google. False reviews are against the law, too, so stay away from this dishonest, bad practice. Instead, harness the power of reviews in acceptable, positive ways.

7. Make radical changes to your business name, address or phone
Changing your address or business name in Google Places is highly risky to the stability of your rankings. Google canonicalization algorithms may struggle to match up your data from across the Web afterwards, and it could even cause your listing to get flagged as potentially compromised or as an attempt to manipulate.

Expect a few weeks of disruption to your rankings at minimum, assuming you can change all the various citation references out there to match. If you can’t get them to mostly sync up consistently, then expect longterm ranking impact and perhaps also ongoing problems in terms of duplicate listings, too.
If you occupy a really great ranking spot, you might consider leaving it as-is.

8. Add lots of fictional office listings in each city all over your metro area
Once you’ve poisoned the entire pond, the negative effects will eventually come back to roost with the rankings of your real, original location!

You may think you can add listings all over without Google detecting it, but your competitors will “helpfully” flag each listing and tell Google that you’re not really there. Expect to have your faux listings tank in the rankings and they’ll take your real, original listing with them.
9. Ignore that your map pinpoint location is completely off
You may be an ADD, multi-tasking, stressed-out small business owner, but this is something you’d better pay attention-to or it can irritate potential customers, reduce your walk-in traffic, and even get your listing erroneously flagged as out-of-business before your realize it.

So, check your map location and use the tools to correct it if you’re significantly off.

Perhaps I shouldn’t be publishing this list. After all, these items result in loads of work for those of us in local search marketing. However, untangling borked business listings is more difficult than setting up a fresh, new business profile completely from scratch.

So, avoid these bad practices so that you can spend more energy on further promotion efforts, rather than trying to correct something that’s been borked!

Source: http://searchengineland.com/9-common-ways-to-bork-your-local-rankings-in-google-99336

Tuesday, November 1, 2011

Infographic: The Top Three US Search Engines

How big is Google compared to competitors Bing and Yahoo — and can they catch it? One metric is the number of searches each search engine handles in a given month.

Search Engine Journal created a nice infographic charting these figures over the past few years. It highlights Bing’s climb, mostly at Yahoo’s expense:


Want the infographic for yourself? You’ll find it here: Comparison Of The Top Three Search Engines: Bing+Yahoo > Google?

Remember, it’s possible for a search engine to see its share of searches drop while the actual number of searches — the search volume — rises. Also, month-to-month changes are often mean little. You want to see a trend emerge over time.

Source: http://searchengineland.com/infographic-the-top-three-us-search-engines-99036

Tuesday, October 18, 2011

Infographic: Why Content For SEO?

How does content help with SEO efforts? The folks at Brafton have produced a “Why Content For SEO” infographic with lots of stats and information about the topic that you might find interesting:


Source: http://searchengineland.com/infographic-why-content-for-seo-96834

Wednesday, September 14, 2011

Google Search Share Plateaus, BingHoo Gains, AOL Drops

The comScore search market share numbers for August are out. What they show is Google seeming to hit a kind of plateau. Over the past year it seems to be bumping up against a market share ceiling of around 65-66 percent. By contrast Yahoo and Bing gained slightly and now have a combined 31 percent of the US search market.

Ask held steady at 3 percent and AOL appears to be continuing its long, slow decline. By the end of the year AOL search should be at or below 1 percent of the overall market.


Google query volume and its share are flat; though mobile, which is growing rapidly, is not included in these figures. Yesterday research firm IDC predicted that by 2015 more people would access the internet via mobile devices than PCs.

That trend disproportionately favors Google over its immediate rivals because Google has a much larger share of mobile browser-based search than it does on the PC. If PC search query volumes grow overall so will Google. For now, however, there doesn’t seem to be much more growth available in terms of market share. Mobile is a different story and will continue to be an important growth driver for Google.


According to comScore there were a total of roughly 17 billion search queries in August across the five largest search engines. We can estimate that roughly 3.4 billion of those search queries are local or tied in some way to location. This is based on extrapolating from Google’s “20% of searches are related to location” formula.

We can also crudely estimate Google sees somewhere between 1.6 and 2.1 billion additional queries in mobile a month in the US.

Monday, September 12, 2011

Justifying the Value of SEO

Most people don’t set fair expectations for a search engine optimization (SEO) effort. I was reminded of this while speaking with a prospect, who asked what kind of a return on investment (ROI) he should expect from his SEO engagement.

“I’m expecting exponential growth, something like 20X the traffic that I’m currently getting,” he told me. “If we can get a number 1 ranking for this one keyword, that should be enough to get us there, don’t you think?”

One Top Ranking Isn’t Enough

Your goal can’t be to rank number 1 on one keyword. That’s not a goal.

What happens if you get the top ranking for your keyword and something happens, such as a major algorithm tweak by Google? You’ve then lost your ranking for that one keyword. What then?

Though everyone has one of those keywords that they salivate over, a solid, long lasting presence in the search engines is one in which your presence is balanced across a number of keywords.

A “goal” should be increasing traffic and – at the end of the day – growing your business (more leads, more sales, and ROI).

SEO vs. Paid Search

Many people find it easy to budget for paid search. They understand the basic premise:

  • Spend $1 per click.
  • Set a budget of $10,000.
  • Get 10,000 clicks for keywords I want to “rank” for.
But what if you could potentially get 20,000 clicks by investing that same $10,000 in SEO rather than PPC advertising? Wouldn’t that be an even better deal?

To be fair, the above example is an over-simplification only intended to make a point. This 20,000 might represent a 10 percent increase in “good” traffic – meaning traffic that’s relevant, converts into a lead or sale, or at least shows some quality measurements (e.g., time on site).

After telling this to the prospect, he paused. “I’ve never really thought about it that way.”

SEO ROI: No Guarantees?

Don’t get me wrong. I understand that SEO is very different than PPC.

With SEO, there are no guarantees. There is the chance that, for whatever reason, you will never realize a solid ROI from SEO. Some of these reasons might include:

  • You hired the wrong people/firm for SEO.
  • Your IT team can’t do what’s necessary to fix things that will lead to better rankings/traffic.
  • You can’t / won’t create content which will lead to rankings/traffic. 
  • Expectations were out of whack with reality.
  • There’s no search volume for the keywords you’re interested in targeting – no amount of number 1 rankings could ever equal ROI.
SEO can be high risk, high reward. When I say “high risk,” I’m not talking about the kind of high risk associated with the possibility of being banned/penalized in the search engines for such tactics as hacking, cloaking, spamming, etc.

My point is that, even given that you work within the search engine’s guidelines, there truly are no guarantees because we don’t own the search engines. Search engines are a third party we have zero control over.

A “Good” ROI on SEO

If you’re investing $10,000 per month in an SEO effort (be it in staff costs or with an agency), you need to get a sense as to what a “good ROI” looks like.

Perhaps you’re one of many who have noticed that cost per click in paid search is getting higher and higher for the keywords that you’ve been targeting. Perhaps it’s gotten to a point where it’s challenging to make a case that the money spent is worth it?

Let’s say that you sell a widget for $100, and you net 30 percent from each sale ($30). If your average cost per click is $1 and you convert 10 percent into a sale, that’s $100 invested in paid search for 100 clicks for a $100 sale in which you netted $30.

Unless you care about the branding value (which I would argue folks should consider, at least a little bit, when they’re looking at valuation of PPC and SEO), that’s not a good ROI. In fact, that’s no ROI. That’s a loss.

What would you need your investment to be for this to pay off? Let’s do some math: out of every 100 visitors, we convert 10 percent into a sale in which we make $30. We would need 3,000 visitors to get 300 sales. Those 300 sales would be worth $30,000 (300 X $100/each) and we would net 30 percent of this ($9,000).

Now we have something to work from.

Do we feel that we could put $7,000/month worth of resources (money/time, etc.) into a SEO effort to help to achieve the goal of gaining 3,000 visitors? Or, perhaps the conversion rate is way off and it’s more like 5 percent?

Perhaps we need 6,000 visitors? Are we willing to fund this “at a loss” (during the initial months of research, etc.) in order to hopefully realize the potential ROI for the months thereafter?

Tools to Help Determine the Value of SEO

seo-value-opportunities

Once you’ve mapped out how, and/or if, SEO can drive ROI then you can begin to discuss how much value/opportunity there may be and what the risks and rewards might be.

I’ve mentioned these tools before, but to get a sense of what the potential value of an SEO effort might be, I would recommend SEMRush and SpyFu Recon. There are certainly many other providers which can also help with an opportunity assessment and I welcome readers to comment below and share with others tools that you might use and why you like them.

Wednesday, August 10, 2011

How Students Use Technology [INFOGRAPHIC]

It’s clear that today’s students rely heavily on electronic devices even when they’re not incorporated in the classroom. In one survey of college students, 38% said they couldn’t even go 10 minutes without switching on some sort of electronic device.

But how students are using their devices, how technology is affecting their educational experience, and what effect it has on their well-being are questions that are harder to answer. In the infographic below, online higher education database Onlineeducation.net has summed up some of the existing research on these points.

Via: OnlineEducation.net

Source: http://mashable.com/2011/08/10/students-technology-infographic/

Monday, July 25, 2011

Identifying Quick Wins

When I ask a new client what they are looking for, the answer almost always involves either the phrase “low hanging fruit” or “quick wins”. It makes sense too, we all want to find efforts that have a low cost associated with them (resource or monetary) and have a significant impact. Unfortunately quick wins frequently tend to be harder to find than high cost initiatives like revamping your entire site architecture.

quick

I went through some old reports and came up with the following list of Items to check that can provide quick wins:

404 Errors

While 404 errors aren’t inherently bad, they can become a problem if there is a lot of link equity associated with these pages. Sometimes you can end up with several 404 error pages receiving a significant amount of links. This is frequently due to things like site redesigns (and not employing proper 301 redirects) or killing off old product inventory.

To fix this, 301 404 pages with link equity to the most similar page. If there isn’t a similar page, the decision process becomes less black and white. In this circumstance, I like to redirect to a relevant parent or category page.

302 Redirects

These are bad. We all know that 302’s don’t pass link equity but 301’s do. The good news is that these aren’t your fault and you can look pretty good if you come in and have them fix a bunch of 302’s and consolidate their link equity causing an increase in traffic. Easy.

Pages blocked by robots.txt

Are you orphaning link equity and prevent it from flowing to other pages? A lot of sites receive links to their privacy, TOS, and about pages but have them blocked by robots.txt. This blocks this link equity from flowing out to other pages and the link equity is stuck at a page Google can’t access. Bad news.

To fix this, you should remove the page from your robots.txt file and link to important relevant pages. If you want to keep these pages out of the index, you can throw up a meta robots “noindex” tag.

Text in an image

A problem I saw quite frequently when working with small businesses was that many of their pages were done using a lot of images. The reason behind this was usually that they got the cheapest web designer they could find (or their friend’s kid “does web design” and made them a site). The solution is pretty simple (but not always elegant for a small business to implement), get that text out of an image.

Having content stuck in Flash is another common iteration of this.

XML sitemaps

If your client has poor indexation but isn’t submitting an XML sitemap, simply submitting a sitemap to Google Webmaster Tools can have a significant impact on indexation. It’s important to understand that XML sitemaps aren’t a replacement or solution for having poor site architecture.

While XML sitemaps help with indexation, they don’t pass link equity so you may be able to get pages indexed but they won’t rank (unless they are very long tail oriented pages). The fact that these pages aren’t in the index is a sign that you currently don’t have an optimal site architecture, so you should really look into improving your site architecture instead of just creating and submitting your XML sitemap in GWMT.

Meta robots noindex/nofollow

You should check to see if important pages (or sometimes all pages) are being kept out of the index. Sometimes a noindex tag will get carried over from a dev environment and will be overlooked or the tag might be mistakenly put into the code. Regardless of the reason, it happens. Check to make sure the tag is being applied appropriately.

Further, you should check to make sure the nofollow tag is applied appropriately. I’ve seen things like all internal links get nofollowed during a code push. Again, while it may not be common, check for it.

A great way to monitor this (as well as other significant errors) is through the SEOmoz Campaign app (warnings shown below).

Quick wins

Internal anchor text

Is your content being linked well? If you aren’t going to link to your own content with great keyword phrases, who will? Notice how Songsterr links to AC/DC tabs with their song name while Ultimate Guitar links to AC/DC songs with the song name + tab. Make sure you are using an optimal keyword phrase to link to your content/product.

Local businesses targeting national terms

I have seen a lot of local businesses targeting national phrases where they should really be targeting local phrases. Think about a local bike shop. Local bike shops should be targeting localized keywords like “Santa Barbara Bike Shop” versus simply bike shop or the name of the bike shop.

Canonicalization

Dispersion of link equity can be a significant problem for some sites, consolidating that link equity can have a a significant impact, depending on the severity of the Make sure that the site is canonicalized not only with the canonical link tag but also with 301s. Make sure that you use 301’s to:

  • Establish the www or non www version of the site as canonical
  • Enforce a trailing slash (or not, just enforce one)
  • Redirect extensions of index files to the root (/index.html > /)

Sunday, July 24, 2011

How To Trick Black Hatters Into Building Links To Your Site

So you are intrigued by black hat SEO, just to scared to try it? Good, the only thing you will experience is pain.

But, there is another way to get black hat SEO to benefit your site while protecting it from the Google ban stick. As you already know, there is a real problem with content being stolen and used around the Web. It infuriates most site owners, but it makes me very happy ;)

So, let’s set a little trap for the black hatters and watch as links come pouring into our site.

Black Hat Content

Black hat SEO has many faces and has become as complex and challenging as white hat SEO. Most black hatters spend most of their time focused on link building and figuring out ways to game the system. They are not natural content creators.

Actually, creating content is the bane of most black hatter’s existence. So they steal it from all types of sources. The smart ones know where to find content that has not been indexed, like sites that do not take advantage of ping. You are not one of those webmasters, right?

You, on the other hand, have one very unique advantage over black hatters, you love to create good content. Content that is legible and popular. Content that is loved by the search engines and regularly indexed.

Even if you use ping, when black hatters steal your content they are betting on their domain and link authority to push their page above your page in the SERPS. If it doesn’t work, oh well, your page of content is only one of thousands that they have automatically scraped and stolen.

Black Hat Automation

Black hat SEO is about one thing, automation. They look for ways to automate their processes. The more automated processes they employ that is successful, the more successful they are.

To scrape content, a black hatter needs a page related to that information to scrape. They have to be able to automatically locate that content through other sources.
RSS feeds make it extremely easy to steal content, since it is in standard format for RSS readers.

Here is how the process goes down typically:
  1. Black hatter builds site own specific niche, like “Digital Cameras”
  2. He scrapes the Google Blog search for the most recent posts related to “Digital Cameras”
  3. He scrapes the sites featured in the blog search and steals the content, posting it to his WordPress blog
  4. He automatically gathers inbound links to these pages at a heavy rate, usually from a large link network that he has set up, scraping content to do so.
  5. He repeats the process over and over again until Google finally bans his domain and hopefully has made his money from the affiliate or ads on his site

Black hatters hopes they can overcome white hat sites with sheer volume of links and content. If it didn’t work, there would not be so many black hat SEO’s.

Backlink Placement

The key to making your stolen content work for you is to include back links in your post to your site. This can be manual links or related posts at the end of your post. Most black hat SEOs do not remove the links inside of a post. It is less likely that they will be reported for spamming if they leave the links intact.

Plus, removing the links requires more processing time and is not in their best interest to do so. This leaves you with a great inbound link opportunity.

Using a plugin like RSS footer gives you the ability to add a link to your original post and a link to your home page with your chosen anchor text. Both of these tactics are extremely effective. Linking to the original post insures that your post will always outrank the content the black hatter has stolen.

Getting your content stolen
It is a good idea to provide a full version of your post in your RSS feed if you are using RSS footer. This makes your site easier to scrape. The more times your site is scraped and your content is stolen, the more backlinks you will acquire.

Make sure you can get to your feed by going to /feed/. If you can’t, add a 301 redirect to your site sending visitors and scrapers to your true feed location.

Promote your RSS feed to as many aggregators as possible. Every time you publish a new post, you want the content scrapers to grab your page and post it on their site.

Make sure to ping Google blog as this is one of the favorite places to find new content sources for many black hatters. Ping everyone, everywhere.

Advanced Technique

Alright, this technique is slightly gray hat. You have been warned.

You need to create a script that monitors what is hot in the online universe. You can pull the Twitter hot feed or any other social media tracking stats.

You can bet that if something goes hot on Twitter or other social media outlets, black hatters are moving rapidly to capture whatever search traffic they can. What do the black hatters need? That’s right, your content.

So, we want to trick them into linking to your content over and over again. Here is what you do:

  1. Create a script that monitors what’s hot
  2. Create a pinging script to ping blog aggregators (I would leave Google out)
  3. Create a scraper to grab generic post titles from a blog aggregator
  4. Once you have those three scripts, you combine them into one script that looks at what is hot, creates random post titles, adds your post with internal links, and pings blog aggregators.

So, you ping the same post with the same links back to your main page over and over again with a different title. Make sure to include a canonical URL in the head section of your page in case Google finds the page.

Your site on Digital Cameras republishes the same content over and over again under a different title and slightly different URL. The black hatters are monitoring the same social media telling them what is hot. Their script detects from one of the blog aggregators that your site has a recent post on the subject based on the title.

They grab your content and post it on their site. While all the black hatters fight to rank for the topic, you sit back and collect as many links as you want.

You could literally gather thousands of links in a very short period of time. Remember, you don’t want to overdo this technique. But if you are stuck a few pages back, you may want to give it a try.

Saturday, July 23, 2011

33 Things You Can Do To Grow Your Business Today…

Every day you are given the gift of time (OK, so there is never enough of it, I hear ya, but all you can do is make the most out of what you do have). So what are you going to do during that time?

There are so many demands on your time, your To Do list is growing rapidly and there is always a fire to put out. Again, I hear ya. BUT what happens is you become too reactive and not proactive enough. We all spend so much time reacting to things, we never get to being proactive about the important things – the things that will help grow a business.

So let’s set aside some time each day and let’s pick 33 things that will help grow our businesses.

There is no magic pill or fairy dust that produces traffic. But there are a bunch of activities you can engage in that will produce traffic. There ARE 33 things you can do today to help grow your business. Then there are 33 more things tomorrow. The next day you could try doing 33 more things. You get the idea. Daily action puts you a step closer to your goals.

So what are the 3 things you could do? Let me give you some ideas:

1. Check on your exposure in the search engines – there is usually something you can do to boost your SEO efforts. Add content, check for broken links, look at web stats and Webmaster Tools and identify problem areas, build links, ask your SEO firm how you can help (if you’ve outsourced the work) etc.

2. Get active on Facebook. Some ideas:
  • a. If you don’t have a Fan Page – get one. (Need help with that? Email me and I’ll get you a quote).
  • b. Invite new friends and fans so you can network and connect.
  • c. Post to your Fan page and engage fans.
  • d. Look into a contest to grow your fan base.
  • e. Make sure your Blog, Facebook, Twitter and site are all working well together for maximum benefit.

3. Get active on Twitter:
  • a. If you don’t have an account yet, get one started.
  • b. Writer tweets and engage your followers.
  • c. Connect with industry leaders.
  • d. Share content.
  • e. Build followers.

4. Build content for your site or Blog. Make it interesting, engaging and be sure you include calls to action.

5. Write a newsletter or email offer for your list.

6. Check your Blog’s backend and be sure it’s properly configured with the right plugins to get maximum benefit.

7. Check your web stats and find areas you can improve your site to increase your bottom line.

8. Check for 404 errors and clean up your site. Not as fun but important none the less.

9. Review Webmaster Tools and see what you can learn to improve the optimization of your site. (Notice I mentioned it in the first suggestion and then slid it
in there again? It was worth repeating)

10. Review content and make sure there are no errors, make sure your copy is clean and tight and prepared to sell your site visitors when they enter your site.

11. Test your shopping cart to make sure everything works and is user friendly.

12. Invite Guest Bloggers to add some interest, different perspectives and fresh content to your Blog.

13. Check on your PPC campaigns and see what you can improve.

14. Set up split tests on landing pages to improve results.

15. Find opportunities for you to be a Guest Blogger.

16. Make sure your product feed is working and up to date.

17. Look into mobile search and mobile marketing.

18. Make sure your local listings are all accurate, up to date and optimized for maximum results and exposure.


20. Follow up with leads and prospects (follow up is a key area that most businesses fail)

21. Identify opportunities that you have been too busy to take advantage of and get on it!

22. Get a press release out there.

23. Write an article for syndication (just know the rules and guidelines so this doesn’t end up working against you)

24. Get a video online and promote it like crazy.

25. Check out your competitors and see what you need to do to beat them out.

26. Build links. (Yep, another repeat but again worthy of the repeat)

27. Check out Google+ (ask for an invite if you don’t have one)

28. Add “Like” and “+1” buttons to your site and Blog.

29. Network and comment on Blogs and forums. No spamming allowed. Offer real insight and contribution to the conversation.

30. Plan a webinar or conference call to present to prospects.

31. Create a Whitepaper that people can get after they opt-in and promote via social media to build your list.

32. If you prepare written proposals for clients, look at your template and make sure it’s clean, easy to understand, full of benefits and error free. Your proposal is going to be what either excites them to talk further about doing business with you or it turns them off because it doesn’t look professional or grab their attention.

33. Come up with your own list of items if you think I forgot something. Take it up somewhere in your office so you see it every day and then be sure every single day you pick 3 and take ACTION.

Friday, July 22, 2011

Google Focusing In, Retiring Labs & Other Projects

There’s little doubt that Google has long employed a “strategy of everything,” attempting to be all things to all people – at least on the internet. However, Larry Page (co-founder and, since April, CEO) has spearheaded a new tactic of increased focus and more concrete prioritization. As part of that effort, many Google projects have been retired. The newest casualty is Google Labs.

The Retirement of Google Labs

Google Labs, a site that demonstrates new Google project ideas (and works as the first beta/vetting opportunity), has been a major part of Google since 2006. However, this product is being retired alongside a large number of APIs and several additional side-projects. “While we’ve learned a huge amount by launching very early prototypes in Labs, we believe that greater focus is crucial if we’re to make the most of the extraordinary opportunities ahead,” explained Bill Coughran, Google’s SVP for Research and Systems Infrastructure.

The labs won’t all disappear (although, yes, many will). As stated by Coughran, for some labs “we’ll incorporate Labs products and technologies into different product areas.” Additionally, several mobile labs will continue to be available for download on the Android market.

Page’s Focus

Larry Page commented on the company’s new, more focused strategy. “While much of that work has not yet become visible externally, I am very happy with our progress here. Focus and prioritization are crucial given our amazing opportunities,” he stated. “Indeed, I see more opportunities for Google today than ever before.”

Part of the prioritization process involved shutting down services like Google Health, Google PowerMeter, and a long lineup of APIs. The remaining products are being streamlined and simplified, according to Page. Certainly the release of Google+ has shown a lot more branding unity and cross-integration, and if Page’s remarks and the current trends are any indication, we can expect a lot more of this in the future.

5 Smart Ideas On How To Use Video For Your Blog

If you own or manage a blog site, then most likely multimedia is on the list of things to do as part of your content marketing strategy. But did you know that videos are often an overlooked part of of an overall content strategy?

Use Video On Your Blog

This puzzles me because videos are viewed at an alarming rate according to YouTube sources (and there are other video streaming sites too). Daily views of video is at 2 billion per day, per this infographic on Mashable. That’s an insane statistic, and it gets more insane when you consider that video sharing via social media is also very high. So it makes sense to add multi-media to your blog and to do it in the form of video. Let’s take a look at five ways to use video on your blog site.

1. Reviews – one of the best ways to explain an idea, service or product is to have it come alive via video. You can do it yourself or hire someone to film and produce a high quality video that reviews your concept like nothing else can. Keep it short under 2 minutes and make it fun, interesting or controversial.

2. Customer Testimonials – are your blog readers, customers or visitors praising you left and right? Best capture it on video and let your new audience see the praise. Video testimonials are all the rage on epic shopping sites such as Amazon. Why? Because people love to watch video and what better way to showcase what someone has said about you, your company or your service than with a live representation.

3. Interviews – is there someone in your blog niche that you wish to interview? Perhaps a fellow blog owner that is releasing a shiny new product your readers should know about? Whatever the case, make sure video is at the forefront of what or whom you seek to interview. This can be a super simple set up where you meet with the intended person and ask a series of questions live on the scene or you can send your questions over email to the interviewee and have him or her set it up and and send it your way.

4. Contests – who doesn’t love a fun contest with an awesome giveaway? Many blog readers and visitors will do just about anything to win the big prize, including submitting wacky videos of themselves in all sorts of situations. For your contest, make it the point of entry. In order to win, have the contestant submit a video he or she made at home. This could be a great way to attract traffic to your blog, not too mention keep visitors on longer. Since everyone will want to watch all the videos the others have submitted. The more fun everyone is having, the better your chances of making it a successful video contest. Go big!

5. Product Showcase – you can describe a product via text and do it justice, however words cannot describe, show and tell quite like a video can. Wouldn’t you agree? Why do you think commercials are so effective at getting people to a store to try out the latest toy, electronic, gadget, etc. Just think of the “Old Spice man” and his uber successful videos. He has managed to bring back a franchise my dad was into, a brand no one in the last decade ever thought or talked about anymore. Now it’s become a household name again. The power of video at work.

So aside from the fact that using videos on your blog is good for traffic, it’s also good for any of the five ideas mentioned above. Be creative, be daring, be controversial, be inspiring. Just don’t side step it any longer, try using video on your blog today.

Thursday, July 14, 2011

How to Conduct a Quality Score Audit

You know Google’s got this “quality score” (QS) thing – and you know it matters. It matters because AdWords uses this 1 to 10 scale to rate how good and relevant your ad is compared to others, and the more relevant AdWords thinks your ad is, the more it’ll get shown and the less you’ll get charged for each click.
First, let’s quickly review…

What factors into the quality score calculation

  • Clickthrough rate (CTR)
  • Ad history
  • Landing page relevance
  • Landing page speed
This is a simplistic view; in reality, there are lots of factors that influence the quality score calculation and you can read more about them in this more comprehensive post.

What’s most critical is this: CTR is the single biggest determinant of QS. If you have fantastic CTR – assuming you landing page loads normally – you’ll usually have great QS.

How to improve quality score

Here’s a quick guide to get you started.
  • Separate the good from the bad. Start with the hatchet, and then we’ll use the scalpel. Set up a filter using AdWords Editor for all keywords with QS < 7. Cut all of those keywords and put them in a separate campaign(s).

The traditional school of thought around QS is that, although it’s only reported in the AdWords interface at the keyword-level, there’s actually adgroup-level and campaign-level QS too. The higher the overall QS of your adgroups and campaigns, the better.

You also want to isolate the QS < 7 keywords in order to troubleshoot them.

  • Figure out why the QS is poor. Bad QS indicates poor relevance. Luckily, Google helps you troubleshoot this, at least in broad strokes, if you mouse over the little speech bubble in the “Status” column when looking at the “Keywords” view.

Score audit

It can also be useful to try the diagnostic tool (the “Ad Preview and Diagnosis” link shown above). In the case above, though, there are no clear issues with the QS. It’s just that the keyword / ad combo isn’t all that fantastic when compared against others for this search term. The thing to do in this case is, well, see point “a” below.

Here are a few common reasons your QS sucks:
  • Your keywords need to be tightly related to the ads in your adgroup. If your keywords are thematically all over the place, your overall CTR will come down and impact QS.
  • The destination page where you’re sending people who click your ad isn’t clearly relevant to the content of your ad. If your ad is for flowers, that landing page better be about flowers.
  • Your ad copy doesn’t contain the words in the keyword phrase you’re bidding on. This comes back to having small, tightly thematic adgroups. Your ads need to say to users, “You searched for pretty red flowers, and I’m all about pretty red flowers.”
  • Your landing page loads too slowly. It isn’t super common that your page loads so slowly that Google dings you with respect to quality score.
  • This is a controversial one, but Google account teams have confirmed it to me in the past: you’re simply not bidding enough. If you’re in a competitive space and your bid is leaving you with a poor average position (5-8, let’s say), your position results in a poor CTR and that poor CTR impacts your QS. Harsh but true.

The only solution here is to create a single-keyword adgroups, A/B test ads that are super relevant, and raise your bid. If you achieve decent position and start to get that CTR up, developing some good CTR history will help you and can eventually make the term less expensive.

  • Raise your CTR. Sounds easy enough. How?
    • Make your ad copy laser-targeted. How to write good AdWords ad copy is a whole other beast, but at minimum, know that you need to say what you’re offering in the ad headline, match it to the specific words the user searched, repeat those words throughout the ad (but don’t overdo it – the ad needs to read like it makes sense or you’ll lose credibility), and capitalize the first letter of each word, excluding prepositions, conjunctions, etc.
    • Try this exercise. Google one of your top keywords and take a look at what competitors’ ad copy focuses on. Check out the search I conducted below to buy herbal tea online.
score audit

What do you notice about each of the ads? The first one emphasizes health benefits. The second one touts a testimonial (I’m not sure who Sir Jason Winters is, but if I’m ignorant, just let me know in the comments). J The third promotes its tea as “award winning” and offers a coupon code. The fourth ad is all about low prices and free shipping.

Now, go into your own AdWords account and create 3 ads tied to a few of your highest-volume keywords. Make sure they’re related closely enough to share an adgroup. Focus each of the 3 ads on a completely different benefit. One can be price-oriented with a coupon code; another can talk about all the great press you’ve gotten on CNN and the BBC.

Measure their respective CTRs and get a sense for what your customers care about most. Then, kill the 2 underperformers and create 2 new ads focused on the winning benefit, using different wording. Test tweaks to description line 2, for instance. This is how you refine ad copy to lock down a great CTR and a solid quality score.

The bottom line

While there’s no one-size-fits-all QS solution, and no one knows exactly how the algorithm works, we do know you’re typically rewarded with good QS when you build strong account history with highly relevant ads that point to good landing pages.

If you adhere to guidelines from the “layman’s 30-minute SEM audit,” develop single-keyword adgroups and maximize traffic coming from exact match keywords – that often covers half the battle when it comes to quality score optimization.

Source: http://www.searchenginejournal.com/how-to-conduct-a-quality-score-audit/30899/

Wednesday, July 13, 2011

Top 15 Free Things Every E-Commerce Website Should Do After Launching

Here is a list of the top 15 things every e-commerce owner/marketer should do after launching their new site. I am not including anything like keyword research, competitive analysis, on-site SEO, creating unique/keyword rich content, or anything else that is generally done before a site is launched. With that being said, here are the top 15…

1. Create a sitemap.xml file (and robots.txt file)

- This will get your newly created site indexed by the search engines. You can choose to link to the sitemap directly from your website, or you can opt to upload your sitemap using Google Webmaster Tools, Bing Webmaster Central, and Yahoo Site Explorer. There are a number of sites online that will create a free sitemap for you (as long as your site in under a certain number of pages).
*Note – you may also want to set up a robots.txt file to disallow certain search engines from indexing unwanted pages in your site. This is used to eliminate duplicate content, preserve “crawl space” (as I like to call it), and keep unwanted pages out of the index (such as your shopping cart page).

2. Set-up product feeds

- Setting up well optimized product feeds can be tedious at first, but it is well worth the effort. The data feed can then be submitted for free to various places such as Google, MSN Shopping, TheFind, Price.com, myTriggers.com, & PriceSCAN. There are also paid services that operate on CPC and % based pricing structures. Data feeds will earn you more traffic and resulting sales. There is no good reason not to have one submitted to all of the free outlets.

3. Submit to DMOZ and other free niche directories

- The Open Directory Project or DMOZ is one of the best free links you can receive. Every new site (e-commerce or other) should submit their site to DMOZ upon launch. Due to the incredibly large amount of entries, it may take a while (months) for your site to be included. Be patient, high quality sites are often included. If you operate as a micro site or an affiliate site – well then try your luck somewhere else.
- There are usually also a number of niche directories you will be able to submit your e-commerce site to. If it is well designed and visually appealing, submit to any number of CSS galleries and directories. If you sell Bar Mitzvah cards, try submitting to Jewish directories. Get creative with your searches and you should find some easy, free, and relevant directories well worth your time and effort to submit to like www.perfectwebdirectory.info, directory.freewebsitelist.com etc.
*Note – Don’t forget to submit to free local directories!

4. Sign up for Google Analytics

- Google analytics is a no brainer for any website, e-commerce or not. It is completely free to use and is a very capable analytics program – sufficient for most of the sites online. All you need is a Gmail address and then place the analytics code on your site, and you are good to go.
*Note – Do not forget to set-up goals and conversion funnels. Also remember to filter out your own IP address and the IP address of anyone else that works on the site. Last, you may want to take advantage of one of the newer features and install page load time tracking.

5. 301 Redirect your various homepage URLs to be consistent

- While this may seem like a small tweak, it can actually be the difference between your site ranking on the 1st page or the 4th. If your site URL is http://www.perfectwebdirectory.info but you also have http://perfectwebdirectory.info (along with other variations like /index.htm) you are creating duplicate content & spreading your link juice across multiple URLs. It’s standard practice to use a 301 redirect to consolidate all of these pages to one URL. Whichever version you choose is up to your discretion.

6. Sign up for Google Webmaster Tools & Webmaster Central

- This step is incredibly simple, and it can provide you with a variety of useful data points that will allow you to manage your site more effectively. You will be able to track backlinks, organic search clicks/impressions/CTR, average site load time, html errors, most important keywords, and much more. All you have to do is register your site and verify using one of the various methods. I suggest verifying through meta-tags or uploading a file to your server. It’s incredibly simple, yet incredibly useful. You may also wish to set up an account with Bing Webmaster Central – the Microsoft version of Google Webmaster Tools.
*Note – Don’t forget to set your preferred domain, as well as your geographic targets.

7. Send out a press release

- Sending out a press release is another must. A simple introduction of your company, what you sell, what makes you unique, etc is all it will take. While you can pay a lot of money to have this press release distributed, there are also a large number of free sites that will let you distribute for free (and some even let you include backlinks). Some of the most popular are PR.com, PRlog.org, OpenPR.com, Free-Press-Release.com, 1888PressRelease.com, i-Newswire.com, 24-7PressRelease.com, TheOpenPress.com, submissionsvalley.com, PRUrgent.com, pressreleaseprint.com, PressMethod.com, & of course eCommWire.com.

8. Start a free blog

- Blogging is a great way to engage your audience, establish yourself as an authority in your industry, network, and build quality backlinks. Use a free platform like Blogger or WordPress and you can have a company blog started in less than 30 minutes. You can also pay to have the blog hosted on your domain – either way, it’s up to you.
*Note – Don’t forget to set up an RSS feed for your blog!

9. Create social media accounts

- Nowadays it seems every business in every industry has social media accounts. While not every industry truly needs to be on Facebook, every e-commerce website should be. Create social media accounts on the big 3 – Facebook, Twitter, & LinkedIn. You may also want to create accounts on Youtube, Quora, Flickr, MySpace, Scribd, or any number of other social media sites related to your niche. For example, a retailer of baby products would want to make an account on CafeMom. Use your best judgment to determine which accounts are worth your time, and which aren’t. There’s no point of creating an account if you aren’t going to participate in the community and keep your information current.

10. Create social bookmarking accounts and start bookmarking your most important pages

- Social bookmarking is a great way to gain traffic and backlinks. While the links are not as authoritative as other sources, they will get your site indexed quicker and they are a great starting point for any e-commerce site that has just launched. The major players are Reddit, Digg, Delicious, Folkd, Fark, Diigo, Slashdot, & StumbleUpon. Wists, Kadooble, Fancy, & Nuji are also great social shopping sites that allow you to bookmark your products and gain traffic/backlinks. There are thousands more sites for social bookmarking, but unless you have an unlimited amount of time and resources to dedicate to this – I suggest you stick with the big ones and actively participate. It’s better to have a few social bookmarking accounts that carry weight, than hundreds that are neglected.

11. Sign up for HARO

- HARO (or Help a Reporter Out) is one of, if not the best place online for free PR. It’s a completely free subscription based service consisting of 3 day emails loaded with queries from various media outlets. It’s made up of journalists and bloggers looking for expert opinions, stories, and products. It’s a no brainer for any e-commerce site. You have the opportunity to earn media coverage on a wide variety of outlets – from the largest to the relatively small.
*Note – There are similar, yet smaller, services that virtually mimic HARO – Reporter Connection, Pitch Rate, & Flacklist (think Facebook meets HARO), just to name a few. They are all free and definitely worth your time.

12. Set up Google Alerts to monitor brand mentions (& competitors)

- Setting up Google Alerts is simple as can be. Visit google.com/alerts and type in the terms that you want to be notified of as soon as they are included in any new indexed web pages. This is great to monitor brand mentions – not just your own, but also those of your competition. It’s also great for monitoring your most important keywords. You may find new opportunities or inspiration relatively quickly. Best of all, its free, and you don’t even need a Gmail account to use it.

13. Take advantage of $75 of free Google Adwords coupons

- There are many reasons for e-commerce site to try Google Adwords at least once. It can give you a better idea of the actual search volume of specific keywords, it lets you know which keywords your site performs best for, it lets you test different ad copy/promotions/landing pages/and marketing messages, and you can get $75 credit free when starting a new account! There’s no good reason not to give it a shot…

14. Sign up for and/or verify your local search pages

- If you are an E-commerce site with an actual brick and mortar storefront, you should be claiming and verifying your local search pages. Google Places, Yelp, Yellowpages, CitySearch, SuperPages, InsiderPages, Merchant Circle, Bing Local, and Yahoo Local are the first places to optimize for. Local search is a completely different animal…but signing up for and verifying your place pages is the first step to gaining additional, local traffic.

15. Inform all of your contacts about your new site

- This one seems like a no brainer, but you would be surprised how many people simply forget to utilize their existing database to help spread the word about their new business. You can choose to use email, phone calls, or traditional mail.
*Note – Don’t forget to link to your website and any relevant social accounts in all of your email signatures from launch date onward.
These last tactics were so close…but didn’t make the cut. – Either because they were too advanced for the normal e-commerce site owner, not a large enough priority after an initial site launch, take too long to become effective, require too many resources, or because they are not entirely necessary for operating a successful e-commerce site. There were also many other ways to market your new e-commerce site that were not included, simply because they cost money.
  • Register for an affiliate program (a free one)
  • Blog comments and Q&A site participation
  • Download free SEO toolbars for Firefox
  • Article marketing
  • Guest Blogging
  • Link request emails
  • Competitor link theft
  • Giveaways, donations and product reviews
  • Link bait
  • Forum participation
  • Submissions to general directories
  • Creation of a press list and contact of all the bloggers/journalists in your niche
Hopefully all you e-commerce people out there have already knocked most of the items off of this list…and if you haven’t – what are you waiting for? The sooner you get going, the sooner you start increasing traffic and start making more money.