2010-07-27

gShift Labs - Product Review

Last week I had the privilege to visit the offices of gShift Labs in Barrie Ontario (ideally located in the downtown right across the road from the main beach) and was given a guided tour of their optimization tool by Chris Adams. gShift has positioned this tool for (and along the way coined the phrase) “web presence optimization”.

Now I've personally know Chirs and Krista (President of gShift Labs) from their earlier days with a start-up known as CGK which developed the first SEO friendly CMS called "Hot Banana". While their involvement with Hot Banana is behind them, they've taken their understanding of the needs of SEO community and have developed a new tool for the evolving industry.  What makes this optimization tool different than others (such as Rave Tools or SEOmoz tools) is that it is multi-focused.  gShift’s web presence optimizer isn’t just for helping you optimize a site from a SEO (search engine optimization) perspective, while it has components to help you do that. It goes beyond SEO and provides information on how social networking efforts are impacting website traffic (requires integration with Google Analytics). Is your organization heavily invested in Twitter? gShift helps you track the impact of tweets by retweets and includes a component that integrates with your bit.ly account. In essence gShift has (through APIs) integrated their product with most of the popular analytic tools for measuring social media. While their current set of tools for measuring social media may not be as powerful as the single purpose Radian 6 or TrackUr, it does bring together multiple tools to provide one stop reporting.

gShift also provides recommendations on where you need to focus your optimization efforts, be it a new press release or blog post or on site factors (better title tags, h1 tags etc.). You can even use its built in schedule feature to create a list of optimization "To Dos" and then have them appear as an annotation of the various report screens as they are completed.

gShift over the next few weeks will be releasing new features to further help site optimizers. This includes a content submission utility to help submit your blog posts to social bookmarking sites (such as digg) and Twitter. An additional new feature will be a integration into Wordpress.

The one thing that does separate out gShift form other tools is its reporting features. While most optimization tools generate great reports for the technical team, the reports are too technical for senior management forcing people to cut and paste various components into formal senior level reports. gShift’s reports are designed to simplify the complex data used by the technical team into easy to understand management reports that any VP can understand without any hand holding.

Overall this looks like a great product. I was also introduced to a series of planned upgrades roughly scheduled within the next 6-12 months. While I can say WOW and I can’t wait for these, I’m under an NDA not to talk about them at this time.

While I don't have any screen shots at this time, if you're interested in seeing this product for yourself, be sure to visit their site and sign-up for one of there Webinars (http://www.gshiftlabs.com/).

2010-07-13

SEO & Stop Words - Writing for Search

Over the years while teaching various Search Engine Optimization (SEO) courses I've been asked many questions about writing SEO copy, or as I prefer it writing for search.

On thing I always stressed is that you first write for your human users (your real audience) and secondly for search engines. This is where an internal battle usually starts about how it should only be for the human users and never for the search engines. To which I reply, fine but without the search engines your human audience is going to be a lot smaller. Once we get over this hurdle, our course conversation usually turns to the issue of duplicate content.

Yes duplicate content and how all work on website/blog has to be original and what that truly means. I try to explain that Google tries strip out all code and just look at the words in the copy. It then compares it to other copy in its index and then has to decided if its original vs. duplicate and if duplicated which one to include in its index. Not a simple task.

While most of us think of duplicated content only occurs when articles are syndicated or if you are incorporating RSS feed content etc there is another and a legitimate reason why you might need duplicate content and why Google should punish you (but they might). This happens for legitimate reasons with big organizations who run country/regional specific websites. For legal reason, marketing or simply management it is very possible for a organization to run multiple unique sites (unique URLs). For example several English language sites target specifically at the USA, Canada, UK, Australia, New Zealand, etc. (same of course is possible for other languages).

The reality is when developing sites in this configuration and to reduce costs, internal documents (white papers, corporate profiles, product pages, etc.) are frequently merely duplicated. So how can you avoid the possible SEO penalties of duplicate content. It's by tweaking the content of these pages. Yes, you can and should adjust the words to reflect the cultural linguistic subtleties of each region, but that usually won't be enough. You should consider at least rewriting the introductory paragraph and a few lines throughout the article. Always take into account the surrounding text on the page (side bars & navigation). Are these unique to each site? Of all the words on the page about 20% should be unique to avoid possible duplication penalties.

One technique, that many people try and fail with, is to merely change some of the connective words. For example changing "also" to "furthermore". These simple changes don't affect the meaning of the content and make life easier. Unfortunately to search engines like Google and Bing their algorithms are programmed to ignore these words along with the basic words "a", "but", "the", etc. (For a detailed list of stop words see: http://www.link-assistant.com/seo-stop-words.html.

 So, if you have to re-purpose content for legitimate reason, the re-purpose it, but budget a little time to make a few reasonable changes that will content reasonably unique.

2010-07-06

Prep for the Bing bot

Microsoft last week announced that it is going to unleash a new search engine bot within the next few months (currently scheduled for Oct. 1) called "Bingbot" http://www.lbi.co.uk/blog/bing-to-launch-bingbot/.

For many who use java script based web analytic tracking tools (i.e. Google Analytics, Omniture, Webtrends Ondemand, etc.) this new bot will  go unnoticed and have no impact on your web analytics reports. However those out there who are using traditional server access logs (web logs) its time to prepare.

By now you should have a bot filter that removes most non-human (bot) traffic from your primary analytic reports. Don't wait until the new bot is out there to update your list update it now.

The new bots name is "msnbot/2.0b" and will replace the old "msnbot". Consider this an addition to your list. If Microsoft follows its old pattern of renaming bots or that the behavoir of other search engines, there will be at least a couple of days overlap where you site could be visited by both . This way your reports will remain unaffected when the new bot comes into play.