Cookie consent law is breathtakingly stupid

For years I've written about why the public shouldn't be afraid of cookies except for the calories in the ones people eat. Despite my efforts, fear has continued to grow in part spurred on my anti-virus software companies that look for tracking cookies.

What most people don't get is that effective ecommerce web sites need cookies to operate. Web analytics software need cookies to create more accurate statistics of user behavior to conintually improve the sites usability and improve marketing campaigns to better convey message to their target audience.

So I'm shocked, but not surprised that the EU is passing a law that will effectively ban the use of cookies on EU based web sites without specific user permissions.

For all the details on this new law and all the stupid impacts please read http://www.techradar.com/news/internet/cookie-consent-law-is-breathtakingly-stupid-648740 from TechRadar.com


Tracking Phone Calls in WebTrends

About 8 weeks ago I was contacted by Mongoose Metrics.

The purpose of this contact was two fold. First they wanted to demonstrate their product. Mongoose Metrics is an web/marketing analytics tool for tracking and attributing on-line phone calls back to the referring on-line marketing campaign.

The product works simply: by adding some java script to you page and a bit tagging. The result is that everyone who comes to your site sees a different toll-free number. This number is recorded in a cookie along with information about the referrer. If they came directly to your site, the visitor sees the default number.

The cool thing happens when a web site visitor decides to call because they need more information or to place the order over the telephone. The software then very cleanly capatures all the referrer information and displays it to you in a easy to read report format. With a bit of advance set-up you can even break it down by different marketing programs.

Now the exciting part and the second reason Mongoose contacted me. They wanted to see if it was possible to integrate their data automatically into WebTrends. After a bit of trial and error, I developed the framework for the integration and Mongoose quickly developed the code on their end to make it possible.

So what does it mean to the average WebTrends web analytics team. With a bit of effort and creation of some WebTrends Custom Reports, you can now know which campaigns (and any variable within your campaigns translation file) drove people to call in. This conversion point is no longer a mystery.

During our tests we were able to successfully monitor calls generated by a Twitter post (tweat), a link in a Facebook status, traffic from organic and paid search including which keyword phrases generated the calls.

The method was successfully tested with both WebTrends OnDemand and an locally installed version of WebTrends (must be using the SDC feature). Beyond passing a conversion point, the configuration also allows for you to set a dollar value to be assigned to each call. While this value is fix in your configuration right now, plans are being made to develop a way to make the value of call arbitrary and in control of the person who took the call. This feature is still under development and no time line has been provided. However, even with assigning an average call value, better quality ROI calculations can be made to better optimize your campaigns.

Below are some sample reports from WebTrends

Paths to and from the call

Orders By Campaign ID (in this case the order is a phone call)

We are still conducting tests, but I expect this feature to be available to the general public before the end of year. If you'd like more information please let me know and I'll be glad to make the appropriate introductions.


Getting Ready for Pubcon

It's hard to believe it's been a year since the last PubCon marketing conference in Las Vegas. To me this is one of the best organized and informative conferences of the year and I start looking forward to it as soon as the last one is finished.

What makes this conference so great, is it the location or the quality of the speakers or something else? As to location, unlike many others, I'm not a fan of Las Vegas for a conference. Yes this year hotel deals are plentiful, but all those neon lights give me a headache.

So it must be the great quality of speakers at the event. Yes over 3 days of presentations the shear number of speakers can be overwhelming. I personally will be speaking during two sessions and the quality of the speakers is a good as anywhere. While you might find all of us at one conference most of the speakers speak at other quality events throughout the years. So what is the big deal?

What I see going between the sessions in the halls at the conference is something you don't usually see. People talking to other people and exchanging ideas, making friends and network connections. It through these new friends and connection that the benefit of the conference continues to grow throughout the year. I attribute this lose/casual/friendly environment to the conference organizers and their ability to get everyone in the right mood.

So with just two weeks before the opening keynote, if you haven't booked your ticket and registered time is running out. To those of you who know you'll be there be sure to tap me on the shoulder in Vegas and say hi.


Best Search Engine for Canada

According to Hitwise and reported by Search Engine Watch today, the best quality search engine in Canada is Bing despite being ranked 6th in popularity.


The primary reason given is that Bing is doing the best job of localizing search results. Through our own tests at K'nechtology, we've seen very little difference in search quality from Toronto office. So is Hitwise's data wrong, are our test wrong or has Hitwise simply assumed something about the results based on user behavior.

If you read the article you'll see its based on how many searches a person does before stopping. This assumes that the person found what they wanted when they stopped. Here is where I believe they've made some wrong assumptions.

1. Bing is new and lots of people are checking them out. This means "let me do a search on Bing for something I search on a lot and lets see what the results are." After their initial search or two when they don't see a significant difference between Bing and Google they return to Google as their primary search engine. Hence Bing is ranked number 6 in terms of popularity

2. While Bing may have a slight advantage in localized search, from this article it still doesn't appear that Canadians are doing long query strings and are still sticking to 1-2 word search queries. So if that query includes a geo-location then perhaps Bing is giving better results, but they did not analyze how many people include a city or province in their search. From the various clients data that we've analyzed even when they rank extremely well for an organic search phrase that includes a geo-locations, the vast majority of organic search traffic comes from phrases that don't include a geo-locations. So in my opinion this localization benefit of Bing is marginal at best.

3. Are people simply giving Bing the once over and doing a single or a secondary search and saying "so what, I'll stick to what I've been using".

Ultimately, it is going to be up to users to try Bing, see if they like the results and then to switch their default search engine. For that to happen Bing has to start promoting themselves here in Canada. In the US, I've seen TV commercials, heard radio ads and seen print ad. In Canada, next to nothing except at some industry trade shows. Unfortunately, this holds true for all the major search engines.

Canada's population is comparable to that of California, yet from my observations the amount of money spent by the major search engines here in Canada to promote themselves is a minor fraction of the California spend. Attention all search engines, if you want to take a bite out of Google's market share here in Canada, then start promoting yourselves by something more than word of mouth.


WebTrends 9

Two weeks back, WebTrends announced version 9.0. As with all recent upgrades this version is only be released to users of WebTrends on Demand. The good news is they expect by late this quarter or early October to release version 9 for the installed version.

This approach is both good and bad for users of the software version. The bad part is obvious, we don't get our hands on this latest version which holds so much promised and should make all of our reporting that much more completed.

On the other hand, when it is released, we can be assured that it will be relatively bug free and stable.

My clients really do appreciate that reduced frequency of the minor updates, as they can concentrate on the numbers and not a parade of upgrades and batches (unlike other products). I on the other hand feel that in many instances users of the software version are being punished as they do not always get some of the improved reporting. It is one thing not to release the a, b, c of the same version, but not to release intra version releases (i.e. 8.6) makes me wonder. My best guess is that WebTrends is really trying to get all users to migrate to the on-demand version. While I don't have any issues with it and have several clients using it, for many it is simply not an option based on corporate policies and the legal stance of where data is stored.

Either way, I'm looking forward to WebTrends 9 full release.

PubCon and more

If you've been following my near daily tweats (http://twitter.com/aknecht), you'll know that I'm speaking again this year at PubCon. PubCon which will run from November 10 - 13 in Las Vegas is by far the best networking industry conference I've ever attended. The combination of numerous sessions and what seems an endless number of nightly events make it ideal for making industry friends.

PubCon is first and foremost a search marketing conference put on by the people who bring us WebMaster world. This year the response to the conference from what I've heard has been fantastic despite the economy. By the end of July (then end of the first tier of early bird registration) they were well ahead of previous years. Also the number of people submitting proposals to speak at the conference has increased and made the work of conference organizers that much harder. Only over the past few days have some sessions been announced. I'm still waiting to hear what I'll be speaking on. I'm hoping to be on one of the Web Analytics sessions, but might just end up in a SEO and Design session like have been for the past two years. With some luck perhaps all the sessions will be announced by the end of the month.

If you've thought about this conference in the past this may be your best opportunity. Hotel costs in Las Vegas are the lowest they've been in years. This year I'm staying at the same hotel as last year and the room rate is 40% lower.

Beyond PubCon, I've also been book to speak at a Social Network/Web Marketing workshop series in my home city of Toronto called "Executing Social Media". This event is schedule to take place on October 28th and 29th. The details are still being finalized and I expect the their web site and registration system to be ready within the next week or two. More on this event as the details (location, registration, price) are announced.

Either way, if your at either of these events please tab me on the shoulder and say hi.


AT Internet Training & Certification

Last month my company K'nechtology Inc. became the first Canadian partner for AT Internet. AT Internet provides a high-end web analytics tool which competes in the market space currently occupied by such giants as Omniture Site Catalyst and WebTrends.

For the past 4-5 weeks I've been giving it a trial run and loving it. Today I started a 4 day training course which dives deep into the product as part of becoming certified as an expert on the product itself. The course will run until Thursday after which myself and my firm will be able to provide product support for AT Internet's web analytics product as well as becoming an authorized reseller.

I'll be posting some screen shots of some of the cooler things it does in a day or two. For right now, consider that this product can provide information on how far down a screen your visitors are viewing (especially useful for analyzing longer pages to see if people read to the bottom) as well as provide a 5 point images called a "radar graph" which easily displays 5 KPI on one image for easy segmentation analysis.

More on this product over the next couple of days.


Search Engine Strategies - Toronto

For the past two days, I was totally engrossed in Search Engine Strategies (SES) Toronto show. My hat goes off to this years conference organizers at Incisive truly came through and took all the negative criticism from the past couple of year and turned it around.

Change from the Toronto conference centre to the Sheraton Centre was by far the biggest improvement. No longer were attendees and speaker subjected to the dungeon like bottom floor of the convention centre, but instead were treated to a much warmer environment even if it was all virtually on the bottom floor as well. Perhaps it was just the lighting and the cushy carpeted floors, but it just worked better and there were not (or at least I couldn't find them) black holes where you cellphone wouldn't work.

As usual, the quality of speakers was fantastic as usual and Andrew Goodman who chaired this event did an excellent job developing the session topics. Grant many of the session were very basic, but there were a few sessions and frequently at least one panelist who took it to the next level.

One change I did notice this year was the background of the attendees. In past year there were many do it yourself business types at the event trying to figure out how to do SEO and PPC on their own. This time around there was the usual independent SEO consultants and some agency types, but many more people who were from medium to larger business who are the ones being tasked with implement SEO or paid search campaigns.

But no conference takes place with out a few criticisms. The biggest one I heard and agreed with was the location of one of the 3 session rooms. It was far way (about a 5 minute walk) from the may area. Attendees and/or speakers not paying attention to which room they needed to go to were frequently seen rushing to these rooms or getting there late. The distance from the main conference area to the speakers room had the same problem. While this was a inconvenience for most speakers, on the positive side this forced many of the speakers who usually disappear to the speakers room to remain in the conference area and opened up the lines of communication with all the attendees.

There are numerous blog posts and tweets on the session specifics so I would take up any more of your time, just simply search on SES Toronto for the blog post or #sesto in Twitter for more information


An Interesting Project

Since late February I've been involved in an interesting SEO project with a branch of the British Government. In essences, I've been mentoring teams at 3 different institutions on the there SEO efforts as part of a pilot project.

The project has the following steps

1. Basic SEO Training
2. A technical review of their web sites along with recommendations (conducted by me)
3. Implementation of any technical changes (implement by their developers)
4. Identifications and recommendation of specific keyword phrases (joint effort)
5. Either the modification of existing pages or the creation of new content in support of the keyword phrases.
6. Monitoring of page ranking with on-going recommendations
7. Monitoring of impact on web traffic until the end of June

A couple of interesting things came out so far from this project. First off, at the time of the keyword research when the amount of time involvement became clear one of the two institutions dropped out of the pilot project. Leaving behind 2 participants CETLD and Archive Hub.

Fortunately, the remaining two organizations have given it their all and results have started to appear within 1 or 2 weeks of making the various adjustments.

In one case, CETLD chose low volume keyword phrases, but ones that were high quality. This included the phrases "Learning Spaces" and "student centred learning". In both cases, CETLD started rank in the top 100 within days of launching new content and has slowly been climbing of the SERPS.

In the case of Archive Hub it was decided to go after some very high volume and very competitive phrases. These include: "history of railways", "history of fairgrounds" and "history of textiles". Once again, a wide variety of changes were required, but as CETLD they are slowly climbing the SERPs.

The biggest issues we've faced on this project so far has been time. With different civic holidays on both sides of the Atlantic ocean, plus the usual paper work in getting anything done in a large organization has taken its toll and made getting the optimized pages published a little more difficult then expected.

What is interesting with both projects, is measuring the impact on web site traffic. Both sites have very seasonal traffic trends plus specific topics are being marketed externally which might be having an impact of the total amount of search they are targeting. The finds will be a interesting write-up next month.

SES Toronto

In just over two weeks it will be time again for this year's Search Engine Strategy (SES) conference here in Toronto (June 8-10). Despite the economy, I expect a good turnout for this event.

If you're contemplating going but are a bit unsure keep this in mind. Based on this year's agenda, the majority of the talks will be focused at people just getting involved in search engine marketing to those at the intermediate level.

There are a lot of great speakers lined up for this event, so I do hope to see you there.

My Appologies

I'm using Blogger as my blogging software and while I haven't posted anything for about 6 weeks, I just noticed that several of last posts starting in late February until early April are all missing.

If really isn't appropriate for me to dig through my files and resubmit them, but I'll be keeping a closer eye on these posts to ensure what ever happened doesn't happen again.


WebTrends Release New Tag Builder

WebTrends announced today the release of a second generation tag builder. The tag builder is used for generating customized SDC (Smart Data Collector) java scripts to be added to web pages for java script based web analytics for both WebTrends OnDemand and installed version of WebTrends.

Details of the changes can be found at http://blog.webtrends.com/2009/02/24/new-verison-of-tag-builder-available/

Of note, you no longer have to tag Google Adword ads as paid search as WebTrends can now use the Google generated campaign ID assume you've enable "Autotagging" enabled for your Google Adwords campaign.

I'll be testing the new tag builder within the next few days and will post any problems if I find them.


Shorten URL's - TinyURL

With the growing acceptance of Twitter into main stream use, the use of services to shorten lengthy URLs is growing. There are many of these free services out there on the web. Popular ones include:
The question becomes how do these services affect your web analytic reports? So let's start with the basics.

First off, if someone is using a twitter application (for example Tweetdeck) any one clicking on a full and proper link it would appear as direct traffic. The only way to properly identify this users is to use your web analytics campaign tracking feature.

If someone were to click on a shorten URL on a web page, how the referrer is handled would vary by the both the shortening service (most service do however work the same way) and your web analytics software.

For example using tinyurl and generating a link to my company web site www.knechtology.com the shorten URL would appear as: http://tinyurl.com/dfjood and would generate the following sequence of hits.

GET 301 Redirect to http://www.knechtology.com/ http://tinyurl.com/dfjood
GET 200 text/html; charset=ISO-8859-1 http://www.knechtology.com/

The first hit might appear in tinyurl's web log files or possibly not depending on how they handle the 301 redirect. Regardless, there is nothing recorded in my web site's log files until the second hit.

Now the big question is: will this hit appear with the referrer of tinyurl.com or not. For that you need to check your log files. In my experience at best this will appear as a referrer from the shortening service. That's why if I'm posting a link in Twitter back to my web site, I alway like to add a campaign id to the URL.

I personally don't care which Tweet generated the traffic so here at K'nechtology have set up a standard one. If you want to distinguish your Tweets, you'll need to generate unique campaign IDs for each post.


WebTrends Update

WebTrends has released an update for those of us who have the software version. This update includes an update of the latest search engine list including Twitter and an update of the browser list.

It is simple to download and fast to upgrade.

To download or for more information go to:



SMX Analytics

As I mentioned before starting on March 31, 2009 SMX will be holding its first SMX Analytics conference in Toronto Canada. At this time, I'm thrilled to announce that I'll be speaking at this event on March 31 during the "Analyzing & Converting Organic Search Traffic" session.

This promises to be a great event and remember that eMetrics will be on at the same time in Toronto as well. Great chance to take in two different web analytic conferences.


Twitter Imposters - Watch Out

Over the past few days I've received about 10 follows from various people in Twitter. I found it strange when reviewing their past Tweets all had the same single tweet. I was getting suspicious and then today I got a follow from someone I knew (and followed), but the Twitter ID looked different.

Upon checking out the Twitter account I knew it wasn't him. So I decided I was going to investigate and write a detailed post about it. During my investigation I came across this blog post which details the problem in great detail http://news.cnet.com/8301-13515_3-10146731-26.html.

So be aware that like all good technology Twitter is being exploited by those who want to trick us to make a buck.


K'nechtology or Connectology or Konnectology

How do people spell your company name when they have only heard the name and have to idea how to spell it? Believe it or not, this is a very common problem relating to search. For example, my company's name is K'nechtology which is pronounced Connectology.

On Tuesday, during a series of 10 radio interviews the name of my company was given out and only once was the spelling of the company name given out. This means that anyone who wanted to check out the company's web site for more information on the company or on myself had to guess at the spelling as they used Google or any other search engine and would most likely first spell it "connectology". While you might think this won't happen to you or your firm, try asking a stranger to spell your company name or product. I've seen this problem on many occasions.

Here are some solutions that we've come up with over the years:
  1. Register multiple domains and set-up micro web sites that explain the situation and forward the user to the proper web site (hard to do for us when connectology.com is already taken by a legitimate company in the UK);
  2. Write a humours web page that pokes fun at the various spellings or company or product name;
  3. Leverage your corporate or personal blog (just like this post) and write a post about the situation;
So if you heard me on the CBC radio on Tuesday, and are looking for "connectology" or "Konnectology" or "alan connect" or "allan connect" or "allen connect" or "alen connect" or even possibly "alan Konnect" take comfort that you are in the right place and you can find my company's web site at www.knechtology.com


Google Virus - Good Luck For Me

For the best couple of days a virus that I'll call the "google virus" has been slowly spreading. The impact of this virus is that it hijacks your Google search results and redirects you to a spam site.

Fortunately, I didn't get hit by this virus but a call from CBC radio here in Toronto alerted me to the virus. For a brief moment it looked like CBC Toronto was being infected. The good news was they weren't either they had just trip a red flag at Google by having a large number of employees conduct lots of searches in a very short time period. Google then just wanted to verify that the searches were indeed coming from humans and not machines. So they got prompt with a CAPTCHA screen. You know these screen where you have to read a bunch of garbled letter and type them.

So what's the good news about this virus, well I did my research and as a result of helping CBC radio out, I'm being interviewed today across Canada on different local CBC radio stations about Google and viruses. I do need to tip my hat to Twitter, as it allowed me to isolate the virus and the details of what was going on.