Over the years a number of solutions have been developed. Some of these worked reasonably well and have included using third part tracking tools or creating redirection page or script. In any option it always required a middle layer that user would be directed to first and then redirect them to the external site. This solution, while it work, created a point of failure and potentially a bad user experience.
With the introduction java script tracking web analytics (WebTrends SDC, Google Analytics, Omiture etc.) it became possible tag the HTML code (specifically the "a href" tag) with an on-click or on-mouse down function that execute the tracking java script file and pass a series of variables to the data collection server. While this method works, it requires additional web site programming which generally required IT resources and testing.
WebTrends has come up with a way with their new SDC java script (relased earlier this year) to simplify this whole process. When building your SDC java script using the WebTrends Tag Builder simply select "Off-Site Links" as an option under the "Event Tracking" tab and generate your SDC script.
Once the script is uploaded, all works perfectly and clicks to off-site sites will appear in the top pages report with the title extension of "Offsite:URL".
At last something made simple. And please remember, if you need help with your WebTrends configuration please feel free to contact my company as that is one of our specialty.
This is a question that I'm frequently asked by users of various web analytic products, and so I thought I'd post my response for others.
Direct traffic (as reported by WebTrends) comes from the following sources
1. People entering the web site URL directly
2. People having added the web site directly to their bookmarks and then clicking on the bookmarked site
3. People clicking on a link in an email
4. Other possible sources for links that are not part of a web site
5. People who choose to hide the referring web site information (a setting now available in some browsers)
In the report where the website is listed as its own referrer is a bit harder to explain, but here is my explanation:
Every page has a referrer, so when someone browses from one page to another page within the site, the site is in itself a referrer.
Normally, these should not appear in a referrer report (many analytics products strip this data out), but WebTrends does show this data under certain circumstances. What a site self referring traffic generally means is that someone was on the site who's visit session had time out (stayed on a page more than 30 minutes) and then clicked on a new page. So that a new visits was initiated and the visits referrer is now the site itself. There are a couple of other possible causes of this, but this is the easiest one to explain and understand.
This is all handled through what is called the "Google Search Wikki". So now you have the ability to move specific results up or down and even delete them. On top of it, you can leave comments for the whole world to see next to the listing forever.
Now this might seem like a good idea and is part of what I mention in a previous post on why Search Ranking is going to become meaningless. So is that day here now? Is Google on the right track? Will it last?
Let's tackle these questions one at a time.
1. No the day of search ranking being meaningless isn't here yet. Despite the ability of users to customize their results, how many will. How many know they can even do it and why would they want to. Sure you might want to delete your competitor from your results, but then how would you know what they are up to?
2. Is Google on the right track? That's hard to say, the feature most certainly has a cool factor to it, but what happens if I accidentally delete a listing, how do I get back? How many users have to delete a listing or move it down in their results before Google adjusts the natural listing? Are listing now not susceptible to attack by larger competitors who can have all their staff delete specific listing? Of course for us SEO, we could use it to fake search results to show how good a job we did. Just kidding of course.
3. Will this last? I don't think this will be going away to soon, but I can't really see people jumping on board. How many of us search on the same topics time and time again that we would really want to customize are listings. There are a few subjects that I do conduct regular searches on when looking for new content. I personally would have preferred it, if Google would have simply allowed me to sort the results by publishing date.
The last option of adding comments, does seem like a good idea. Let's see I like what I found on the other end of the listing so I go back to the search again and write a favorable comment. Who has time for that. The site on the other end doesn't match what I want so I hit the back button immediately and add a nasty comment (somewhat likely). Now there are going to be hundreds if not thousands of comments next something I want click on, do I have the time to read all the nasty stuff people have left. Hmmmmm
So let's give it some time and lets see if it will last.
All those who know me, know that I don't jump quickly to the latest and greatest Internet gadgets. This is not to say I am slow to adopt good ideas, I just need someone to show me the practical side of it for business. And this has brought me to my Twitter experiment.
If you have never heard of Twitter (www.twitter.com), you are not alone. Yet this simple tool is quickly gaining a foothold among teenagers, socially active technos and most top-end bloggers. For those who don't know what it is here is the simplest definition. It is a micro blogging tool which limits your posts to a mere 140 Characters. As with any social networking tool, people follow your Twitter posts (Tweets) and you can reciprocate by following their posts. My reluctance to Twitter was more of what did I have to Tweet about and who would want to follow my Tweets.
Two weeks ago I was in the UK at a conference and several people asked me what my Twitter ID was to which I had to answer "I don't tweet". I was greeted with a look of disappointment. So upon my return to Toronto, I quickly set-up my Twitter account (@aknecht - http://twitter.com/aknecht/) and started my experiment. Here's how the experiment went.
Step 1 - I linked my Twitter updates to my Facebook profile status (Friday)
Step 2 - I added Twitterberry to my BlackBerry for easy updates (Sunday)
Step 3 - I started to follow a couple of people related to the PubCon Conference I was about to attend
Step 4 - I started to Tweet (Sunday)
These simple actions soon proved to be very valuable. Before I knew several people (Facebook Friends) were following my Tweets. From this several people attending PubCon found me and started following my Tweets. For the experiment, I committed myself to Tweeting as often as possible.
Twitter Experiment Results
By the time I landed in Las Vegas (Monday afternoon) I quickly discovered that all PubCon related announcements were no longer going to out by email, but by Tweets (at least I had that covered). Next I started to see people Tweeting about dinner and where to go and where to meet. Before I knew it I had dinner plans with someone I had never met or even had contact with before Twitter. We met up (Tweetup) at a restaurant at my hotel and found each other. Of course there was some confusion as someone else who was going to join us who had simply described himself as bald with glasses (gee that could be me). Once we got that settled the three of us had dinner and a great conversation.
PubCon started the next day and the conference almost crashed Twitter. Everyone was Tweeting (at least 90% of all attendees -including myself) what sessions they were attending, issue during the session etc. By the end of the first day of PubCon, I must have Tweeted at least 10-15 times. It seemed as with each Tweet I got more followers. What a strange occurrence as just a year earlier at PubCon 2007 almost no-one Tweeted.
By day two, several people came up to me and said they enjoyed following my Tweets in Facebook. That night at the hotel bar, I ran into someone who I wasn't sure would recognize me, but of course he did and said, "Hi Alan, I opened up Facebook today and there you were all over my home page with your tweets." It was a positive response. The only person to voice negative feedback came from my wife, who just thought I was abusing Facebook for business and bragging purposes. I took to the old adage that most negative comments go unsaid so I took my wife's words to heart.
So after a week of Tweeting to my hearts galore, my experiment came to an end. Through this experience I learned a few things.
- 1. If you're going to Tweet a lot don't hook it up to your Facebook account, but encourage your friends to follow you on Twitter if they want
- 2. Get a variety of apps loaded up on your computer to help optimize your Twitter experience (my personal favourite right now is TweetDeck)
- 3. Link your cellphone to Twitter (SMS messaging or through a phone application)
- 4. Tweek responsibly and they will follow
- 5. Talk to other Tweeters about Tweet etiquette or read the book "Twitter Revolution" by Deborah Micek & Warren Whitlock (I devoured this book on the flight home from Las Vegas) which contains lots of excellent tips
- 6. Start Tweeting
The reality is there is no real business model for Twitter right now and who knows if they'll survive long term. For now, I equate the Twitter phenomena at the same state as the web was in the mid 1990's people knew it had potential, but they didn't yet know how to take advantage of it for business. Yes it was in the hands of a few geeks and look what we turned the web into.
For the socially active party goers this is a must have tool. Start following all your friends and have your friends start following you and there will never be the need for a phone call again or at least while you're out party hoping.Happy Tweeting
Once you strip away the advice on how to and why you should use h1 tags, why writing good description meta tags and why using clean & valid XHTML is important, it really boiled down to you need to make the whole SEO process part of the project plan and part of everyone's deliverables.
You can't let the graphic designer get approval on a design that can't be marked-up in an SEO friendly way. Just the same as you can't let the html developer, the server management team or the back-end database programmers get away with it either.
So how do you do this, the answer is simple, but it is the execution that is hard. Step 1, get everyone educated on search engine optimization, how it works, the impact of different components on SEO and why it is important to your organization. Next make it part of each persons project deliverables. If their deliverable is not SEO compliant, then they haven't finished their job and need to go back and get it right. Of course, this means allowing enough time (budget if you are outsourcing it) for them to get it right. Just because it looks good or works from a technical point, doesn't mean it is. None SEO compliant feature must be treated as a web site bug and fixed on a high priority basis.
1. What's with the indexing of Flash, what works and doesn't work? Matt was able to confirm what I had hear from Adobe last month, but that Adobe wasn't 100% sure. Google is only capable of finding links within the Flash file and unraserized text.
2. Is Google working on OCR for graphics and imaged pdfs? Matt - Google is now successful (at least to the limits of OCR) processing PDFs that were converted to images (you can't select text in these), but they are not yet working on trying to apply OCR (optical character recognition) to page images. Matt mentioned, that he has asked out this and even suggested it to the team, but so far no traction with the Google development team.
We went on to discuss how even if they got this working, the issue of OCR and different typeface would pose a huge issue and be very open to major mistakes.
So for now as I always state in my presentation we are still left with the 3 things that search engines value for our SEO projects - "words, words and words"
While it the story may sound simple, the video went viral and before they knew it this small blender company for Omaha was being featured on national news shows, and late night television. The resulting impact on sales were almost through the roof. A the legend of "Will it Blend" was born. Before long who knows, perhaps "blendtec-it" will become a verb of the likes of "google it".
As for the rest of day, there was a great session on web analytics despite the rep from Google pushing Google Analytics a bit too much. However between the pitches there was some good info.
My session on Organic design went well and we had nearly standing room only in our vast room. I didn't like the microphone they were using, but it was clear that the attendees were impressed with what all the panelist had to say.
While at PubCon, I attended a session where Bruce Clay was speaking. As part of his address he stated something that I've been saying for years, "A page's rank in the search results is meaningless".
He went on to point out that the search engines are changing how they decided which page to rank. Most of us already know about how they adjust the results based on the users search location (someone in Las Vegas will get different results than someone in New York), many may not be aware they have started monitoring user behavior. This means they adjust the results based on what you as a searcher tend to search one. If you click on more product reviews than commerce site when searching for products, review pages will trend upward in your search query resutls.
Bruce feels that these types of changes are going to be even more wide spread come some time in the 1st quarter of 2009. While time will tell, Bruce pointed out as a community SEO and SEM specialists need to stop focusing on the rankings in the SERPS and start embracing web analytics. This latter point what I've been saying, preaching and teaching for years.
As with all conferences, the best things happen between sessions and that's when Bruce and I got to speak for about 3 minutes on this subject. During our brief conversation, our thoughts and opinions were identical that we as a community need to use web analytics to measure the quanity and quality of the traffic generated by search engines. We both agreed that being number 1 is important for corporate ego, but the truth that needs to be conveyed is what traffic do you get for which terms and how do they convert on your site.
Chance conversations like this is why I love conferences like PubCon. It gives everyone the chance to discuss ideas and concepts with peers and industry leaders ensuring that we as a community can than transfer this knowledge with confidence to co-workers and clients.
Given that numbers I realized that Twitter was starting to come to age and I better start Twitting. So I set-up my Twitter account and loaded my Blackberry with Titterberry. Didn't do much with it except link it to my Facebook account and thought so what. Who really wants to follow what I'm up to and how will I take advantage of it. Well then I headed to PubCon in Las Vegas.
I quickly started following the PubCon Twit and from there as they say the rest is history. All major social activities having anything to do with PubCon are being delivered via Twitter. Quickly several people started following my Twit and before I knew it, I had dinner plans with people I had never met before.
It got so crazy I was getting upset that my Twitterberry utility wasn't giving me live feeds like others around me that I realized the importance of the need for live feeds at least from a social aspect. So will I keep it up once I'm back at home, only time will tell - but my gut feel says yes and just another utility grabing more of limited time.
Two sessions (beyond my own) really standout and both occurred on the second day. The first was SEO and Web 2.0. While the session didn't provide me with much new information, it was the way the information was presented especially the presentation by Mikkel DeMib Svendsen. I'm a big fan of Mikkel since he has officially gone white had and I had to love it when he said "I'm not that technical".
My other favourite session was the session on Local Search. Unfortunately it was the last session on the last day and I could only stay for about 1/2 of it as I had to get to the airport. There was a ton of great information once again presented in a way even the most non-technical person could understand. For anyone who attended this session, I hope they took good notes so they can apply this way of presenting their data to their clients and/or bosses.
Overall, I found it well worth my while to attend SMX London.
Next stop, is PubCon in Las Vegas starting tomorrow. Look for regular updates.
From the people I chatted most were very familar with the concepts of SEO and PPC and appeared to be more focused on the advanced tracks.
My session with Offir Cohen, Richard Gregory and Kelly Gillease went very well with a small but attentive audience. Chris Sherman monerated this panel and did his excellent job.
Only complaint that I can have against this conference is the lack of free wireless Internet access. While the conference venue did offer it, it was at the steap cost of 5 pounds per hour or 20 pounds per day. As such, I was unaware of anyone who opted for the wireless.
I'll provide a more detailed summary once I'm back home in Toronto and before I head off to Las Vegas for PubCon next week.