How to avoid being wrong as a web analyst

Due to the fact that this discipline is so young, web analysts constantly have to combat the duality of their trade.  On one hand, the web analyst weighs fairly concrete numbers and patterns to formulate recommendations, on the other hand they take a stance on the interpretation of those facts and figures.  At what point do we draw the line between insight and speculation, and how can web analysts avoid mistakes and eating crow?

Experts such as Jim Sterne and Avinash Kaushik (check my blogroll) have a lot of experience, so they can recognize patterns that might be missed by an already keen eye.  Personally, I think it’s much easier to make recommendations based on insight as a consultant, rather than as an in-house web analytics specialist.

Actionable insights versus unadulterated speculation

Coming up with a list of actionable insights based on visitor data doesn’t have to be brain surgery.  If you know for a fact that traffic used to be X and now it is Y, and you found it’s because of a change in Z, you slowly start to brainstorm ways you can keep changing Z (for better or for worse).  The web analyst need not take a hard stance on the raw numbers talking to:

  • a shift in customer satisfaction
  • changes in the level of engagement
  • efficacy of marketing efforts

You can’t really take a hard stance on any of those extremely complex measures without additional information through surveys or competitive intelligence tools.  It wouldn’t necessarily be fair or accurate to base over-arching usability insight on only the raw numbers from a web analytics tool such as Google Analytics, Omniture, Webtrends, Yahoo Web Analytics, etc.  So don’t do it!

Staying within the realm of testing

Convoluted metrics don’t get anyone closer to their end goal of achieving business objectives.  So rather than speculating visitor intent, sentiment, and random behavior in a vaccuum, start to become comfortable with if/then statements.  For example,

IF we change the call to action in traffic drivers on our homepage so click through rate doubles THEN we expect order volume for this product to increase by 25%.

Notice, I didn’t say anything about customers being happier.  Heck, if the folks in marketing are able to increase CTR but the visitors attracted to the new creative don’t buy, or maybe bounce, customers might get a bit ticked off.  However, the statement implies testing!

Remember Bryan Eisenberg’s motto “Always Be Testing.”  Testing a solid hypothesis is great because it allows the web analyst to forgo speculation by:

  • including actionable insights (we’re changing the call to action)
  • establishing concrete goals (double our CTR)
  • stating an expected change or prediction (25% increase in orders)

And possibly the single greatest element of the statement is the word “expect”.  You can base expectation on a lot of things including current conversion rates, historical patterns, or even just segmenting your existing traffic numbers.

Avoid falling into the trap of evaluating the unknown unknowns as good, bad or ugly.  Start down the path of positive reinforcement, incremental knowledge and building a corporate culture of creativity and testing.

2 thoughts on “How to avoid being wrong as a web analyst”

Leave a Reply

Your email address will not be published. Required fields are marked *