Web analytics – the “Trinity approach”

During my stint as a content strategist for Verizon Telecommunications, I worked on the “learn” section of the company website, developing content about Verizon’s Internet products. As part of my job, I participated in a weekly sales call with about 30 other folks during which an analytics expert presented a 30+ page report full of graphics and statistics that had mostly to do with the site’s “order flow” – the long series of steps customers had to take to customize and pay for an Internet product after they clicked an “Order” button. It took a lot of explaining and cajoling before another “learn” team member and I convinced the analytics folks to add two content-related metrics to the report—average time spent viewing selected ‘learn’ pages and average number of ‘learn’ pages viewed. We could then at least briefly report that data each week on the sales call before everyone plunged into a lengthy discussion of order flow metrics. It was all a great illustration of what I was reading at the time in “Web Analytics,” by Avinash Kaushik, Google’s “Analytics Evangelist.” One of Kaushik’s main themes was that many organizations are too “obsessed” (his word) with “conversion rates” (typically sales or orders), which he saw as important “outcomes,” but hardly the whole story. “Outcomes” were just one of three main analytics areas, he emphasized, none of which should be ignored.  Here’s a diagram adapted from his book:

 

Adapted from "Trinity approach" diagram in "Web Analytics" by Avinash Kaushik (Google "Analytics Evangelist")

Adapted from “Trinity approach” diagram in “Web Analytics” by Avinash Kaushik

I try to keep Kaushik’s “Trinity approach” diagram in mind whenever analytics is part of my work. One of its crucial points is that all three areas are interdependent (see circular arrow). For example, you can’t reliably understand the “behavior” data you get from Google Analytics and other analytics programs without knowing what caused the behavior, the “why” of the equation, the user “experience”–and to do that you have to go to the users themselves and find out via a survey or usability test or other research. I did that too at Verizon – conducted a 3-day usability test of ‘learn’ content and designed an 11-question online survey that ran for more than six months and fielded a lot of useful data and insights, some of which we were actually able to act on to improve our ‘learn’ content. Which definitely doesn’t mean that small businesses have to go to those lengths or spend that kind of money to get at the “why” of analytics – watching even a few visitors use your site and asking them why they clicked what they clicked or what they think of your most important pages (e.g., your home page) can work wonders helping you decide how to cost-effectively improve your content.

related links

Occam’s Razor – Avinash Kaushik’s blog, great info and insights about analytics and digital marketing in general.