Careful of Complexity

I worked on a site for a major web portal – I’m not going to say which one but it may or may not have a name like a certain creature in a little book called Gulliver’s Travels – and we had our Omniture analytics code on this site. It got traffic, lots of traffic, page views in the millions per day and then one day it started getting 20 times the normal amount! Because the company has dozens of sites and a handful of site managers, nobody noticed this enormous spike in traffic for a week. The fixed number of server calls we had purchased from Omniture in a month was eaten up in under a week. Costs were being incurred. Senior managers were freaking out.

As the analytics guy, I put a debugger on the site, and frantically identified the problem to the correct products and areas of the site that were madly sending a firehose of image requests to Omniture. I sent that off to the programmers so they could quickly identify the problem in the code.

You know what it wound up being? A single extraneous space at the end of a certain javascript call. Just one space cost the company thousands of dollars.

Another time an agency built a flash application for a particular customer acquisition campaign. This piece of flash had dozens of buttons and screens, video that played inside it, links to downloadable content….it was complicated. But not as complicated as their measurement plan. They wanted to measure everything – and I mean everything – to come up with some hooey “engagement” metric that was supposed to justify the agency’s enormous fees. But the measurement plan and resulting javascript was so complicated that nearly nothing was measured at the end of the day. The campaign cost tens of thousands of dollars and we couldn’t tell how many leads were generated.

What’s the moral here? Complexity breeds costs.

Whenever an analytics package or measurement plan is being designed I like to follow the KISS rule: keep it simple, stupid. On most projects I can limit my plan to three basic questions:

  1. What are our “KPIs plus”? KPIs plus meaning what do we use to judge the project a success or failure and what additional learnings can we take away from this for future projects?
  2. What is actionable? What can the data tell us so we can tweak, test and improve ROI?
  3. What is affordable? The more metrics, the more server calls. Watch out, they can come back and bite cha.