Skip to content

What’s it all about… Alfie?

August 24, 2011

It’s easy to get caught up in the details, the tools, the reports and all the other stuff that our jobs entail on a daily basis. I spend a lot of my day helping client (internal and external) focus on things like objectives and KPIs, but I sometimes have to take a step back and make sure I’m doing the same thing.

We all have to create reports, dig though spreadsheets and learn the inner workings of the tools we use to be good “analysts”, but that simply isn’t enough.

Living in the Nashville area I’m surrounded by some of the greatest musician (from all genres) alive and I’ve been fortunate enough to get to jam with many of them. I am also fortunate in that I recently began studying bass with Adam Nitti. In my very first lesson we talked about playing every note with “intention”. In other words treat every note you play as if you are performing to a packed concert hall, even if you are practicing alone. This is one of the marks of a great musician.

So what makes a great analytics person? I believe that intention (i.e. keeping your “eye on prize”) is one of the critical factors to being successful in analytics. The bottom line is that being good at reporting, data visualization, analysis, tool knowledge, etc., etc. are all prerequisites. Too often we forget that our jobs are about optimization. We will be successful when we know how to make things better, stronger and faster.

Looking at everything we do through the lens of optimization is the key. Approach each task with the intention of learning how to make whatever you are working on more successful and tell “the story” (your reports and presentation) from the perspective of optimization, not data.

Ponder this:

  • How much is a report worth?
  • How much is a 10% lift in conversion worth (you pick the conversion points)?

Want a raise? Become more valuable to your organization by focusing on making everything you work on more successful. If you do, you’ll become an analytics rock star.

Rock On


Web Analytics Implementer vs. Analyst?

August 4, 2011

Web Analytics Implementer vs. Analyst? The discussion may be getting old but it is an interesting debate none the less and I can no longer keep from chiming in.

Adam Greco started it with his SiteCatalyst Implementation Pet Peeves blog post. Then again, one could argue that Gary Angel started the debate with his rebuttal in which he makes a distinction between an “Analyst’s” approach and an “Implementer’s” approach.

Tim Wilson (my distinguished colleague from Columbus) chimed with a good summary of the debate (in which both Gary and Adam make good points) and Tim made some interesting points of his own.

My Opinion (and you know what they say about opinions…)

Forget about the whole concept of Implementer vs. Analyst. These are two sides of the same coin. You can’t be good at one without being good at the other.

Back in the days when I was “the client”, I asked my media agency why their “delivered traffic” numbers where so outrageously high. They proceeded to send me white papers detailing why different tools reported different numbers and why they would always be significantly different. That’s a load of Lorenzo Lamas movies.

The real answer was that they had no idea how Dart for Advertisers reporting worked. At this point you might say “that doesn’t matter, the numbers should be directional similar”. Not so in this case. DFA de-duplicates visits on a page (tag) by page basis not on a user basis. Because ads had different numbers of calls to action (to differing numbers of landing pages), ads had varying degrees of over counting. This is hairy, so I’m not going to belabor it now. Of this I am sure: If you don’t know the input and how your reporting tools are configured (filter, rules, etc.) you won’t fully understand the output.

Do you want Engelbert Humperdink Analytics or ZZ Top Analytics? A “Rocking Analytics” system is one that gives you the intelligence you need in a usable, timely and accurate way. A Rocking Analyst has to know what the output means.

Though I don’t share all of Adam’s “Pet Peeves”, most of them are indications that the SiteCatalyst implementation you are working with was not well thought out. This will inevitably make the maintenance, analysis and report more difficult, less accurate and unnecessarily time consuming. More importantly these inefficiencies limit the value and usefulness of analytics in your organization and could eventually lead to your “Last Waltz”

Rock On

Do Your Analytics Rock?

July 6, 2011

Thanks for reading my first Rocking Analytics blog post! In the music business they say “don’t bore us, give us the chorus”. Thus, I’ll avoid a bunch of “how great I am” nonsense. If you really want to know more about me please read the section conveniently labeled “About”. If not, I don’t blame you.

Do Your Analytics Rock?

Asking yourself these question will help you understand if your analytics have the potential to live up the the John Camp-Cougar-Mellon song and “ROCK in the USA”.

  1. What are the 2 primary things your insert tactic (web site, Facebook tab, etc.) are supposed to do?
  2. What is the most important KPI for each of your stated primary objectives?
  3. What are your targets/goals for your most important KPIs?
  4. Are your reporting/analytics tools and processes designed/implemented to support reaching your targets/goals?
  5. Do you spend the majority of your time learning how to hit or exceed your targets?

You get to grade your own test.  If you’re cocked, locked and ready to rock, great. If you hear the accordion player warming up, never fear.

Alignment is the key to success.

If you where listening to a band and all the players where playing different songs, it would probably suck, even if the players we good. The same holds true if the Analytics Team, the Business Owner and the Creative Agency are signing different tunes, the results tend to suck.

Getting everyone on the same sheet of music means getting all the players in one conversation and following a process.

A Process:

  1. Define and communicate the Business Objectives (what is this supposed to do?)
  2. Define and communicate the KPIs (how will we know if it did that?)
  3. Set Targets  and communicate them (10% improvement, etc.)
  4. Report (actual vs. target)
  5. Analyze and communicate findings (what, where, how can we improve)
  6. Execute against the findings of you analysis
  7. Rinse and repeat

You can learn more about steps 1 and 2 of the above process in my colleague Tim Wilson’s (aka Gilligan on Data) “Pocket Guide to Identifying Great KPIs” blog post.

Rock On,