Close your eyes and imagine that you’re on stage in front of 50,000 screaming fans. You are getting ready to rock their faces off. Your Marshall stack is set on 11, you applied your eyeliner and half a bottle of Aquanet to your hair (at least that’s how we did it in the 80s). You fire up the accordion and start belting out a thrilling version of the Beer Barrel Polka.
What? That isn’t the way you expected your “rock ‘n roll fantasy” to play out?
If you want to rock and audience, it’s often helpful to play a rock song (Weird Al notwithstanding).
If you want your analytics to rock, the data you produce and the analysis you do has to be focused on the desired business outcome. When your measurement and analytics efforts fail, it’s not always because the data is incorrect or the analysis is poor, but sometimes because you are providing the right answer to the wrong question.
In his book “the 7 habits of highly effective people” Stephen Covey talks about “beginning with the end in mind”. This concept is critical in selecting the right metrics to measure. Incorrectly defining the problem is the analytics equivalent of opening up a rock show singing “Role out the barrel.” You might do it flawlessly but the outcome may not result in thousands of screaming fans (though it might involve a different type of screaming).
At this point you may be asking yourself “self… how do I figure out what the right question is?”
This is where things can get a little tricky. If you ask the typical stakeholders what metrics they want, they’re probably going to do one of 2 things:
- Say they don’t know and ask you to tell them
- Ask for the metrics they’re familiar with
In the situation is easy to put the blame on the business owner for not knowing what they need, however, I tend to think the problem isn’t that they gave you the wrong answer, is that you asked them the wrong question.
A better question to ask might be “what are you trying to understand?” or “what are you trying to accomplish?” As analytic professionals, we should be able to help them find appropriate metrics to answer their questions once the questions have been correctly defined.
There are of course certain questions we simply can’t answer with the tools that we have. If this is the case, we need to explain the situation and look for proxy measures that can give us enough directional information to work with (this may also be an excuse to lobby for those cool new analytics toys we’ve been wanting.)
If you want your analytics to rock, make sure you’re providing the right answer to the right question.
The Huge Insight Report:
In most analytics systems the “Huge Insights Report” is found next to the “It Worked” report and the “Why Our Results Didn’t Really Suck” report. These reports are most often needed when project objects were not clearly defined, KPIs and targets where not set and best practices where not followed. These reports are invaluable because there’s a meeting with the VP, CEO, Chairman of the Board and someone from an undisclosed “Three Letter” government agency in 90 minutes and we need to prove (or at least imply that) the project was successful.
Ah… if only the “Huge Insights Report” actually existed. Since it doesn’t, we have to hire the right people, know our tools and spend time searching for “Huge Insights”. While we’re at it, we can forget about gathering the “low hanging fruit”. If there really was any, someone else would have eaten it by now.
What Would Woody Do?
I recently watched a documentary about the famed OSU-UofM football rivalry. The film recounted Woody Hayes falling asleep from exhaustion after watching game film all day.
The typical football game lasts about three hours (including halftime), why on earth would the COO of the football team (the coach), his assistants and players spend days watching film? The answer is simple, to learn, get better and win. Woody Hayes (love him or hate him) new what it took to win.
How much time and energy does your organization spend “watching film”, in other words, learning from your analytics tool? Do you have smart senior people in your organization focused on optimizing your marketing dollars?
Until analytics vendors figure out the “Huge Insights Report”, if you want to beat your competition, you might want to ask “What would Woody do?”
It’s easy to get caught up in the details, the tools, the reports and all the other stuff that our jobs entail on a daily basis. I spend a lot of my day helping client (internal and external) focus on things like objectives and KPIs, but I sometimes have to take a step back and make sure I’m doing the same thing.
We all have to create reports, dig though spreadsheets and learn the inner workings of the tools we use to be good “analysts”, but that simply isn’t enough.
Living in the Nashville area I’m surrounded by some of the greatest musician (from all genres) alive and I’ve been fortunate enough to get to jam with many of them. I am also fortunate in that I recently began studying bass with Adam Nitti. In my very first lesson we talked about playing every note with “intention”. In other words treat every note you play as if you are performing to a packed concert hall, even if you are practicing alone. This is one of the marks of a great musician.
So what makes a great analytics person? I believe that intention (i.e. keeping your “eye on prize”) is one of the critical factors to being successful in analytics. The bottom line is that being good at reporting, data visualization, analysis, tool knowledge, etc., etc. are all prerequisites. Too often we forget that our jobs are about optimization. We will be successful when we know how to make things better, stronger and faster.
Looking at everything we do through the lens of optimization is the key. Approach each task with the intention of learning how to make whatever you are working on more successful and tell “the story” (your reports and presentation) from the perspective of optimization, not data.
- How much is a report worth?
- How much is a 10% lift in conversion worth (you pick the conversion points)?
Want a raise? Become more valuable to your organization by focusing on making everything you work on more successful. If you do, you’ll become an analytics rock star.
Web Analytics Implementer vs. Analyst? The discussion may be getting old but it is an interesting debate none the less and I can no longer keep from chiming in.
Adam Greco started it with his SiteCatalyst Implementation Pet Peeves blog post. Then again, one could argue that Gary Angel started the debate with his rebuttal in which he makes a distinction between an “Analyst’s” approach and an “Implementer’s” approach.
Tim Wilson (my distinguished colleague from Columbus) chimed with a good summary of the debate (in which both Gary and Adam make good points) and Tim made some interesting points of his own.
My Opinion (and you know what they say about opinions…)
Forget about the whole concept of Implementer vs. Analyst. These are two sides of the same coin. You can’t be good at one without being good at the other.
Back in the days when I was “the client”, I asked my media agency why their “delivered traffic” numbers where so outrageously high. They proceeded to send me white papers detailing why different tools reported different numbers and why they would always be significantly different. That’s a load of Lorenzo Lamas movies.
The real answer was that they had no idea how Dart for Advertisers reporting worked. At this point you might say “that doesn’t matter, the numbers should be directional similar”. Not so in this case. DFA de-duplicates visits on a page (tag) by page basis not on a user basis. Because ads had different numbers of calls to action (to differing numbers of landing pages), ads had varying degrees of over counting. This is hairy, so I’m not going to belabor it now. Of this I am sure: If you don’t know the input and how your reporting tools are configured (filter, rules, etc.) you won’t fully understand the output.
Do you want Engelbert Humperdink Analytics or ZZ Top Analytics? A “Rocking Analytics” system is one that gives you the intelligence you need in a usable, timely and accurate way. A Rocking Analyst has to know what the output means.
Though I don’t share all of Adam’s “Pet Peeves”, most of them are indications that the SiteCatalyst implementation you are working with was not well thought out. This will inevitably make the maintenance, analysis and report more difficult, less accurate and unnecessarily time consuming. More importantly these inefficiencies limit the value and usefulness of analytics in your organization and could eventually lead to your “Last Waltz”
Thanks for reading my first Rocking Analytics blog post! In the music business they say “don’t bore us, give us the chorus”. Thus, I’ll avoid a bunch of “how great I am” nonsense. If you really want to know more about me please read the section conveniently labeled “About”. If not, I don’t blame you.
Do Your Analytics Rock?
Asking yourself these question will help you understand if your analytics have the potential to live up the the John Camp-Cougar-Mellon song and “ROCK in the USA”.
- What are the 2 primary things your insert tactic (web site, Facebook tab, etc.) are supposed to do?
- What is the most important KPI for each of your stated primary objectives?
- What are your targets/goals for your most important KPIs?
- Are your reporting/analytics tools and processes designed/implemented to support reaching your targets/goals?
- Do you spend the majority of your time learning how to hit or exceed your targets?
You get to grade your own test. If you’re cocked, locked and ready to rock, great. If you hear the accordion player warming up, never fear.
Alignment is the key to success.
If you where listening to a band and all the players where playing different songs, it would probably suck, even if the players we good. The same holds true if the Analytics Team, the Business Owner and the Creative Agency are signing different tunes, the results tend to suck.
Getting everyone on the same sheet of music means getting all the players in one conversation and following a process.
- Define and communicate the Business Objectives (what is this supposed to do?)
- Define and communicate the KPIs (how will we know if it did that?)
- Set Targets and communicate them (10% improvement, etc.)
- Report (actual vs. target)
- Analyze and communicate findings (what, where, how can we improve)
- Execute against the findings of you analysis
- Rinse and repeat
You can learn more about steps 1 and 2 of the above process in my colleague Tim Wilson’s (aka Gilligan on Data) “Pocket Guide to Identifying Great KPIs” blog post.