In the recording studio, isolation can be very important. Isolation techniques keep the guitar amp (volume set to 11) from bleeding into the drum microphones and vise versa. Good isolation gives you greater control and more sound shaping options during the mixing process.
In analytics, isolation or segmentation, is critical as well. This is especially true while testing the integrity of your analytics implementation.
This quick tip will help you standardize and simplify your analytics Quality Assurance, by isolating the activity of those people testing data integrity from those responsible for other testing in the same environment (development, QA, production, etc.)
The data input process:
- Document a clear set of repeatable actions and specify the order of the actions
- Go to XYZ page through a link with a predefined QA tracking code appended (?cid=qa_regression_test)
- Search for “Antidisestablishmentarianism”
- Compete the Contact Us form
- Have the testers execute the documented process
- Note the number of testers, times at which the procedures where executed, etc.
- Validate the expected data is returned in your analytics tool/s
The Tip (using SiteCatalyst nomenclature, though this can be done in any tool with segmentation capabilities):
- Create a segment for “visits” where your specified tracking code (qa_regression_test ) was passed in.
- Note: Always validate your data set when first creating a segment.
When you apply you’re newly created segment you will have isolated the data passed in specifically from your testers and eliminated the “noise” from anyone else that may have been working in that environment. This technique is particularly valuable when testing in a production environment.
Want to hear some of my music? Go to: http://www.reverbnation.com/matthewcoen
If you want your analytic systems to rock, like guitars, you need to know how to “tune” them. For that reason, I’ll stray from addressing the more philosophical aspects of analytics and get technical.
Many companies use multiple promotions (internal banner ads) on their sites and want to understand their performance.
You can use things like internal tracking codes and path analysis to do this, but these methods can be time consuming, error prone and often don’t work well when you are dynamically serving promos.
When dynamically serving promos, displaying multiple promos on one page or when tactics like carousels are used, the number of impressions a promo gets will significantly affect the number of clicks and successes.
What we need is and automatable, internal “Click Rate” report or as media calls it, a Click Through Rate report. Click “Through” is misleading, as the fact that a user clicks doesn’t mean they made it through. We’ll save that topic for another time.
To create a Click Rate report, we’ll use a list variable.
There is a lot of confusion related to variable types in Adobe Reporting and Analytics (Omniture, SiteCatalyst).
When it comes to Reports and Analytics, it can be difficult enough to understand the intricacies of s.props and eVars, let alone specialized variables like the list variable. The fact that there is more than one type of list variable further complicates things.
What is a list variable? It’s a variable that allows you to pass in multiple delimited values (separated by a comma for instance) and run reports on each value separately.
In the early days of SiteCatalyst there was only one variable that would accept a list (s.products). Today I’m going to focus on s.list1, s.list2 and s.list3.
List variables persist like eVars (conversion variables) but with one major difference, how they persist.
If you set and eVar on a page and then set it again on a subsequent page, the subsequent page will “overwrite” the value set on the first page. When an event variable is set, the last value set in the eVar will get credit for an event. This is not the case with the s.list variables. Each value set in a list variable persists until its persistence expires.
Less talk… more rock.
To set up an internal click rate report, do the following:
Set up the variables
- Have support enable a list variable (s.list1, s.list2 or s.list3).
- You only have three so use them sparingly
- NOTE: If you want to change the list variable’s name to something like “Promos”, you’ll have to do it in admin where the menu is customized.
- Have support set the persistence expiration to “on page”.
- This essentially means that the variable does not persist
- Have support set up the delimiter you want to use for that List Var (I like to use a comma)
- Set up an event variable for “Impressions”
- Set up an event variable for “Clicks”
- OPTIONAL: Set up a “Clicked Promo” eVar
Tag your site
- On the page containing the Promos, pass all the promo names to the list variable and set the “Impression” event variable
- s.list1=”Promo_Name_1, Promo_Name_2, Promo_Name_3”
- When a promo is clicked, pass the clicked promo name into the list variable and set the “Click” event variable
- s.list1=” Promo_Name_2”
- OPTIONAL: s.eVent17=”Promo_Name_2”
Set up the reporting.
- Create a calculated metric called “Click Rate” (or whatever makes sense to you)
- Click Rate =Clicks/Impressions
- Run the ListVar1 (Promos) report
- Select your “Click Rate” calculated metric as your metric.
If you set up a “Clicked Promo” eVar you will also be able to calculate conversion to other events as the eVar will persist (through the visit unless otherwise specified)
The last and most important step is to use the data to optimize your promos.
A basic 12 bar blues tune is in 4/4 and has 3 cords. This project is more like a jazz tune in 9/8 with 3 key changes in each phrase. Don’t hesitate to hit me with questions if you have any.
Rock On – Matt Coen
Close your eyes and imagine that you’re on stage in front of 50,000 screaming fans. You are getting ready to rock their faces off. Your Marshall stack is set on 11, you applied your eyeliner and half a bottle of Aquanet to your hair (at least that’s how we did it in the 80s). You fire up the accordion and start belting out a thrilling version of the Beer Barrel Polka.
What? That isn’t the way you expected your “rock ‘n roll fantasy” to play out?
If you want to rock and audience, it’s often helpful to play a rock song (Weird Al notwithstanding).
If you want your analytics to rock, the data you produce and the analysis you do has to be focused on the desired business outcome. When your measurement and analytics efforts fail, it’s not always because the data is incorrect or the analysis is poor, but sometimes because you are providing the right answer to the wrong question.
In his book “the 7 habits of highly effective people” Stephen Covey talks about “beginning with the end in mind”. This concept is critical in selecting the right metrics to measure. Incorrectly defining the problem is the analytics equivalent of opening up a rock show singing “Role out the barrel.” You might do it flawlessly but the outcome may not result in thousands of screaming fans (though it might involve a different type of screaming).
At this point you may be asking yourself “self… how do I figure out what the right question is?”
This is where things can get a little tricky. If you ask the typical stakeholders what metrics they want, they’re probably going to do one of 2 things:
- Say they don’t know and ask you to tell them
- Ask for the metrics they’re familiar with
In the situation is easy to put the blame on the business owner for not knowing what they need, however, I tend to think the problem isn’t that they gave you the wrong answer, is that you asked them the wrong question.
A better question to ask might be “what are you trying to understand?” or “what are you trying to accomplish?” As analytic professionals, we should be able to help them find appropriate metrics to answer their questions once the questions have been correctly defined.
There are of course certain questions we simply can’t answer with the tools that we have. If this is the case, we need to explain the situation and look for proxy measures that can give us enough directional information to work with (this may also be an excuse to lobby for those cool new analytics toys we’ve been wanting.)
If you want your analytics to rock, make sure you’re providing the right answer to the right question.
The Huge Insight Report:
In most analytics systems the “Huge Insights Report” is found next to the “It Worked” report and the “Why Our Results Didn’t Really Suck” report. These reports are most often needed when project objects were not clearly defined, KPIs and targets where not set and best practices where not followed. These reports are invaluable because there’s a meeting with the VP, CEO, Chairman of the Board and someone from an undisclosed “Three Letter” government agency in 90 minutes and we need to prove (or at least imply that) the project was successful.
Ah… if only the “Huge Insights Report” actually existed. Since it doesn’t, we have to hire the right people, know our tools and spend time searching for “Huge Insights”. While we’re at it, we can forget about gathering the “low hanging fruit”. If there really was any, someone else would have eaten it by now.
What Would Woody Do?
I recently watched a documentary about the famed OSU-UofM football rivalry. The film recounted Woody Hayes falling asleep from exhaustion after watching game film all day.
The typical football game lasts about three hours (including halftime), why on earth would the COO of the football team (the coach), his assistants and players spend days watching film? The answer is simple, to learn, get better and win. Woody Hayes (love him or hate him) new what it took to win.
How much time and energy does your organization spend “watching film”, in other words, learning from your analytics tool? Do you have smart senior people in your organization focused on optimizing your marketing dollars?
Until analytics vendors figure out the “Huge Insights Report”, if you want to beat your competition, you might want to ask “What would Woody do?”
It’s easy to get caught up in the details, the tools, the reports and all the other stuff that our jobs entail on a daily basis. I spend a lot of my day helping client (internal and external) focus on things like objectives and KPIs, but I sometimes have to take a step back and make sure I’m doing the same thing.
We all have to create reports, dig though spreadsheets and learn the inner workings of the tools we use to be good “analysts”, but that simply isn’t enough.
Living in the Nashville area I’m surrounded by some of the greatest musician (from all genres) alive and I’ve been fortunate enough to get to jam with many of them. I am also fortunate in that I recently began studying bass with Adam Nitti. In my very first lesson we talked about playing every note with “intention”. In other words treat every note you play as if you are performing to a packed concert hall, even if you are practicing alone. This is one of the marks of a great musician.
So what makes a great analytics person? I believe that intention (i.e. keeping your “eye on prize”) is one of the critical factors to being successful in analytics. The bottom line is that being good at reporting, data visualization, analysis, tool knowledge, etc., etc. are all prerequisites. Too often we forget that our jobs are about optimization. We will be successful when we know how to make things better, stronger and faster.
Looking at everything we do through the lens of optimization is the key. Approach each task with the intention of learning how to make whatever you are working on more successful and tell “the story” (your reports and presentation) from the perspective of optimization, not data.
- How much is a report worth?
- How much is a 10% lift in conversion worth (you pick the conversion points)?
Want a raise? Become more valuable to your organization by focusing on making everything you work on more successful. If you do, you’ll become an analytics rock star.
Web Analytics Implementer vs. Analyst? The discussion may be getting old but it is an interesting debate none the less and I can no longer keep from chiming in.
Adam Greco started it with his SiteCatalyst Implementation Pet Peeves blog post. Then again, one could argue that Gary Angel started the debate with his rebuttal in which he makes a distinction between an “Analyst’s” approach and an “Implementer’s” approach.
Tim Wilson (my distinguished colleague from Columbus) chimed with a good summary of the debate (in which both Gary and Adam make good points) and Tim made some interesting points of his own.
My Opinion (and you know what they say about opinions…)
Forget about the whole concept of Implementer vs. Analyst. These are two sides of the same coin. You can’t be good at one without being good at the other.
Back in the days when I was “the client”, I asked my media agency why their “delivered traffic” numbers where so outrageously high. They proceeded to send me white papers detailing why different tools reported different numbers and why they would always be significantly different. That’s a load of Lorenzo Lamas movies.
The real answer was that they had no idea how Dart for Advertisers reporting worked. At this point you might say “that doesn’t matter, the numbers should be directional similar”. Not so in this case. DFA de-duplicates visits on a page (tag) by page basis not on a user basis. Because ads had different numbers of calls to action (to differing numbers of landing pages), ads had varying degrees of over counting. This is hairy, so I’m not going to belabor it now. Of this I am sure: If you don’t know the input and how your reporting tools are configured (filter, rules, etc.) you won’t fully understand the output.
Do you want Engelbert Humperdink Analytics or ZZ Top Analytics? A “Rocking Analytics” system is one that gives you the intelligence you need in a usable, timely and accurate way. A Rocking Analyst has to know what the output means.
Though I don’t share all of Adam’s “Pet Peeves”, most of them are indications that the SiteCatalyst implementation you are working with was not well thought out. This will inevitably make the maintenance, analysis and report more difficult, less accurate and unnecessarily time consuming. More importantly these inefficiencies limit the value and usefulness of analytics in your organization and could eventually lead to your “Last Waltz”
Thanks for reading my first Rocking Analytics blog post! In the music business they say “don’t bore us, give us the chorus”. Thus, I’ll avoid a bunch of “how great I am” nonsense. If you really want to know more about me please read the section conveniently labeled “About”. If not, I don’t blame you.
Do Your Analytics Rock?
Asking yourself these question will help you understand if your analytics have the potential to live up the the John Camp-Cougar-Mellon song and “ROCK in the USA”.
- What are the 2 primary things your insert tactic (web site, Facebook tab, etc.) are supposed to do?
- What is the most important KPI for each of your stated primary objectives?
- What are your targets/goals for your most important KPIs?
- Are your reporting/analytics tools and processes designed/implemented to support reaching your targets/goals?
- Do you spend the majority of your time learning how to hit or exceed your targets?
You get to grade your own test. If you’re cocked, locked and ready to rock, great. If you hear the accordion player warming up, never fear.
Alignment is the key to success.
If you where listening to a band and all the players where playing different songs, it would probably suck, even if the players we good. The same holds true if the Analytics Team, the Business Owner and the Creative Agency are signing different tunes, the results tend to suck.
Getting everyone on the same sheet of music means getting all the players in one conversation and following a process.
- Define and communicate the Business Objectives (what is this supposed to do?)
- Define and communicate the KPIs (how will we know if it did that?)
- Set Targets and communicate them (10% improvement, etc.)
- Report (actual vs. target)
- Analyze and communicate findings (what, where, how can we improve)
- Execute against the findings of you analysis
- Rinse and repeat
You can learn more about steps 1 and 2 of the above process in my colleague Tim Wilson’s (aka Gilligan on Data) “Pocket Guide to Identifying Great KPIs” blog post.