If you subscribe to this blog, you have probably heard this 100 times before. It’s a classic case of some stakeholder or HiPPO wanting to measure clicks on every single element on every single page so they can ask every single question that will waste every single minute of every single day until you burn out or quit. I want to write about this to help underscore the importance of considering outcomes and asking the important “W” questions. You know what those are (Who, What, When, Where, Why).
As a consultant, it was very easy to ask these blunt questions:
“What’s the purpose of this [thing]?”
“What are our goals specific to this project?”
“Who is our audience?”
“Where should target audience go from this [thing]?”
“Why would those people take this important action?”
“What can or can’t be changed within our framework?”
“When can we make changes?”
Now that we know the objective, we can form hypotheses about user behavior and tag/measure/test/optimize against those hypotheses to address our bigger organizational goals. Consulting in a nutshell (yes, I know I am trivializing it).
One might think it’s easier to ask these questions on the client side. In some cases I’m sure it is; but when you’re not getting paid a premium price by the hour to solve a specific business problem, coworkers and bosses don’t immediately see an incentive to seeking the answer to these fundamental questions. Crazy? Yes… but it’s more common than you think. So how does this tie back to the “Track Everything” mentality? If it was as simple as asking the questions above, getting our answer, and moving on then stop reading here – problem solved.
For the other 99.9% of you… asking these questions is NOT the issue, but just a symptom of a bigger problem. Here are just a few causes I’ve seen.
Table of Contents
Lack of Education
- Frequent requests from stakeholders to tag everything on a page
- Specific reporting requests without context
People don’t know what they don’t know. If you’re being asked to track literally everything on a page – maybe that person hasn’t been challenged with how analytics is used to drive business decisions. A lack in education of the fundamentals of site optimization will make the “track everything” model seem like a smart decision.
“Why NOT track everything?”
Ultimately, excessive tracking often breeds excessive reporting. Excessive reporting causes analysis paralysis. That means there’s so much data that you can’t make decisions. As an analyst, it’s your job to educate stakeholders in how data is leveraged to make decisions. A better understanding of how decisions are made will yield better questions and (inevitably) fewer tags/reports.
Host functional training sessions where you describe how analytics addresses specific needs to each function. Additionally, underscore how it ties back to the larger project development cycle. This isn’t just to talk about important metrics, but also to help them understand how decisions can be made by answering key questions.
Unclear Vision and Strategy
- Stakeholders unable to answer key fundamental questions about projects
- Projects initiated based on a “hunch” with unclear goals
It’s important to take a step back to better understand where the business is going and why. When the goals of the business are ambiguous, you will not have any leverage to push back when stakeholders want you to adopt the “tag everything” approach. Additionally, you won’t have the background to understand what should be tagged in the first place.
If the entire company isn’t aligned on the vision and strategy… it’s a leadership issue.
There isn’t a heck of a lot you can do if the whole company isn’t aligned on the vision and strategy. The best you can do is make assumptions of what action the user should take and see if you’re corrected by leadership. Eventually (with enough corrections), a general idea of the vision/strategy should materialize – even if they can’t effectively communicate it. I would also be a little concerned with the effectiveness of the leadership team.
If YOU aren’t aligned on the vision or strategy, ask your sponsor or take a day to conduct some independent research. Put all of your newly-acquired knowledge in the context of your measurement strategy and see if there is opportunity to enhance your implementation.
No Executive Sponsorship
- Key recommendations aren’t being prioritized
- Insights and recommendations receive little visibility
There’s a really great article I read regarding executive sponsorship from 2009. Despite being 5 years old, it’s actually still incredibly relevant (in other news, I can’t believe 2009 was 5 years ago). In it, Brent Dykes mentions:
Choosing the right executive may depend on the maturity of your web analytics program.
In most all businesses, this applies. Having an executive sponsor is your window into the war room – it’s how you know the best levers to pull. A sponsor also takes on the following responsibilities:
Prioritization: In order to be successful, the web analytics program needs to be aligned with the business. The executive sponsor provides crucial direction to the team so it is always in line with the corporate strategy and top priorities.
Protection: The executive sponsor plays an important role in protecting the web analytics team from other conflicting initiatives or corporate politics.
Problem resolution: Using their clout or influence within the organization, the executive sponsor steps in to remove any problems that could impede the success of the program such as resource or budget constraints.
Promotion: The executive sponsor plays a key role in championing the benefits of the program and promoting its successes within the organization, especially among other executives.
The main idea behind leveraging an executive sponsor is being able to tie back your decisions as an analyst to the larger business vision. When you aren’t aligned with the vision of the company, you’re guessing. When you have to guess, there’s an incentive to measure everything. There’s also no protection from folks who aren’t aligned with the company vision. These are the HiPPO’s and project owners who want to “understand everything”.
Absence of Process
- Measurement and optimization are a reactive instead of proactive
- Inconsistent project communication (projects deployed without analyst’s knowledge)
Before you apply your slick Six Sigma DMAIC model, you have to have the opportunity to apply each step at the right time. That means your analytics program needs to be able to proactively ask the questions listed above. To do that, you have to be fully integrated into some kind of a Discover/Design/Develop/Deploy lifecycle.
Discover: This is where you define the purpose, audience, and objectives.
Design: This is where you build your measurement plan based on the Discover phase; and this is also when you build hypotheses to analyze and identify opportunities for testing or improvement.
Develop: Site tagging implementation and QA. You’ll better understand if you’re collecting necessary data here.
Deploy (and post-deployment): Analyze, Identify, Control (or Improve).
You’ll find that organizations that do not have this process implemented are often accompanied by several of the other problems listed above. More specifically – if you’re an analyst or a manager without an executive sponsor, you might be canoeing without a paddle.
If you have an executive sponsor, understand where each touch point is and get a seat at the table. Where do projects and campaigns originate? Propose sitting in as a fly-on-the-wall to listen in on how these projects begin. After a few meetings, begin participating and asking some of those key questions to firmly establish the key objective(s) of the projects. What problems are we trying to solve? What opportunities do we want to exploit? This should drive the measurement plan. From there it should make sense to justify integration with the Design discussion (if it occurs separately). The executive sponsor should also have your back when you push back on ad-hoc requests.
If you don’t have an executive sponsor, start small. Something as simple as a 15 minute status meeting with the project lead and a clearly-defined agenda should be enough to communicate that you’re necessary in these discovery sessions. It may take time, but eventually the questions you ask over and over will become second nature to the stakeholders. They’ll find themselves asking these questions without you. This takes much longer than it would with an executive sponsor – simply because you may not have the leverage to make folks take action quickly.
Finally, you may not even have these touch points. There may not be formal discovery, design, development, or deployment phases. This is the most challenging because you’re building something from scratch. This is often something that needs to be built from the top-down. While you can plant the seed, there likely needs to be a sponsor who builds this process from scratch or you’ll have to wait for the seed to sprout… which could take a while.
So I want to stress that these are just a few problems that might contribute to a “tag everything” mindset. There’s certainly some overlap between these – where a lack of strategy/vision might directly correlate with not having a process. A lack of a process likely correlates with a lack of education in some cases. At the end of the day we have to start with our desired outcome – the end goal. We want clear and consensual goals that will ultimately drive the tagging and measurement strategy. It’s MUCH easier said than done. What’s getting in your way from achieving this?