March 2010


23 Mar 2010 02:48 am

aclusterThere are more mistruths and F U D about Web analytics out there than I think is reasonable.

Part of it fueled by Vendors. What a competitive bunch!

Part of it fueled by some Consultants. I suppose the rational is: self preservation before all else.

Part of it is fueled by a vocal minority genuinely upset that 10 years on we are still not a statistically powered bunch doing complicated analysis that is shifting paradigms. They generally feel it is beneath them to use a standard tool, they push a utopian world that is hard for anyone to accomplish, including themselves, even after spending a minor fortune.

This is sad. Even a little frustrating.

My problem with these mistruths and FUD is that they result in a ton of practitioners and companies making profoundly sub optimal choices, which in turn results in not just much longer slogs but also spectacular career implosions and the entire web analytics industry suffering.

Let's try to change that. If you agree to help I am confident we can accomplish a lot.

Web Analytics, this beautiful child, was born just the other day in the midst of tumultuous times, quite literally, when everything changes every day. This constant evolution means that every time it learns how to do something the world changes around her and then it is on to learning the new things to stay relevant.

It has simply not had a break to catch a breath and mature.

And I doubt it is going to happen soon. The web is changing too fast. Too many new things are happening too fast and those of us charged with measuring it have to change the wheels while the bicycle is moving at 30 miles per hour (and this bicycle will become a car before we know it – all while it keeps moving, ever faster).

Yet. Yet. Yet, yet, yet, yet…. there is so much we can do.

Now.

This instant.

We can make use of what we have. Javascript tag driven click data processed in the cloud provided through a web based front end that allows you to segment and create meaningful views of the data unique to you.

ninja 1Even with the tools we have, in the state we have them, we can be smart. In fact smarter than you would be through any other channel on the planet!

Don't fall for the FUD. See through the mistruths. Don't go down rabbit holes.

The opportunity is too big for you to be distracted.

In this blog post let me share with you some ground truths from my own humble experience. It's a bit of black and white in a world that admitted has lots of gray.

My hope is that it inspires you. That it helps you focus your precious time and resources. That it results in you making fewer mistakes.

Finally, that it helps you go kick some bottay!

Here are ten web analytics ground truths….

1. If you have more than one clickstream tool, you are going to fail.

Strong words!

It is perfectly ok to date as many people as you want. It is ok to put them in tough situations (just introduce her/him to your parents!). It is ok to go all the way and see if things click.

Once you make up your mind and get married, practice monogamy. Bigamy is vastly overrated.

Here are some reasons:

    ~ It is really really hard to make sure you have implemented one tool correctly. Not just javascript tags but the ecommerce customizations, the custom variables / sprops / evars, the unique campaign tags required by each tool (for search and affiliate and email marketing etc), the internal site search configuration, the insane javascript tag updates just to make the darn segmentation work (except in some like Google and Yahoo Analytics), the… I could keep going.

    You'll be hard pressed to do one right, doing two is like asking for King Kong to slap you. Repeatedly.

    ~ It is really hard to get a organization to use one set of numbers (and remember they are not going to be clean or complete, no matter what you do). Why do you think introducing a completely different set of numbers is going to make your life easier?

    Having two tools guarantees you are going to be data collection, data processing and data reconciliation organization. Why? Because every tool uses its own sweet metrics definitions, cookie rules, session start and end rules and so much more.

    You'll have no time for data analysis, certainly not for data actioning.

    ~ It is a bit silly to believe you can use one tool for purpose x (say search analysis) and another for purpose y (say everything else).

    When it comes to proving which campaigns are better and which numbers to report to the management what will you do? How will you make sure you are in every meeting where people bitch and fight about getting credit?

There is nothing magical about they way clickstream data gets collected by any tool. They are all 95% the same.

Date around, find the one you love, marry it, stick with it.

If nothing else convinces you, remember that clickstream data is a small part of the data you'll use to make smart Web Analytics 2.0 decisions. For big success you'll need to have a Multiplicity strategy:

multiplicity updated sm

So when you step back and realize at the minimum you'll also have to use one Voice of Customer tool (for qualitative analysis), one Experimentation tool and (if you want to be great) one Competitive Intelligence tool…. do you still want to have two clickstream tools?

Likely not.

2. Omniture cannot save you. Only you can save yourself.

There is a absurd belief that if you buy a paid web analytics tool that you'll bathe in milk and honey and magically insights will be delivered.

Paid web analytics tool come with clickstream analysis tools that are hobbled on two counts:

    1. They come with legacy problems in their code and architecture that make it nearly impossible for you to do anything fast, or even do simple things like one the fly advanced segmentation – you constantly need to change the code and know everything you want to analyze up front.

    2. They will never be as powerful as Yahoo! Web Analytics or Google Analytics because otherwise Paid Vendors could not upsell you to, in case of Omniture toDiscover2 and Insight. In case of WebTrends replace those terms with Marketing Intelligence / Visitor Data Mart etc. In case of Coremetrics….. well you know.

This means when you buy a paid web analytics tool you'll be hobbled until you buy the versions of the product that actually do the job you want (and more).

Now if you decide that you don't want hobbled clickstream tools but would rather buy the complete suite on day one this is what you buy:

A 18 month implementation schedule and a 12 month process of redoing things (life changed in 18 months) and no money for Analysts (you sent have $3 mil to your analytics vendor by now) and you the lone ranger have in two and half years barely managed to deliver improvements to reduce bounce rates for top email campaigns.

Was that what you set out to buy?

all the data you ever wanted just no insights

Know what you are buying. Not insights, as alluring as they sound. You are buying implementation with a possible future promise of some actionable data three years down the road.

Ready to use Google or Yahoo! Analytics today to make 85% of the decisions you need to make after 3 weeks of implementation?

If you are just starting your analytics journey does it not sound reasonable?

Let's flip the coin.

You already have the paid analytics software combos mentioned above.

It is just as absurd to believe that Google Analytics is better than your Omniture Site Catalyst + Genesis + Discover with a dash of Insight. I have to bang my head on the wall when I hear that someone just replaced Omniture Site Catalyst + Discover with Google Analytics.

Why?

You just spent two years implementing them! And you paid three million dollars!!

There is nothing you get with Google Analytics that you did not already have. In fact with Discover you probably have 12 things Google Analytics can't do (that's whey you are paying an additional million dollars plus on top of what you are paying for Site Catalyst!).

Google Analytics can't save you if you already have the set up above or CoreMetrics Analytics + Explore or Unica's NetInsight OnDemand + Customer Insight + PredictivInsight!

If you are still failing then the problem is not the tools.

The problem is you. Your organization. Your skills. Your budget allocation priorities. Your silos. Your HiPPO.

Switching to Google Analytics, in the set up above, is not going to help you.

Fix what's actually broken, it's your WebTrends combo of Analytics9 + Visitor Data Mart or your CoreMetrics combo of Analytics + Explore + Benchmark + whateverelseweboughtbecuaseitsoundedgoodinthesalespitch.

Org. Skills. Structure. Process. Courage.

The only reason to switch to Google Analytics when you have the above is that you can't fix what's broken (org structure, skills, hippo). You might as well save the $3 million you are sending to your web analytics vendor.

3. It is faster to fail and learn then wait for an "industry case study" or find relevancy in a "industry leader white paper".

I met a small group of top companies in London recently. Post my keynote the feedback I got was: "Your presentation was powerful, you made a compelling case for how we can do the things you have outlined to take advantage of the opportunity. Do you have some relevant case studies you could share with us?"

I let out a quiet scream.

In this day and age I completely fail to grasp the need for "case studies" and "white papers".

grab this opportunity

In my offline life I looked for case studies because it was very expensive to try something new, you wanted someone to have failed already. I wanted a white paper so I could convince my HiPPO (Highest Paid Person's Opinion) that some magnificent Thought Leader pontificated something so we should do it.

Most case studies were at best from tangential businesses. 100% of the time the companies did not have the priorities that our business was currently executing, neither were they driving towards the same outcome.

Yet case studies in some sense reduced risk, even if they were simply over blown marketing fluff written by the vendor.

I don't need case studies now, not on the web.

Why?

If someone tell's me that vanity url's are a great way to start measuring multi channel impact then I can just try it for 500 times less effort than it would take me to find a case study.

If I go to a conference and hear that doing test and control experiments is a great way to measure cannibalization by paid search links on well ranked organic keywords, then I can just run a small test myself and see if it works for me.

If you blog that a short on-exit survey or a feedback button is a great way to collect voice of customer, I don't have to be lazy or hyper paranoid and wait for a convincing case study. Both of those are available for free, I'll just implement and be my own case study.

Email campaign ideas, content improvement, behavior targeting, testing product prices, hiring a supposedly awesome consultant, using offline calls to action, measuring impact of television on the web, opening a twitter account of a B2B business, doing….

Anything you can think of I can do it. Usually for free. Usually with a modest effort. Usually at least a test.

I can fail or succeed all by myself in my unique circumstances delivering for my unique business goals in my own organization.

Why do I need a case study?

Neither do you.

There is such little risk to actually trying. You don't need no stinking false comfort that something worked for someone else.

Fail faster.

[I realize for some HiPPO's old habits die hard, they won't even let you run a report without seeing a case study. Update your resume and start looking for another job - because the org you are with will never be as successful as it should be. Meanwhile see if you can convince your HiPPO to run a small test while you look for a case study (and a job).]

4. You are never smart enough not to have a Practitioner Consultant on your side (constantly help you kick it up a notch).

The field of web analytics (especially 2.0) changes far too much in far too short a time.

That's because the web changes too fast (and vendors that don't update their software to take advantage of these opportunities every quarter will die).

Yet companies, falsely, believe that they can keep pace and do it all with no external help.

That almost never works. Because…

    1. You are far too busy actually reporting and analyzing to keep pace with all the wonderful evolution

    2. It is cheaper to get someone to answer your question at $60 or $80 or $100 or $150 an hour than spend a week "trying to figure it out".

Hire a Practitioner Consultant (someone who just does not speak at conferences but actually rolls up her sleeves and does the dirty work) on some kind of a retainer, or buy a bank of hours you can cash out say during the next six months (or whatever) and get solutions delivered to you. You focus on taking action.

I recommend this blog post: Web Analysis: In-house or Out-sourced or Something Else?

consultant 2dclient 2dstages1

In it I describe four stages into which each company fits (in terms of its current analytics evolution) and what you should expect from a consultant in each stage.

This will help you figure out exactly what you might need and hold your consultant accountable.

Here are three additional tips about hiring consultants, from my humble experience:

    1. Compute how long the person has been consulting, call this X. Compute how long the person had actually worked as a practitioner in a real company (hopefully in your industry), call this Y.

    If X > Y, it is possible the consultant might be disconnected from the reality of what it really takes to get businesses to use data (and not it is not just tool expertise). [This means I have 3.5 yrs left to be a hands on practitioner consultant!]

    If X >> Y (substantially greater :), avoid.

    2. If you can try to hire an independent external consultant.

    It is not that the consultants at Omniture or CoreMetrics or WebTrends are sub-standard, they are Absolutely Not. But they do face dual pressures of selling you more consulting and up-sell products. If you have a independent consultant they only try to sell you more consulting! :)

    That is the reason I am partial to hiring authorized consultants for Google Analytics (GAAC's) and Yahoo! Analytics (YWAAC's) or for Omniture / CoreMetrics / WebTrends going with someone such as Stratigent or Zaaz.

    Oh and don't forget rule #1 above.

    3. Do a Google search for the Consultant. Read what people say about them. Read what they say about themselves and others. Read how they contribute to the blogosphere, to forums. Form an opinion, then hire.

    If possible hire a nice person. Life is too short to work with jerks, no matter how skilled or knowledgeable they are./p>

Good consultants will help you stay current, solve problems faster, deliver solutions and not just reports, allow you to focus on analyzing data and finding insights.

5. Your job is to create happy customers and a healthier bottom-line.

If you think your job is to analyze the "numbers" your career will be limited.

People (you? :) whose job it is to do "the data thing" spend day after day after day in analytics tools producing numbers (if they have time left over from tagging, begging IT, changing tags, turning down vendor up-sells, begging vendor for more svars and eprops and asi slots…).

Numbers with data and tables and graphs and pivots and font sizes and automated pdf's and…. a lifetime spent producing numbers.

work gloves

Here's a major reason why all that effort, the numbers deluge, changes nothing for a company:

    You / me / they never ever bother to actually go to the website.

    Never bother to search for their company and look at the paid and organic results (to find broken things).

    Never bother to sign up for their own email campaigns (to see how much they stink).

    Never bother to buy something on their site see live the torture.

    Never bother to try and return the product/service purchased via the site (and see how much that stinks).

    Never bother to visit competitor sites and find nice or terrible things (to take advantage of).

    Never bother to do a online usability study (it just costs $20 a pop!!).

    Never bother to….

Look, if you are not going to go out there and feel the heat how do you expect to get the insights you need about where to focus and what to do?

Your web analytics tools only provide you with numbers. Then its up to you. And you can only begin to focus, prioritize, find stories and fixes and opportunities if you actually immerse yourself in understanding what you are supposed to analyze.

Walk in the customer's shoes so you'll understand how much your site stinks (then find the numbers that help prove that, or not). Email people who have placed orders, asked them for their frustrations. Answer tech support emails for a day.

Every single day ask yourself this question: What amongst the data I have provided today will create happier customers tomorrow?

If you don't have a direct line of site from your work to happy customers, you are doomed.

Ditto, perhaps even more so, if you are not incessantly focused, every single day, to providing data stories (or "info snacks") that help improve your company's bottom-line. Every day. Wait. I said that already. : )

If you, the "Web Analyst" don't believe that you hold in your hands the power to change your company's existence then you are either at the wrong company or, more likely, in the wrong job.

6. If you don't kill 25% of your metrics each year, you are doing something wrong.

In ancient times we would hire Accenture or some such august consulting group to come in, spend six months systematically going through the business and recommend Measurable Success Factors (shorthand: metrics) and those then would be carved into stones, handed to the Good Lord's messenger and the rest of us would for ever follow the commandments unquestionably no matter what happened.

While I am exaggerating a bit for effect, most web businesses, if they identify key metrics, and then never go back and revisit and revalidate.

It should not come as a surprise that after just a few months you find that no one looks at your dashboards, no one can seem to find insights from the data and the company has reverted to "faith based initiatives" rather than "data driven initiatives".

The web changes too fast for us to believe that we can be stationary with 1. our measurement strategies 2. what to focus on priorities 3. success measures.

evolution progress change

We need to change our measurement strategies as changes occur in:

    1. Marketing strategies (from forums to display to search to social to mobile to…)

    2. Business priorities (no we are not doing ecommerce, we want leads!)

    3. Structure, purpose, audience (oh my!)

    4. Available measurement technologies (ohh…. sentiment analysis!)

    5. Skill set available (wow we finally got someone who know what r squared is?)

    6. HiPPO's bonus measurement metric (you will never succeed unless you are trying to get the person on top promoted or a higher bonus, keep very closely informed as to how they get paid, find insights that solve for that, you will have eternal love and a data driven org)

All of the above happens all the time to every website. So why should your reports, dashboard, measurement priorities and "Measurable Success Factors" stay stagnant?

By forcing yourself to have a target for killing metrics you are ensuring that you'll focus on an important activity once a quarter. You'll re-visit your assumptions and what's important to the business. You'll be forced to talk to HiPPO's, Marketers and pretty much anyone who currently consumes the output from you/your team.

And that, as Martha would say it, is a good thing.

[Allow me to point out that only 50% of the metrics I love exist in clickstream tools - like webtrends or xiti or unica. The other 50%, the ones that help drive key changes to the business exist in other places. Metrics like: Multi channel value index. Impression Share. Task Completion Rate. Keep that in mind when you choose metrics to ensure you are not over-leveraged in metrics that don't matter.]

[Bonus Reading: Five Rules for High Impact Web Analytics Dashboards.]

7. A majority of web analytics data warehousing efforts fail. Miserably.

There are few investments as overrated as building a catch all massive data warehouse to give you the "global cross functional multi channel single view of the customer experience and lifetime value on demand through a business intelligence report powered by an econometric model that takes into account page view probabilities using the Clopper-Pearson binomal confidence interval".

Yet that is exactly how internal data warehousing projects are championed or external cloud based data warehousing solutions are sold by vendors.

As of 2010 I still have a lot more years that I spend in the traditional data warehousing / business intelligence world than in web analytics. I have personally executed data warehousing projects for web data (in the broadest sense), and they have mostly been miserable failures. [Warning: There is a distinct possibility perhaps I am the problem here!]

very large warehouse

Here are some problems you face with web data (when it comes to warehousing):

    1. There is too much granular data! Yes yes I have purchased the Netezza appliance, yes other promise "massively parallel processing data warehousing appliances". The problem is not the hardware or the hardware company, the problem is the amount and type of data (most of it is actually worthless, even if you can get much of it into the warehouse). Things of course get worse when you think of warehousing in traditional software only solutions.

    2. The data is rarely deep (say about a person), is mostly anonymous (about a person) and full of holes (cookies, scripts off, plugins). This goes counter to the strengths of what data warehousing is able to pull off so well with offline data (years and years of data too).

    3. Warehouses expect logical structures and relationships, you'll be astonished at how little of this exists in your web analtyics data (see reason above).

    4. It is worse than extracting all your teeth with a toothpick to try and get your offline data merged with your online data (even if, and it is a BIG IF, you can get the requisite primary keys).

    5. BI tools stink at answering questions web analytics tools answer with ease (how many people clicked on a link on our home page, how many sessions from keyword "avinash" came from Google and abandoned products in their cart,….). This means trying to replace a WA tool with a Warehouse only results in an organization slowing down further.

    6. Campaigns, tags, links, meta data (if any that might exist), data relationships, metrics, website url structures etc cause there to be a constant demand to make changes to the underlying structure of your data warehouse every single day. Yet no dw team is organized to execute on a daily schedule, you'll be lucky to get monthly. All of the aforementioned is not a problem for your web analtyics tools.

I could keep going on. Please please please make sure you don't make a decision to invest millions of dollars (that's what it will take by the way for a fortune 5000 company) based on the promise of data warehousing, look at the reality and apply that filter. It will be humbling.

Oh and before you tell me that you want to build a data warehouse to store history let me point you to this blog post: History Is Overrated. (Atleast For Us, Atleast For Now.) Please give it a quick read and make sure the traps outlined there don't exist in your case.

History, and historical comparisons, beyond the last 13 months are vastly overrated, and almost never worth the cost that data hangs around your neck.

There is always one exception to the rule. :) It can be of some value to take aggregated data about your visitors (especially those that converted) and put it into your corporate data warehouse where all other data of your company sits. This allows you to do strategic analysis of you web acquisitions in context of retail, call center, etc.

Not page level analysis type (that's tactical!) rather the cross channel purchases and returns etc (the real strategic kind).

Think really really hard before you buy the hype of web analytics data warehousing. They tend to be expensive multi year commitments that rarely deliver even nominal value not matter how much vendors and consultants hype them.

It is possible that you'll be the exception and build the first clickstream data warehouse where you'll deliver positive ROI (against the Total Cost of Ownership). But even if 110% of the signs point to that first make sure you have aggregated all the marginal gains.

It would be silly to not pick up the high ROI low cost stuff first right?

8. There is no magic bullet for multi-channel analytics.

The reason you have had a hard time finding a multi channel (online plus offline) analytics solution is….. it does not exist!

And here's the thing, it won't for quite some time. The problem is the missing primary keys, and we won't solve it in the near future.

Yet there are Vendors that blatantly say they provide a "comprehensive integrated multi channel solution" and imply that they can track every interaction across any channel and help you compute "true ROI".

It is a bunch of @#%^*

The best thing such solutions do is they sell you a campaign management solution for your offline marketing activities with some possibility of running those campaigns (think email) online as well. In the most optimistic scenario what you'll get is response rate from a mailer (postal) and a email campaign because the email campaigns were auto tagged.

That's it.

They won't help you understand impact of search on store sales, they won't help you understand impact of tv on your website (not without massive pain even after you buy the "comprehensive integrated multichannel solution"), they won't help you…. well a lot of things.

Be wary. Be very very way of these people/solutions.

Now make no mistake… measuring multi-channel impact (non-line marketing baby!) is critically important. You *should* do it.

But it is a long hard slog. It requires people, it demands begging many people in your company and agency to cooperate with you, it mandates building custom solutions, it needs lots of creative thinking. There is also a big payoff in the end, just no easy answers.

You'll need a portfolio strategy (from my book Web Analytics: An Hour A Day, Page 235):

multichannel marketing value analysis framework1

Here are two blog posts that comprehensively outline why multi channel analytics is important, what the problems are and a portfolio of 11 solutions you can deploy:

Updated versions of the strategies outlined in above posts are in Web Analytics 2.0 (starting Page 368, in case you have the book).

9. Experiment, or die.

Let me beat this dead horse one more time. Sorry.

If you don't have a robust experimentation program in your company you are going to die.

It is just a matter of time.

[I know, I know, it seems like we have been through this so many times, and I also know that secretly you know how critical this is, sadly others stand in your way.]

In today's world there are so many questions that we can't answer with any degree of certainty (even with petabytes of data!). Here are some such questions…

    1. How much cannibalization happens between paid and organic search for my brand keywords?

    2. What is the online impact of my promotional flyers sent in postal mail?

    3. What is the optimal price I should charge for my product to maximize profits?

    4. Should I go for overwhelming, pungent, or just plain pukey for my home page design?

    5. Should I show an Add To Cart link to our own ecommerce store or also to other places on the web people can buy the exact same product (often cheaper, so people buy a lot more of what they might not have bought at all)?

    6. What is the impact of having a live twitter feed of all mentions on each product page of our website?

    7. Will people from Ireland buy that?

Your imagination is the limitation in terms of hypotheses and "I wonder…" ideas that you come up with every day.

Yet Site Catalyst and Unica and Google Analytics and Indextoos stink at answering all of the above questions.

petri dishes experimentation

But if only you could answer any one or two of the above, it would dramatically alter how you do business online.

Oh and when I say Experimentation I don't mean testing button sizes (BOO!). I mean doing big important things that matter (every one in the list above, and more).

Start with something simple, try three different layouts of your home page, the product line page and the highest trafficked landing page. You are on your way to A/B testing. Progress points? 20.

Next move to changing two things at one time on your product description pages. That's multi-variate testing. Progress points 25.

Now you are ready for the kind of testing that is life changing: running controlled experiments! [Web Analytics 2.0: Pages 205 - 208.]

That's most of the tests above. They will help answer the almost unanswerable questions from cannibalization to multi channel impact to brand impact and more. Aim for this.

Hire at least one or two people dedicated to experimentation (not just a/b testing, or Google Website Optimizer / Test & Target) in your team if you are a Large company, and part of a person if you are medium sized.

If you want to truly being data driven, if you want to crush your competition, if you want to really win on the web, then all roads lead through robust experimentation.

10. The single most effective strategy to win over "stubborn single-minded" HiPPO's is to embarrass them.

Finally perhaps the bane of our existence, the magnificent HiPPO (the Highest Paid Person's Opinion).

Our beloved HiPPO's bring their entrenched mindsets and loud voices (in terms of power) and performance review writing authority to bless our projects, or more likely stand in the way of progress.

Often HiPPO's don't impede progress / change or crush valid opinions / suggestions because of malice. Sometimes they don't know this interweb thing as well as they should, sometimes they know things have worked a certain way forever and they are reluctant to try new things, and other times they are convinced that they are right (even when they are magnificently wrong).

Net net things are rarely as cute as this…

cute hippo

Here is what does not work when it comes to convincing HiPPO's:

    1. Your opinion. Really, no one cares what you or I think (not that high in the organization).

    2. Repeating yourself time and again.

    3. Data puking (though we tend to thing as data persuasion).

Here is what does work with heavenly precision: embarrassment.

Their embarrassment.

You just have to be nuanced (to ensure you don't make the above three mistakes).

You two BFF's in the HiPPO's nuanced embarrassment:

    1. Data about your competitors (and your performance against that data set)

    2. The voice of your customers (and your awesomeness or suckiness that shines through that)

I only know a handful of HiPPO's that can resist having competitors crush them (especially results of their opinions that were actioned!). I know only a couple of HiPPO's who once made aware of will ignore the pain of customers.

Here are six specific strategies you can use to move even the heaviest of HiPPO's:

    # 1: Implement a Experimentation & Testing Program.

    # 2: Capture Voice of Customer. Surveys, Remote Usability, Etc.

    # 3: Deploy the Benchmarks I Say, Deploy 'em Now!

    # 4: Competitive Intelligence is Your New Best Friend.

    # 5: Hijack a Friendly Website (/ Earn Your Right to be Heard).

    # 6: If All Else Fails. . . . .

Please check out this blog post for additional details and examples for each recommendation: Lack Management Support or Buy-in? Embarrass Them!

Next time you see me don't complain about how your hands are tied and your boss is a pain or how you feel like the loneliest person in the world and no one understands you. Your destiny is in your hands, use the strategies above, go after your HiPPO (respectfully), and make change happen!

EOM. Phew!

If I could summarize the philosophy I have formed from a lifetime of bruised it would be this…

The only way to succeed in Web Analytics is to: Be agile. Be flexible. Move fast.

Decisions you make today based on data you have right now will have greater impact on your business, than decisions you can make in the future based on solutions you will implement over the next eighteen months with data that will be so perfect it is as if God is speaking to you.

Ok now it's your turn.

What do you think of the ten fundamental truths? Agree with 'em? Vehemently disagree? Got a #11 you would add? Perhaps not just #11 but #11 through 16? :) Please share your thoughts / feedback / criticism / love via comments.

It would be fabulous to hear from you.

[A Small Contest:]

My online learning startup Market Motive is holding a small contest to award scholarship for a Master Certification course ($3,500) in Web Analytics. The course starts on April 15th. Our goal is to give someone deserving an opportunity to become a Ninja.

If you think you could gain value from a three month structured course (with exams and quizzes!) then please contact me. Here are the rules… please e m a i l me the following…

1. A short (really short) paragraph on why you want the scholarship.
2. Pick a site you love and tell me three things you would change about it, and why.

That's it.

Please fit the whole thing in one page (6 sized font automatically disqualified! :)).

Contest close date: March 31st.

Thanks.
[/A Small Contest:]

08 Mar 2010 02:18 am

threeData, data everywhere yet nary an insight in sight.

Is that your web analytics existence?

Don't feel too bad, you share that plight with most citizens of the Web Analytics universe.

The problem? The absolutely astonishing ease with which you can get access to data!

Not to mention the near limitless potential of that data to be added, subtracted, multiplied, and divided to satiate every weird need in the world.

You see just because you can do something does not mean you should do it.

And yet we do.

Like good little Reporting Squirrels we collect and stack metrics as if preparing for an imminent ice age. Rather than being a blessing that stack becomes a burden because we live in times of bright lovely spring and nothing succeeds like being agile and nimble about what we collect, what we give up, and what we deliberately choose to ignore.

The key to true glory is making the right choices.

In this case its making right choices about the web metrics we knight and sent to the battle to come back with insights for our beloved corporation to monetize.

A very simple test can allow you to figure out if the metric you are dutifully reporting (or absolutely in love with) is gold or mud.

It is called the Three Layers of So What test. It was a part of my first book, Web Analytics: An Hour A Day.

What's this lovely test?

Simple really (occam's razor!):

Ask every web metric you report the question "so what" three times.

Each question provides an answer that in turn raises another question (a "so what" again). If at the third "so what" you don't get a recommendation for an action you should take, you have the wrong metric. Kill it.

the three test

This brutal recommendation is to force you to confront this reality: If you can't take action, some action (any action!), based on your analysis, why are you reporting data?

The purpose of the "so what" test is to undo the clutter in your life and allow you to focus on only the metrics that will help you take action. All other metrics, those that fall into the nice to know or the highly recommended or the I don't know why I am reporting this but it sounds important camp need to be sent to the farm to live our the rest of their lives!

Ready to rock it?

Let's check out how you would conduct the "so what" test with a couple of examples.

Key Performance Indicator: Percent of Repeat Visitors.

You run a report and notice a trend for this metric.

Here is how the "so what" test will work:

"The trend of repeat visitors for our website is up month to month."

So what?

"This is fantastic because it shows that we are a more sticky website now."

(At this point a true Analysis Ninjas would inquire how that conclusion was arrived at and ask for a definition of sticky, but I digress.)

So what?

"We should do more of xyz to leverage this trend." (Or yxz or zxy – a specific action based on analysis of what caused the trend to go up.)

So what?

If your answer to that last "so what" is: "I don't know… isn't that a good thing… the trend is going up… hmm… I am not sure there is anything we can do… but it is going up right?"

At this point you should cue the sound of money walking out the door.

Bottom-line: This might not be the best KPI for you.

Let me hasten to point out that there are no universal truths in the world (though some religions continue to insist!).

Perhaps when you put your % of Repeat Visitors KPI to the "so what" test you have a glorious action you can take that improves profitability. Rock on! More power to you!

many exit signs

Key Performance Indicator: Top Exit Pages on the Website.

[Before we go on please know that top exit pages is a different measurement than top pages that bounce.]

You have been reporting the top exit pages of your website each month, and to glean more insights you show trends for the last six months.

"These are the top exit pages on our website for the last month."

So what? They don't seem to have changed in six months.

"We should focus on these pages because they are major leakage points in our website."

So what? We have looked at this report for six months and tried to make fixes, and even after that the pages listed here have not dropped off the report.

"If we can stop visitors from leaving the website, we can keep them on our web site."

So what? Doesn't everyone have to exit on some page?

The "so what" test in this case highlights that although this metric seems to be a really good one on paper, in reality it provides no insight that you can use to drive action.

Because of the macro dynamics of this website, the content consumption pattern of visitors does not seem to change over time (this happens when a website does not have a high content turnover – like say a rapidly updating news site), and we should move on to other actionable metrics.

Here the "so what" test not only helps you focus your precious energy on the right metric, it also helps you logically walk through measurement to action.

conversion rate efficiency

Key Performance Indicator: Conversion Rate for Top Search Keywords.

In working closely with your search agency, or in-house team, you have produced a spreadsheet that shows the conversion rate for the top search keywords for your website.

"The conversion rate for our top 20 keywords has increased in the last three months by a statistically significant amount."

So what?

"Our pay-per-click (PPC) campaign is having a positive outcome, and we should reallocate funds to these nine keywords that show the most promise."

Okay.

That's it.

No more "so what?"

With just one question, we have a recommendation for action. This indicates that this is a great KPI and we should continue to use it for tracking.

Notice the characteristics of this good KPI:

#1: Although it uses one of the most standard metrics in the universe, conversion rate, it is applied in a very focused way – just the top search keywords. (You can do the top 10 or top 20 or as many "head keywords" as it makes sense in your case, just be aware this does not scale to the "mid" or "tail".)

#2: It is pretty clear from the first answer to "so what?" that for this KPI the analyst has segmented the data between organic and PPC. This is the other little secret: no KPI works at an aggregated level to by itself give us insights. Segmentation does that.

task completion rate 2

Key Performance Indicator: Task Completion Rate.

You are using a on-exit website survey tool like 4Q to measure my most beloved metric in whole wide world and the universe: task completion rate. (You'll see in a moment why. :)

Here's the conversation…

"Our task completion rate is down five points this month to 58%."

So what?

"Having indexed our performance against that of last quarter, each one percent drop causes a loss of $80,000 in revenue."

So what? I mean in the name of thor, what do we do!

"I have drilled down to the Primary Purpose report and most of the fall is from Visitors who were there to purchase on our website, the most likely cause is the call to action on our landing pages and a reported slowness in response when people add to cart."

Good man. Here's a bonus and let's go fix this problem.

Nice right?

Notice in this case you have a inkling to the top super absolutely unknown secret of the web analytics world: If you tie important metrics to revenue that tends get you action and a god like status.

Keep that in mind.

So that's the story of the "so what" test. A simple yet effective way of identifying the metrics that matter.

This strategy is effective with all that we do, but it is particularly effective when it comes to the normal data puke we call the "management dashboard". Apply the "so what" test and you'll make it into a Management Dashboard.

Closing Summary:

Remember, we don't want to have metrics because they are nice to have, and there are tons of those.

We want to have metrics that answer business questions and allow us to take action—do more of something or less of something or at least funnel ideas that we can test and then take action.

The "so what" test is one mechanism for identifying metrics that you should focus on or metrics that you should ditch because although they might work for others, for you they don't pass the "so what" test.

And killing metrics is not such a bad thing. After all this is the process that has been proven to work time and time again:

web analytics metrics lifecycle process 1

More here: Web Metrics Demystified.

Ok now it's your turn.

Do you have a test you apply to your web metrics? What are your strategies that have rescued you during times of duress? What do you like about the "so what" test? What don't you like about it? Do you have a metric that magnificently aced the "so what" test?

Please share your comments, feedback and life lessons via comments.

Thanks.

PS:
Couple other related posts you might find interesting: