November 2008


24 Nov 2008 02:18 am

centralAn Analysis Ninja, let's call him Philip Walford, asked a delightful question. Philip wanted to know if the impact of a faith based initiative in his company, product demo videos, could actually be measured using data.

Hurray!

Faith is good. Data is better. : )

[And before you flame me: know that I love my religion more than you love yours. Wait. That did not come out right. Let me rephrase that.]

In this thanksgiving week 2008 post I'll share Philip's question about how to identify value of video product demos on an ecommerce site, and my answer about involving customers.

Here's Philip. . . .

We are a large retailer with a lot of product on our site. In the past we have invested lots of dollars and time producing things like demo videos for our products, or adding other features and tools to our website to provide more information about a product. Our goal is to inspire customer confidence in their purchase (by giving them as much information is possible).

The question is, what are the KPIs of things like a demo video.

video product demos

My recommendation was to measure conversion rate for the segment that views the video. If conversion is higher then the videos are bringing value. Others in my company have presented the hypothesis only customers that are a lot more invested in buying the product are likely to click on the video link and hence "pre qualified", hence that segment would have had a higher conversion rate regardless.

I understand their perspective but I feel they are reading too much into the situation but I don't know how to argue this point. There are several directions we could go with this but I wanted to see if you could share some guidance on this issue.

My answer to Philip. . . .

This is a complex problem, more than might be apparent on the surface.

It is also an example where it can be easy to jump into bed with your web analytics tool to get satisfaction but you wake up in the morning feeling. . . . well. . . . less than satisfied.

tado my zune original But before we go there I have to give a ton of credit to Philip and his crew for being skeptical of reading too much into their own opinions or biases.

I firmly believe that people who work for a company rarely (never!) represent customers. They are too close to the company and too different.

Just because I work for Microsoft and use a Zune (yes I do!) does not mean I can be a effective customer representative of Microsoft Zune customers. Company employee opinions rarely reflect those of customers. Do please be aware of that.

So when looking to make decisions, look for data (quant or qual).

I'll present Philip with three solutions / options as he battles the challenge of figuring out if the investment of muchos dineros in creating product videos is worth it (besides the fact that these videos ooze sexiness!).

1) Use ClickTracks (Compute Contextual Influence)

There are two challenges with using clickstream data and the "typical" measure of conversion rate to determine success.

A] You might be looking at a "biased" segment (as challengers to Philip's recommendation mentioned). I.E. Only the highly motivated people.

B] By comparing all people who converted and viewed the video with those that converted and did not see the video you are not comparing fair segments. You are also lumping all other "convince our visitors to buy" tools into one large bucket. Tools like Comparison Charts and Product Screenshots and Product information and Customer Reviews and more.

clicktracks segmentation revenue analysis It is quite possible that those other tools might be getting people to convert at a much higher rate and by dumping them all together you are not being fair.

And of course you'll get a wrong read on conversion impact of the videos.

So even if you use your web analytics tools (your Google Analytics or Omniture or WebTrends or CoreMetrics or whatever) try to compute "contextual influence" (value of each feature in context of the others).

It is actually very hard (damn near impossible) to do this in all those tools (even for the Paid solutions, even after you plunk down half a million dollars for the mandatory Data Warehouse "add on").

ClickTracks is the only tool I know of that can do this out of the box, using its terribly named "funnel report". No data warehouse. No extra tags or variables or sprops or wt_&*#$. In fact not even much IT, I just need admin access to my tool (not site, web analytics tool).

Its easy to use. Create a hierarchy of your website. Add individual or groups of pages into each stage (notice I did not say step because you can jump steps here). Add an outcome (in my case say "Thanks for placing your order" page). Click Calculate.

Boom!

clicktracks funnel analysis

[You are not supposed to be able to read the analysis, sorry, privacy dictates that.]

What I want you to note is two things.

This is a site where each stage means a view of the site (and like a traditional funnel how many people get in, get out, move on etc).

Secondly note that each box (which represents a page/'s or a tool – videos, comparisons, reviews etc) has a different stage of blue.

What this lovely report does for you is compute "the influence"of each of those pages/tools in driving the ultimate outcome – purchase here. The darker the blue the more "influential" that piece of content. [Influence is defined by the existence of that piece of content in the visitor session, regardless of what path the visitor took, regardless of when the content was seen.]

Ain't that super sweet?

The analysis you see above is for a real ecommerce website. What it proved to us, delightfully, was that the product videos, we had created at a cost of over one hundred thousand dollars, yellow star above, was the least influential tool we had on our site.

The most influential, sexy pink star above, was a tool that had cost us $8 to produce – it was a page that compared different versions of the product (information that was handily available in the company).

We used actual customer behavior. We analyzed contextual segments. Ultimately it allowed us to put our precious few resources in the right area.

hippo Of course it is quite likely that everyone who came to the site and did not buy (convert) might have loved the videos and rushed to stores to buy our products (one HiPPO actually said that!). There is no way to prove that using just the web analytics data.

What we did is proved impact on online buyers.

As to the HiPPO. . . . read on. . . .

2) Use Surveys (Actively Collect VOC)

When in doubt (or confronted by a HiPPO, remember don't run) what better way to go then gather some Voice of Customer. Dare I say the voice of god? :")

Two things I have tried (of many!) that work a lot of the times. Each covers one unique bucket of visitors to your website.

A] Consider sending a simple post purchase email survey to customers who have purchased on your site and ask them for the key influencers of their purchase.

You could share with them the various tools you have on your site (product information, comparison tools, images, videos, customer reviews etc etc) and simply ask them to rank order them in order of importance.

Don't ask them to tell you how much they like them, or choose ones they like, they tend to pick all. :) Just ask them to rank order. Or use a tactic similar to that.

This tells you want works for those who buy.

For the 98% that will never convert on your website. . . .

surveys q and kampyle

B] Consider a onsite survey like 4Q (though 4Q can only be customized so much so perhaps you want to use either your own or one of the big daddy paid survey tools).

This will go to a small random sample of people who are on your site (who may or may not buy). You'll ask them three or four questions about why they were there (primary purpose) and then what tools/features of your website they liked (rank ordered if at all your survey company can do that).

That will give you what you want.

Since this can also be thought of as a page level problem, you can also use something passive, a page level survey / poll, like Kampyle on your product pages and ask people to quickly rate the various features. There is a Site Content feedback topic in Kampyle which you can customize.

Now you have the most important piece of data you need, your customer's. Few website owners / marketers / hippo's can argue with this. Leverage this advantage.

Finally one last option for you. . . . hopefully one you'll use before you write a chq for a hundred grand to create your videos. . . .

3) Use… wait for it….. Testing! (Measure Actual Customer Behavior)

I am sure this does not surprise you. Run a A/B or Multivariate Test and let your customers help inform you of the value of these features.

For 30% or 40% or whatever %, don't show the product demo videos and for the rest show the product demo videos and see the impact on the data. Boom (!) you have your answer, without any biased opinions.

a b testing tools and features

It is certainly going to take you a small amount of effort, get the Website Optimizer, talk to your IT folks, create version of the page with no product tour link etc.

But you are making a very expensive decision for your company are you not?

And here is the additional benefit of testing. You are free to use any kind of "conversion".

You can measure success as conversions (submit order).

You can measure success (of the test) as number of people abandoning from the product page.

You can measure success as the time people spend on the product page. [There is a very cool javascript code that does this with the Google Website Optimizer, it is especially helpful for rich media / flash sites. Without a doubt other vendors can do this as well, just ask.]

You can measure success through your survey tool if it is integrated (this is some extra work sadly, but for big bets I recommend it).

You can integrate your analytics tool with your testing tool (say Google Analytics with Website Optimizer) and use other metrics to measure success such as bounce rate or electric shocks etc :).

[For GA and GWO ROI has integration instructions .]

The bottomline is that you can define success and then let the customers tell you.

That's my answer to Philip.

Sounds exciting?

Am I the only one who thinks when you do this kind of analysis you are in a nearly orgasmic state?

Yes these methods are some small amount of work. But nothing in life worth having is easy. The tools might be free, but that does not eliminate your need to investing your time and effort! :)

And on the positive side with a recession looming people who involve customers in making decisions, rather than their opinions, will win big. The "guessers" will not win big. They might even win small. Or fail.

Plus if you do this you'll be a Analysis Ninja, not a Reporting Squirrel.

Ok now your turn.

Have you tried to analyze the features like Video Demo's on your website? Or perhaps other complex features you have launched? What works for you? What totally failed? In my recommendation to Philip, what did I overlook?

Please share your feedback, critique and hurray's.

17 Nov 2008 01:49 am

collect"Experiment or die, there is no try."

That was my call to action, Yoda inspired, last week to a group of international C-level executives. And I meant every word of it.

There is a tendency to think experimentation and testing is optional. Ouch!

I fundamentally believe that is wrong. For a few simple reasons:

# 1 It's Not Expensive!

You can start for free with a superb tool: Google's Website Optimizer. It is packed with enough features that I have no qualms flogging it (even though I work closely with the team!).

If you want to help our economy and pay for your tools then that is absolutely fabulous. Both Offermatica and Optimost are pretty nice options.

[Just don't fall for their bashing of all other vendors or their silly claims, false, of "superiority" in terms of running 19 billion combinations of tests or the bonus feature of helping you into your underwear each morning.

You'll be lucky if you can come up with 5 combinations, and it is not that hard to put on your underwear.

Look for actionable uniqueness. For example I am quite fond of the fact that with Offermatica you can "trigger" tests based on behavior. That is nice, well worth paying for.]

# 2 Six And A Half Minutes. That's it!

Tom has tried this with many many Marketers, and its so true: If you have fast leap two different pages you want to test, it takes six and a half minutes for you to configure, test (QA) and launch a A/B test.
[Please read that literally, as it is written. You have two pages already. 6.5 mis to: Configure. QA. Launch.]

You have six and half minutes right?

I cannot recommend enough the wisdom of starting with a A/B test.

You will start fast, you will find enough problems in your company, you can show easy wins.

Aim to get to the thing vendors are selling, MVT, but start with A/B regardless of the tool you use.

# 3 Show 'em You Are Worth It.

There is a lot of pressure on all of us to prove our worth and make significant improvements to our web business.

ClickStream analysis with Omniture or Google Analytics or ClickTracks is well and good, testing will get you on the path of taking having a direct impact faster.

By the nature of it Testing is action oriented, and what better way to show the HiPPO's that you are awesome then by moving the dial on that conversion rate in two weeks?

# 4 Big Bets, Low Risks, Happy Customers.

Very few people appreciate this unique feature of testing: You have an ability to take "controlled risks".

poker chips

Let's say you want to replace your home page with pictures of naked people, yes in the quest of engagement . : ) Naked people are risky, even if they are holding strategically placed Buy Now buttons.

So run a test where only 10% of the site traffic sees version B (naked people).

You have just launched something risky, yet you have controlled the risk by reducing exposure of the risky idea.

Stress this idea to your bosses, the fact that testing does not mean destroying the business by trying different ideas. You can control the risk you want to take.

# 5 Tags, CMS, Reports & Regressions: All Included!

all in one box Pretty much all Testing tools are self contained, simple to launch (A/B is brain dead easy, MultiVariate needs your brain to be awake – that's not hard is it?), they contain all reporting built in and the data is not that hard to understand.

So you don't have to worry about integrations with analytics tools, you don't have to worry about rushing to get a PhD in Statistics to interpret results and what not.

You will hear super lame arguments about mathematical purity or my factorial is better and the other guy's whatever. Ignore. It will take you a while to hit those kinds of limits. And the nice thing is by then you'll be smart enough to make up your own mind.

What's important is you start. Do that today. Think of this as dating and not a marriage. You are allowed to make mistakes. You are not going to marry the first guy you run into. Don't take that approach here.

So agree with me? This is attractive? Right?

Think about it this way. If your analytics career is flagging then testing is the Viagra you need to take.

Seriously.

: )

So as my tiny gift for you here are five experimentation and testing ideas for you. I'll try to go beyond the normal stuff you hear at other sources.

# 1 Fix The Biggest Loser, Landing Page. (& Be Bold.)

Now all that is well and good. But the sad thing in a common mistake people make is get excited and then go try to test Add To Cart buttons. Or three different hero images on the home page.

That's all well and good. But honestly that's not going to rock your boat. [Remember you are on Viagra!]

For your first test be bold, try something radical, bet big. I know that sounds crazy. But remember you can control risk.

If you start with a A/B test with some substantial difference then you can show value of testing faster because you'll get a signal faster, you'll start the emotional change required to embrace testing across the organization.

My favorite place to start, is the Top Landing Pages report (or Top Entry Pages if that's what your vendor calls it) from your web analytics tool.

Find the biggest loser, the one with the highest bounce rate.

volvo hybrid cars landing page

Click and look at the sites sending traffic to this page, look at the keywords driving traffic to this page. That will give you clues about customer intent (where people come from, and why).

Come up with two different (bold) ways to represent that page and deliver on that customer intent.

Your first A / B / C test.

# 2 Test a Single Page vs. Multi Page Checkout.

One of the highest ways to improve conversion is to reduce Cart & Checkout Abandonment rates. Take money from people who want to give you money!

Some websites have a one page checkout process: Shipping, billing, review and submit.

Some have it on four pages.

I have seen both work, you never know, it really depends on the types of visitors you attract.

checkout options

So if you have a single page why not try the multi (if your abandonment rate is high, say more than 20% :). Or vise a versa?

I have seen very solid improvements in these tests.

Or here's a bonus. Many shopping cart (or basket to my British friends) pages have a Apply Coupon Code box. This seems to case people to open Google and search for codes. So why not move this coupon code box to the Review Before Submit page?

It won't send those who don't have a coupon code looking for one, and by the Review Order page they are way too committed. For those that have a coupon code they can still apply it.

In both these scenarios you are helping your organization find value quickly by touching a high impact area.

And remember, this works for lead submission forms and other such delights.

# 3 Optimize the Number of Ads & Layout of Ads.

Ad supported sites are numerous. And the there is so little restraint, the core idea seems to be let's slap as many ads on the site as we can.

More ads = more clicks = more revenue.

Usually this is never tested.

vgno ads

[I can't read Norwegian so this could be wrong, but I counted a total of 19 ads on this page! Ten above the fold. Important point: American sites are just the same.]

So test the number of ads you should have on a page. Its not that hard. It can be a simple A/B test or a MultiVariate test.

In a memorable test the client actually reduced the number of ads on the page by 25% and the outcomes improved by 40%. I kid you not, 40%. And guess in which version customers were happier.

There is a built in assumption there that you are simply not selling impression, in which case pile on the ads in the pages. You are not being held accountable for outcomes so enjoy the ad party.

Here's a bonus idea.

There are sites were the ad is in the header, it takes up the whole header and is the first thing that loads. I have only seen one case where that worked.

information week ad in header

The header takes up 30% of the space above the fold on a 1024 resolution.

So if that is you why not try a test with the header ad and without? See which one improves overall conversion / outcomes?

The other bonus idea is to try different ad layouts. Most people have banner blindness, top of the page and in the middle of the content (as in Yahoo news).

Why not try different layouts and formats? If not to see which one works the best then to just annoy your customers? :)

# 4 Test Different Prices / Selling Tactics.

You can of course test different pretty images, why not try to reinvent your business model using testing?

A company was selling just four products. But the environment got tough, the competitors got competitive. How to fight back? Some "genius" in the company had an idea "Why don't we give our cheapest product, currently $15, away for free?"

CMO says: Radical idea. CEO says: Are you insane? CFO says: No way!

Now it did present a fundamental challenge, no one like to give revenue up. And people worried about how successful it would be, what would be the revenue impact, why would anyone buy a non-free version etc etc.

Rather than create prediction models (with faulty assumptions!) or giving up in face of the HiPPO pressure, the Analytics team just launched a A/B test. And they controlled for risk (after all the CFO did not want to go bankrupt) by doing a 95% control and 5% version A test.

testing product price points

Perhaps unsurprisingly the free version of the product sold lots of copies.

That was not surprising.

What was surprising was that free helped shift the sku mix in a statistically significant way, i.e the presence of free caused more people to buy the more expensive options. Interesting. [In a delightfully revenue impacting way!]

The other positive side effect was to cause lots of new customers to be introduced to the franchise, as they "purchased" the free version. Lovely.

Here are some bonus ideas.

If you give discounts try 15% off vs $10 off (people tend go for the latter! :)).

Try $25 mail in rebate vs $7 instant rebate (or change amounts to suit).

You get the idea.

# 5 Test Box Layouts, DVD Covers, Offline Stuff.

Let's say you are launching a new product or a dvd or something similar. You want to figure out what layout might be more appealing to people in stores.

thank you for smokingYou could ask your mom to pick a version she likes.

You could ask your agency to ask a few people.

Or you could launch a test online and see which version is rated highest by your website visitors!

I have done tests for DVD covers and the results were surprising.

Or here's another idea…

You are a multi channel customer. You sell bikinis. Now you want to sell Accounting Software. Why not try it on your website before you reconfigure your stores?

Or you are Wal-Mart and it is expensive and takes a long time for you to put new products in your stores. That makes it risky to start stocking the "on paper hideous but perhaps weirdly appealing" Zebra Print Occasional Chairs in your store. What if it bombs?

Well why not add it to your site, see if it sells. If it gets 15 positive customer reviews (!!), then you know you have a winner on your hands.

The actual launch process is faster, you can reduce risk, and you don't have to rely on just your company employees (the fashion mavens) from picking winners and losers.

All done.

I hope that you'll find both compelling reasons for starting experimentation and I have managed to stretch your mind beyond "honey let's start testing shopping cart buttons".

There is so much you can do. This recession season buy your CEO the gift that keeps giving, a experimentation and testing tool.

Here's a summary for you. . . .

Five reasons for online Experimentation & Testing:

    #1 It’s Not Expensive!
    #2 Six And A Half Minutes. That’s it!
    #3 Show ‘em You Are Worth It.
    #4 Big Bets, Low Risks, Happy Customers.
    #5 Tags, CMS, Reports & Regressions: All Included!

Five off the beaten track Experimentation & Testing ideas:

    #1 Fix The Biggest Loser, Landing Page. (& Be Bold.)
    #2 Test a Single Page vs. Multi Page Checkout.
    #3 Optimize the Number of Ads & Layout of Ads.
    #4 Test Different Prices / Selling Tactics.
    #5 Test Box Layouts, DVD Covers, Offline Stuff.

Ok now its your turn.

What are the reasons your company is not jumping on the awesome testing bandwagon? If it did, what finally convinced them? If you are doing testing, care to share some of your ideas? Anything off the beaten path you have tried? Any massive failures?

Please share your feedback, insights and stories.

Thank you.

PS:
Couple other related posts you might find interesting:

Next Page »