January 2008


29 Jan 2008 12:51 am

engagementEngagement is a buzz word. It is a quest. It is altar at which many worship.

Often though, atleast online, our hopes are dashed, efforts expended rarely have adequate ROI, the hype is followed with a bucket of cold water.

It is not that measuring if "Visitors" / "Customers" itself is a ignoble goal. It is more that our execution efforts in measuring engagement are fatally flawed. So much so that recently I was compelled to pen a warning post: “Engagement” Is Not A Metric, It’s An Excuse.

That post outlined the core issues faced in the quest of measuring Online Engagement. It also outlined a four step process you should follow when it comes to trying to measure engagement.

One of my central thoughts was that it is nearly difficult, if not near impossible, to measure Engagement, as many others have passionately recommended, based only on your quantitative data (in our case Web Analytics clickstream data).

So it was with great delight that I read Theo's email that helped me understand exactly why I felt that way! :)

When Theo proposed posting it here on Occam's Razor I quickly agreed.

Theo Papadakis is a Marketing Executive at cScape in London. Our paths first crossed when he invited me to contribute a thought piece to accompany the 2nd Annual Customer Engagement Survey Report (and to his credit he did so after having read my post above!). You can download the report, there are nice graphs and some interesting data and page twenty is quite interesting. :)

Here's Theo. . . . . .

__________________________________________________

Measuring Online Engagement: What Role Does Web Analytics Play?

Before we look at what aspects of a customer's online engagement Web Analytics can capture we need to clarify the meaning of the concept of customer engagement.

Definition of Customer Engagement (Engagement Index)

'Engagement' is a word with many meanings (vow, betrothal, involvement etc). For marketing purposes they can be boiled down to a single concept: one-way relation. If x is engaged with y, x is related to y.

The concept of customer engagement only deals with a particular kind of one-way relationship:

  • Subject of engagement: The subject of engagement should not be limited to customers.


    Although 'visitor engagement' is better in that it takes into account non-customer visitors to your website/store, its focus on measuring people's engagement with your brand on your own premises is too restrictive.

    It is important to measure the engagement of customers, prospective customers and detractors with our brand, in every space they engage with it in.

  • Object of engagement:

    The subject's relationship with a brand / company / product / consumption topic.

Now that we have defined what kind of relationships customer engagement deals with let's look at the criteria with which we can refine and classify the ways in which customers engage:

  • Kind: Customers can be positively or negatively engaged with a company/product.

    A more in-depth examination of kind would reveal its content, usually a mixture of emotional states and rational beliefs, such as, in the case of positive engagement, sympathy, trust, pride, etc

  • Degree: The degree of positive or negative engagement lies on a continuum that ranges from low involvement, namely, the psychological state of apathy, to high.

    An engaged person is someone with an above average involvement with his or her object of relatedness.

With the context setting out of the way. . . . .

What aspect of customer engagement can web analytics capture?

Having defined customer engagement we are better able to delimit what web analytics can and cannot tell us about the engagement of our website's visitors.

Let's look at some of the widely used web analytics metrics and understand what aspect of engagement they capture.

Unique Visits: Shows how many people decided to engage with you for the first time by visiting your website.

Frequency of Visit: Frequency must be contextualised within a specific time frame.

A customer who has engaged 10 times with the company in the past 10 years has a lower degree of engagement for example in relation to a customer who has also engaged 10 times in the company in the last 2 months.

Contextualised 'frequency' can therefore help us to identify the relative degree of our customers' engagement.

Recency of Visit: This metric speaks of the recency of our customers' last engagement.

recency of visit google analytics 1

Jim Novo has proven that it correlates well with degree of engagement. A customer whose last engagement with a brand is more recent than that of another is also likely to be more engaged.

Like frequency it therefore frames our customers' degree of engagement only relatively.

Depth of Visit: This tells us how many pages long our visitors' journeys through the site were.

depth of visit google analytics

Although a deep journey signifies a high degree of engagement this metric again does not distinguish between the kind of engagement.

Do your visitors passionately disagree with what you are writing about? Are they simply unable to find what they are looking for?

In both of these cases a high degree of engagement may be of a negative kind.

Time Spent on Site: Same story as with depth. Time spent correlates with degree of engagement but as it does not discriminate between kind it may simply be negatively spent desperately trying to find the content your visitor is after.

Similarly, most online metrics are only able to capture degree not kind of engagement:

  • Subscribing (feed, email, newsletter)
  • Registering
  • Feedback (comments, complaints, inquiries etc)
  • Rating\tagging\filtering\bookmarking its content
  • User submissions (UGC)
  • Printing or downloading a piece of content
  • Brand index

Degree of Engagement

What comes out of the above discussion is that

…it is impossible to derive the kind (positive/negative) of your visitor's engagement using web analytics alone, and, therefore, that…

…when we are talking about customer engagement in the context of web analytics, we are in fact talking about degree of engagement.

This is not to say that we cannot make inferences and state hypotheses about the kind/content of engagement, based on what we can measure (degree of engagement), nor that these hypotheses are unlikely to be correct.

It is only to say that using web analytics it is impossible to make or support such inferences.

Such inferences, about the kind of engagement, must necessarily be informed by considerations that lie entirely outside the field of web analytics.

Before we begin making inferences on the basis of degree of engagement however let's discuss this metric a bit more.

45degrees 1Following a number of leading Web Analysts I also believe that a customer's degree of engagement is better calculated as a synthetic metric composed of several basic metrics, rather than as a one-metric solution e.g. measuring customer engagement by means of 'duration of visit' only. This requires an argument unto itself that will not be pursued here.

The score each of these component metrics takes however only makes sense if contextualised. Example: a frequent and recent visitor is 'more engaged than' someone who is not, but is he engaged? If yes how engaged is he? There is little we can do with relative statements such as this.

In order to make such statements meaningful and operational we need to contextualize the component metrics that constitute a customer's degree of engagement on a high/low continuum, beginning with apathy and proceeding with progressively higher degrees of engagement.

This means that both the lowest (apathy) and the highest degree of engagement need to be defined. The easiest way to do this is to define the average degree of engagement (the average score for several metrics of your choice across your site or based on a competitor-specific or industry-wide benchmark), considering everything that falls short of it as (increasing degrees of) apathy and everything beyond it as (increasing degrees of) engagement. In this way a customer's degree of engagement assumes a non-relative meaning (it remains of course relative to your website's, competitor's or industry's historical performance).

x greater than y

By inserting relative statements such as 'x is more engaged than y if and only if x does z and y does >z' into a continuum that is based on website \ competitor \ industry benchmarks, it is possible to provide a reference point which although relative in itself (historical performance) is sufficiently stable and pertinent to business performance, to provide with useful insights into visitor behaviour and business\campaign success. (Substitute z with any or an aggregate of the visit metrics score(s)).

Conclusions

  • No web metric, or combination of metrics, can discriminate between kind of engagement i.e. positive engagement. This requires primary research.

  • All web metrics can do is discriminate between relative degrees of engagement.

  • Basic metrics can only discriminate between low degrees of engagement.

  • A customer with a high score in his visit metrics may nevertheless feel apathetic towards the brand.

__________________________________________________

That was interesting, was it not? You can now see why I found Theo's article to be so enlightening.

I found the nuance of the kind of engagement and the degree of engagement to be particularly insightful.

The next time I take quantitative data (even if I mash it into a formula of five different metrics and call it website engagement index!) to my C-level executives I will first state what my engagement index measures is Degree of Engagement.

My hope is this will result in a clear understanding of the limits of what the data is saying.

That then I hope will lead to questions about measuring the Kind on Engagement (and to exploring qualitative measures).

No false promises made from the data. Progress made by understanding its limits and exposing them. That's awesomeness. IMHO.

My heartfelt thanks to Theo for sharing this article with me, and now with all of us.

It's your turn now.

Agree? Disagree? What's your experience? Please share your perspectives, critique, bouquets and brickbats via comments.

PS:
Couple other related posts:

22 Jan 2008 01:22 am

big and smallThis week history is on everyone's mind, especially in our little world of web analytics.

Yes history when it comes to the economy (no one likes a recession!), the war (ok no one like this for sure!), the elections (raise taxes! no, no, cut taxes!) or the Super Bowl (go Giants!!).

But with all that I am positive this week the thing top of mind for many web analytics practitioners is what to do with their historical web analytics data.

Do we switch or do we not switch? What happens to my tags? What about my contract? Where do I log it? Is it time to panic? I have seven years of history, help!!

For the first few questions I am afraid you'll have to answer for yourself.

[Although: While old habits / tools are hard to give up, it "takes about three weeks for a new habit to be hard wired in your brain" (source). That should give you some solace!]

This post tries to take a position on your last worry, historical data and its value.

We have all been brought up to cherish data. To love it, to adore it, to propose marriage, to stick together for better or for worse, to build increasingly vast and complex systems to keep it around and to tap into cloud computing along with a small army of people in your company to keep that data happy.

Most of the time while this sounds like a good mindset (/marriage), especially in the traditional world of ERP and CRM etc systems, on the web unfortunately this is can be a deeply sub optimal mindset (/marriage).

My proposal for you is to divest yourself of this mindset of keeping Web Analytics data around forever. If you, or your HiPPO's, have this mindset then a quickie divorce might be greatly helpful.

The über thought to keep in mind is that from the moment you collect it your web analytics data starts to decay and lose value. Oh it is useful on the first day, and the first month, but less so in six months, and click level data is nearly valueless in a year or so.

Why you ask?

  • Your Visitors "change" too much.


    Remember that at the end of the day almost all of us collect anonymous non-personal data from our Visitors. They swap browsers and machines and upgrades (if not outright blow your cookies away every day!) the data is less useful in identifying any usage trends and patterns tied to people.

    This is less a problem in traditional Data Warehouse environments.

rapid change 1
  • Your computations change too much.


    We were all on third party cookies and then you moved to first party cookies (say yes to this one!!). Most of your visitor stats just became uncomparable.

    You shift from logs to tags to tags of a new vendor to tags of your newest vendors and now you are going tagless! You are now comparing a bowl of chopped apples to fruit salad.

    Vendors and practitioners have changed basic formulas for measuring the core stats every every so often. They rarely reprocess history (too hard!) making it hard to provide continuity.

  • Your systems change too much.


    At the end of the day three things are captured by your analytics tool. The Referrer. The page URL. The Cookie.

    As you evolve your web site platform, from Interwoven to ATG, or move it around, hosts / servers, or add remove functionality like internal search or recommendation engines or multivariate testing or behavior targeting or other such things it usually impacts all three of those critical things that make up your data.

    Resulting impact can make your data disjointed.

    And I am not even touching changes from static html to dynamic html to personalized content to flash to flex to ajax to RIAs (Rich Internet Applications) etc etc (all of which again impact the three pieces you collect).

  • Your website changes too much.


    This is perhaps the biggest thing that most of us don't reconcile with. Most websites are on the Yahoo! "paradigm" and not the Google "paradigm". . .

yahoo google home page evolution 1

      . . . and that is quite ok (though the image above might suggest otherwise).

      Your home page three months ago is not your home page now (is it? hopefully not!). You killed your have product line pages last year, opting for product detail pages for SEO reasons. There was no PayPal last week. Maybe 2007 was your first year with the support and ecommerce sites merged into one.

      In the last six months you have learned so much about your business, about your data, about your visitors, about how fast you are being left behind (or are ahead of everyone else!). . . and your web presence has changed accordingly.

      Every change above changes the data you have and what value you can get from me three months from now (when you will have changed even more). It is important not to forget this.

  • Your people change too much.


    Sad as it might be the hardest people to find now are web people. Not just great web analysts, which we know are scarce (!), but web people in general. Front end, back end, middle tier, thin, overweight, rich, poor, newbies, experienced, all kinds are hard to find.

    As people come and people go their actions have a subtle but important impact on all aspects of your data ecosystem.

You have a few years of historical web analytics data. Give me the benefit of the doubt, for just five minutes, and think of the above five items with a kind of sort of open mind. Would you still keep terabytes of data from two years around?

The pace of change on the web is so tremendous (pages, sites, business rules, experiences, applications, data capture, what's right and what's wrong, what's doable). In this hyper fast environment all the detailed data, perhaps you'll agree, is not very useful because it has decayed too much.

opportunity chineseIt's the decay that is the root cause.

But at the same time it is also an opportunity. Because it means that you are not tied to the past in a egregious manner. It means you can think smart and move fast. If what you have now will be of less value soon then you will cherish the now more try to get something out of it.

It also means that it gives you the freedom not to be tied to legacy systems or legacy tools or legacy data. You can move forward to the next and better much faster than our Sisters and Brothers in the traditional world have been.

It means a lot more fun because you get to learn and adapt and get value and move on. It is damn exciting and damn liberating!

Yes, yes, yes, you knew this was coming . . . . .

Keep some history around. Aggregated data. Historical markers.

Weekly trend (counts) of Visits and Unique Visitors. Top ten referrers to your website by month. Monthly Bounce Rate. Weekly + monthly trends for Revenue and Products Sold. Perhaps Conversion Rate trends for your site Overall and important campaign categories. Top groupings of content consumed on the site.

Aggregated data, for your critical few metrics (that won't become less important with time!). And some revenue stats just to prove that you're worth it!

aggregrated data

Keep that around as long as you have it. It will all fit on one tab in a Excel Spreadsheet. That is all you'll need. They might be slightly different for you than above, but I assure you it will fit in a spreadsheet.

Or if you prefer here is another suggestions. . . . .

Keep your "click level" (detailed) data around for a year (assuming seasonality!) and your "session level" (aggregated) data for as long as you want to / have to (and it will fit in a spreadsheet).

In closing:

It is extremely difficult to get anything out of your web analytics that you can action right now. I humbly recommend that in the drive to conquer history that you don't forget the present and ignore the price that you'll pay every day that will come for every day that has gone by.

History is important in other context, but in web analytics tools, for now, change on the web reduces value from old data. This might cease to be the case at some point, but that point is some ways away.

Before you cut a big chq for your consultant, consider the above, consider what you are actually buying.
 

Oh and those of you worrying about switching tools & losing data, worry not (too much): Go forth and prosper!
 

As always it is now your turn. . . .

Agree? Disagree? What am I missing? Am are in Antarctica all by myself on this one? Am I wrong to think this is our own version of "an inconvenient truth"? What's your experience?

Please share your perspectives, critique, bouquets and brickbats via comments.

[Like this post? For more posts like this please click here, if it might be of interest please check out my book: Web Analytics: An Hour A Day.]

Next Page »