Wednesday, March 30, 2011

What Do You Want From Your Data? Five Must-Have's









In a recent post, I wrote about the pitfalls of taking your data at face value and last week, I blogged about not knowing the true impact of critical organizational attributes without diving more deeply into the data (read: do more than explore it visually or look at up/down trends).

My business partner sent me an excellent article this morning by Stacy Harris that asks the question: Are You Considering Firing Your Employee Engagement Partner? I’ve been thinking a lot lately about data and how to make sense of it and I recommend this article for anyone else on a data journey. I think Stacy’s research applies to any type of data collection method. I’m summarizing what I took away from the article:

It’s not enough to have a survey; you need a strategy. A stand-alone survey gives you just that: one view whereas designing a survey that supports a business strategy provides data that can be aggregated with other data sets for a more accurate and complete picture.

If you ask generic questions, you get smiley or frown-y answers but no insight. Whether you have 12 questions or 40, if they don’t speak to your culture, your workforce or your targeted customers, I’m hard pressed to see how you will get any “aha’s” that are worth the time and expense of a survey.

What specifically do you want from a survey or any “listening post” that you design? “Because we’ve always done one” or “Because everyone else does it” are not specific objectives. What business issue do you hope to resolve? What’s keeping you or your boss up at night?

How are you analyzing the data? Scores alone may not tell you what you need to know. Trends without relationship to some rationale are just up or down points on a chart. Desktop tools exist to assist. Help is available (call me).

If you’re hung up on benchmarks, you may never get to the “why” of your own project. I know that some organizations swear by a benchmark study and who can argue, as long as the benchmarks map precisely to your own situation and as long as it’s not your only measure. Like generic questions, without the appropriate construct, a benchmark exercise can leave you with no specific roadmap for your success.

Collecting data is a critical component of every function these days. It's a project like any other, with objectives, outcomes and measures. Is your data giving you what you need?


Wednesday, March 23, 2011

Do Scores Matter if You Don't Know What is Critical to Your Customers?















Last week, I wrote about the importance of knowing what drives customer dissatisfaction, which dealt with data interpretation and the need to dive more deeply into the business issues as well as the data.

There are exploratory ways to interpret data and more analytical methods as well. The key to correct interpretation is knowing which approach will deliver insights to your business. As I said in my last post, coming to the wrong conclusions even with good data is a possibility without using the right tools. In the case of customer loyalty, you could be investing in programs that have little or no impact on the customer’s intention to buy again or you could be ignoring “dissatisfiers” that diminish a customer’s perception of your critical attributes.

So, there are a few initial questions to ask:

  • What do you believe are the attributes that contribute to your customers’ loyalty?
  • Are you measuring those attributes specifically in any data gathering exercise including social media monitoring?
  • Do you classify these attributes in terms of their impact on the customer or importance?

The example below is from an article titled Guests’ Perceptions on Factors Influencing Customer Loyalty, which appeared in the March 2010 issue of the International Journal of Contemporary Hospitality Management.

Customer Service : Dissatisfier
Cleanliness : Neutral
Room quality : Dissatisfier
Value for money : Critical
Quality of food : Dissatisfier
Family friendliness : Neutral

The authors selected typical product and service attributes for a guest at a hotel and designed questions around those.
Using simple regression, they did something very interesting and in my view very revealing about the data they collected. The usual loyalty question was asked (intention to return) and all other questions were tested against this one. Then, questions above a median score were tested individually and those below the median also were tested in the same way. The results were plotted against agreed criteria from Critical to Neutral.

Critical Attributes were significant in both tests and would be considered a driver of loyalty as well as a reason to switch. These have high compliments and high complaints. Performing well in other areas won't compensate for low performance here.

Dissatisfiers were significant in testing low performance but not when high performance was tested. So, if customer service is bad, it influences a decision to switch but an average experience doesn’t critically drive loyalty. These are the attributes that should be maintained but not at the expense of more critical ones.

Neutrals generally may not be noticed by customers and although bad performance would reduce perceptions of quality, it would not be to the point where quality is considered poor.


There are some lessons to be learned, I think, from this type of data interpretation:

  • Simple statistical tests are used in a way to deliver insight that would be difficult to obtain with exploring low and high scores alone.
  • Knowing whether a key attribute delivers the loyalty factor; whether it has no affect or whether it destroys loyalty is so valuable in terms of designing the customer experience and making the right investments.
  • This is the kind of "I know" insight that is compelling when reporting on Voice of the Customer issues.
How are you measuring customer loyalty? Do you know what is critical; which areas need only a minimum performance to maintain loyalty and which attributes have no impact on loyalty at all?

Tuesday, March 15, 2011

What's Wrong With Taking Survey Data at Face Value?













I’ve been designing survey questionnaires and analyzing the data for so long that I often forget that some people may not be doing the deep dive and asking the hard questions of the data they’ve collected and for which our clients hire us (thank you, you know who you are). Maybe a little exploratory analysis, a tad of correlation, a glance at the verbatim comments and we’re done until the next time. Did we do a survey? Check. Did we do anything with it? Sure, sort of. Do we have a deep understanding of what the data means? Well….

What’s worse than not gathering intelligence from customers and employees?
Coming to the wrong conclusions!

I’m reminded of this fact by two articles I read last week: In This Case, Let’s Examine Dissatisfaction in the February issue of Survey magazine and Guest Perceptions on Factors Influencing Customer Loyalty in the current issue of the International Journal of Contemporary Hospitality Management.

In the case of customer dissatisfaction, the article suggests several calls to action:

Understand whether you have a category problem rather than a brand problem. In other words, your competitive space may allow easy switching with or without loyalty programs so make sure you know what your issue is before investing in programs that will not alleviate it.

Your market strategy will drive a customer’s perception of satisfaction. If you are a low cost provider, you have accepted that a lower level of quality and service is part of the equation. The danger zone you could find yourself in is in trying to be low cost while also attracting a customer who looks for a different level of product and service.

Benchmarking. I’ve never been a fan but lots of companies do it and the swirling vortex that you get sucked into is that you compare your performance to companies who target different customer segments.

Dissatisfaction may not arise from what you do but rather what other, similar companies do that you don’t do. Customers constantly evaluate decisions based on alternatives; some amount of dissatisfaction arises with your product and service even if you are executing your strategy perfectly.

My suggestions for arriving at the best conclusions possible from your data analysis:

  • Keep your strategy upper most in mind when designing the project and return to it often when analyzing data. This means knowing who your competition is; who the ideal customer is and what your competitive advantages are.
  • Design survey questions to be particular rather than general. The more generic the question, the less likely it is that you have actionable data and the more likely you potentially are arriving at the wrong conclusions.
  • Don’t confuse happy with satisfied. If you want to meet a customer’s needs, you are aiming for satisfaction. If you want happy, that’s a whole different level of expectations.
  • Perform data analysis from several different perspectives. Not all survey questions should be treated equally in reaching conclusions.
I’ll write more about the last topic next week.




Wednesday, March 2, 2011

Can You Hear Me Now? It's Your Customer Calling
















Does anyone still breathing believe that we can succeed without customers? I didn’t think so. If people don’t want what we’ve got to sell, we’re toast. So, there should be no argument that customer satisfaction and loyalty are goals we should be eagerly pursuing since the pundits tell us that it costs five times as much to acquire a new customer as it does to keep one.

I know, some of you are saying satisfaction and loyalty, yes, “but not at all costs” and “not the wrong customers for our business”. Exactly. So how are you managing those customer issues for your benefit and theirs?

Hello? I’m Out Here and Here and Here…: There are so many channels from which to collect customer feedback and each one potentially provides data gold. However, what is interesting may not be useful. Before buying, jettisoning or adding onto your business intelligence platform, a game plan is imperative.

Customer orientation is a critical part of developing a business strategy. It’s hard to describe competitive advantage and competencies unless we’ve defined how we will deliver products and how we will serve our customers. Business intelligence needs to be aligned with both strategy and customer needs. A company’s market research used to be private; now it is open and available on every social media outlet and search engine just by mining customer comments. We shouldn't ignore the qualitative nature of data and fail to integrate that into other customer data sets as it is necessary for ongoing strategy development

See Your Company Through Your Customers’ Eyes: It’s disappointing to deal with companies that seem to have no idea how many customer touchpoints they have and infuriating that they are not integrated. It’s inexcusable that frontline employees are neither selected nor trained for a service job. If you invite some of your co-workers into a room with a big white board and start mapping customer interactions, it is a mind-boggling exercise. Or, just read your customer comments online.

Create a Single View of a Customer: I’ve said it before: customers do not care about our silos. It’s not their job to sort out which department is responsible for resolving a problem. A fragmented approach rather than Outside-In customer experience design means that your employees have a perspective of your customer based on their job roles not on customer expectations.

It’s Called Voice of the Customer: I design a lot of feedback and research surveys and I am sent a few as well. It takes me less than 30 seconds to decide whether or not I’ll complete a survey depending on how it’s constructed. Questions should be properly worded and of an appropriate length; be free of bias; be actionable and be linked to the customer experience as well as the company’s metrics and Key Performance Objectives.


I’ve been on a data and analytics soap box for a while and there is no doubt that through collection of behavioral, rational and experiential data, we are doing a better job of knowing who buys from us and why. To understand what we didn’t sell and why requires tapping into customer emotions. Maybe someone will develop an App for that. Until then, we have to collect, aggregate and understand qualitative data to hear what are customers are feeling as well as saying.

To paraphrase a line from my newest favorite film, The King’s Speech, your customers have a voice. Yes, they do.