Wednesday, March 30, 2011

What Do You Want From Your Data? Five Must-Have's









In a recent post, I wrote about the pitfalls of taking your data at face value and last week, I blogged about not knowing the true impact of critical organizational attributes without diving more deeply into the data (read: do more than explore it visually or look at up/down trends).

My business partner sent me an excellent article this morning by Stacy Harris that asks the question: Are You Considering Firing Your Employee Engagement Partner? I’ve been thinking a lot lately about data and how to make sense of it and I recommend this article for anyone else on a data journey. I think Stacy’s research applies to any type of data collection method. I’m summarizing what I took away from the article:

It’s not enough to have a survey; you need a strategy. A stand-alone survey gives you just that: one view whereas designing a survey that supports a business strategy provides data that can be aggregated with other data sets for a more accurate and complete picture.

If you ask generic questions, you get smiley or frown-y answers but no insight. Whether you have 12 questions or 40, if they don’t speak to your culture, your workforce or your targeted customers, I’m hard pressed to see how you will get any “aha’s” that are worth the time and expense of a survey.

What specifically do you want from a survey or any “listening post” that you design? “Because we’ve always done one” or “Because everyone else does it” are not specific objectives. What business issue do you hope to resolve? What’s keeping you or your boss up at night?

How are you analyzing the data? Scores alone may not tell you what you need to know. Trends without relationship to some rationale are just up or down points on a chart. Desktop tools exist to assist. Help is available (call me).

If you’re hung up on benchmarks, you may never get to the “why” of your own project. I know that some organizations swear by a benchmark study and who can argue, as long as the benchmarks map precisely to your own situation and as long as it’s not your only measure. Like generic questions, without the appropriate construct, a benchmark exercise can leave you with no specific roadmap for your success.

Collecting data is a critical component of every function these days. It's a project like any other, with objectives, outcomes and measures. Is your data giving you what you need?


Wednesday, March 23, 2011

Do Scores Matter if You Don't Know What is Critical to Your Customers?















Last week, I wrote about the importance of knowing what drives customer dissatisfaction, which dealt with data interpretation and the need to dive more deeply into the business issues as well as the data.

There are exploratory ways to interpret data and more analytical methods as well. The key to correct interpretation is knowing which approach will deliver insights to your business. As I said in my last post, coming to the wrong conclusions even with good data is a possibility without using the right tools. In the case of customer loyalty, you could be investing in programs that have little or no impact on the customer’s intention to buy again or you could be ignoring “dissatisfiers” that diminish a customer’s perception of your critical attributes.

So, there are a few initial questions to ask:

  • What do you believe are the attributes that contribute to your customers’ loyalty?
  • Are you measuring those attributes specifically in any data gathering exercise including social media monitoring?
  • Do you classify these attributes in terms of their impact on the customer or importance?

The example below is from an article titled Guests’ Perceptions on Factors Influencing Customer Loyalty, which appeared in the March 2010 issue of the International Journal of Contemporary Hospitality Management.

Customer Service : Dissatisfier
Cleanliness : Neutral
Room quality : Dissatisfier
Value for money : Critical
Quality of food : Dissatisfier
Family friendliness : Neutral

The authors selected typical product and service attributes for a guest at a hotel and designed questions around those.
Using simple regression, they did something very interesting and in my view very revealing about the data they collected. The usual loyalty question was asked (intention to return) and all other questions were tested against this one. Then, questions above a median score were tested individually and those below the median also were tested in the same way. The results were plotted against agreed criteria from Critical to Neutral.

Critical Attributes were significant in both tests and would be considered a driver of loyalty as well as a reason to switch. These have high compliments and high complaints. Performing well in other areas won't compensate for low performance here.

Dissatisfiers were significant in testing low performance but not when high performance was tested. So, if customer service is bad, it influences a decision to switch but an average experience doesn’t critically drive loyalty. These are the attributes that should be maintained but not at the expense of more critical ones.

Neutrals generally may not be noticed by customers and although bad performance would reduce perceptions of quality, it would not be to the point where quality is considered poor.


There are some lessons to be learned, I think, from this type of data interpretation:

  • Simple statistical tests are used in a way to deliver insight that would be difficult to obtain with exploring low and high scores alone.
  • Knowing whether a key attribute delivers the loyalty factor; whether it has no affect or whether it destroys loyalty is so valuable in terms of designing the customer experience and making the right investments.
  • This is the kind of "I know" insight that is compelling when reporting on Voice of the Customer issues.
How are you measuring customer loyalty? Do you know what is critical; which areas need only a minimum performance to maintain loyalty and which attributes have no impact on loyalty at all?

Tuesday, March 15, 2011

What's Wrong With Taking Survey Data at Face Value?













I’ve been designing survey questionnaires and analyzing the data for so long that I often forget that some people may not be doing the deep dive and asking the hard questions of the data they’ve collected and for which our clients hire us (thank you, you know who you are). Maybe a little exploratory analysis, a tad of correlation, a glance at the verbatim comments and we’re done until the next time. Did we do a survey? Check. Did we do anything with it? Sure, sort of. Do we have a deep understanding of what the data means? Well….

What’s worse than not gathering intelligence from customers and employees?
Coming to the wrong conclusions!

I’m reminded of this fact by two articles I read last week: In This Case, Let’s Examine Dissatisfaction in the February issue of Survey magazine and Guest Perceptions on Factors Influencing Customer Loyalty in the current issue of the International Journal of Contemporary Hospitality Management.

In the case of customer dissatisfaction, the article suggests several calls to action:

Understand whether you have a category problem rather than a brand problem. In other words, your competitive space may allow easy switching with or without loyalty programs so make sure you know what your issue is before investing in programs that will not alleviate it.

Your market strategy will drive a customer’s perception of satisfaction. If you are a low cost provider, you have accepted that a lower level of quality and service is part of the equation. The danger zone you could find yourself in is in trying to be low cost while also attracting a customer who looks for a different level of product and service.

Benchmarking. I’ve never been a fan but lots of companies do it and the swirling vortex that you get sucked into is that you compare your performance to companies who target different customer segments.

Dissatisfaction may not arise from what you do but rather what other, similar companies do that you don’t do. Customers constantly evaluate decisions based on alternatives; some amount of dissatisfaction arises with your product and service even if you are executing your strategy perfectly.

My suggestions for arriving at the best conclusions possible from your data analysis:

  • Keep your strategy upper most in mind when designing the project and return to it often when analyzing data. This means knowing who your competition is; who the ideal customer is and what your competitive advantages are.
  • Design survey questions to be particular rather than general. The more generic the question, the less likely it is that you have actionable data and the more likely you potentially are arriving at the wrong conclusions.
  • Don’t confuse happy with satisfied. If you want to meet a customer’s needs, you are aiming for satisfaction. If you want happy, that’s a whole different level of expectations.
  • Perform data analysis from several different perspectives. Not all survey questions should be treated equally in reaching conclusions.
I’ll write more about the last topic next week.




Wednesday, March 2, 2011

Can You Hear Me Now? It's Your Customer Calling
















Does anyone still breathing believe that we can succeed without customers? I didn’t think so. If people don’t want what we’ve got to sell, we’re toast. So, there should be no argument that customer satisfaction and loyalty are goals we should be eagerly pursuing since the pundits tell us that it costs five times as much to acquire a new customer as it does to keep one.

I know, some of you are saying satisfaction and loyalty, yes, “but not at all costs” and “not the wrong customers for our business”. Exactly. So how are you managing those customer issues for your benefit and theirs?

Hello? I’m Out Here and Here and Here…: There are so many channels from which to collect customer feedback and each one potentially provides data gold. However, what is interesting may not be useful. Before buying, jettisoning or adding onto your business intelligence platform, a game plan is imperative.

Customer orientation is a critical part of developing a business strategy. It’s hard to describe competitive advantage and competencies unless we’ve defined how we will deliver products and how we will serve our customers. Business intelligence needs to be aligned with both strategy and customer needs. A company’s market research used to be private; now it is open and available on every social media outlet and search engine just by mining customer comments. We shouldn't ignore the qualitative nature of data and fail to integrate that into other customer data sets as it is necessary for ongoing strategy development

See Your Company Through Your Customers’ Eyes: It’s disappointing to deal with companies that seem to have no idea how many customer touchpoints they have and infuriating that they are not integrated. It’s inexcusable that frontline employees are neither selected nor trained for a service job. If you invite some of your co-workers into a room with a big white board and start mapping customer interactions, it is a mind-boggling exercise. Or, just read your customer comments online.

Create a Single View of a Customer: I’ve said it before: customers do not care about our silos. It’s not their job to sort out which department is responsible for resolving a problem. A fragmented approach rather than Outside-In customer experience design means that your employees have a perspective of your customer based on their job roles not on customer expectations.

It’s Called Voice of the Customer: I design a lot of feedback and research surveys and I am sent a few as well. It takes me less than 30 seconds to decide whether or not I’ll complete a survey depending on how it’s constructed. Questions should be properly worded and of an appropriate length; be free of bias; be actionable and be linked to the customer experience as well as the company’s metrics and Key Performance Objectives.


I’ve been on a data and analytics soap box for a while and there is no doubt that through collection of behavioral, rational and experiential data, we are doing a better job of knowing who buys from us and why. To understand what we didn’t sell and why requires tapping into customer emotions. Maybe someone will develop an App for that. Until then, we have to collect, aggregate and understand qualitative data to hear what are customers are feeling as well as saying.

To paraphrase a line from my newest favorite film, The King’s Speech, your customers have a voice. Yes, they do.



Tuesday, February 22, 2011

"Even the Longest Journey Must Begin Where You Stand"




















The quote is from Lau Tzu, the Chinese philosopher-turned-management-guru -- and I like it for a couple of reasons. First, it’s so true: you have to honestly appraise your current situation in order to reach any goal. Secondly, no matter how bold your strategy, you can’t obfuscate the situation, thinking that strategy only needs to be stated to be accomplished. As any successful person will tell you, there’s a lot of sweat equity that has to be paid between where you stand and the journey you take.

As you probably know by now, I’m fascinated by data and passionate about analytics and how both will transform our businesses. However, it’s a journey not a sprint and begins with assessing what we call "the path to desired business results":



There are five key drivers of performance in any organization and while analytics might be the means to an end, these enablers are the catalysts. So, standing where you are now, it’s worthwhile asking the following questions:

Strategy: How will analytics help us compete successfully? Will we be able to differentiate ourselves in our markets using analytics?

Leadership: Are we as leaders prepared to commit the organization to an analytics based way of making decisions? Can we give the employees who will have to make this work the buy-in they need to be successful? Can we put aside our impatience and allow them to Think Big but Start Small?

Culture: Do we have data fiefdoms that refuse to share data or collaborate on projects? Do we celebrate the efforts of the early adopters, even if success isn’t guaranteed every time, in the spirit of discovery and experimentation? Analytics is all about experiments, testing, and doing it over and over. Have we made more of our decisions by the seat of the pants and been proud of it?

Employees: Do we want analytics to cascade down into the organization and, if so, are we prepared to properly train those whose jobs it will be to manage the technology that supports their business knowledge? Are we hiring employees for competencies that underpin the need for a broadly based analytics movement?

Customers: This is one area of most businesses that has received the most analytics attention so the questions here are: is our customer data in one place, is it at the lowest level of analysis possible and have we aggregated it across all of our channels?

The analytics journey is probably never ending as technology and competencies improve exponentially to deliver more insight with less complexity. But, knowing where you stand before you take the first step – or flying leap – will ensure that this critical initiative doesn’t crash and burn at the first turn in the road.

What is your analytics journey like? Are you looking first at where you stand or sprinting off for the unknown?

Tuesday, February 15, 2011

Transform Your Metrics From "So What?" Into "Who Knew?"

















There are a lot of 3D movies out; have you noticed? I don’t seek them out but I appreciate the fact that people may enjoy a film more when it is multi- dimensional and they can feel immersed in the action.

I think we get too fond of our metrics; we have them because we’ve always had them. We track metrics and manage them and present their variances against performance goals. The problem? They often are one-dimensional and not very meaningful outside of our own function. And, if they aren’t tied to a real business outcome, it’s hard to make a case for the programs we want to implement. We may not monetize metrics, which is the language of our bosses; so there’s a sense of “so what?” when we present.

So, how do you make a common metric like turnover (employee or customer) more 3 dimensional and get people immersed in your action?

Embed Metrics With Data: Not just the obvious data of people in/people out. Drill down; explore data. There’s an “aha” in there I promise and since you have the business context, there is no one better positioned to see it and explain it.

Use Data Sets from Other Departments: Make your metric multi-dimensional by bringing in data from HR, Sales, Marketing, Operations, Process, Call Center: whatever data set you have, add to it in a smart way by collaborating with other departments who also have valuable data that isn’t yet insight. We have to dismantle data fiefdoms and share. Where does turnover impact the business? How does it impact the business?

Try Simple Statistical Tests: This is the point at which people click off because they think it’s not in their skill set. If you have Excel on your PC, you have a statistical toolkit. Invest in a great little e-book that provides a huge amount of good information and it is well presented (Using Excel to Solve Business Problems by Curtis Seare). Try out various assumptions to see which are more powerful. Who is leaving? What is driving turnover? How does it affect customers? How does it impact employees? Where does it affect business goals? Experiment with results and keep testing.

Provide a Business Context: Sometimes people get hung up with statistics, even simple ones and forget that the most important point is taking what statistics can tell you and mapping that to what you know about the business.

Tell Me Something I Don’t Know: aka Monetize the Results. When you know what turnover really costs the company and what it costs to improve the situation, you will have the attention of people who haven’t seen your metrics/data/ideas presented in a way that they understand.

Then, your metrics are multi-dimensional and provide real intelligence for the organization.

How are you helping your decision-makers get immersed in your metrics? Are they in 3D?

Thursday, February 10, 2011

What Kind of a Boss Are You Anyway?



















The title of today’s post came from a reader who sent me an email after my blog about employee commitment. There’s a lot of pent up frustration out there about us mangers, People, and it’s bound, as my reader suggested, to chase away your best and brightest.

As someone who likes to research a topic in order to develop it properly, I’ve done my due diligence and here is how some of the issues stack up. Do you recognize even one that applies to you? If so, there are things you can change IF you have the heart and mind to do it.

Don’t Believe All the Stuff You Read in Reviews: only Paris Hilton does that. Put your own boss’s comments in context. Is she bad about having productive performance conversations and the annual review (if you even get one) is rushed and bland? Maybe you should ask your team how you’re doing and really listen for feedback.

Are Your Bad Habits Rubbing Off on Your Staff? Are you continuously late to team meetings or do you regularly ask employees to work extra hours because you couldn’t get your act together? As Stephen Covey said, being in the thick of thin things means that the important things never get done – until they reach a crisis. Putting out fires is not the measure of a good manager or leader no matter how good that adrenaline rush feels.

Do You Show Genuine Respect for Other People? I confess: I absolutely hate people responding to email and texting during meals and meetings or continuing to work while I sit there for a scheduled meeting; and the only functioning brain in the room is mine because the other person erroneously believes in the myth of multitasking. It’s been proven that multitasking is less productive than attending to one thing at a time and following through on it. If you don’t respect my time or the purpose of the meeting, don’t call one -- or me.

You Went on the Leadership Courses; Now What? I’m not one of those people who believe that leaders are born not made although the potential needs to be there. Winston Churchill was a terrible peacetime leader but he understood what it took to lead in the darkest of times. I don’t think he went on a course to develop that skill set. The point is, leadership training can be generic and disconnected from either your job or your organization’s culture; making it very difficult to apply the content in everyday work. The best recipe is training that is a direct result of a good performance review (not filling up a predetermined number of annual training hours) coupled with mentoring and coaching. Being curious and unafraid to ask questions are also vital development tools.

As managers, too often we don’t take stock of our own performance and how it affects the people who have to make things work on the team. Are we lazy or in over our heads? Are we modeling the behaviors inflicted on us by our own bosses? When you find one day that your best people are leaving, you may need to own up to the fact that people rarely leave their jobs; they leave their bosses.

We’ve all had great bosses. Who were yours and what made them great?