Saturday, March 29, 2014

Your Initial CX Assessment...Part 4 - Staff Interviews

Okay, it’s now time to push back from your desk and start some “management by walking around.”  If you were fortunate to have access to all or some portion of the qualitative and quantitative research discussed  in previous posts, you should have at least a reasonable sense of how the customer experience stacks up at your company.  However, there’s a maxim I go by whenever reviewing research results, “trust but verify.”  As previously mentioned, even the most rigorously designed and executed research is not “perfect” in that it can never truly understand the intangible and elusive customer feelings, motivations, and perceptions that are part of their considering, shopping for, purchasing, and using your product or service.  The best you can hope to do is obtain a fairly strong approximation of how your customers experience the various ways they experience your organization.  So, in this post, I’d like to discuss a group of additional sources that will compliment your formal person discussions with select internal and external staff that deal directly with your customers.  And yes, when we get to the posts that will focus on designing and using CX research, we’ll discuss how best to have a personal chat with your customers.

Internal Office Staff
As you begin your “primary research,” I’m going to suggest a couple of reasons why you start by identifying those internal staff that don’t necessarily deal directly with customers, but nevertheless have roles that may fulfill a customer inquiry, or that support a front-line person who does interact with customers.  First this staff may be able to fill in some of the context you’ve gleaned from looking at the secondary research you completed previously.  While this particular staff may not be receiving direct feedback from customers, they will likely know what’s running smoothly and what’s not working in terms of the internal processes, policies and tactics that support the delivery of products and services to customers.  Second, in a future post, we’re going to discuss one of the most valuable tools for the CX manager - a customer journey map.  In developing your journey maps, you’ll need input from those who provide the product or service directly or indirectly to the end customer.  As you go through your interviews, make a note of those who you think might be valuable members of your journey mapping team.
Here’s some questions to guide you as you walk around your office...
  • What does this person do, and what’s the connection to the customer?  How well do they explain what they do?  This is an important criteria to qualify for your journey mapping team.
  • If they had to redesign their job to improve their particular output, how would they do it?  What would this time? improve quality? save money?
  • What do they think of the company’s customers?  No, not personally, but rather, do they see things that aren’t working...certain customers, for example, tend to return items more than others, or tend to contact the call center with specific inquiries.
  • What do they think of the internal processes that support specific customer touchpoints?  Are these processes efficient, or do they hinder the job?  Do they think the processes are money well spent, or could they be done in a way that might incur less cost or time? 
Suppliers to the company have a couple of particular vantage points that could be beneficial in your CX assessment.  You’ll need to identify those vendors who, in some way, support interaction(s) the company has with its customers.  An obvious example of this is the company’s customer research supplier.  The majority of companies will retain a third-party to execute their customer satisfaction survey.  So, a key question to ask this supplier is how seriously do they think the company is about managing their customer experience.  To answer this, the supplier should be able to share such things as... 
  • How often do they meet with the company to review survey results?
  • How often does the company request changes to the questionnaire and/or reporting?
  • How often do company staff log-in to the online survey reporting, or request ad hoc analytics from the vendor?
The company’s suppliers may also provide you with another potentially useful does your company’s customer experience compare with that of their other clients?  Again, your company’s research supplier no doubt provides comparable services to other clients, including your organization’s competitors.  This question is certainly fair game, and a good vendor should be able to provide you with a candid response and still maintain the confidentiality of their clients.  

By all means, expand the scope of your supplier investigation beyond your research vendor.  If yours is a manufacturing company, for example, it likely sources components from an extended supply chain.  Have a chat with these vendors and ask about such things as what are your company’s expectations for the quality of the products they order?; do they think your company is a leader in serving the marketplace with products liked by customers? 

Front-Line Staff
Whether it’s a receptionist, a salesperson on a showroom floor, a flight attendant, or a call center agent, these staff, and others like them in direct customer facing roles literally hear the “voice of the customer.”  They’ve seen the good, the bad, and the ugly, in customer experiences, and can likely talk at length about what’s working and what’s not.  Before we get into a few suggested questions for this group, a couple of caveats to keep in mind. First, it’s critically important that you speak with as many front-line staff as is reasonably possible for you.  You’ll likely find that many staff will either have some ax to grind with the company, or conversely, see their employer as a place that can do no wrong.  Both of these views will adversely color what should be your balanced judgement, so talk to as many people as you can so you can hopefully smooth out these extremes.  The second caveat flows from the first, and that is to make sure you talk to veterans and rookies alike.  They’ll each have their own unique perspectives that should provide rich context for your assessment.  Now for some suggested questions...
  • How long have you been with the company?  And what’s your tenure in this customer-facing role?
  • What are customers telling them about the company’s products and services?  You’ll want to confirm that this is not their spin on what customers are saying, but rather, is an almost precise replay of customers’ comments (i.e. “the unvarnished truth”).
  • What do think is going well?  You’ll want to probe on this, so ask them to be as specific as possible (i.e. does this apply to all customers?  Is it going well in specific locations but not others?). 
  • What compliments are they receiving from customers?
  • In their opinion, where is the company not meeting customers’ expectations?  Again, try to collect as much detail as possible.
  • Of those you’ve interviewed, who do you think displays a strong level of credibility?  Who do you think explains things well to you?  Make a list of these staff as I strongly recommend you include them in future journey mapping and planning meetings.

In the next couple of posts, we’re going to continue our CX assessment by spending some time making some sense of all the information you’ve collected thus far.

Tuesday, March 25, 2014

Your Initial CX Assessment...Part 3 - Qualitative Research and Customer Exit Surveys

In this post we’re going to look at two seldom used (in the customer experience context), but invaluable sources of information.  

Qualitative research...typically focus groups or individual interviews, and (increasingly used in the CX domain) ethnographic observation, provides your CX assessment with the all-important customer sentiments that are not available through quantitative research.  While quantitative will tell you how many customers say “x” or purchase “y”, a well designed qualitative research study should give you a sense of how and why customers feel about interacting with your company and its products and services.  As Anne Beall points out in her book, Strategic Market Research (1), “...qualitative work allows us to explore all the potential thoughts, feelings, and behaviors of a group of people.”  The results of qualitative research can be presented in various formats...written reports typically prepared by the moderator, audio or video recordings of the customer discussions, and in some cases, the transcripts that usually accompany on-line panel discussions.  If you’re fortunate to come across some recent qualitative research that your organization has completed on topics such as how customers learn of, evaluate, purchase and use your product or service, here are some items to look for and potential questions to keep in mind...
  • How’s their body language?  Obviously you can only get a sense of this in a video, and if you’re lucky to either personally view a focus group, or have a access to a video recording, pay careful attention to not only what customers say, but their expressions and body movements as well.  Saying positive things about their experience with the company in the absence of smiles or other positive expressions may tip you off that the customer is not being sincere.
  • In listening to audio or watching a video of the research, how would you characterize the sentiment of the participants?  Were their views generally favorable towards the company, or was their a noticeable sense of dissatisfaction?  More specifically, did the respondents appear or sound energetic when they spoke favorably of the company?  And, conversely, could you detect a sense of hostility, disappointment, or frustration when discussing dissatisfaction?  Also, importantly, for what specifically did they express their positive or negative sentiments?  This seems obvious, but often in qualitative research, it’s tempting to generalize that if customers like / dislike something in particular, they must like / dislike the entire experience.
  • If possible, take a step beyond the official reporting and recordings, and have a chat with the interviewer / moderator.  A good facilitator may be able be able to discern the hidden dynamics of a group discussion and identify “off the record” statements made by participants that may likely reveal their genuine feelings about your organization.  In my own moderating experience, for example, I asked the participants to walk around a prototype product and report back on their impressions.  Standing next to a couple of the respondents, I overheard their very critical and negative remarks about the prototype.  On returning to the group discussion, these participants then provided positive impressions of what they saw...these “disconnects” are not uncommon in group discussions.  A good moderator will catch this, and is thus a good source for learning what customers are “really thinking.”

Exit Surveys...these surveys or interviews, which are also referred to as defector or loss surveys, are surprisingly not a part of many companies’ formal customer research.  As the name suggests, these surveys are completed by customers who no longer do business with your company.  And while in some cases, it’s not easy to identify those who take their business elsewhere, in other instances, it’s readily apparent.  If you’re customers subscribe to your company’s offering, you should be able to easily determine if they don’t renew.  The same logic applies to leasing or rental arrangements.  And of course, in certain sectors, such as financial services, a customer can simply close their account and have their funds transferred to a competitor.  I think these opportunities to capture customer feedback are so useful that if your organization only has a budget to execute one type of survey, this should be it...why?  Because you’ll learn much more from your failures than from your successes.  
From a well designed exit survey or interview, you should be able to add the following to your overall CX assessment....

  • Broadly speaking, do customers defect because of dissatisfaction with your company’s product, or with the supporting service?  Can you isolate which products and/or services?
  • What is the trend for these defections?  Are they consistent over a long time period, or is this something relatively recent?  If the latter, you’ll now need to determine if they’re possibly associated with a recently launched product or service.
  • Are the defections associated with a particular geographic region or retail location(s)?
  • Can you attribute the defection to customer dissatisfaction, or are customers no longer purchasing from you because your company doesn’t offer a product or service that may better meet their needs?
  • Lastly, and perhaps most importantly, are you able to identify who these customers are?  Specifically, I’m not referring to personal identification, but rather, whether the customers can be identified based on their purchase frequency and revenue to the company.  This implies that your company maintains a robust transactional database to facilitate this.  The ability to identify lost customers by the value they provide goes to the 80/20 rule...20 percent of your customers likely account for 80 percent of your organization’s profit...are these lost customers coming from that 20 percent?
In future posts, we’ll look more closely at designing and implementing a comprehensive quantitative and qualitative Voice of Customer system.

(1) Strategic Market Research, by Anne E. Beall. Page 12

Friday, March 21, 2014

Your Initial CX Assessment...Part 2 - Internal Surveys

Continuing on in our initial attempt to get a high level understanding of the state of customer experience at Widget Inc., the initial focus is on the information contained in customer surveys.  The previous post, looked at industry-wide surveys typically conducted by market research firms, and identified some key items to look for from these sources, including peer rankings across various metrics, and always being cognizant of sample sizes for each question.  In this post, we’re going to look at internal (sometimes referred to as “proprietary”) surveys conducted by companies themselves.  Many large and mid-size organizations will typically contract with a market research firm to execute their own survey of their customers.  These surveys are generally of two types...

Transactional Survey - in this type of survey, the customer receives a questionnaire after a particular transaction such as, for example, an online or in-store purchase, or after a contact with the company’s call center.  Consequently, these surveys tend to be very “diagnostic” in that they ask questions about specific aspects of a transaction (e.g. were you greeted promptly, was the facility clean, how knowledgable was your attendant).  Transactional surveys tend to be the most common type of method companies use to gather customer satisfaction feedback.

Relationship Survey - these surveys can take a couple of different forms.  They could be presented in the form of a “brand health” assessment or audit, where respondents are asked to rate various components of the company’s operations including its products, service, community involvement, etc.  Given the general and high-level type of questions asked in a brand health survey, they are not likely to provide a very revealing or informative picture of the customer’s satisfaction with the company’s offerings.  Another type of relationship survey, asks more general questions about the customer’s experience over an extended period of time (i.e quarterly or annually).  The objective of a relationship survey is to account for some of the variability often seen in transactional surveys by asking customers, not about a particular encounter with the company (which may be positive one occasion, but negative on another), but about their perceptions across a variety of encounters with the company over time.

Let’s now move on to discuss some of the key benefits and drawbacks associated with most internal surveys.

Whether discussing a transactional or relationship method of gathering customer feedback, a key draw of an internal survey is the potential for associating a respondent with a particular transaction.  Typically, at either the beginning or end of the questionnaire, customers are asked for permission to reveal their identify, and in my experience, many respondents will disclose their name on the survey.  
The benefit of capturing the customer’s name is realized if the organization maintains a comprehensive transactional database.  If properly designed, these databases will contain a customer’s contact information, as well as their transaction history with the company.  Transactions could include purchases, repairs, and warranty claims, to name a few.  By matching a customer’s internal survey responses to a particular transaction, or series of transactions, insights can be derived into such areas as...
    • Do customers with similar attributes share the same type of feedback?  For example, if the company’s database segments customers along particular attributes, it would be worthwhile to determine whether other customers in that segment provided similar responses to survey questions.
    • The identification of a customer by name also facilitates the ability for the company to contact that person and offer a remedy as a response to an unsatisfactory experience.  This response to a problem takes on added importance in the case of loyal customers who frequently repurchase.  We’ll explore this “closed-loop feedback” method in much more detail in future posts.
    • Like the third-party industry survey we reviewed in the March 15 post, a well designed internal survey should also contain multiple years of results.  This allows for the trending of customer feedback over time in order to identify specific points where performance improved or deteriorated.  Your knowledge of trends will be critical as you proceed with your initial customer experience assessment because you’ll be in a position to ask what may have occurred organizationally at the time of the change in performance.
    • Finally, a big potential benefit associated with internal surveys is that they typically capture unfiltered and unedited customer comments.  If you have a large enough sample of replies, these “verbatim” comments can be invaluable sources of learning and assessment.  Spend some time with these comments and discern any trends and insights.  For example, are they gender specific?  Do they come from particular geographic or sales regions?  Are they associated with particular products or services?  Be careful not to draw conclusions from only a handful of replies.

Unfortunately, internal surveys can also have their share of shortcomings that you ignore at your peril. Let’s take a look at a few of those drawbacks that, if not recognized, can undermine your attempt to get a candid picture of your company’s customer experience...

  • Bias towards more favorable ratings - perhaps it’s because customers perceive that there responses may be seen by the company’s staff, they tend to indicate relatively higher levels of satisfaction than on surveys administered by a third-party.  This is the case even when customer submit their replies anonymously.  Conversely, customers tend to be more candid when they perceive that their responses will be viewed by an independent source not affiliated with your organization.  If you have access to the results of a third-party industry survey of your customers, it’s worthwhile to identify similar or identical questions that appear in both your internal survey, and the industry survey.  Now you’ll be able to compare how your customers responded to these similar questions...take the time to identify any trends in the ratings.  All other things being equal, I would have more confidence in the validity of the replies from the third-party survey...just make sure that there’s a statistically valid sample size.
  • Survey “gaming” - a second potentially significant drawback with internal surveys is, unfortunately, the possibility of customers being manipulated in some way by the staff they dealt with during their transaction.  Gaming can take several forms including a bribe - “if you give me a good score on the survey you’re going to receive, I’ll throw in a little extra in your purchase;” or a hard luck story - “my compensation is based on the score you give in the survey...”  These types of manipulations tend to be more common in some particular industries, so be on the lookout for this.  If you can identify widespread “gaming” be very careful about drawing any substantive conclusions of what your customers are saying about your company.

Saturday, March 15, 2014

Your Initial CX Assessment...Part 1 - Industry Surveys

In my previous post, I outlined some potential sources of customer satisfaction information that  can be used to form a preliminary assessment of the current state of the customer experience.  These sources include: industry studies of satisfaction by company (typically referred to as syndicated reports); internal or proprietary customer satisfaction surveys; qualitative research; internal reports directly from the front-line staff that deal directly with customers; and internal reports from the organization’s call center.  In today’s post, let’s have a look at how to use industry surveys as a diagnostic tool.

Industry-Wide Syndicated Surveys
You’ve no doubt seen the ubiquitous advertising touting a company’s rating in a J.D. Power Survey of, take your, airlines, financial services providers, wireless phone carriers, etc.  In these surveys, researchers such as J.D. Power send thousands of questionnaires to, for example, a sample of customers who purchased a new car in the preceding 90-days.  Customers’ responses are statistically analyzed, and a host of performance metrics are produced.  From this, you can learn how your organization performs against select competitors such dimensions as overall satisfaction, likelihood to recommend and repurchase, as well as satisfaction with such things as the company’s website, retail channels, and post purchase service and support.  A few specific questions to ask when looking at industry satisfaction reports...

  • What is the sample size overall, and for each question (remember, customers may respond to some questions, but not all).  Keep an eye out for small samples (typically, less than 100 respondents) particularly if your company has large sales volumes.  If Widget Inc. sells 100,000 widgets every month, you don’t want infer too much from a sample of 70 customers.
  • Select a few of the performance metrics in the report and look at the trends for your company over the last 3 to 5 years...what’s gone up and what’s gone down.  Make a note of these, as you’ll want to explore them in more detail when you start talking to internal staff later on in your current situation assessment.  I can’t overemphasize the importance of trend analysis in your assessment...when it comes time to address particular problem areas, you’ll want to know if this has been a long-standing issue, or something that appeared recently.
  • Some industry-wide reports include something called importance weightings, or loyalty drivers.  These are particular aspects of a company’s performance that, according to the statistical analysis completed by the research firms, are so important to customers that they significantly influence a purchase decision, or a customer’s overall satisfaction with a product or service.  In the case of a wireless carrier, for example, this could be coverage in remote areas.  For a car purchase, an important driver could be the vehicle’s reliability.  
  • A word of caution in using these drivers...not all customers are created equal.  Remember, results from industry surveys are typically aggregated across all respondents...young and old, male and female, thrifty and big-spender.  So, unless these drivers can be identified by (representative) sample sizes, be careful in concluding something like, “ALL of our customers say that the price of our ABC widget is an important purchase consideration.”  Price may be important to the thrifty ones, but the big spenders don’t really care.
  • Where do you rank versus your competitors?  In which facets of the customer experience do they perform well?  Are their customers the same as yours? If so, you may be able to use the survey data to compare their performance against yours along the same attributes.  This can be very instructive as it will allow you to pinpoint the specific areas where customers say you underperform against your competitor(s)...just be careful with sample sizes!

In the next post, we’ll look at how to use the results from your company’s own customer surveys.

Thursday, March 13, 2014

Where to Start? were recently hired by Widget Inc. as the new customer experience manager, and this is your first day on the job.  Although Widget Inc. makes good quality widgets, the only department adding headcount is customer service because they need to deal with the increasing flood of dissatisfied purchasers.  Furthermore, in the last year, the cross-town rival has taken 5 percent market share away from Widget Inc. even though their widget is comparable in cost and quality.
What’s going on?

Your new boss has brought you in as the savior (no pressure), and over the next several weeks, she expects a detailed road map from you on how the company is going to “provide a great new experience that’s going to wow current and future customers.”  Sitting at your desk, you study the company’s organizational structure and realize that producing a widget involves lots of processes, lots of directions, and lots and lots of staff.  CX is daunting you say...and it is.  But it’s also manageable...and that’s where we’re going to begin our adventure at the Widget Inc.

Developing a strategy always begins with a very thorough understanding of the current situation. This understanding typically starts at a high level and then progressively works its way down into the weeds to get at all the nitty-gritty details.

So, let’s see if we can identify some of the following key resources that may help us to better understand why Widget Inc. finds itself in a bind...

Does the company participate in an an industry customer satisfaction study, and if so, does it have detailed reports covering the last 5 years?

Does the company conduct its own customer satisfaction survey, and if so, does it also have detailed reports covering the last 5 years?

Has Widget Inc. recently (in the last year or so) completed any qualitative research (focus groups, individual interviews, etc.) with current customers as well as with those no longer buying from the company, and if so, is there any discussion about why customers buy widgets from the company, what they like and dislike about the widget and the accompanying service, why the defectors stopped buying from the company?

Is there any formal reporting (e.g. contact reports) directly from the sales staff, and does it contain their perspective on what they’re hearing from customers?

Is there any reporting from the company’s call center, and does the report organize and tabulate the calls into distinct categories?  Information collected by a company’s call center can be a gold mine for the CX manager because, if formatted properly, addresses many of the deficiencies often associated with formal surveys.  For instance, obtaining a representative sample of the company’s customer base can be an issue when there’s a relatively small group of survey respondents.  A good call center report, on the other hand, usually tabulates and categorizes the responses of ALL of the calls made by customers.

In the next few posts, we’ll explore how to proceed if some or all of this information is available, as well as what to do if there’s little or no customer feedback to go on.

Tuesday, March 11, 2014

First Post - Welcome

There is only one valid definition of business purpose: to create a customerThe customer is the foundation of a business and keeps it in existence.  He alone gives employment. 
-- Peter Drucker

Welcome, and thanks for visiting.  

For years now, I’ve been involved with several facets of customer experience – research, analytics, process design, management – and I’ve created this blog to share what I’ve learned about developing, implementing, and sustaining a customer experience initiative in an organization.

The blog is primarily intended for the CX manager or consultant who is looking for tips, advice, and research that will help with the management of their CX activities.  As such, think of this site as a collection of CX information from various books, websites and magazines, as well as my own perspective from my time in the trenches.

Customer experience can be overwhelming to the practitioner simply because it encompasses so many facets of an organization's activities.  Consumer research, website usability, mobile applications, retail processes, service centers, are but a few of the typical business functions that fall under the CX umbrella.  To help make some sense of these varied items, this blog will attempt to take a structured approach to presenting CX from the ground up.  That is, the first few postings will be devoted to discussing ways to understand the current situation in your company.  From there, we'll proceed to focus on understanding the various tools you can use to identify your customers' expectations and pain points.  As our comfort level with CX increases, we'll then move to posts that focus more on strategic considerations as your CX initiative evolves.

For this blog, I'm going to use Kerry Bodine and Harley Manning's explanation of "customer experience" as referenced in their excellent book, Outside In...

"CX is not...soft and fluffy, customer service, usability...

Customer experience is how your customers perceive their interactions with your company." (1)

For the sake of practicality, I'd like to emphasize the reference to "interactions" in the context of defining CX.  Theoretically, I suppose, almost everything an organization does touches on the customer in some way.  That said, I personally think it's a mistake to hold a company's CX function accountable for all direct or indirect customer-related activities...this would be overwhelming to even the largest and most capable of teams, and would likely add an unnecessary layer of bureaucracy to the company.  To optimize both the effectiveness of a CX initiative, as well as the abilities of the CX staff, I propose that the scope of CX be limited to those activities that are truly "interactive" between a customer and the company.  So, in-store purchases, calls to a service center, an online visit,  social media postings are all examples of interactive events.  On the other hand, such things as most brochures, posters, and TV commercials that are more "passively" consumed, should be outside the bounds of CX.  This blog will focus on CX with this approach in mind.

I’m a big fan of the late Peter Drucker, and I think his definition of the purpose of a business is spot on…the customer should be at the center of every business activity.  I hope you find Think Customers useful, and I welcome your comments.

(1) Outside In, p.7