There is a lot of talk today about customer experience management, but use of the term is vague, much as customer relationship management meant different things to different people. For some it is much the same as CRM, for others it is about using the voice of the customer to gain insights to make customer-related decisions. I have another view. Let’s consider phone calls, which according to my research into the use of technology in contact centers, is still the main way consumers interact with companies. What makes these calls good or bad experiences? Being driven mad by a badly created IVR system and waiting in a queue both come down on the “bad” side, and companies should do something about them. Those aside, my research into agent performance management(APM) shows that the majority of companies correctly believe that how their agent deals with the call makes the difference. So it makes sense that companies should take action to help agents deliver good experiences. Here are four things they can do.
First the company should route the call to the person most qualified to resolve the issue to the caller’s satisfaction and in the company’s best interest. This means moving beyond simple routing and adopting smart systems that route based on the caller’s profile, the skill of the agent, the context of the call and the person most likely to achieve the desired business outcome.
Second companies should train agents better and help them follow the best practices exhibited by the most effective agents. I’ll come back to the “help” aspect, but from a training perspective, companies have to move away from a “one size fits all” approach to training and deploy processes and systems that can focus on individual weaknesses and requirements.
Third is where the “help” comes in. Companies should make the agent’s desktop system easy to use and gently lead agents into following best practices. My research into APM showed that in the majority of companies the agent desktop can only be described as a mess. Some agents have more than 20 systems open at any one time; they have to know which screens to use to for each step, have to jump between systems (often repeating what they have just done in the previous system) and get little guidance on how to answer questions, much less advice in trying, for example, to sell additional products or services. So it is no wonder that many callers have bad experiences and many issues go unresolved. The maturity model I developed as part of my APM research shows that innovative companies have spotted what they can do to help – implement a smart agent desktop.
The least mature companies, which we call Tactical, do nothing to improve the desktop and let their agents struggle with what they have (if they stay in the job). At the next Advanced level of maturity, companies have implemented what was called a unified desktop when it was new. These systems typically did three things: They hid the other systems behind a new user interface that was more intuitive to use, followed the typical flow of different call types and automated updating of data across several systems. At the highest two levels of maturity, companies have deployed increasingly smarter desktops. These typically include the features of a unified desktop but go further by using rules-based systems to pop messages or new screens onto the desktop, which automatically provide the required information, advise the agent on what to say or do next within the context of the customer’s profile (including, for example, previous interactions and products bought) and the customer‘s relationship with the company (such as high or low value, or the net promoter score). In this way a smart desktop guides agents to follow established best practices.
There are many products now on the market that include some or all of these capabilities. To help customers decide which best suit their needs, we created the Ventana Research 2011 Customer Experience Management Agent Desktop Value Index. This report evaluates most of the vendors of agent desktop products in terms of seven standard categories: usability, manageability, reliability, capability, adaptability, vendor validation and TCO/ROI. Based on a scorecard driven by our CEM research model, we ranked the 10 vendors that took part as frigid, cold, warm or hot. Four received our Warm ranking: Riverstar, Altitude, Genesys and SmartPoint, while the other achieved the top-level Hot ranking: Jacada, salesforce.com, Cincom, Cicero, OpenSpan and number-one-ranked vendor Upstream Works. Overall the last four listed here all scored within two percentage points of each other and each represents an example of a smart desktop.
The fourth step companies should take to deliver good customer experiences is to put better information in the hands of people making decisions, whether at a strategic level or in the contact center as agents handle calls. My research into contact center analytics shows that most companies are not very mature in the use of analytics and metrics within their contact centers. The majority rely on spreadsheets to generate analytics, and only a few at the highest Innovative maturity level have realized the benefits of using one of the dedicated analytics products now on the market.
At a time when it is even more important than ever to retain customers and gain the maximum business benefits from those relationships, companies have to pay attention to the customer experience. These four steps, especially improving the agent desktop, are vital to business success, and I recommend you take a long, hard look at how you can improve call-handling. I’d love to hear your views and plans you may have to improve the customer experience.
Regards,
Richard Snow – VP & Research Director