The answer could determine whether or not your first foray into automating customer responses is successful.
Comparing marketing messages when it comes to fast moving markets is always difficult and at IPsoft we often get asked how Amelia compares to xyz vendor(s). What makes these comparisons especially challenging is a limited understanding of what lies beneath the surface of natural language understanding (NLU) and other AI technologies.
Recently, my esteemed colleague Parit Patel, wrote a very insightful blog on this topic focusing on contextual comprehension of chat bots and virtual agents and provided several practical examples. In this blog, I would like to take that discussion a bit further, but from a different angle. In my opinion, the simplest way to describe the difference between chat bots and virtual agents, and to a potential adopter, decide which is the better option, is to ask one single question:
What problem or use-cases is the vendor trying to solve with their solution?
There could be many answers, such as:
- Assisting customers with frequently asked questions (FAQs)
- Driving customers to hard-to-find web-pages for self-help
- Perform quick remediation to common problems
Because of our heritage with Virtual Engineers in IPcenter, which generally work completely autonomous from human involvement, we decided to solve a much more complex challenge with Amelia.
We set out to build a Cognitive Platform capable of working as an Enhanced Human Agent servicing customers through natural conversations. I will get to the “Enhanced” part later, but let us first examine what it really takes to work as a human agent in the service center role.
First, you would need to understand the customer’s intent or in essence, pinpoint the exact nature of a customer’s problem.
Although it may appear to be a straight-forward task, most customers are rarely direct in their explanations and tend to provide long stories before they get to the point of their request.
They also may not use the correct words to describe the situation or problem. So, a cognitive agent working as a human would need to:
- Disambiguate and clarify to get the right context of the problem.
- Perform Co-Referencing to understand what “it” and “they” actually refer to in the context of a conversation.
- Ask questions using determined context, and incorporate clarifying words and phrases from the customer’s previous responses.
Secondly, you would need to deal with the complexities of a customer’s request, which is often non-linear and unpredictable.
- Revisiting and changing information already provided during the session, i.e. “I changed my mind, send me the yellow one instead”. This happens all the time in real life customer interactions.
- Changing topics or asking questions in the middle of following a standard process, i.e. “Do you accept AMEX?”--needs to be handled seamlessly.
- Recognizing pieces of information provided earlier in a conversation which may need to be used later on as a part of resolving the customer’s issue.
Doing one of these things might be simple, but doing all of the at the same time becomes increasingly complex.
Thirdly, you would need to humanize the conversations to make them pleasant and guide the dialogue appropriately.
- Recognizing sentiment – both positive and negative - and use this emotional information to sympathize with the customer during a conversation.
- Using “social talk” to handle out-of-context situations and converse differently with customers who speak informally.
- Leveraging sentiment to make informed decisions regarding escalations to human agents or provide offers to upsell a customer new services.
Again, all these inputs need to be analyzed, correlated and combined dynamically into dialogue with the customer to create a seamless experience
Finally, the “Enhanced” part, is all about performing a customer service function better than a human agent. There are some obvious and non-obvious aspects to this.
- It is obvious that a virtual cognitive agent can be faster than a human especially in handling a request immediately to serve the customer exclusively. For instance, if you have ever chatted with an online human representative and waited 40-50 seconds for a response at every turn of the conversation, you will understand this “Enhanced” part of a cognitive virtual agent.
- To complete transactions, Amelia is integrated into backend systems and accesses information in milliseconds by scouring existing records to pinpoint what the customer is inquiring about, i.e. “Are you contacting us about the delayed flight and your connection?”--something that advanced IVRs solutions are also starting to do.
- Less obvious is what we call “Visually-Enhanced Dialogue”. For Amelia, this basically means that she can control the UI (user interface) and thereby influence user experience – whether through mobile or over the web – to guide and explain situations in a visual way. The information controlled by Amelia can be dynamic based on the conversation and options available.
In addition to the above, Amelia has several unique capabilities which make her more human, such as Natural Language process creation (learning) through observation of other agents and her Episodic Memory, which allows her to recall similar conversations in the past to better assist customers both in terms of answering queries or resolving issues faster and more accurately than before.
What is next for Amelia? See for yourself at our Inaugural Digital Workforce Summit in New York City.
Amelia’s latest enhancements, which were engineered from IPsoft’s Cognitive Innovation Laboratory, will be showcased at our Digital Workforce Summit event taking place on June 1st. The objective is and continues to be developing and incorporating those unique human nuances that are so understated in conversations, yet make all the difference to customers in live interactions. Amelia’s already-impressive cognitive capabilities, from her ability to learn through observation to her episodic memory, are impacting businesses profoundly from a financial and operational standpoint. It isn’t difficult to imagine the magnitude of Amelia’s current and potential business impact when you consider how she can unburden customer service departments from handling high-volume, low-level issues and her cost-effectiveness in terms of training.
Early adopters of CVA’s (cognitive virtual agents) like Amelia are creating a blueprint of standard best practices and business roadmaps for future implementers of this AI-driven technology. Some of these trailblazing companies will be presenting what they’ve discovered at our Digital Workforce Summit. They’ll be relaying a belief that has picked up steam in recent times and that is: the future of customer service lies in the coexistence and seamless collaboration of human and virtual agents.