em·pa·thy | \ ˈem-pə-thē \
noun
Definition: the ability to understand and share the feelings of another.
Customers rarely contact a brand to tell them how satisfied they are with the service or product. On the contrary, customers contact the brand’s Customer Service because the purchased service or product has fallen short of expectations and must be corrected/repaired/replaced/refunded. It’s important that the support agents on the receiving end can relate to and appropriately respond to the customers’ disappointment or frustration. Recent research reveals that great customer service is often a defining factor when choosing a brand.
Perhaps the most important skill for customer care agents to possess is empathy. However, when agents are bombarded with complaints or problems day in and day out, expressing sincere empathy can become difficult. It’s easy to get into a rut and the customers quickly sense negativity in their interactions with customer care, even if it’s unintentional.
This is particularly the case over digital channels such as email, chat and messaging. It’s not easy to inject tone and expression into the typed word. Often, distraught customers are responded to with a canned answer that goes: “I am sorry to hear that you are facing this issue,” which really doesn’t have any sincerity behind it at all. It’s ‘typed’ so quickly that it’s clearly a predefined response.
Offshore call-centres are now frequently employed as cheaper alternatives to home-grown support and, more often than not, the agents are conversing in a language that is not their first. While we must admire the advisors for their multi-lingual abilities, truly understanding the customers’ frustrations and expressing adequate empathy in a foreign language is not always easy. Moreover, the subtleties of language also play into this – sarcasm, irony and other nuances are sometimes hard enough to recognise in our first tongue let alone in a second language. Often, a brand will provide empathetic phrases for the agents to use, but unless they are correctly modified to fit each instance, then the exercise fails and the mechanical response is often worse than expressing no empathy at all.
So, in a world where automation is now commonplace, could a chat bot show the same empathy as a human being? Some clearly believe that it’s possible. Recently, a Google employee was placed on leave for claiming that a bot he was working on was as sentient as a 7 or 8-year-old child.
In the 1950s, mathematician and code breaker Alan Turing devised a test, originally called the imitation game, which assessed a machine’s ability to show intelligent behaviour indistinguishable from that of a human. Despite major advances in artificial intelligence, almost 70 years later, no computer has ever passed the Turing test. Perhaps the quality that is hardest to emulate is empathy.
Empathy isn’t something that we completely learn – it is also dependent on our genetic make-up. This has been proven in studies comparing identical and non-identical twins. It has also been found that women are slightly more empathic than men.
But besides the genetic factors, the ability to recognise another person’s thoughts and feelings, and the ability to respond with suitable emotion to those thoughts and feelings are also the results of upbringing, social interaction and experience. Some of those include loss, death and hope – experiences that a bot could never have.
Chat bots are really only as smart as the humans who program them. At its simplest level, natural language understanding uses computer software to understand user input (speech or text) in the form of sentences (not just keywords). More advanced natural language processing tools understand context and can convert speech to text. They can also ‘make decisions’ based on what they have understood and the algorithms running behind them.
When using a platform’s AI (Artificial Intelligence) there really is no self-learning – a considerable amount of human intervention is always needed either to train the input recognition or improve the algorithm.
Then, we could add a layer of sentiment on top of this. While the phrase “I am very sorry to hear that” could work well as a response to “my phone has died” or “my mother has passed away” it doesn’t mean that the bot is actually feeling anything in either situation, simply that it has picked-up on a keyword and been given something adequately empathic to say that will cover all eventualities. Going even further than that, the bot might be trained to understand when the phrase ‘has died’ is used alongside an inanimate object (a phone) or alongside a person and apply empathy in the first case and sympathy in the second.
There are several software programmes that cue contact centre agents when they are not using enough empathy in their calls. The programme recognises certain language and tone and reminds the agent to ‘relate to the customer.’ Again, the software is not feeling or relating itself but is recognising the voice patterns to which it has been taught to respond. The danger here, it is argued, is that if we begin to rely on machines to tell us when to be empathic, we may cease to naturally feel that emotion in customer care interactions where we are distanced from the situation.
If empathy is a human quality that is partly in our genes and partly the results of life experiences, then it will be impossible to confer that complete quality on a bot. However, as Natural Language Processing improves, there is no doubt that very soon it may become more difficult to distinguish whether we are talking to a bot or to a human. Certainly, while not feeling empathy itself, the bot will be able to detect mood and tone as well as understand speech patterns and, using all of these, will be able to respond appropriately.
Streamline your customer support, sales and marketing through conversational AI and chatbots.
Get a Free Demo Today