To most of external world we are a call center platform, more specifically a call center change and insight software. Last week we had the opportunity to attend and speak, and really geek out at the AI Summit in San Francisco, the heart of technology innovation. If you can indulge me for the next few minutes I would love to share my insights and thoughts.
Everybody’s Talking About AI, But Not Really Listening
There are a great number of AI technologies out there, facial recognition, drone, chatbot, knowledge management, dating, and others. Very few that actually are involved in conversational AI. Simply no AI is listening.
Voice, specifically conversational voice is the wilderness. In the AI world we call this unstructured conversational data. We were the only group at the AI Summit focusing on interaction and voice analytics. Why is this of use in a bot centric world? Well it’s all about the data, we have millions of voice calls to analyze. Really no one has the data set, and if you’re a data scientist once you have the data you can begin machine learning cycles. We have the data.
What is going on and going on in spades is text analytics leading to chatbots. Text is easy to get. Facebook is text, Wikipedia, tweets, Google news – it’s really easy to get text. So natural language processing is advancing rapidly in the text space. I counted many chatbot providers, all text based. You type they type back. Clever ones can speak back, like Alexa and Siri. But they are struggling with understanding human interaction.
What the Chatbot heard
I spoke to a lot of chatbot companies, and to many people looking to buy chatbot technology for their bushiness. The question I asked them all is, “What does your chatbot know and how does it learn?” the answer after I got through the AI buzz speak of “learning systems and machine learning algorithms” was essentially whatever it is told or can see/hear.
What became obvious was our data and technology picks up exactly where a chatbot, IVR, IVA, or whatever tech is used to engage a customer without a human, leaves off. More appropriately where it fails. If you want to build a call supplement technology, you probably should know what is happening in your calls.
Replacing the unknown
Without knowing all the words, sentiments, and context of calls, it is impossible to replace them with other channels – be it chat, social or automated anything. Sort of like being asked to replace everything in a pitch-black room. It would take a while. With CallMiner, we can shed light on the entire room, help you understand what to replace, and can tell you when it fails. We can help a call replacement technology learn and succeed.
So what is on the AI horizon?
Well in AI it’s accuracy and product deepening. Better everything they do now. Most AI is incremental in nature right now. There are some innovations out there. I count us among them. Drones to herd cows and sheep, pretty cool. Deep ML, pretty amazing, the new CNN chips might just make self-driving cars really work. For us, call genomes, and full conversational context are on the horizon, the very near horizon. Prediction models for outcomes are right around the corner. But to be fair I’ll save that for another post.
Know that AI is rapidly innovating and going mainstream. You see how many commercials on the weekend’s sports broadcast had AI in them?! I counted five, which is a lot. But what AI innovation comes down to data and use cases. Thank you to all who are innovating to solve your next problems and improvements with the speed of AI.
Talk to us about how to take your unstructured voice data and do amazing things with it!