Brian LaRoche of CallMiner and Steve Ruszczyk of Praxidia answer three common questions organisations ask them before implementing speech analytics.
1. Do you feel implementing a speech analytics solution should be done in phases? i.e. A department or location at a time, or should you deploy it organisational wide out of the gate?
I don’t mind having a long-term horizon as to where I want to be because that is like a north star to help navigate. But I am a big fan of incremental progress. What really thrills me is the advanced analytics around getting the plumbing out of that ingestion of text and voice. And then moving to the right of the value chain into customer journey mapping, brand protection, becoming predictive and prescriptive. And in a sense, moving the future to real time so we can affect outcomes.
That is the stuff that mostly thrills me, but to go after that all at once is a recipe for disaster. I want to be able to claim quick and multiple wins, and be able to continue to win. I am a big fan of realistic, aggressive goals. But I would say pick some things that you know are low hanging fruit. If you are doing manual QA, there is a lot to be had in the first six months, first year of operations that will give you the political capital and even the financial capital to do bigger and better things.
Steve, I’m right there with you. Our mantra is start small, plan big. I think it is really important to get your feet wet with analytics. Start off with a department or a group, maybe a subsection of your team. Or if nothing else, start at least with only 1 or 2 benchmark objectives with your deployment of this technology.
I am fond of saying the best thing about speech analytics and interaction analytics is it does a million things. The worst part about interaction analytics is it does a million things. If you try to boil the ocean and do everything right out of the gate, you will fail. You just can’t serve all of the masters. You can’t deal with all of it. There is no easy button with speech analytics. It requires dedicating some resources and some staff to it.
In a recent webinar, Steve talked about how they changed and shifted some of the personnel that was in other roles and reallocated them to the roles, with some being program owners. Make sure you have the right staff in place, and you start with realistic, achievable goals in mind and then build on that success.
2. How long would it normally take from an initial deployment before an organisation starts to see real results from your analytics programme?
Before we launched CallMiner, we had used a platform that was about eight years old, took almost a year and a half to get installed and running. Quite frankly, it never actually delivered the promise of things. There were configuration issues that allowed only partial pieces of the product to run. We never did get to some of the call parts and some of the other things we wanted to have.
So, when we first implemented CallMiner, I had surveyed about 12 different providers in the market, quickly got to around four, got to two, and then I compared candidates one and two on how quickly it took to implement. One took 14-16 weeks to set up the environment, and the other one took ten days. Another one took months to set up the lexicons and categories, and CallMiner had pre-built content out of the box.
What I tell my clients these days is that from the time an order gets processed to the time we are ingesting calls and become operational is 30 days. From that 30 days, we are actually ingesting calls and every day we are getting smarter.
Easily, in the first week of operation, we have our first set of data, which to me is a bunch of hypotheses or hints of what is going on. By the time we get to another 30 days into it, I can start showing some real results. Certainly, in the first 90 days, everybody should be very strong in terms of operations.
The only caveat I have here is you have to have someone who understands analytics. Because when you train somebody from the very beginning that says this is analytics, am I going too fast for anybody, as opposed to somebody that has been doing analytics for five years, even if it is a different brand, I am not trying to catch them up to what is all about.
So, assuming you have the right partner and maybe you can get some professional services from the outside or you have some experience inside, if you’re going to be developing your own resources from zero, it takes longer. If you’re going to be using somebody who has some professional services to guide you along, you can literally be seeing results 30, 60, 90 days progressively.
So, Steve’s results when he deployed CallMiner, we were amazed at how quickly they were able to see real significant results. I will say because Steve’s organisation had background and experience with another program when we introduced ours, they were able to really take off and hit the ground running and were seeing results in a very short order. I will say statistically if you’re starting a program from scratch, Steve’s second estimate of 90 days…90 days to six months is typically where you will start to see significant results, your program starts to mature, your analyst or program owner is having a stronger sense of things.
3. What technology do you employ for text analytics?
CallMiner’s technology is multichannel, so it’s the same platform for text that it would be for chat, or email, or even surveys. There aren’t any extra things you have to bolt on there. Other than adding the text analytics software licence. It’s all part of the same.
By the way, it’s the same reason why the technology enables journey analytics. You can map the customer’s interactions, multiple interactions whether it’s voice, call, a chat, an email or a combination thereof, because it’s all being analysed by the same platform.
I just want to add that there’s a concept I haven’t discussed today on interaction management front end. Because of our experience we actually have two market approaches.
A third of our company came from market research – surveys, customers, profiling customers, predicting the likelihood to buy, sell, keep, sell, churn, whatever the case may be. And then my side which came from the contact centre was about interactions themselves – calls, chat, emails, blog files, whatever the case may be. More of a near real-time-type thing.
In a sense, you can look at engagement analytics as an ingestion engine that takes unstructured data – the calls, the chats, the emails and so forth– brings them in and does this cool thing about redacting PCI, and other stuff you don’t want in your data centre. Then, it normalises the calls, normalises the chat sessions, tags them, categorises them, gives you insights and scores.
There are applications that then use that for QA, complaint management, regulatory compliance, and on and on. In our environment, we also have predictive analytics that feeds different things like HR hiring tools, predictive analytics around churn, keep, buy, sell and so forth.
And then we have an enterprise feedback management system that does customised surveys based on the scoring of the call that we think just happened. There is no sense in giving somebody a nice, happy survey that says how was your call when we already know they were furious and are going to call the CEO. That just makes them even madder.
We customise things. Our predictive models are bringing in e-commerce data, CRM data, and billing data. I’m feeding that and then making bigger wider decisions in a larger scope like using their API to integrate and that is all cool stuff. We are using CallMiner as a near real-time engine to feed bigger downstream processes.