The Quality Problem: Good Advisors Stay Good – Average Advisors Stay Average


8,808

The performance of contact centre advisors can vary greatly, in terms of both approach and style, even when handling the same call types.

This is according to Daniel Ord, the Founder and Director of OmniTouch International, who also says that “individual performance, over the course of ‘days’ and ‘times of day’, remains remarkably consistent.”

“I’ve been doing a number of call audits over the past few weeks and I have noticed that if the advisor is good, they tend stay good. But if they’re mediocre in one or two calls, it’s likely that they’re mediocre all the time.”

“Practice does not make perfect. Practice makes permanent.”

While these thoughts are simply based on Daniel’s observations, is this a signal that the approach to quality in the contact centre needs to change? And, if so, what can be done?

If the advisor is good, they tend stay good. But if they’re mediocre in one or two calls, it’s likely that they’re mediocre all the time.

Daniel Ord

Let’s Start with Recruitment

Daniel Ord

Daniel Ord

As Daniel says that good advisors stay good and mediocre advisors stay mediocre, it seems as though fine-tuning the recruitment process to highlight the potentially good advisors is the ideal place to start when looking to boost quality.

While some contact centres may struggle to find the staff, there is an argument that holding conversations and having the ability to share correct information are things that the contact centre should get right in recruitment.

The fundamental principle of quality is to find broken process and lagging issues, with the aim of improving the customer experience, not to retrain advisors on the basic principles of conversation.

DPD’s contact centre run recruitment days instead of running through the standard interview process.

These involve introducing potential recruits to their future work environment, open Q&A sessions and putting advisors in situations where communication skills and maturity are tested.

[For more on DPD’s recruitment days, read our article: 15 Things You Can Learn from the DPD Contact Centre]

Consider Moving Away from Tick-Box Exercises

There is a strong argument for contact centres to move quality away from a box-ticking exercise and more towards a coaching and behaviour change environment.

In fact, many contact centre experts believe that those in the industry need to do a lot more to share what best practice looks like or, in a contact centre environment, sounds like.

Dave Salisbury

Dave Salisbury

Dave Salisbury is keen to make this point, saying: “make sure to define ‘good’. What is the standard that makes a ‘good call’?

“When your advisors know this, allow them the freedom to explore and personalise.

“Ditch the canned phrases; promote agency and flexibility to personalise and ensure that the advisor knows they are free to make decisions and act for the best interests of the customer”

Scripts Might Be Reinforcing Mediocrity

Philip Bennett

Philip Bennett

According to Philip Bennett, a Customer Service Operations Manager at Empire-Today, “too often companies reinforce mediocrity with their scripts and QA processes, forcing advisors to use verbiage that isn’t natural or that actively keeps the advisor from delivering excellent service.”

When advisors rigidly stick to a script, it can give the interaction a robotic tone, which can reduce quality in the eyes of the customer, but not the company.

However, while tone may improve, many companies will feel uneasy getting rid of scripts completely, as they are a guarantee of consistency in showcasing the brand’s values, goals, etc.

So, as a solution, Mary Eplin, a Quality Analyst, recommends the process that her contact centre uses.

“We do not provide them with exact wording, we do provide them with suggestions, alternative word choices and guide them when an area of empathy should have been offered.

“We do offer a scripted greeting, as we require them to provide specific information in the greeting and closing, but accept variations so the advisors will feel more comfortable delivering it.”

So, giving advisors freedom on the phone to self-learn may be a useful strategy to boost quality in the long term. But also providing a document that includes greeting phrases, empathy and acknowledgement statements for advisors to refer back to at their will could be a good addition for a blended approach.

It May Help to Rethink How Advisors Are Promoted

Adam Cunningham

Adam Cunningham

There will be contact centres out there that promote to team leaders advisors who may not be high performing themselves.

In fact, Adam Cunningham, CEO at ALKHEMY, is sure of it, saying “I have seen on countless occasions poor performing advisors become leaders relatively quickly, because they have had the right attitude, coaching framework and consistent QA.”

But does the team leader need to be a good performer if they have other strong attributes?

Benel Pilar

Benel Pilar

Benel Pilar, a Customer Experience Expert, seems to think so, noting: “I’ve always espoused performance variation of agents as an indicator of team leader effectiveness.

“Advisors of a good team leader will have similar performance, as good practices are shared by the supervisor across the team.”

Remember that Quality Analysis Is Primarily a Coaching Process

An increasing number of contact centres are deploying approaches such as feeding customer feedback to the advisor who handled their call, to help advisors self-learn. However, quality monitoring should not be primarily used as one of these strategies.

For more ideas on helping advisors to self-learn, including using self-scoring, have a read of this article on How to Get Advisors to Buy In to Your Quality Assurance Programme.

Simon Blair

Simon Blair

This is according to Simon Blair, a sales and service telephone coach, who says that too many contact centres are using quality analysis as a method of giving feedback, but are not training advisors to act on that feedback.

In fact, Simon Blair says that “the point of quality monitoring is to fuel the coaching process. Yet most managers don’t coach and default to feedback only.”

This “combines a lack of a coaching culture and poor communication expertise, allowing mediocrity, or worse, to reign supreme.”

Think About How Calls Are Selected for Quality Monitoring

Many contact centres select calls for quality monitoring on a random basis, choosing to listen to a certain number of calls per advisor each month.

However, Sarah Kennedy, a Call Centre Strategist, believes that a certain number of calls should be monitored per month on an organisation level, not an advisor level.

So, for example, instead of monitoring six calls per advisor each month, the contact centre would measure 400 random calls of a certain type each month.

Why?

Well, Sarah Kennedy says that this will allow the contact centre to “redirect those tenured skills of QA staff to doing more forensic QA – QA on advisors that are struggling, or call types that always get a failing customer grade, etc.

“Coaching is then ‘fed’ by surveys and side-by-side observations, freeing it from the tyranny of performance metrics.”

Make Sure Quality Analysis and Customer Experience Are not at Odds

A recent study from BrainFood Extra found that 75% of UK centres are under external regulation, where the whole approach to quality was weighted towards conformance. So feedback language is peppered with ‘breaches’, ‘fails’ and ‘mark down’.

This, according to Martin Hill-Wilson, is “the very antithesis of learning how to connect and engage authentically, which is what a customer experience agenda hopes for.

Martin Hill-Wilson

Martin Hill-Wilson

“Ironic then that quality and customer experience are often at odds in terms of agendas, which puts the frontline teams in a conflicted ‘no-win’ position.

“One compliance head was said to have retorted that they were not paid to care about the customer experience, just the avoidance of punitive judgement against the brand.

“QA is an interesting lens through which to view the mosaic of tribal cultures that make up a typical organisation. Getting it to a point of true effectiveness is therefore a long journey.”

Make Customer Emotion the First Thought of Quality Analysis

Research from Temkin Group has recently found that emotion has a greater effect on customer loyalty than success or effort.

With this being the case, Mary Eplin says the “first thought” of the contact centre in which she runs quality analysis, is “how do we think the caller felt at the end of the call?”

“Did the agent give them ‘homework’ creating a callback? This is where we start.”

Mary Eplin

Mary Eplin

Other things that Mary Eplin looks out for include: “Did the advisor meet the fundamentals? Greet the caller, close the caller, and meet the requirements for the call type?”

“How was the tone set? Did the caller feel as if they were a number not a person? Did the advisor really answer the caller’s question?”

“If things like these are missed, even though the fundamentals are met, this will not pass muster and we will positively enforce through coaching better word choices, upbeat tones, and working together recognising what was done properly.”

However, some of the questions were subjective, meaning that it is very important to calibrate quality scores and/or feedback.

[To find out more, read our article: How to Calibrate Quality Scores]

What else can be done to improve quality in the contact centre?

Please leave your thoughts in an email to Call Centre Helper.

Author: Robyn Coppell

Published On: 4th Oct 2017 - Last modified: 18th Mar 2024
Read more about - Customer Service Strategy, , , , ,

Follow Us on LinkedIn

Recommended Articles

A panel of people holding up signs with 5 on them
How to Create a Contact Centre Quality Scorecard - With a Template Example
Two people with a can phone on an orange background
Coaching Talkative Advisors to Provide Better Customer Service
A photo of a happy team gathered around a computer
How to Get Advisors to Buy In to Your Quality Assurance Programme