Recorded Webinar: Executive Briefing on Call Centre Quality
Webinar on Call Centre Quality along with all of the Questions and Answers.
It can create a closed-loop system which can catch problems at an early stage and alert senior management to serious errors. This can improve both customer service levels and the whole customer journey.
But many call centres still use paper and Excel-based call monitoring forms. This is both inefficient and not good enough for regulated operations.
Our panel of experts share how call centre quality monitoring can improve customer operations.
- Introduction – Jonty Pearce, Call Centre Helper
- Call Quality Monitoring Overview – Chris McIlduff, Ember Services
- Overview of Technology Solutions, Bernie Kane, Infinity CCS
- Interactive Questions and Answers
Topics to be discussed
- The importance of measurement
- The role of Call Centre Quality
- Customer feedback processes
- Quality assessments
- The role of analytics
- Getting Quality on the board agenda
- Overview of the technology solutions
This webinar is provided by Call Centre Helper and is sponsored by Infinity CCS.
Questions & Answers:
Q1: In which form should the rewards / bonus be?
Chris – A: If I understand the question correctly then it’s impossible to answer it generically as every organisation will have different budgets, conditions and access that will affect the rewards and bonuses that they can offer. What we would recommend however is that where reward and bonuses are used they should not be awarded for average or standard performance but should be used to recognise great performance or where there is evidence of having created a superior customer experience. If the question is asking about what form should rewards take; such as cash, vouchers, food, wine etc. then we would again say this needs to be considered on a case by case basis.
Q2: How do they recommend that the calls that are selected for scoring by agent are representative (i.e. of the good, bad and ugly that Lisa mentioned) and statistically relevant?
Lisa – A: We would use such things as time of day, length of call and try to ensure as wide a range as possible is used
Bernie – A: Infinity QA randomly selects calls from either all calls or a filtered list of calls. Standard features allow the filtration using the call disposition code, so if the outcome of the call is sufficient to tell you which calls were good, bad and ugly then that will suffice. If it is other information captured during the call, we can modify the system to use that piece of data.
Chris – A: Lisa made a number of great points in her presentation and the one about the good, the bad and the ugly things we find is very true. The facts are that without the use of advanced technology (such as analytics) organisations tend to focus on measuring a certain number of contacts per advisor for a given period and will adjust the number of advisor assessments depending on how well, or not, they have scored previously.
Whilst this systematic approach is common and quite effective it tends to focus mainly on conformance and often, as we said in our presentation on things that don’t matter.
Our recommendation for customers always starts with re-considering the things that matter and making sure that it’s these that are measured. If this is done then the systematic approach combined with the combinations Lisa mentioned (time of day, length of call), call outcomes and perhaps customer feedback will let us retrieve calls that are representative of the advisors week, identify problems but also great contacts that will let the business focus on actions that make a real difference.
In terms of the statistical significance of the sample this can be a complex area. What’s key is generating a big enough sample such that it will improve the confidence level we have with the scoring. If anyone wants to discuss this in more details then we’d be happy to meet up and discuss how Ember could help.
The reality though, is that if you are monitoring 4-10 calls per advisor per month, this is unlikely to be anything close to a representative sample of their work – it is an indication only. Put simply though a sample basis of 1 in x is a good start. So if Paul took 400 calls last month and you are going to review 4, search for and review every 100th call,
Q3: What challenges have you faced getting a ‘Sales acquisition’ company to invest (time & money) in quality management? And is it seen as an aide to sales or Big Brother?
Lisa – A: Actually about 95% of our business in the Direct sales channel is upgrades, in other words a proactive retention arena so given they are existing customers it’s key we deliver throughout their lifespan with us in order to retain them. It also helps that the customer experience is one of the key metrics in the balanced scorecard of the business and the desire to drive improved customer satisfaction comes from the top of the organisation and is key in all the strategic plans. The one ‘problem’ with delivering a great customer experience is that the customer expectation tends to rise with it so you can sometimes feel you are forever chasing a moving goal!!!
Chris – A: I think this question is directed to Lisa so I’ll not answer. Ember has however support many customers in developing their quality management approach. Our approach is built on the philosophy of measuring what matters and we’d be happy to discuss this further.
Q4: With Infinity how easy is the integration to all of these systems given this is usually the most problematic area?
Bernie – A: Of course the ease of integration depends on the target systems and the API’s they provide (if at all). Infinity itself uses web services and have an open SQL server database making it easy to integrate to. Our systems integration team have lots of experience of integration and we haven’t come across anything that can’t be integrated to. Integration projects used to be large and time consuming. Time and technology have moved on, and integration capability has become a given requirement so vendors are used to providing it these days.
Q5: How does Infinity get around the fact that recordings are encrypted to support for eg PCI DSS Compliance?
Bernie – A: Infinity QA will launch the ‘player’ provided by the call recording vendor and recordings should will be decrypted during playback. The listening experience should not differ from the current method. Regarding PCI, the part of the call taking the payment information should not be recorded at all, so should not cause a problem during playback.
Q6: Is it commonplace or fair to place part of the emphasis of a bonus scheme, on rewarding agents doing well, based on quality of monitoring when monitoring roughly 12 calls per agent per month (daily call volumes of around 4000 calls)?
Lisa – A: It’s key that the bonus is based on a balanced scorecard and that emphasis on a particular KPI doesn’t drive the wrong behaviour. For example in Carphone we don’t reward our sales staff on trading margin generated individually so that sales consultants aren’t tempted to sell the wrong handset/tariff to a customer because it would increase their bonus
Bernie – A: If an agent is performing badly, there will be a trend in their behaviour that will be picked up from random selection of calls. Quality assessment needs to be ‘representative’ so sample rates are important. The proportion of calls assessed needs to be a company specific decision that incorporates historical quality scores, customer satisfaction surveys, customer complaints etc.
Chris – A: As we’ve said above we’re of the opinion that reward should not be generated for average or standard performance but rather it should be used to recognise superior performance or where evidence of a great customer experience has been delivered. In doing so however the reward should be based on a series of measures that have appropriate weightings applied that reflect what is most important to the organisation. In some instances it is also important to consider minimum standards within a reward structure. In the example given we’d need to consider the other measures, weightings and minimum standards associated with the scheme.
In addition we would expect other factors to be apparent, for example a good team leader will be aware of the performance and call style of their team without the need for individual call monitoring. Equally customer complaints and expressions of thanks all build up this picture.
Q7: We as quality analysts evaluate the calls however we deliver the feedback the CSRs Supervisor, do you think this is beneficial?
Lisa – A: The Team Managers will evaluate the calls of the advisors and feedback; my team ensure the calibration of the Team Managers goes ahead and on a regular basis and spot checks of evaluations are also carried out on a regular basis. My Quality Team spend approximately 60% of their time on this type of activity then the rest of their time will be spent on projects e.g. listening to calls from the evening to understand why AHT is significantly higher at that time of the day etc.
Bernie – A: Yes, very much so, and perhaps it should go to the CSR’s themselves. The CSR can provide context around the assessment, they are a new starter, had the flu, just been trained on the new produce etc., it also engages them in the quality management process and helps them understand the need for consistent quality.
Chris – A: Our recommendation is usually that the team leader or the supervisor carries out call monitoring and feeds this back to the advisor. In our view this is the key role in any contact centre and this process should be a platform to meaningful coaching and improvement. The role of the quality team is then to ensure that the scoring and use of scoring is effective and consistent.
One thing to keep an eye on is where technology can improve this in the future. There are now examples of where the contact monitoring is being automated using analytics technology. In these examples sample sizes are much larger, scoring is more consistent and team leaders and advisors can spend their time focusing on coaching to drive improvement. As this approach further matures there is also significant potential to realise cost savings associated with quality monitoring. If anyone wants to discuss this in more detail then feel free to get in touch.
Q8: What criteria are used to create the QA benchmark used to measure the agents quality performance within an outbound sales environment?
Lisa – A: Currently ours is very task orientated based around DPA and FSA regulations. Our outbound team is also outsourced so this is something we are working with our supplier to move forward
Bernie – A: Depends on the business you are in, the product(s) you are selling, regulation associated with them etc.,
Q9: What would be the advantage over NICE as it provides analytical data both reporting and voice
Bernie – A: Analysis and assessment scheduling is not as advanced and crucially it only manages voice interactions. Alone it can’t provide the ‘Customer Insight Hub’, or provide the cross channel outputs that drive actions for business improvement.
Q10: We are Helpdesk not Sales – what would you suggest as outcome based measures?
Bernie – A: Depends on the outcomes you use … perhaps ‘escalated call’, ‘query resolved’ etc.’
Chris – A: This is a difficult question to answer without knowing your business and what products and services your help desk actually supports. Some outcomes possibly to consider would be first contact resolution, customer feedback or in call recognition that the problem is resolved or escalated contact. I’m sure we could discuss this further if we knew a little more so if you’re interested then just let us know.
Q11: What are the most important points that should be measured when monitoring quality of mails (irrespective of the industry the Contact Centre is)
Chris – A: Again this is a difficult question as it’s not usually the best approach to generalise. What’s important is to understand the reason for contact by considering the outcomes and customer feedback so that measures which matter can be applied to your approach. I’m sure we could discuss this further if we knew a little more so if you’re interested then just let us know.
Webinar attendee A: we set scoring as with calls, as well as process we monitor if the email was it personalised, all questions answered, did they thank the customer for their contact, did they offer further assistance etc. it’s quite in depth.
Webinar attendee A: For email monitoring we look at standard opening, technical content, use of templates and correct corporate closure etc.
Webinar attendee A: We regularly monitor email and non-voice transactions. We tend to look for clarity of message as a key driver as well as making sure all customer points raised is answered
Webinar attendee A: It is a lot easier to measure than you might think so long as your training and quality expectations are fully aligned. We also regularly calibrate with Operations and review CSAT generated as a result of email transaction
Webinar attendee A: We regularly monitor our email, chat and social media conversations. We focus on all aspects, but particularly the ‘tone’ or ‘perceived tone’ of the writing.
Webinar attendee A: Hi, we monitor response time, accuracy of information being provided vs. requested, grammar, spelling etc. Tone is also considered.
Webinar attendee A: we’ve used a mystery shopper to monitor the quality of emails against a number of criteria
Q12: What about the reporting in e-mails based contact centres
Chris – A: Treat emails as you do voice calls. If this is channel being used by customers it needs to be measured in the same way.
Webinar attendee A: also, don’t assume that people who are good agents on the phone can’t necessarily write emails.
Lisa – A: Absolutely, written communication is a different skill set to verbal communication
Chris – A: We agree they are very different skills. We’d also say that written skills are now not necessarily measurable the same way as they have been in the past. As different customers, of different age groups, engage with the business using multiple channels (direct and social networks) so the language they use and are expectations they have may change.
Q13: Infinity has a call record facility or does it work alongside a call recorder?
Bernie – A: Infinity has a very basic call recording facility that uses the sound card of the PC and mp3 or compressed wav files to record for very simple non-regulated applications. It would normally integrate to an existing call recorder using the call recorder vendors API.
Webinar attendee A: We use a system where the call recording system is built into the software that loads the script
Webinar attendee A: We listen to all sales calls as most campaigns are fsa regulated, we also do side by side listening and remote listening, and we don’t listen to a set amount per month. Listen normally every day or at least 3 times a week.
Q14: How much is speech analytic equipment?
Chris – A: I’m afraid that this a difficult question to answer as there are many parameters that need to be considered before thinking about the best product(s) and then the budget e.g. do you need to analyse in real time, what volume of contact is to be analysed, is it voice only or multi-channel, is the need for a standalone analytics solution or is a WFO capability needed. This is an area where Ember can help so if it’s of interest then let us know and we can follow up directly.
Q15: All our calls are listened to within 24 hours of being made and feedback given to the agent is this quick enough do you think?
Bernie – A: Depends on the volume of calls handled in a day. With Infinity QA the feedback to the agent (if turned on) is sent immediately following the assessment. The agent can then review at a convenient time.
Chris – A: As we said in our presentation the closer to the interaction the measurement and feedback is the better. In our experience your process and timing here is pretty good and to further improve this then perhaps it’s time to explore where analytics can help? If you’re interested in discussing that then give us a call.
Q16: Does in Infinity work for small contact centres i.e. 10 ft.? Or is it too heavy?
Bernie – A: Yes it can, but you would need to consider the payback / business benefit carefully.
Q17: We need to be PCI compliant and are struggling to find a recording system which supports this.
Bernie – A: With integration between your contact management / CRM system and the recorder, it is possible to pause the recording at the point of taking payment. This means you don’t need an expensive new recorder. There are vendors who provide a voice and telephone keypad based payment service external to your call centre. Contact Infinity CCS for more details.
Chris – A: This is an area where we have helped lots of customers. Two common approaches are to integrate the recorder with you advisor tools to disable recording or card information or there is also some very clever technology available that lets the advisor maintain a conversation with the customer whilst simultaneously the customer can securely enter their card details via DTMF tones. If interested we can discuss this further.
Q18: How exactly will this support a reduction in staff absenteeism and how do you address personalised feedback or 1 to 1 coaching?
Bernie – A: More engaged staff who are rewarded for good operational performance AND good quality performance tend to be happier in their work. Providing the feedback and personal MI engages them more and helps them understand the importance of their behaviour to the business and also their earning potential. We can automatically send completed assessments to the agents themselves and allow them to add their own comments. Those comments and assessment are then used in 121 reviews and the outcome of the reviews also captured in Infinity to show an audit trail of assessments and comments made, as well as coaching and training sessions attended. This builds a complete history of agents.