Quality Monitoring Forms

Topic Views - 12171

Training and Recruitment Manager

Menzies Distribution


In our contact centre we have been using Quality Monitoring forms that have a scale from 1 - 5 for each thing that is being marked. The problem is that what someone sees as a 3, someone else might mark as a 5. This meant that scoring was not consistant.

I made up a new form and it seems to be working a bit better although it is not perfect by any means.

I am interested to hear how other people Quality Monitor their calls and if possible it would be great to see some examples.

I am putting together a new form that will be scored as yes/no, a work in progress.

Looking forward to hearing how everyone else does it.


Team Lead

Sykes


Hello,

do you have calibration sessions? - basically this is when all of those involved in monitoring all score the same call then discuss any points of variation and agree a consensus stand-point.

You can do this in a live environment (more time consuming) ie play the call in the meeting room as each monitor scores the call, then at the end of the call discuss how each point is scored. Alternatively, get everyone to monitor the call first and have their scored monitoring forms with them - discuss point by point and ensure that you have the call available to play to support the concensus view. (I find this works best)

The above assumes you have the ability to record calls?

Further, do you / can you break down or add 'criteria' for each point demonstrating behaviours or actions the agent should demonstrate to achieve each score. For example score '0' agent did not use customer name at all during call, '1' agent used customer name once..... '5' agent used customer name at beginning middle and end of call and personalised closing of call....


Planning & Control


We recently changed our quality monitoring form to something less subjective. It has an advantage, in that there is greater consistency in scoring, but can be a little limiting in what you are asking.

Previously we could say "How friendly was the agent" and have a mark out of 10. To put it as a simple yes no results in either less info, or the need for several questions asking more specific questions.

In my opinion its worth looking at if you do need to change from your current form. We have 3 people who monitor quality, all trained by the same person, and whilst they might mark it differently on occasion, they have less than 0.2% variance in the average total scores, so we didn't have much need to change.


Training and Recruitment Manager

Menzies Distribution


Thanks for the replies guys. We do regular calibration sessions although we still find that people find it difficult deciding what is a 3 and what is a 2. I am trialling a Yes/No system and people seem to like it as it takes guesswork out, an advisor either did something or didn't do something. Can anyone send me an example of their forms?

Also I am interested in banding. Ours is as follows:

91%-100% WOW

82%-90% Spot On

64%-81% Miss the mark

-63% OUCH

I feel the WOW benchmark is very high as not many people are getting this score. A lot get 89 and 90% and it is a bit disheartening.

Looking forward to hearing your ideas and thoughts.


Team Lead

Sykes


What type of calls are your agents handling? are they giving technical advice? selling or providing customer service? and what sort of products are they supporting?

I don't think I can send you an actual QM form for confidentiality reasons but if I have a better idea of the actual role your agents have I may be able to offer something useful.

Enjoy your day!



I'm just looking at this topic again. The hard bit for me is that without knowing technical aspects of a call I am listening for the empathy and personality and I'd mark a great call from that perspective highly but a technical scorecard could mark it low. Getting the weightings right is a big issue. I really like the brackets above - spot on for KISS


Group head of Contact Centres Quality and Development

Johnston Press


We are new to the contact centre world and predominantly an outgoing based business with 9 sales centres in UK.

We have a QA form with 20 questions which covers the pattern of the sale, data input into the system, and soft skills.

It's a yes/no and n/a form. yes is only a yes if it is completed 100%. If there's an aspect of not quite, then it's scored 'no' but the manager needs to feedback and coach through why it didn't get a yes. It begins with low scores but we are seeing them improve, a score the scorer programme is being implemented to get parity in the managers, but agreed there will always be some element of subjectivity.

Happy to send a copy if email addresses are provided......would like to see others with a weighting as we avoided this as it appeared to give more room for disparity....


Team Lead

Sykes


Hi, schoalesytj sounds as if your form is working and you are beginning to see improvemetns to the scores indicating quality is improving.

In the past we have used weighting when we wanted to focus on a specific issue for example DPA, Opening and Closing scripts, call logging, use of customer name and so on. The weighting is used to generate a Fatal Error where an agent is not demonstrating the desired behaviour for the area of focus. Failing calls then has to be linked in with disciplinary action where improvement following coaching and guidance is not seen - this will drive improvements.

It is much easier to build a form and Fatal Errors for definate factors such as accurately logging the call (which the person monitoring can physically check) or using an opening or closing script of course than the more subjective 'soft' areas, perhaps appropriate tone of voice, which may be the areas of disparity thornden4 is finding?


Quality assurance

Absa insurance and financial advisors


Hi Thornden

If you can provide me with your e-mail address I would gladly assist you with examples of assesment forms. We are using the yes/no principle and it works well, we do however separate questions into winning or hygiene questions where the winning questions have to do with customer interactio etc. and the hygiene questions with things like compliance and business process followed. the rating is calculated on two criteria A= pass B= pass C=Fail D= Fail as well as a percentage outcome for the call and is directly linked to our company's performance development system. What this entails is that an agent can still achieve a 69% quality rating but due to hygiene errors they will have a D=fail for the call. this helps to build trends on the entire assesment and see specific questions on where agents are getting it wrong. i.e the agent can be brilliant in the interaction with the client but fail at mentioning compliance related information.


Training and Recruitment Manager

Menzies Distribution


Hi Hastur, Thanks for the reply. Is there a way to PM on this site so I can send you my e-mail address.

Thanks


Call Centre Manager


Hi Thornden4, I see your a trainer for Menzies, I'm the service manager for News International, are you on Linkedin or similar so we can share knowledge etc?

I also run QA program based on YES/NO answers and believe it works well.



I am implementing a quality system for my c/s teams as the one they currently use is very subjective. What I have done is broken down in categories from the start of the call to the end including everything an operator should cover in that call and it is as simple as yes, no or N/A. Add up all the yes and all the no and this gives you a percentage.

Work out your benchmark and then you have a percentage against it. This can be done individually or you can do an overall department %.


Service Quality Manager


I was following the thread of this discussion and am also implementing a assessment based on yes/no and then the points are scored.

Where does fatal errors fit into this? Would anyone have a current program that includes this in the score?


Training and Recruitment Manager

Menzies Distribution


Thanks to all that have been involved in this discussion. Seems like everyone is looking for the perfect form lol. I am still working on mine.

[Part of Post removed. please do not give out your contact details on this forum.]

Thanks again and Merry Christmas


Supervisor

Beyond Payments


Hi All

thornden4 have you received any examples of assessment forms. If so can you please forward them to me or anyone else please. I am currently putting together a form for incoming calls.


Quality Analyst

LSLi Property Services


Hi Thornden,

I run the Quality Assessment for a large group of estate agents and we regularyly calibrate our quality (at least once a month). We use a Yes/No/NA form which have points allocated to each, giving an overall percentage score.

<60% is a fail

61% - 74% is an average

75% - 96% is a good

96% > Outstanding.

This way, when Quality is reported to seperate sales team, we can compare it to competing sales teams on the sales floor and also, when the sales floor report is issued at the end of working month, teams can see where they are in relation to quality compared to other teams doing the same job but for a different region for example.

This particular approach works really well for us here, as its a simple Yes/No or N/a so we dont have the issue with what one might feel is a 3, another may feel is a 5. Another benefit to using this, is when we are feeding back to agents they motivate themselves to beat their % score the next month (often because colleagues scores are higher then theirs.


Training and Recruitment Manager

Menzies Distribution


Hi QALoz, I like the idea of a Yes/No form, takes away the ambiguity of it all. I am currently trying to put one together at present. We are going to have a workshop with our Quality Team and Team Managers to thrash thisout as the form we are using at present is not working.

Shaun, I can send you examples if you would like.

Anyone else, if you can provide me with samples that would be great.

Thanks


Training and Recruitment Manager

Menzies Distribution


Hi all,

I have completed our new form. Is there any way to upload it onto the site?

Also we have just got our CCA plaque through (having passed our accreditation with flying colours) so had a nice unveiling last week.


Digital Media Manager

Call Centre Helper

Quality Monitoring Form
Just wanted to thank Thornden4 for sending his Quality Monitoring form, it is now available to download from the following link

https://www.callcentrehelper.com/free-call-monitoring-form-3507.htm


Password to unlock to make changes is: Example1 (although I seem to be able to edit the form without the password!)
The scores are worked out as following:
The scores in RED are mandatory and the marker chooses a score from the right hand column.
Scores in Blue can be set to 0 to make that part non applicable.

CSI Coordinator

Enterprise Rent A Car


The form is quite interesting, we did something similar, the use of slang was an eye opener.

We originally were really hot on this, yeps, yeahs etc.

Now though we are allowing their use as long its not over used (every other word etc.)

Whenever we discuss slang we are always asked for a definitive what is and isnt slang and I found this hard as realistically there are so many that if we miss one we open ourselves up for that to be used.

How does everyone get round this?


Team Lead

Sykes


It is interesting and mirrors many aspects of various froms I have used over the years, which I guess means we all look for the same sort of things, which i find rather reassuring :O).

So how are the team doing?


Customer Services Team Leader

Santia


Hi there, I've just trie to download the form but the password is not working so I cant edit.

Can anyone help?

Thanks


Quality Auditor

Alegria Auto Sales


im having the same issue, we created one using the yes/no, but this template is really interesting, but we cant modify it, says the same, that the password doesnt work, can anyone help us??


Telecoms/IT - Inside Sales Manager


Hi guys,

The whole area of QA really intrigues especially when it comes to looking at ways of empowering agents to allow 'them' to make there own decisions.

As above i am also unable to even open the link??

C.



Hi this is such a great form, but the password doesn't work for me either, anyone out there know what the password is??


Quality and Assurance Coordinator

Waterloo Housing Group


Hello,

I am on a project to get a contact team up and running by December 2013 and I was looking for a call monitoring form so this is brilliant, thank you rboynton.

I also am unable to unlock the worksheet, can anyone help?

Many Thanks

Clare Johnson


Quality and Assurance Coordinator

Waterloo Housing Group


Just found this on this site and you can editthe form!

https://www.callcentrehelper.com/free-call-monitoring-form-3507.htm

Hope it helps :-)

Many thanks

Clare Johnson


Team Manager

Fexco


Hi all I'm just in the process of re-designing our call quality sheet and would really appreciate a copy of any of your sheets that are working well.



thornden4--

Can you please provide the password to you Quality form?

Thanks,

Blaine


Digital Marketing Assistant

scorebuddy


Hi All,

When you are creating a Quality Monitoring form you should always try and put a weighting system in place so that the most important factors are weighted heavier to reflect the impact it makes on the quality of the call, email or webchat.

It definitely should be reviewed and updated over time!



Hi, I just took on the task of creating our call center QA form (out bound sales) and wanted any ideas that can help. I tried to open the form above but it didn't work.

Reading this forum already provide me with some insight. Would like to take a look at a form though.


Customer Director


Hi

Linking this with first call resolution is also good.

Call monitoring seems to work with the first indicator being a clear yes or no, easy to understand and give feedback you either did it or you did not, then how effectively you did it as a grade of performance followed by the customer outcome, resolved or nor resolved for the customer.... everyone wins

JP



I have been working with a form where I use a rating: instead of 0 to 5 with 5 being every effective. I use 0 - 3 - 5, where 5 remains very effective, 3 is average and 0 poor. It makes it easier to be calibrated when a team is grading, even though there are still times where we are off. This helps me determined how strong the agent in certain areas and where areas of improvements are. But reading about the yes/no form is interesting and would like to view one. I don't mind sharing mine.


Planning & Performance Manager

Thames Water


I'm not really adding anything to this debate, I know, but the use of language in contact centres does sometimes make me chuckle. "Fatal errors" - did the customer actually die?



Hi All,

I am in Call Monitoring process for a long time now used various types of forms and devised a few.

It all depends on how simple or complex the process is and what results you want to derive at from

the call monitoring process.

I am sure once you have identified these requirement you will have an idea what you want to include in the form and to want extend you want the form to give you the information you are looking for.

I can help you in this area once i know what exactly you need.

You may use simple yes or no check list or you may also want to use "NA"

You may use plane "1" and "0" method or you may use weightage for each parameter and derive at weighted average score.

Thanks,

Parag.



Our QA scoring forms have evolved continually throughout the last 2 years until we have arrived at one which delivers results. That being said it is probably time to 'move the goalposts' again. We use a simple YES/NO/NA form with individual weightings for a total of 14attributes. The form is broken into 3 themes and is aligned with the overall objectives of the wider organisation.

1. Relationship building

2. FCR

3. System use

Our emails are also audited with a similar goal in mind but based on Email ettiquette rather than relationship building in the former. We have a number of agents working in different teams providing a different service; technical advice, customer service issues and Returns and warranty. Each form is adapted to serve that area of the business correctly and also in different channels.

We have found our most recent form has positively impacted on our NPS, FCR and agent performance and confidence in their role. This, combined with effective regular coaching sessions has resulted in a long serving, competent workforce with a very low level of staff turnover.

PS we deal primarily with inbound CS calls for an e commerce business.


Digital Marketing Assistant

scorebuddy


Hi,

We help clients to build scorecards and the most effective method is to have a combination of answers; some questions have YES/NO/NA answers, some are 1-3, 1-5 or even very poor, inadequate, adequate, good or excellent. Having to capability for additional comments, fail points and star marks or kudos also prove beneficial. Having scorecards that can be easily adapted to fit the need without losing historical data is a key advantage for rapid changes.



Hi A Byne, I am currently testing a form I have with some of the same characteristics as you mention. Like, Yes/No/NA and areas with poor, average, good and excellent and an area for comments from the reviewer. It seems to be working on the two campaign I assigned it to. I would like to try using it on the bigger campaign but feels like there is something missing that is crucial that I cant identify as yet. (Our center is an out bound center for different sales campaign running all at the same time.)


Digital Marketing Assistant

scorebuddy


Hi sburgess,

Thanks for responding there are a few areas to consider; Fail All, Sections/Headings, Organizational Goals, Attaching Calls.

Are you using Fail All sections? These are generally used for priority high security information, e.g. “Did agent verify customer account details?” Answer Yes/No/NA, if this is a requirement and the answer is no then this could be a fail all as this is a major breach in policy.

Have you broken your criteria into sections?

You should also try to incorporate the organizational goals into the scorecard but this may be a design element you need to tackle with seniors managers, working on a similar principal as a balanced scorecard.

Attaching the call/chat/mail to the scorecard is a good idea to increase understanding and transparency… do these make sense?



Yeah, it does make sense. Some of these I do have, like Fail all based on "not following client requirements and about three other reasons". We do send off the recordings as well so agent can listen to it to justify the grade and for coaching reasons.

Yeah, I have talk already with upper management and I'm incorporating what they are expecting from the form and Agents.

Yes, I have sections on the form and then those are subsections with specific points to gain or lose.

I have the form in a test phase on two campaign and it seems to be working but I'm not sure if it would work the same on a campaign with more client requirements.

I can have you take a look at the form if you would like.


Digital Marketing Assistant

scorebuddy


Hi,

Well you are definitely on the right track and doing a lot of testing before a major roll out it always a good idea too.

I would be more than happy to take a look at the form for you.



Ok, cool... is there some way I can send it through here?


Editor

Call Centre Helper

Here is the latest Quality Monitoring Form
Here is a link to the latest Quality Monitoring Form

https://www.callcentrehelper.com/free-call-monitoring-form-3507.htm

Want to add a comment?

Not found what you were looking for?

1. Try searching through our site.
2. Still not got an answer?

Why not ask the Call Centre Helper Community? Click here to ask your question