Service Level Vs Abandonment Rates
I’m in a Contact Centre handling on average 3000 calls a day made up of around 30 service lines.
Our overall service level is usually around 95% with an abandonment rate of around 1.5%, however I have noticed recently that one of our individual Service Lines has been having a SL of around 96% but with an AB rate of up to 15%
We are targeted on 90% in 20secs and ignore all abandoned calls before 5secs. This leads me to believe that most of the calls for this line are abandoning between 5 and 20 seconds.
Does anyone have any sort of calculation for what your expected abandonment rate should be if you have a SL of “n”??
My SDM thinks there should be 3% ab with 90% sl and 5% with 80% sl, but we would like to find something to back this up!
Sorry if I’ve confused you all, but any help would be appreciated!
Question asked by Laura
Understanding Service Level Vs. Abandon Rates
The challenge that you are facing is likely a function of the 100-year old interval based forecasting methods put forth by Erlang. Erlang was brilliant but he only had a mechanical odometer for collecting call counts.
He clearly stated that he would rather be working from what he called the “inter-arrival patterns”. Essentially he recognized that calls don’t always arrive randomly.
He built a method around the assumption of random arrivals but felt he could have done much better if he had access to more descriptive data. Accordingly, Erlang recommended against using his own method any time that call volume was “event-driven”, skewed or volatile.
Practitioners of the 100-year old methods try to sell us on the concept that it is ok to use Erlang methods because call volume is indeed random. Check out the following book excerpt and you see an interesting graphic representing random call volume. Also note that the graphic was generated using random numbers, not real data.
We’ve found that the concept of “calls bunching up randomly” is completely false about 90% of the time.
If you look at any 30-minute slice of a typical forecast day then virtually every slice has a demand for agents that is skewed to the right, left or centre of the planning interval. Simply put, the demand is not bunched up, its skewed.
Now try flat staffing for 30 minutes across the diagonal demand. The result is a call centre that toggles between:
- understaffing and overstaffing
- instantly answered calls and long wait times
- no abandons and high abandons
- low labor utilization and 100% utilization
This reality is the premise for the science of High Definition Forecasting.
Returning to your puzzle over the high abandon rates…
A 2% – 3% abandon rate is generally considered healthy. However, the typical ACD switch grossly understates abandons because callers who hear a long wait time in queue message are likely to deflect prior to being offered to the queue.
Remember that in most cases, callers listening to a message have not yet been offered to the queue. Instead, they are listening to a recorded announcement that precedes the queue. The majority of switches don’t even count callers who are deflected while listening to such a message.
This undercounting of customer reaction to long wait times artificially improves service levels. Ignoring callers who abandon within the first 5 – 10 seconds will overlook even more of the customer frustration.
It means that things have to be pretty bad before inappropriate staffing patterns show up as either high service levels or high abandon rates.
Your initial post made the following statement…
“SL is 96% but with an AB rate of up to 15%. We are targeted on 90% in 20secs and ignore all abandoned calls before 5secs. This leads me to believe that most of the calls for this line are abandoning between 5 and 20 seconds”
If the demand for agents is skewed and you are trying to address this with a single staffing pattern across a 30-minute planning interval then your staffing pattern is a really poor fit for the demand. Initially you will be significantly overstaffed and answer all calls instantly.
As activity grows, you’ll flip into a state of understaffing that will drive very long wait times and high abandon rates.
Your service levels for the interval will probably be great due to the effect of averaging in all those calls that were answered instantly. Consequently, the long wait times towards the end of the interval are blurred and blended into acceptable service levels.
If you would like to just reduce abandons then stop playing the wait time in queue message. Naturally your service levels will fall because the wait time in queue message will no longer chase callers away before they can be counted.
If you would like to fix the problem with a more appropriate staffing pattern and forecast, then you need to look at a forecasting system that is immune to all the interval effects that distort the 100-year old planning methods. In this post, I’ve covered just the tip of the iceberg when it comes to interval based planning distortions.
Also, be cautious about moving to 15-minute interval based planning. Conventional wisdom might suggest that a shorter interval forecast will provide a better fit with demand.
Unfortunately, the reverse is generally true with interval based forecasts because can these older methods can only peg calls to one interval. The shorter the planning interval is, the higher the distortion.
While your staffing level may change more frequently, the staffing levels are usually phase shifted into patterns that are difficult to recover from.
Even with short 3 minute talk times we’ve seen a 7% phase shift from 30 minute forecasts turn into a 14% phase shift in 15-minute forecasts. Phase shifting is really harmful because it affects all customers, not just the ones who are unlucky enough to arrive on the wrong side of a toggle.
High Definition Forecasting provides granular staffing levels (15 minutes, 10 minutes, 5 minutes, etc.) but in a distortion free manner that intricately respects the real life distribution and carry-over effects of past demand.
There is a free weekly webcast that introduces participants to the concepts of High Definition Planning.
With thanks to Paul