Chris Dealy looks at the 4 key elements you need to consider to ensure forecast accuracy in the contact centre.
1. Are you taking care of all the elements of the forecast?
It’s not just about the volume of work. Don’t forget the average amount of time required to complete each unit (Average Handling Time – AHT) and the total workload. Also make sure you are using the most appropriate metrics – e.g. offered calls not answered calls.
While it is common for centres to analyse the call volume forecast, it is less common to see analysis of the AHT. Both are equal partners in the workload calculation and should be analysed separately, as well as in combination, to identify opportunities for accuracy improvement. You should question each metric and understand underlying causes.
For example, in some centres, AHT is relatively constant across all timeframes and in others there is large variation; e.g. AHT during the night shift is much longer than during the day.
That raises the question of why this happens. Is it a function of less supervision? More new hires on unattractive shifts? Customers calling with more difficult problems when they have time to talk? Customers wanting short calls during the working day, or a combination of effects?
Digging into that type of question can not only improve the accuracy of forecasting the actual workload in each time period of the day, but can also help in identifying opportunities to reduce AHT through other measures.
2. Did you select the correct timeframes for analysis?
The period over which you analyse forecast accuracy is important.
Analysing accuracy at the monthly or weekly level serves as a reasonable scorecard – but does little to help you to discover where the forecast may be consistently over or under the actual demand.
If you have one hand in the oven and the other in the freezer, then on average you are comfortable! It is similarly misleading to measure service level or ASA over long periods.
There can be dramatic fluctuations within the week or month that offset each other, making the overall average look good. You need to do the analysis at the interval (e.g. 30 minutes) level to focus attention on those elements of the forecast that can be improved.
Dealing with wide swings at the daily or half-hourly interval level puts an unrealistic demand on the Operations Team. The goal is a consistently high level of accuracy, not a high average level over time.
3. Are you using the proper methodology for analysing accuracy?
Percentage variation, standard deviation of the variation and correlation coefficients can all be used to identify pattern anomalies and measure accuracy.
The percentage by which actual varies from forecast (forecast minus actual divided by forecast) is the most commonly used analysis of forecasting accuracy. At the interval level, it is also a pretty accurate picture.
Where there is a wealth of data to analyse, it is helpful to have an easier way to put a finger on the pulse of the accuracy over a long period, and that is best found by calculating the standard deviation of the variation percentages.
A small deviation is desired rather than wide swings in the variation, and the standard deviation calculation will be revealing even if the average variation seems quite small.
Another tool to analyse variations over time is correlation coefficients, which analyse patterns from one period to another. The correlation coefficient analysis can be applied to the variation percentages, but is probably most useful when applied to the arrival patterns of work volume and the changes in AHT over the intervals. It compares two periods to see if the patterns are a match or not.
For example, the typical Monday might adhere to a relatively consistent pattern, but the correlation analysis may reveal that one particular Monday varies in pattern even if the total volume of workload is within normal boundaries. This would suggest that further understanding of what happened on that Monday is useful.
This level of detail is also critical to determining which historical data is “normal” and which is not when deciding to allow the data to average into the history kept for forecasting.
Data which is outside of an acceptable range should be considered for adjustment, either stored separately as a sample of a particular repeatable event or discarded as an anomaly (such as a power outage) unlikely to recur.
4. Are you collaborating with colleagues whose actions can influence workload?
People outside the workforce management (WFM) team have influence on forecast accuracy.
Any anomalies in the actual workload should be identified and tagged with a reason, e.g. marketing campaigns, mailings, billing cycles, so that future forecasts are better able to accommodate and predict these drivers.
A good WFM team will make sure that it communicates effectively with other departments. In most cases, full understanding of what makes customers and staff behave differently from usual is essential to improving the accuracy of the forecast.
Is this something that your team does on a regular basis?
With thanks to Chris Dealy at injixo