5 minute read

Improved Forecasting Starts With Effective Data and Standard Modeling

This is Part 3 of a 5 Part series on Forecasting by Dan Bursik. To read Part 2, click here.

In my last blog I talked about the need for accurate forecasting as a vital link in the chain of your labor management and operational planning and execution strategies. We touched on three important ideas that are helpful to remember:

  1. You can’t get a good schedule out of a bad forecast.
  2. If there is no right place for the hours to be positioned then you cannot create a good schedule even with a good forecast.
  3. If you haven’t gotten serious about your best practices, labor standards or labor modeling, you probably have bigger challenges to attend to before forecast accuracy optimization.

But let’s assume you are working your way through the labor management process and see forecasting as an opportunity. It’s still important to consider what you are forecasting and whether your general approach for forecast data are aligned with your ultimate objective of putting the right people at the right place at the right time doing the right things. As far as I know, that’s the objective and accurate forecasting is a critical means to that end.

So, think about that objective and consider what you are forecasting and how you are forecasting it. This question concerns the granularity of your forecast element (e.g., department items, category items, specific UPC items, etc.), the time granularity of your data (e.g., weekly, daily, 15-minute interval volume, etc.) and the metrics you are using (e.g., sales, items, customers, etc.) to which you are applying your standards.

Let’s consider the labor needs of a Deli service counter. Some businesses use sales as the major (or even the only) forecast metric to quantify business volume. Some will drive it by items, some by customers and some by pounds – all of which can be supported by data at the department, category, sub-category or item levels. In cases where department information is the driver, then the business portion requiring service labor might be a fixed apportionment of the total Deli. If driven at lower levels, this apportionment becomes much more dynamic by day and by time of day.

Let’s use a set of three examples from Deli to illustrate how the right data can capture or mask the underlying work content. Ask yourself if the time is different for each of these scenarios:

  1. One customer purchases 10 lbs. of sliced ham in 1 package.
  2. One customer purchases 10 lbs. of sliced ham but asks for them to be packaged in 10 packages of 1 lb. each.
  3. Ten customers each want 1 lb. of sliced ham.

I hope it is obvious that although each scenario involves selling 10 lbs. of ham or the same dollar value in each case, the work content of these three scenarios is significantly different. To get each of these scenarios reflective of the right amount of labor and attending to your service objectives, three volume drivers need to be present in your forecasting methodology and, preferably, at 15-minute interval time granularity.

  1. Number of customers to serve.
  2. Number of items (packages) to make.
  3. Number of pounds to slice or produce.

So how does your approach to forecasting Deli service counter workload play out through these three examples? Would your approach capture the labor differently in each scenario?

It’s not that you can’t drive it from sales, or simply from items, or from customers or from pounds; it’s just that the best way to do it is to be able to effectively handle all three units of measure dynamically. This makes package size, and order size dynamic by day and by time of day. Is that important? I certainly know retailers who would say emphatically, yes it is.

If you aren’t forecasting at this detailed level, you need to ask how you provide the right service to your customers and how do you schedule the right number of associates at the right time?

Was it that no one thought to ask if such data could be captured? Was it that your POS or scale systems could not provide it? Was it that your labor forecasting and scheduling system couldn’t manage it? Was it a conscious decision or is it one you should revisit? You could spend the same number of hours in a day but if your business expects higher level of business on the evening than morning, you need to make sure you have better coverage in the evening than in the morning and you have a blend of higher skill level associates in the evening than in the morning.

As to data granularity, are you capturing the details at the department level? At the category level? At the UPC level? Is your labor model thoughtfully developed with the right data? Did you inherit what you got and simply accept it or did you really get what you need to model labor effectively? Are you living with what your information technology team agreed to give you instead of what you really need? Or was your data design dumbed down because of functional limitations of your vendors’ solutions? What is the value of taking a fresh look at the data you need to meet your needs today?

Regarding time granularity, are you capturing and using the relevant data on a weekly basis? On a daily basis? Or at a 15-minute interval basis? Is the data granular enough for you to be providing the best service to your customers assuming you can find a way to forecast at that time granularity?

Is the work of a Deli Service Clerk the only place where this sort of challenge occurs; where multiple units of measure are required to get the work content best quantified? No, aside from any other service counter you may have (Meat, Seafood, Bakery, Prepared Foods, Service Center, etc.) the same is true for many other operations.

Cashier workload should best be split into express eligible and not express eligible (with department registers and self-checkout volumes excluded). After that, their workload is a combination of customer interaction and item processing. Some of the customer processing is direct customer interaction – meet, greet and thank time. Other parts of the customer time should be based on the types of tender being processed (e.g., cash, credit with signature, credit without signature, debit, WIC, SNAP, check, etc.). The items then drive processing time by the type of method required (e.g., scan, key entry, weighed and keyed, etc.).

So, I hope those examples illustrate that the data strategy you take into your forecasting approach really matters. The more reflective you data and your standards can combine to anticipate the work content of your associates’ work plans, the more important that accurate forecasting will become. And just how to measure and improve on forecast accuracy will be a topic for our next blog. In the meantime, take a fresh look at your data and consider whether refinements are in order. Consider whether you are capturing all the right units of measure and if you have them at the right level of granularity.

For forecasting to matter most, there has to be a right time, a best time, for hours to be positioned. Once you have that you can put the right people at the right place at the right time doing the right things!

Continue reading

Let’s Connect