Restaurant Profit margins, the hard truths with inventory

The hard part of writing is writing. You have the topic and points in your mind, mull it over a few times, sleep on it a few more days, and yet hardly a word is written down. Well, blame it on indolence. And then one fine moment, your brain berates you for the laziness and you get onto it all of a sudden, with half collated thoughts and an unclear mind. But the beauty is, as you start, it comes along okay and you are in the zone, you can feel it. 

Get the foundation right, you can’t do calculus if you don’t know basic maths:

I have written a few monologues on restaurant inventory management, came across so many different use cases and challenges over the last few years. Conceptually, restaurant inventory is very simple, yet I am inclined to take a different path and also explain why it can be hard, what are some tough to solve use cases. First, the fundamentals of why we need it and what’s at the heart of it.

Even to this today, inventory management for most is recording procurement, placing orders, taking inventory counts and many other non-insightful mundane features. Food costing is a simple Purchase/Sales, simple 4th-grade math. The more nuanced guys record opening and closing stock values to the numerator. Now there are three different sets of users. 

  • No tracking – doesn’t matter
  • Excel – it’s the 8th wonder of the world
  • Software – collect data 

Then there’s either a finance guy or a controller who summarises all of this info for upper management. You will need a monthly P/L, with COGS, non-COGS and other operational expenses, to arrive at EBITDA. Pretty simple so far. If this is the only thing one needs to do, any restaurant management software would do. Unfortunately, this isn’t enough to be successful. 

When I joined a food startup and was made in charge of operations and outlet level P/L, our food cost was 55-60%. It took me shy of two months to bring all outlets to 28-31%. I learned that there are two crucial elements to costing, improved profitability and better business decisions.  

  1. Tracking variance
  2. Menu engineering 

And to do both, you need ‘recipes’. You need to know how to cost your entire menu, then fix your selling price. In this industry people will tell you to price it 3x, 3.5x, 4x, etc. i.e your cost should be 25-35% range. Again, it depends and generalisation doesn’t apply. Amongst others, it will depend on the city, location, type of cuisine, competition, the purchasing power of the neighbourhood, etc. 

Below is a fictitious income/expense statement for a restaurant, for every 100 earned. I have taken a scenario of 15% EBIT, most of your expenses under OpEx is fixed, labour can vary if you are engaging contractors on hourly costs, but this mostly applies to developed countries. Now, the variable or the expense item that is directly proportional to sales will always be the COGS, what did it cost you to make a sale of 100? Typically, most businesses know what the ‘fixed’ expenses are, so you fit them in as per actuals. Now, decide what EBIT you want to make. Go back to COGS, see what leverage you have here to make the kind of profit you intend to. Then cost your menu (&mix) accordingly. 

Note: For the sake of simplicity, I have ignored Depreciation/Amortization and other non-operational income/expenses like interests.  

 Scenario: Base Plan vs Higher rent and wages

In the second case, you have a room of 30% for all purchases, and 28% for COGS expenses. Here, non COGS expense purchases that are non-food category (packaging, stationery, housekeeping, gas, water, etc.). Different companies may track this as under different subheadings. As these are regular purchases, I prefer to track this under NonF&B SKU purchases head. One of the main reason is that some of these expense subcategories will still directly correlate with your sales. For e.g. more deliveries – more packing materials; more dishes made, more gas.

The Cheesy Stuff:

One of the companies I started tracking over the last few months is The Cheesecake factory. Below is an excerpt from their annual filings. Being an upscale casual dining setup, their labour cost is high but in the expected range, but boy, look at the COGS, a nice 22.6%. The average income per outlet is $10.7 million a year. Sales have increased by 6% and 3% in the last two years, respectively.  

Just to break this down for easy consumption. 

Below are excerpts from their filings. 

“Average check is driven by menu price increases and/or changes in menu mix. We generally update The Cheesecake Factory restaurant menus twice a year, and our philosophy is to use price increases to help offset key operating cost increases in a manner that balances protecting both our margins and customer traffic levels. We plan to continue targeting menu price increases of approximately 2% to 3% annually, utilizing a market-based strategy to help mitigate cost pressure in higher-wage geographies, and expect near-term increases to be at the higher end of this range.

We leverage our recipe viewer system to ensure timely and accurate recipe updates, and to provide instructional media content and detailed procedures enabling our staff to consistently prepare our highly-complex, diverse menu across all locations.”

Though the COGS numbers look cute and similar, if 2017 COGS% had remained the same in 2019, they would have spent an additional USD 8.7 million or USD 5.4 million with COGS% of 2018. Am talking percentage differences of .35 and .23 respectively. Do the savings look big now? Enough incentive? On average they spend $1.7 million to open a new outlet and with such additional savings, they can open 4-5 new outlets. 

Just to give a glimpse of how the P/L expense heads can vary across segments, here’s the data on KFC (company sales, not franchise). Note that labour cost is much lower, compared to food cost for KFC. Whereas in PizzaHut, it’s the reverse.  

*Amount in millions USD

Also, Pizza Hut and Taco Bell.  

Ajay, CEO of Windmills Craftworks says, “If you snooze, you lose”. For him to retain his margins, his food cost needs to be ~25-26%. In Sep 2019 I received a call from him and he appeared nervous, frustrated and a bit furious. His COGS which were along expected range had suddenly shot up that month and it had to be fixed. I went to his office the next day and sat in the conference room with finance, purchase, chefs all in one place. I could sense that everybody’s ass was on fire, there was palpable tension, felt like it was going to be a war. We went through all the data points – top variances, wastage, prep patterns, recipe costing, ingredient changes, etc. They were able to identify and act on the issues, leading to expected numbers and better margins. 

In most setups, meat and dairy will contribute to high consumption and high variances. Shashi, a key person in the organisation, is someone I am personally impressed with. Right from the early days of EagleOwl implementation, he tracked meat variance on a daily basis and knows his numbers extremely well. He will reel out the number of units sold and quantity in each portion for all meat-based menu items across all outlets in a jiffy. 

To keep control of COGS, what one needs to do is quite simple, it is actually 4th-grade math. Assume that a restaurant sells only coffee, each coffee containing 100 ml milk and 20 ml decoction. If they sell 100 units of coffee, expected consumption is 10L and 2L of milk/decoction, respectively. Now if you have consumed anything more than this, you need to know why, because it will hurt your bottom line. Though the math remains the same, it is tedious and complicated when you have 100s of menu items, yield, ingredient changes, price fluctuations, wastage, central production, transfers, past dated edits, etc. It is almost impossible to scale if you don’t have good control over this.  

The devil’s advocate:

Earlier I mentioned that we will also cover some of the harder sides of inventory, let us go over the topic of yield. I must however also humbly declare that no software in this space, including us, can be a hundred per cent accurate, it is awfully difficult in this industry. 

Let us take yield. Yield, in percentage, is the edible part of an ingredient, is not fixed for natural products like vegetables, fruits, meat, etc. It will work for a Delmonte’s canned pineapple slice, where usable part (drained weight) is clearly defined by the manufacturer. Or, say frozen basa, where glazing factor is labelled. But for a natural product, it is impossible to have a standard yield, but the recipe costing needs to account for yielded price and not raw quantity. Also, the yield will vary depending on the season. Different approaches can be made to handle this, some software allows you to define the ingredient yield per recipe, some choose to have a common yield. In either case, since yield keeps changing, so should your recipe cost, well technically, but it isn’t really practical to do every time. It is advisable that you enter the average yield, i.e measure the yield for an item a few times over and measure what the average is. 

One grave mistake many do in recipe costing is not to account for yield loss. That’s a bummer, but average yield, if even not completely accurate is closer to reality, so pick the lower end of yield for costing. 

Let’s take an example of Frozen Basa Fish, costing INR 300/kg. Assuming an average yield of 80%, i.e effective price per kg is INR 375. Now, if you don’t take yield into consideration, the cost would be INR 60 as opposed to INR 75, an error of 20%. This will result in inaccurate costing, wrong selling price – that’s money down the drain.  

Source: Delmonte label – google

Apart from costing purposes, yield also plays an important part in forecasting purchase, issues from the store, generating variance reports. If you have average sales over the last 3 months, a system can easily predict the expected purchase based on recipes. In the above example of Fish Tikka, to sell 100 portions in a month, each containing 200 grams, you need to purchase 25 Kgs and not 20 Kgs, since yield is 80%. i.e we multiply 200 gm * 100 = 20 Kgs, divide by .8 to arrive at 25 Kgs. Similarly, while generating variance reports based over any given time period, the estimation consumption needs to be converted to pre yielded quantity, i.e normalise, as other parameters such as purchase, OS/CS, are ‘as-purchased’ quantities in pre-yielded form. Again here, if someone records physical stock post thawing or adds up unopened packet quantities with thawed quantities, it screws up the math. These are real challenges, solvable yet quite tricky and very time-consuming. The effort involved to go from 98% to 99% itself might just be 5 times more effort from going from 0 to 90’s.

Another issue we can think of is recipe cost fluctuations – which depends on SKU fluctuations and yield changes. In most systems, the recipe cost price will fluctuate based on SKU price changes and additionally if you update the yield factor. These changes will happen at a random frequency. When we do menu engineering analysis or expected COGS calculation, what recipe cost should we take, for any given time period during which fluctuations are numerous? Arguably, there isn’t a right answer to this, nor is it an easy problem to solve and getting to that accuracy is frightfully difficult. An easy way is to pick the latest recipe price but that misses all the past changes (leading to variance report with lesser accuracy). Also, what if ingredients were changed mid-month or quantity of some ingredients were changed? This becomes a nightmare to solve using technology unless we get into recipe versioning, which means you start recording and storing all changes to recipes, whenever they occur. The cost taken can simply be an average or latest price or just an average of open price and close price, but there isn’t a foolproof solution to these type of issues. Also, it is too costly an effort to solve for such use cases, as typically a recipe is frozen in terms of ingredients and quantities. 

There are many more difficult to solve use cases, in this industry. I tell our clients that we can get to ~98% accuracy and not 100%, and most accept it. Will try to cover some more nuances in my next monologue. 

Let me know what you think and what else you would like us to cover, by dropping a comment. 

Leave a comment

Your email address will not be published. Required fields are marked *