On the Threshold: A Framework for Adopting AI (Part 2 of 2)

There’s more than one way to AI.

There is no single method or process for “turning on” AI for your restaurant company, though it certainly isn’t as simple as plugging ChatGPT into your POS and hoping that it will start delivering business results.

Instead, the path to leveraging AI for restaurants is more of a journey, one that can feel circuitous at times. It’s about figuring out where there are real opportunities to apply advanced data automations, and then building out the tools, processes, and principles to achieve these objectives, while maintaining a core culture of innovation, curiosity, and even humanity.

Below are the four steps we recommend to any of our clients looking to activate data for their restaurants:

Step 1. Identify your essential business problems

Deciding how to incorporate adaptive decisioning technology is a little like writing a job description, but for a very specific, and very big, job. We’ve all read those job postings where you just can’t be sure what the candidate will be doing.

When you’re thinking about how you want to incorporate AI, start by identifying where in the business there is significant value to be gained if you could make instant, smart decisions, at scale. Where do you see regular inefficiencies in the business because people or processes are regularly making uninformed, ill-timed, or incoherent decisions. And just as importantly, where do those inefficiencies lead to significant value loss?

One of the best areas to look is where short-term results may run counter to long-term business objectives. Humans have a strong bias towards immediate results over delayed outcomes. Data-driven systems, when well designed, can counteract this bias, and weigh long-term results in its decisioning models.

For the restaurant industry there are a lot of these opportunities, particularly in high volume concepts, including fast casual and quick service models. The greatest challenge in high-volume restaurants is to deliver consistent service at an exceptionally high rate. Automation, when well applied, can shave down costs and inefficiencies at scale, while balancing short-term and long-term objectives.

Step 2. Organize and structure your data for coherence and accessibility

Once you identify the business problems worth solving, the next step is to identify and describe the sources of relevant data that are available within your system, and the specific factors and variables that will inform potential solutions. The good news is that there is exponentially more data than there used to be.

While the explosion in restaurant technology systems has created some challenges for restaurant operators looking to manage their G&A budgets, these systems have also produced a huge array of data, which were never available previously. Whether it’s training compliance, task completion, or clock-in behavior, data that used to be confined to specific functions, or were previously managed by paper and pencil, are now widely available, just an API away.

It’s easy to get lost in the mass of data that can now be accessed, but within the tranches of GUIDs, timestamps, credential identifiers, there are a number of variables that are rich with meaning and essential insights.

Here are a three things to look for in your data:

When something happens

A lot of system data is just timestamps of when various actions are completed by the system or by human operators. These simple fields are the building blocks for things like throughput measurements, sorting and organizing functions, and system update triggers. Just make sure that you have clarity on the function that is being timestamped. And of course, make note of the timezone (there are more than a few stories of systems that go awry because of a timezone misalignment).

Who completes an action

For any system interaction that involves a human operator, whether that operator is an employee or a guest, the “who” is a really important factor, and is often used to aggregate actions, activities, and performance analysis around system users. The key is to identify the user data that can bridge systems and maintain a single view of individuals, across multiple sources. Some advanced data systems can use programmatic processes to disambiguate and connect user data using a variety of variables across systems. But many simpler systems are limited to either an assumed unique identifier (e.g., user email address) or a manually input identifier (e.g., employee ID). It's important to recognize the requirements, as well as the limitations of your systems, when it comes to getting a full picture of user activity in your overall tech and data ecosystem.

What is selected, enacted, or altered

These data relate to the content or definition of an event, including what someone orders, which tasks are completed, or what message was sent. This data can be the hardest to structure or analyze, because it is often less defined and more variable in nature. But it is also the richest in meaning because it provides deep insight into things like product mix, service issues, or engagement details. Categorization and tagging, either in the source system or later in an analytical system can make these data more meaningful and less unwieldy for analysis, visualization, and automation.

After you’ve identified available data, the key is to bring that data into a single location and structure it so that it is meaningfully connected and summarized. Then the summarized and enriched data can be made available to downstream systems via API or other integration technique and used to automate functions and actions.

Step 3: Establish your system’s optimization principles and “ethics”

When people talk about machine ethics, they aren’t generally speaking about broader questions of morality. It’s a discussion of the specific parameters we set for our machine decisioning and to what our systems optimize.

These principles are implicit in our human decision-making. But they’re often happening at an individual level. A manager may make a decision to keep someone on the clock a little longer to alleviate the stress on an over-tasked team, though he knows he could wring out a little more efficiency if he cut labor. A cashier may choose to discount an order for a guest because it took a little longer than expected to make, even if that means reducing total sales.

These little ethical dilemmas are everywhere in a business, but the decisions, and often the impact of those decisions, just get lost in the mix, and no one is the wiser.

But when machines start making decisions, and even more importantly, when machines start optimizing their decisions according to programmed priorities, we have to acknowledge and specify operating principles and priorities for our systems, and they may be more complex than we first assume.

Target outcomes have implicit ethical prioritizations.

In the first part of this article, I mentioned a use case related to order routing, and how a number of variables can be balanced to deliver an optimal outcome. However, the key question I didn't address is: what are we optimizing for? Are we seeking to give guests the best possible experience? Are we looking to create manageable working conditions without too many orders going to any one location? Are we seeking to reduce the cost of delivery to the company by routing to a less expensive delivery service, even if that means sacrificing the guest experience? Each of these factors now have to be defined and weighted in the calculus of the machine.

Additionally, the power and risk of these decisions are now magnified. The occasional discount offered by a manager to win back a guest who had to wait too long may now become a programmatic decision to give a discount to anyone who has to wait 10% longer than a quoted time. That could suddenly spike a company’s COGS. But on the other hand, it could lay a deep, consistent foundation for loyalty and greater guest satisfaction over time. One, or both, of these things could happen. In either case, it will be happening at a much larger scale than a single manager’s decision.

When incorporating adaptive automated decisioning into your company, these decisions are going to be everywhere, and the biggest challenge is not just the technical hurdle of getting the system configured and the data flowing. It becomes necessary to define

  • how you want your business to operate at scale,

  • how you assess and track the impact, and ultimately,

  • how your programmatic decisions will begin to define the kind of business you are, and even what your core values are.

Step 4. Build out functional workflows

Once you’ve established the business priorities, the data foundations, and the optimization principles, it’s time to plug that data into functional workflows.

This is where the rubber meets the road when it comes to data activation. I’ve written previously how dashboards can be the place where data goes to die. The reason for this is that even the most enriched, carefully constructed data insights, can end up being meaningless unless it’s actually connected to systems that can make something happen on that data.

Workflows can be simple or complex, but the level of complexity isn’t where the core value lies. The key is to identify the relevant data points and their combination where truths that you’ve identified can be applied to generate real value.

For example, if analysis shows that guests who visit your restaurant more than three times in the first three months after their first visit are significantly more likely to become regulars, you can build out a workflow that engages guests and more directly encourages early repeat visits. Optimization models to discover the right methods to drive those repeat visits can add further power to these workflows, while mitigating financial risks, assuming the models are built with both these priorities in mind.

These workflows can function for a wide range of actions, from guest engagement to employee training to inventory management. The key is that the system is both leveraging a key insight that is available within the data, acting on that insight, and then programmatically evaluating the impact of that action to evolve actions going forward. This is where the business can continue to refine and produce value from the data.

A Call to Action for the Use of AI

Two weeks ago, Fourtop kicked off its first meeting of an initiative we’re calling the “Restaurant AI Assembly,” an episodic set of development and strategic discussions related to the activation of data and adaptive automated decisioning for restaurants.

One of the key parts of this initiative is to produce guidance on the use of this technology in a way that benefits companies and employees, alike.

In a world of scaled automation, we know that the decisions that we make with this technology can drive massive improvements in business performance, and with that performance, significant benefit for all stakeholders. But we also know that we could be producing rigid, single-track systems, which in their optimization of short-term productivity further depress the employee experience. That’s not just bad for employees, it’s bad for a business in the long-run, as negative employee experiences radiate out into poor guest experience, and ultimately brand degradation.

As restaurants look to start harnessing the massive potential of rich, instantaneous data to make decisions and automate processes, we encourage them to take a full look at the potential of these systems to create a better version of our industry, which provides rich work experiences, extraordinary guest experiences, and positive business outcomes.

Previous
Previous

Is it worth doing something that you know is going to fail?

Next
Next

More than Splashy Headlines: How AI Can Drive Real Value for Restaurants (Part 1 of 2)