Buy vs. Build: It’s Not Whether, but Where to Invest in Data
The ROI for data infrastructure and activation is clear, but you may still have questions about the right moves to take you there.
More tech has meant more guests, more orders, more income. It also has meant more discrepancies, more operational woes & more costs. How do we tip the scale so we are better off?
Restaurant staffing and stocking is often a guessing game against volatile sales numbers and an online market dynamic that can shift rapidly and often appear out of nowhere, with just the dings from third-party tablets providing warning of an impending tsunami of orders. Back office and accounting teams work overtime to reconcile wildly divergent sales numbers as the realities of fees, chargebacks, cancellations and other net sales impacts make their way through the layers of reporting.
Despite technology’s promise of simplicity, the reality of operating a restaurant has only become more complex, as the orchestration, analysis, and management of the data generated by these tools has become essential to driving sales and maintaining profit margins. When brands come to realize that growth and efficiency comes down to how they leverage the information gathered by technologies, it inspires a new way of thinking about how those tools fit within a restaurant, as well as a desire for a higher level of visibility and control.
The most tech-forward and strategically advanced brands have been following a logical path back from the question of how they become more aware of, and responsive to, the data that is everywhere in their companies. They understand that…
to act effectively on data, they need their systems to interoperate,
which means…
they must have normalized data they can trust flowing across their ecosystem,
which means…
they first must be able to clean and centralize data in a way that is accessible, verifiable, and stable.
After countless conversations with restaurant executives, operators, and technologists, we recognize a pattern of thinking that leads from one solution to another, in search of an answer for how to effectively bring together, and act on, all of a restaurant’s data.
Should we leverage our point-of-sale as a data warehouse?
The point of sale is a compelling option when considering how to bring data together. These systems historically have originated the vast majority of sales, they house menu details, and often serve as the time clock for employees, making the POS the source of truth on labor. Given the critical mass of data generated, the idea that the POS would be the best solution is understandable.
However, the very things that make POS systems optimal as a source of data, make them problematic as a repository for bringing data together from other systems. POS systems must maintain a fairly rigid internal structure and data model to ensure operational stability. And as restaurants pull in more and more data from systems with alternative formats, variables, and overarching data models, a POS system can’t—and shouldn’t—adapt or adjust to incorporate these new sources.
As the most powerful technology tool in a restaurant, the POS definitely is not dead, but also, it definitely is not your data warehouse.
Is it best to centralize data in back office and accounting tools?
Of course it is smart next to consider the systems already in use to reconcile data and provide business reporting. Back office and accounting tools are effective at providing restaurants with a vital source of information, why shouldn’t they also serve as the data hub?
Because these systems are more cul-de-sac than junction, their purpose, often, is to be the final check before data is canonized in business records for shareholders and for auditors. The very nature of their function makes them suboptimal as centers of constantly moving and changing data. Moreover, most of these systems have capabilities to pull data IN from various sources, but the capability to push data OUT is limited. They’re a one-way tool. A good and necessary tool, but not a hub.
What about a CDP?
The CDP (Customer Data Platform) is less of a defined technical product, and more of a range of tools and configurations that have become more common, particularly for marketing and customer service functions. Good CDPs can be extremely valuable and give brands a lot of power and capability when it comes to understanding and engaging with guests.
However, the challenge with CDPs is their specific focus on customers and customer activity data. They orient both normalization and data analysis around the guest relationship. While an essential part of the restaurant business model, there are real limitations to viewing the restaurant solely through the lens of the customer.
We’ve seen brands effectively build a customer data infrastructure that drives real marketing value, only to have the rest of the organization feel that the tail is wagging the dog. Marketing efforts drive significant demand shifts, while a lack of orchestration across other systems and tools will leave Operations unprepared with staffing and inventory. Ultimately, negative impacts to operational efficiencies undercut both the net revenue and guest experience that the marketing initiatives seek to deliver.
A restaurant’s data infrastructure must be more comprehensive than a CDP. In truth, it should be holistic.
So, I guess we need to build our own custom data infrastructure.
This is where so many data journeys end up. Having exhausted other options, the comprehensive, and often idiosyncratic, nature of holistic restaurant data ends up driving brands to start down the path to a custom build. And historically, this has been the only viable option to achieve the target objectives.
The opportunities are tremendous, both for improved top-line and operational performance, as well as long-term overhead cost reduction. However, the up-front investment is significant, and once made, represents a long-term commitment to ongoing maintenance and future investment as technologies and methodologies continue to evolve.
We recently spoke with the CEO of a major enterprise QSR brand, who led the charge on building a custom infrastructure within their brand to centralize all of their data. The up-front investment for initial development work represented nearly 10% of the company’s overhead spend for the year, and the ongoing costs of maintaining and activating the system accounts for around 5% of the annual planned overhead spend. This is before factoring in future large investments that will need to be made whenever major shifts happen in component system technologies and providers.
Now several years into the investment, the benefits are myriad for this brand. Greater marketing efficacy is yielding top-line growth. Multiple efficiency drivers in staffing and supply chain leverage are providing greater control over prime costs. Even strategic activations of AI tools have been adding value in core performance areas, like drive-thru and online order handling.
The investment has proven worth it, so far. But the risk was significant, and the time-to-value has been lengthy and required trust and significant buy-in from shareholders, franchisees, and ops leadership.
Is a custom data infrastructure worth the risk? Is there any other way to activate comprehensive and trustworthy first-party data?
So glad you asked.
The common perception is that custom development will yield greater capabilities and tailored functionality, which pre-built systems can’t provide. But when it comes to data infrastructure, this is generally a mistaken view, and a high cost mistake at that.
Data infrastructure is differentiated by the degree to which it can process and structure a wide range of data, from a large number of systems, into a predictable and broadly applicable resource. Custom data development, while seeming to promise greater flexibility, can lead to missed opportunities in model design and analytical processing, without benefitting from the savings that come from data system companies amortizing the cost of their builds across many users, all while perfecting models based upon best practices.
This is the case we make for considering a data infrastructure like Fourtop. Both in terms of the time-to-value and in total investment, you will reap the same rewards of bringing all of your data together into one place, but with significantly less risk and internal effort.
Here are three key benefits of going with a Buy, rather than a Build approach:
1. Leverage existing data models.
Restaurant data models do not vary significantly from each other. There are unique traits that make every successful restaurant special, but these aren’t generally found in the shape of their data. Even certain specializations in service delivery or marketing tactics require only a small number of customizations in the underlying data models.
Yet, when a restaurant company decides to set up their own infrastructure, they have to make a huge number of decisions about their own data models — for example, how to structure menu details within unified order objects, or how to connect customer behavioral arrays to the overarching identity models, or how best to relate employee data to customer data when that person is one and the same. Smart decisions in these areas will yield a much more powerful infrastructure, but they aren’t in themselves, strategically differentiating. At the same time, poor decisions can significantly hamstring a system and require massive additional investment to correct later.
Needless to say, it’s far more efficient and less risky to leverage more standard restaurant data models, where most of these decisions have already been optimized, as long as you retain the ability to make small customizations, when necessitated.
2. Outsource the management of an ever-shifting ecosystem.
Fifteen years ago, companies were coding their own video players to display content on websites. They eventually realized that YouTube or Vimeo could provide the same functionality, with the added benefit of managing all the technical details to ensure the video plays correctly across different devices, browsers, screen ratios, and buffering.
Given the wide array of restaurant tools and software, custom built integrations have similarly reached their inflection point. There’s no real advantage to it, and if you do build your own, you’ll end up having to monitor countless changing integrations and API connections that have nothing to do with your core business.
Outsourcing management and monitoring of integrations to an external resource specifically dedicated to this function reduces risk and keeps your organization focused on differentiating itself by how you act on the data – not on keeping the pipes clean.
3. Scale your investment as you derive value.
Just getting to the starting gate when pursuing a custom data ecosystem can require hundreds of thousands of dollars (if not millions) when you consider consulting support, engineering costs, integrations, and back-end infrastructure setup. And many traditional data warehouse builds require that the system be fully architected before work can be initiated, so even incremental value is a challenge to realize. Bottom line, there’s a long wait before the system starts to pay for itself.
By deploying an existing system that can be rolled out over time, it’s possible to align the next portion of the investment with the particular business outcome you’re seeking to achieve. Then, only after the system is adopted by your team and delivering value, do you move on to the next phase of a deployment.
Incrementally paying for the system as it drives material value can not only put holistic data infrastructure within reach, but make it a priority across the organization.
Sounds reasonable. Can you share some numbers?
Sure. At a high level, in an analytical model we did of a 100-unit QSR brand with 10-source systems, the pre-fabricated data system produced nearly 3X the net value compared to a custom-built system, and that’s assuming the same all-in three-year cost in both approaches – unlikely given the greater efficiency of a prefabricated system for setup and deployment. This equates to more than $5 Million in incremental, bottom-line benefit to the brand.
A Value-Driven Data Investment Model
Costs for a custom build may seem to level out after year one, but to maintain and stay up to date with industry changes, it will require significant reinvestment year after year.
A pre-fabricated system will both generate value and reach breakeven about 10 months before a custom build will hit each of those benchmarks.
For underlying data and assumptions, see here.
Now that we know where to invest, any advice on how or when?
Make data part of your annual budgeting process.
An all-too-common issue is that investment in data infrastructure is considered only after challenges arise with the launch of other tools or technology initiatives. Tech providers can offer strong internal functionality, but the hurdle of getting to clean data can be near insurmountable, and could significantly delay projects and increase cost.
While it’s easy to say that it’s never too soon to start planning for data infrastructure investments, there are certain times when they are more likely to yield immediate value. During annual planning, if you are considering one of the following scenarios, it’s advisable first to consider an investment in your data infrastructure.
Before a Major Growth Phase
Expecting to open a number of new locations or to significantly expand franchising? Growth is essential for QSR brands, but significant growth adds a range of complexities, including more stakeholders, more management systems, and larger volumes to analyze and manage. Investing in data prior to a period of growth can smooth out many issues and provide a strong platform for greater scalability.
Before a Significant Shift in Service Offerings
We often hear brands say they want to first launch a new loyalty program or service delivery offering before they invest in their data. But actually, this can lead to significant limitations in the way the system is built and integrated, and may force future re-work as the data generated from these new systems will need to be adjusted to function with the rest of the business.
After a Merger or Acquisition
Bringing multiple brands worth of data together into a single location can be a massively complex and challenging task, especially because there’s never a good time to pause operations to get everything aligned. Planning for significant data investment around a merger or acquisition can make for a far smoother transition and can help the new entity derive value from the aggregated resources far faster.
And if we have more questions?
Ultimately, generating and making use of high quality first-party data is an ongoing effort. But with the right tools and partners, along with a strategy that is laser-focused on driving measurable value through specific data-driven initiatives and target outcomes, it’s possible to fundamentally alter the current and future potential of a company.
Fourtop provides open stack technology solutions for the hospitality industry. When companies implement our data tools, they are able to optimize their tech ecosystem, and unlock the ability to deploy data-activated automations that streamline operations, improve morale and amplify productivity. Delivered with knowledge sharing, tech support, and enterprise data security, Fourtop clients make the most of their tech, without reservation.