top of page
Search

Being Data-Driven: A Discussion with Founder, Brian Reece

Updated: Nov 1, 2023

Brian Reece is one half of the pair of long-time professional partners who founded Service Physics in 2019, having decided to take a highly successful training campaign “pro” and grow it into a full-time consultancy business with a singular mission: solve the myriad of problems and issues that have long plagued the service industry. We recently spoke with Brian to learn a little more about his values surrounding the deeper use of analytical data in making informed decisions, and the greater Service Physics ethos.




What does it mean to you for a company to be “data-driven”?


That’s a big question. There's what companies do and say, that they think makes them data-driven, and then there's what I think a truly data-driven process should be. A lot of companies will set a numerical target, and then give that to their employees and say, “go achieve this target”, and think that’s being data-driven.

In my experience, corporate targets are often driven by a finance team, who is more or less saying, “Hey, next year, we need to do 10% better”. What does that really mean, for your particular business unit or your specific area? Say you have the Northeast region of X retailer being told, “you have to grow revenue by 20%.” Okay, great - but what's that based on? And how am I supposed to do that? “Well, I don't know. Go figure it out.” That's the worst-case scenario.

The physics in Service Physics is intentional, in that it's meant to get across that we seek to understand how the world actually works, and can help you configure your business in such a way that it will actually deliver the value you want it to deliver, versus just hoping for a better outcome, or relying on people to continually work harder.


What is the Service Physics method of utilizing data in a more meaningful way?


There are two ways that we use data. The first is exploratory; let's go and see what opportunity exists out there and what insights we can glean. Let's just look at the current state of things, and try to understand what's happening today and pinpoint opportunities for improvement or growth. Then we take those findings and say, “What would it be worth for us to solve this problem, or to take advantage of that opportunity?” So it's a little bit more of a bottoms-up approach to business. For example, the other day we found a $100 million opportunity for a company just by rearranging some of the work they do in their kitchen. And there's no major technology investment needed. You don't have to ask those people to work any harder (in fact, it'll probably be easier), and the company can grow $100 million by taking advantage of this opportunity that we discovered - just by understanding what's happening today and looking for opportunity within the data. That’s one way we use data, from an exploratory standpoint.


The second way we use data is related to embedding an actual “data-driven” business process. The key to any growth (business, personal, or otherwise) is learning - and learning comes from understanding cause and effect (physics). One example of a data-driven learning process that can be embedded in any organization is the problem-solving process known as PDSA (Plan-Do-Study-Act).


The Plan element should include a goal; a numerical statement of what we wish the effort to achieve - and the plan to achieve the goal should demonstrate an understanding of the physics involved; “If I do this, then I expect that to happen as a result.” Then we need to Do the plan all the way through to obtaining a result (unless there is a safety or quality problem). Finally, we need to Study the result, which is very often not done, to see if we achieved the original numerical goal. Our study effort constitutes the embedded learning, and is often where organizations fail to close the loop on new initiatives and either continue through another PDSA cycle, or move on to the next most important problem to solve.


World class organizations are very disciplined in closing the loop and actually moving through not only the full PDSA cycle, but several iterations of it, until the problem is solved in a sustainable way.


What makes Service Physics different from other consultants?


The first inclination for many organizations in problem solving is to implement technology as a silver bullet “solution.” That same client I previously referenced where we found the $100 million opportunity hired one of the big consultancies to come in and help them with the same problem. That big consultancy came in and said, “You’ve got to implement this technology solution.” And now they're of course spending millions of dollars on rolling out that technology, but without an understanding of the physics involved (why the technology will solve their problem). We went in and actually observed this new “solution”; it was not going to deliver a significant opportunity and was actually degrading the quality of their product. So our approach is, “minds before wallets”; let's come in and just understand what opportunity exists within the current system, before we go and add technology or tools or other things to it.

My hypothesis, and I don't necessarily know this to be true, is that Company X asks Consultant Y to come in and look at their business. Consultant Y looks at all the finance numbers and they say, “Oh, you know, per our benchmarks for this other client of ours, if you did this thing you could grow by 10%.” I think that's how a lot of consultants work; they just look at existing, readily available data, make a bunch of assumptions, give a recommendation for technology (because everybody believes technology can make their lives better), and then they leave. It’s like, “here's your work to go do,” and off they go. So they have no accountability to actually see their work through or know whether their recommendation was effective and impactful. No one is learning in this process and there’s certainly no physics involved.

That's a service level difference between Service Physics and many other consultancies. We stick around. And we don't just come in and look at data that’s existing, we go and collect it ourselves and we teach the client teams how to collect it and to see what's actually happening. We don't rely on what's easy to access in terms of database numbers or what the finance projections are; we’ll actually go and collect our own data that helps us understand the physics of what's happening within the business, derive our own perspective, paint a picture of their business in numbers, and use that to create a real recommendation that's rooted in reality.


So to paraphrase, Service Physics, in contrast to other consultants, generates their own metrics; they do deep research on how a business is functioning, and produce recommendations based off of that deeper research.


That's right. And so, again - back in hypothesis land - Consultant Y comes in and says, “Give us your Point of Sale data, give us your financial P&L from the last three years, give us this or that data you already have”, and they take that data and layer a bunch of assumptions onto it based on a book they read or some market study someone else did, and say, “Here's what's possible, based on all this research we did” - which was basically generated while sitting at a desk rather than from the front lines where value is being delivered to customers. Service Physics goes into the operation; we put our eyes on what's really happening, we talk to customers, we take operational data and then synthesize all that into a clearer picture of opportunity that exists and the physics of how value is being created (or not!). And that's being data-driven; whereas the other approach is more like being data-validated; using existing data to “validate” their assumptions or whatever they’re recommending. This is, in my opinion, extremely dishonest.


What’s the difference between analytics and reporting?


Reporting is looking at what happened; analytics is looking at the data in different ways to uncover opportunity or figure out why something happened. Let's get some understanding of what the drivers were to obtain that result. Arbitrarily, if we're looking at a dashboard every Monday, we may know that the number of customers went from 10 to 12 - a 20% increase. Hurray! But why? If you can't answer that question, you need to do some analytics to learn the physics involved.

In that way, reporting is perhaps more like studying history, whereas analytics might be more like studying anthropology; understanding the drivers behind the outcomes - which then may provide an ability to forecast.

I like this analogy. Because, sure - you can understand that there was a stock market crash in 1929. And you can look at that on the surface and say, “Well, the crash before that was 100 years back, the one before that was 120 years, the most recent was 75 years - I’ll take the average of that and say there's a stock market crash every X number of years.” And I'll take that information and call that data. However, that has nothing to do with why that stock market crash happened.


The PDSA cycle naturally provides rich analytics, because you can see the why behind the result. Organizations that actively and very quickly move through that process are truly data-driven, because they're always looking at data directly related to the problem they’re trying to solve; they’re making a hypothesis (forecast/prediction), they're testing that hypothesis, and they're improving, constantly and continuously. It’s the scientific method, really, and all human progress is based on the scientific method.

241 views0 comments

Recent Posts

See All
bottom of page