The Business Value of data products Is often miscalculated. Learn these two rules to calculate it correctly
For me, a product manager, the weighted shortest job, or what is called the cost of delay changed my perspective on understanding value. That’s what we want to do as product managers, maximize value. And the essential ingredient in that formula is the business value of a task or job.
For data products, for data-heavy products, machine learning solutions, business intelligence systems, in short everything that has data at its core, I often happen to misguess the business value by a lot.
It’s not because you have to go through a thorough calculation, it’s because a data products’ business value obeys two simple rules, that are not so prominent with other products:
Rule No. 1: There’s always a (good) alternative, you got to compare your product to.
Rule No. 2: The only value of data lies in improving someone’s decisions.
Let’s take a look at a few examples to understand, why this often misleads us.
The Evil Math Part — WSFJ & Cost of Delay
I said we only need to roughly guess the value. And that’s true. But the more basis we have for a guess, the better. I really like the Cost of Delay concept which basically says, divide the “Cost of Delay” by the job size and you got the biggest value gain per time unit.
- End-user value (what’s it worth to the end-user)
- Time criticality (hard deadline because a team or external user depends on it?)
- Risk reduction + Opportunity enablement
That’s pretty easy, especially since we only got to get a rough, relative feeling for things on our plate. We just gotta estimate what holds more value, not find an actual $$$ figure.
Now most important in my opinion is the “end-user value” part. After all, ultimately, even though the other two things are important, they are really only indirectly paying in on end-user value. So the crux lies in the value. How do we calculate that for data products?
Let’s take a look at two data-related examples.
It’s The Alternative!
Your data team is asked to create a new shiny report displaying metrics on the call-center operational units. You’re told it will enable the team leads to organize the work, shift resources where necessary, and optimize overall operations. The business value is huge, they think they can save 5–10% of the resources by optimization.
Sounds like an important deal right?
Except, have you asked, “so how do you work currently? How do you plan & optimize the work? After all, you’re already making decisions and shifting people around, right?”. They might respond with “Yes of course! But we’re using this huge Excel file that’s compiled by a temp once a week. Take him around 4 hours.”
So suddenly, the actual value turns to …
Mmmmh. Now that looks like a task with a lot of fewer zeros.
I’m not saying this isn’t an important thing, I’m just saying, the alternative matters. In this case, it matters a lot.
Let’s take a look at another example.
It’s About The Decision
Your data team is now asked to produce sales forecasts. The team really likes the idea of using some fancy machine learning mechanism.
But wait, we also got 10 different tasks for which we have a business value guessed. Do we do this first? Or last? Or not at all? What’s the business value of forecasts? You ask the manager on the other side.
At first, he isn’t sure. So you ask “Which decisions can you make better, if you have a rough estimate on the sales?” Aaah so now he gets the point “Mmmmh, so if the stuff is reliable, like 90% reliable, we can preorder things and not get into bottlenecks which cost us around 10% on top of our usual procurement costs. We are currently not able to do that forecasting by hand, it doesn’t work.”
So that’s the point. You know, if you can produce a solution at least 90% reliable, then you will be able to save 10% on the average procurement costs.
You also just got told, if you only have an 80% solution, you probably will not produce any value. Great! So you now got your value, and in return, your team can re-estimate the “size” because you now know, you have to hit the 90%.
It’s About The Whole Cycle
I hope this already helped you to learn how to better judge data products, additional features by their ultimate end-user value.
Sometimes, this will still not be enough, especially when you need to judge larger complex products. In that case, you’ll have to look at the whole datacisions cycle which you can learn about in another one of my blog posts. It will help you further dissect what someone actually needs to make a better decision and how your product will help with it.
- The SAFE framework provides information about both the Cost of Delay and the weighted shortest job first concept.
- Finally, if you want to dig deeper into data & decisions, and how to really make data products work, look at my article about good data vs. bad data strategy.