By John Fielding

Earlier this month, we were excited to meet with some very smart partners of the Quality Forum who wanted to show us how one of their recently completed big data projects might benefit us. This particular tool used an innovative algorithm to combine our clinical data with a wide variety of historical and projected environmental, socio-economic, political and social information to predict the emergence, over time, of disease hotspots on a map. The tool had proven itself in the lab to be more accurate and timelier than other well-known instruments out there.

It was really quite impressive.

But while our partner's impressive work was ready for "selling" to us, it occurred to me that it only exposed one-third of what is needed for our customers. If I used the tool to show a major health care organization in, say, Alexandria, that there will be an outbreak of influenza in their area next November, that would certainly spark interest, but it only answers the question "What will happen?" Still out there haunting the organization like a teenage ex-girlfriend are:

The point is, always look way downstream when developing new analytics. If your customer sells to other customers, then you have to ensure that what you're developing meets the needs of your customer's customers. With such appropriate tweaks to the tool our partner presented, our customers would receive the holy triumvirate of analytics: the Descriptive (what happened yesterday and what contributed to it?), the Predictive (what will happen and why?), and the Prescriptive (how and when should I react?).

And that's a three-piece Big Data Happy Meal that don't need no stinkin' bag of chips!