Accuracy can easily be mistaken for usefulness. If you pay a consultant a hefty sum to come up with a detailed analysis of some key business function, the results may be impressively accurate but can also be utterly useless. Take this example: The graph above shows IDC’s Intel Itanium sales forecasts made in nine consecutive years, along with the actual sales numbers; the difference is staggering. The sad part is that thousands of people paid good money to get these wrong forecasts, and they did so over and over again expecting better results.
Predictions are typically wrong, and very often misleading. We still use predictive models though, as they are the only tool we have for managing the future. As George P. Box said, “all models are wrong, but some are useful”. But what about “real” facts? What about “looking at the data” – querying, analyzing, summarizing, and all that good stuff? No matter how much effort you spend on getting quality data, accuracy doesn’t imply usefulness. Paul Graham tells this story: ”I remember telling David Filo in late 1998 or early 1999 that Yahoo should buy Google, because I and most of the other programmers in the company were using it instead of Yahoo for search. He told me that it wasn’t worth worrying about. Search was only 6% of our traffic, and we were growing at 10% a month. It wasn’t worth doing better.” David Filo relied on facts to make a [wrong] prediction. The facts were accurate alright, but his interpretation was arbitrary.
Turning data into useful information is not trivial. It requires experience, carefulness, and often intuition. Simple problems like deciding on the color of a button, are fairly easy to resolve. Run an A/B test, see which color gets a better conversion rate, and go with it. This is true only if you can collect enough data points, of course. Startup companies very often don’t have this luxury, and their only option is to JFDI.
When the number of variables increases, it becomes exponentially difficult to draw useful conclusions from the data. This is where statistics comes into play. Using the right statistical tools for job is key. Be careful and know what you’re doing, otherwise your “data driven management” might be as chaotic as they come. As John von Neumann said: “There’s no sense in being precise when you don’t even know what you’re talking about.”
You must be logged in to post a comment.