I have been thinking about Business Intelligence and Business Analytics the last few weeks. In my earliest days designing and supporting BI, data marts, and data warehousing systems in the late 1990s, the work I did was aimed at something typically less-than-critical. Back then, my customers ran Oracle and SAP Data Warehouses with a sprinkling of Teradata and others. BI reports could take a few hours if not a few days to land on someone’s desk or in someone’s inbox. If a report failed, my customers were often OK with re-running the job that created the report later that evening. They needed the information contained in the report, sure. But with the exception of the last few pages in their month-end reports, in many cases my customers didn’t need these reports real-time. It’s safe to say then that the data in these reports didn’t affect mission-critical business operations one way or the other. They were important but not mission-critical.
A decade later, nothing had really changed….In the early 2000s, BI was still expensive and oftentimes still ‘batch’ oriented. The highest performing BI systems – those capable of delivering complex real-time analytics – could only be afforded by very large organizations with equally large budgets. For the rest of us, Microsoft came out with Analysis Services (OLAP Services) in 2000, and a Reporting Services add-in to SQL 2000 in 2004, and those were nice steps in the right direction. SAP came out with its Google-like BI appliance a bit later and saw some decent adoption in the SAP ecosystem, but even then SAP’s customers didn’t really know what to do with all that power and ability to do real-time analytics. And it was still “SAP” expensive….not exactly BI for the masses. In the end, we had a mix of expensive real-time and inexpensive near-real-time BI solutions, and end-users (actually, their IT organizations making the purchasing decisions) pretty much had to accept the trade-off.
Five years later, and things are finally changing. Cost-effective real-time analytics are here. As the cost of world-class BI systems has fallen, and BI’s capabilities to provide real-time business visibility have grown so much, we’re at a new crossroads. I have come to the conclusion that this new cost model and capabilities combination has moved BI into the realm of mission-critical gotta-stay-up applications. Why? Because a contemporary BI system’s failure may now cause an organization’s overall mission at hand to fail – real mission critical impact. Because waiting a day for a report means losing a day’s advantage to your competitors who now have not only the same kind of inexpensive real-time data, but also the ability others have to do mash-ups in a way that uncovers new trends, markets, expense reduction possibilities, and net new OPPORTUNITIES. Cheap real-time BI gives businesses the agility they need today, this minute, right now, to make smarter decisions and employ course corrections….to keep the business not only afloat but ahead of its competitors. Again, mission critical impact.
What next? It’s time that the platform providers shared the kind of prescriptive guidance and practices necessary to host these real-time business capabilities in a mission-critical, unbreakable, resilient, and risk-mitigated manner. Yes, beyond unbreakable… which arguably is the most important attribute but at the end of the day really only represents ‘table stakes.’ Microsoft is one of very few players that can offer mission critical platforms across a breadth of innovative and cost-effective hardware platforms, from on-premises to cloud, from HP and Dell to Fujitsu and Unisys hardware partners, from CSC and IBM to smaller hosting providers. The combination of these four attributes – unbreakable, innovative, cost-effective, and superior choice – sets the stage for a new generation of mission critical platforms, a new normal in enterprise computing, and a new sense of the phrase ‘table stakes.’ All of this sounds like something we need to explore in another post…
Thanks for your (gentle!) comments and feedback,