I have been working in the Business Intelligence world for a great many years now, and one of the things I have noticed pretty much from the beginning is how parochial we practitioners are about our platforms and practices. IT shops have a standard BI platform, but typically only one. Consultants have their favored technology, but often refuse to acknowledge the attributes of others. There may be an established project methodology to which all projects must conform. The more I encounter this mindset, the more I have come to realize how counterproductive it really is in the BI space. It is as much to say that a carpenter should be able to construct a custom bookcase with just a screwdriver.
I am being a little ham-fisted, but bear with me. I assure you that I believe wholeheartedly in standards. Without standards, there would be chaos. There would be little control over data consistency and accuracy, reports and metrics would yield conflicting results, and the cost of managing (not to mention licensing) the BI program would be enormous. However, having standards or core competencies does not mean that one size fits all any more than one tool is appropriate for all jobs. Let me offer food for thought on four areas of activity that occur within a BI program.
As noted above, there are many reasons for an IT shop to adopt a standard platform for its BI program. By the same token, no one platform does everything universally well, no matter what the sales folks tell us. This is why taking a strategic approach to standardization is so prudent. By looking tactically at the long term BI needs of the program and the enterprise, it should be possible to identify a standard of two platforms that allow full coverage of these requirements while affording a choice of solution options. This is by no means universally necessary, especially in smaller companies. When it is, though, the two platforms should be able to use the same master data sources and data models, cover the range of required analytic tasks with minimal overlap, and have a reasonable combined total cost of ownership.
For years, the concept of running an IT project using Agile was unthinkable. Only trash would result from a project that did not have a rigorous requirements phase followed by a thorough design phase. Now it almost seems that the converse is true. I recently completed a project plan and estimate for a client that has been bitten by the Agile bug (and where I have already completed two Agile projects myself). “Why aren’t you using Agile?” asked one of the IT directors. Insofar as this was to be a complete architectural rewrite of an existing solution, it seemed obvious to me. “Because to achieve this redesign of the architecture in small increments without breaking the existing applications would take us fifteen months rather than fifteen weeks,” was my reply. A toothbrush is the wrong tool for sweeping the driveway. The lesson here is that every approach has strengths and weaknesses, and if we do not select appropriately we court disaster.
The “My Toolset” Mindset
I have to admit that I am more comfortable working with certain platforms. Nevertheless, I have never been able to subscribe to the mindset that it is the only one that I should work with or that my client should consider. The needs of my clients rarely align that way. Rather, I have had the opportunity to work with a range of technologies and I think this has made me a better consultant. There are profound benefits in being platform neutral. (Stay tuned. I plan to devote an entire rant to why the term “Platform Agnostic” or “Vendor Agnostic” is both ignorant and nonsensical and should be rebuked.) Not being tied to a single platform enables me to be more attuned to my clients’ business needs. It gives me different ways of looking at the same problem. It keeps me from being complacent. And while there are many substandard products on the market, I am able to avoid the hubris of thumbing my nose at the good ones because I may well be using one of them tomorrow.
Here is one last “One Size Fits All” situation that I encounter frequently. Personally, I insist on thorough change management processes in my consulting practice. The idea of deploying software that has not been fully tested and validated is appalling. Being mostly large companies, my clients have very rigorous processes in place. The cost per minute of having your global POS system down because of a deployment bug is astronomical, so the scrutiny pays for itself. This scrutiny requires documentation a week in advance, testing results, and several meetings before a deployment can be approved. But should the same scope of paperwork and meetings be required for BI solutions? In general, while a BI outage is inconvenient, it is rarely so mission critical as to bring a company to its knees. A BI program cannot be nimble and adaptive when a small tweak to a metric requires a week to deploy. Documentation, test results, and approvals should all be part of the process but they should be designed with agility in mind. It’s not whether or not to use a wrench but rather the size of the wrench in this case.
These are all examples of how intelligent tool selection can improve BI practice. Standardization, methodology, core competencies, and good change management all benefit from appropriate tool selection and sizing. Understanding that one size does not always fit all can make your BI program more nimble, more scalable, and more profitable. These are my thoughts at any rate, but one is seldom a prophet in one’s own land. I caught my wife using the handle of a screwdriver to hang a picture the other day and asked her why she wasn’t using a hammer. “I would have had to go all the way out to the workshop to get a hammer. Why do that when this was right here in the junk drawer?”