Is measuring the variance in predictability really analytics?
Business intelligence as a platform has significantly improved the ability of businesses to gain insight on answering some of their most important performance questions. At a very basic level, here’s how it works:
- The designer of the data warehouse painstakingly sifts through myriad information that the business leaders say is important to run their business, looking for the appropriate data that will provide the answers.
- Once found, models are created so that the information is now being captured and monitored.
Now the question is, since this is a planned metric, at what point did analysis take place? If we assume that it occurred at design time, then this metric has become predictable because the only thing it is capable of reporting is what the model was originally designed to tell us. For example, the model may be designed to monitor the relationship between parts and suppliers. If inventory falls below 20%, an alert will appear for someone to come and order new products. Good designers will look for all the possible combinations they can think of to understand why parts would drop below 20% and put in metrics, scorecards, dashboards etc, to show what is happening.
There is a slight problem, however.
The models generated to create the business intelligence warehouse are static in nature. What this means is that if additional information is required in the future, then so is the entire process of rebuilding the model, extracting the data, reloading the data, and republishing the warehouse before the new data is available to analyze the new question that needs to be asked. Often, little sub-warehouses are created to speed up this process by not moving as much data and publishing information faster. Although ideal in theory, these sub-warehouses contribute to the issue of the proliferation of data – duplicating data that then needs to be updated in more than one location.
Our conclusion is that business intelligence is great at static analysis or measuring predictable results of pre-planned conditions.But what do we do when something unexpected happens?
When static analytics are not enough, what’s next?
What’s next is “dynamic analytics.” Let’s take an internet search as an example. The first thing I would do is go to a search box and type in “species of frogs.” I could then count the total number of species, but what if I just want to count bright green frogs? I can type “bright green frogs,” because this data exists on the internet, in no particular structure, further enhancing my search. This is fun: “bright green frogs found in South America,” “bright green frogs in South America that live in trees.” These queries are all possible, each one providing me with more information.
So what is the difference between internet searches and the business intelligence environment? Every day I could type in to the search box “bright green frogs in South America that live in trees,” and every day I could potentially get a different answer – maybe some new data was added due to the fact that destruction of the rainforest caused a species of green frogs to become extinct or scientists discovered a new species of green frogs in another area of South America, etc.
With Enterprise Search 2.0 platforms, this dynamic concept of searching and obtaining relevant information is now possible.
Shifting to Enterprise Search 2.0-powered dynamic analytics for business
Innovative and advanced organizations see the value and power of a unified search platform for their business. Using a series of state-of-the-art data connectors to connect disparate data systems in your information ecosystem allows information to be pulled into a common unified index that can consolidate, correlate and normalize the data in near real time and provide ubiquitous access to it.
Isn’t that what the internet is – a common index of information that is accessible by everyone? Like the internet, Enterprise Search 2.0 platforms can enrich their business environments, providing dynamic mash-ups of key relationships between non-integrated data systems through a search query as opposed to through a warehouse that takes days or weeks to rebuild and recreate by moving all the data. Instead of moving the data, the unified index approach only references it, so when new applications or new entities are added to existing applications they become part of the index and are fully accessible.