From stone tablets to modern BI, your data has travelled a long wayShivanshu Sharma
The term “Business Intelligence” was coined in a landmark article written by an IBM Computer Scientist Hans Peter Luhn in 1958. He described Business Intelligence as:
“An automatic system developed to disseminate information to the various sections of any organization”.
This description really cuts to the core of what BI is. It is a practice of using tools to automate collection of data from different sources, processing it and then presenting it in the form of actionable information.
Since the time, this term was first coined, BI has grown leaps and bounds (and still does) due to advancements in computing power, computational analytics, data storage, reporting and other networking technologies.
It has become a framework for organizations for taking tactical and strategic decisions.
But, is that all? Was there a time before computers? If yes, how did people make decisions before that?
Data Collection on Stone Tablets
It all started with armies and imperial bureaucracies trying to work out ways to collect taxes, wage wars and feed people. Stone tablets were used to store data.
They were considered as a notion of faith and the data collected was used for foresight and planning.
Sumerian Stone is the first recorded use of written language for data storage that tracked shipments of wheat through the local granary.
It wasn’t long before everyone started realizing the potential limitations of data storage on stones. Stones were perhaps a better option for public display and had a reasonable resistance to the destructive forces of rain and wind but difficult to both store and inscribe a long text.
The Emergence of paper
A major breakthrough came with the invention of paper. It allowed for more information to be stored and accessed in a smaller space.
Record keeping really came in its own as better forms of paper were invented.
Organizations made a note of key information pieces on paper and later used them for gaining insights.
In fact, the term “business intelligence” was first used by Richard Millar Devens in his book “Cyclopaedia of Commercial and Business Anecdotes in that year”.
He used to describe a practice followed by Sir Henry Fumese where he collected three thousand anecdotes and incidents of trade and commerce to gain knowledge of political issues, risks and general market conditions of the time. This gave him a leg up against his competitors.
Many organizations and individuals still use only paper to record and manage information.
But, organizations soon started to realize that storing data for a longer period makes it vulnerable to get lost, stolen or even destroyed.
Further, increasing volume of data made it extremely difficult to navigate through stacks of paper to find the right information at the right time.
We needed something else. Something that can manage our ever-increasing appetite for data storage and analysis. And then, computers made a red-carpet entry into the organizations.
From Silicon in Stone to Silicon in Microchips
With the advent of the computer, organizations finally had an alternative to easily store data and navigate through it.
The first computers were tabulating machines, designed to perform quick one-off calculations. But soon scientists started developing information storage capabilities.
And then around 1940, both the information storage capabilities of the computer as well as its computing power exploded. There was no looking back from there.
Organizations needed more space to manage their growing data and scientists responded with storage technologies like floppy discs, hard disks and laser discs.
But then, the complexity and volume of data couldn’t be handled any more. To address that, programmers developed Database Management Systems (DBMS). It dramatically decreased transaction time by splitting data elements and storing them separately.
With growing volume and variety of data coupled with the simultaneous increase in power of computing systems, businesses started adopting applications and technologies for specific business functions.
Since these applications were serving only one purpose, they allowed for faster access to crucial data, and eliminated the tedious data entry.
Point-of-sale systems were the classic examples of the transactional system which were more likely used to process sales transaction, record and track inventory and keep the business in fighting shape. To streamline processes, it could be easily connected with accounting systems that gathered and organized sales information for subsequent use.
Big names like SAP also enabled these businesses to prepare reports based on that data.
However, there was one challenge.
Companies normally have many transactional systems – one for each purpose. They would be having a CRM, an ERP, a billing system, a distributor management system etc. Each were designed for a specific purpose.
They all recorded information in a different manner following different naming conventions and storage protocols. The political barriers within the company made the search for truth even more difficult.
So, now when the CEO wanted to check the status of his business, he had to peek through different reports and make sense out of it.
Needless to say, these weren’t the right tools for the job. They were meant for recording and not research.
And hence BI was born.
The Emergence of Decision Support Systems (DSS)
BI initiatives started as IT pet projects. They made it possible to extract data from multiple transactional systems, combine them in a single format and then present it in a single place. The chief decision makers were not required to see different reports.
By now, you would have realized that a business intelligence does two major things: querying and reporting. Querying is transforming data into a single format and reporting is presenting it in a single consumable format. When the BI projects started, IT’s major role was to query data. Reporting was done mostly once in a month/fortnight.
As companies started realizing the potential value of their data, their BI investments grew and so did the pressure on the IT team. The queries became complex and so did the reporting frequency – from monthly to weekly and then from weekly to daily.
It even led to the creation of separate roles in the organization that of “Data Miners”. Their role was to extract data from multiple applications and produce a single and meaningful report out of that data.
But then, the dawn of 21 st Century had a new challenge in store: Internet started booming which brought in a host of changes along with it:
– The advent of new data sources which produced terabytes of complex data on a daily basis
– Customer facing solutions were required to operate 24*7
Existing BI tools were developed keeping the IT professionals in mind. They required extensive training.
The businesses started asking even more complex queries and demanded answers in real- time.
They demanded intuitive and easy to use tools which could be used by anyone in the organization. This tool should be able to handle data of any volume, velocity and variety to produce insights in real-time. This meant that the existing tools just couldn’t keep up.
The Proliferation of Self-Service Business Intelligence
The technology supported the evolved requirements. Cloud-based programs expanded. Computing power increased. Technology companies started providing far more flexible and easy to use platforms than ever before.
This lead to significant reduction in lag time and decentralization of insights throughout the organization. This eventually led to technology vendors offering self-service software.
They allowed even a layman to create beautiful interactive reports through simple clicking and drag and drop feature. No technical skills required, no IT background required.
Microsoft went a step ahead and even included search functionality. Users don’t even have to drag and drop. They just need to type in their queries (just like Google Search) and voila!!! The answer is there.
From Stone Tablets to Self-Service BI, it has been a long journey. But is that where the buck stops?
The whole concept of BI started with a need to manage things better. As the volume of data started growing, organization’s requirement followed suite and so did technology capabilities.
So, what’s next?
Here is what we predict, and these are not some out of the box predictions.
– Firstly, the data will keep growing in variety, velocity and volume. We have started talking about new data sources like IoT streaming data, voice, image etc.
– Second, the organizations would want to get over descriptive analytics. A need to take better decisions will require them to develop predictive and prescriptive capabilities. In fact, words like Machine Learning and Artificial intelligence are no longer buzzwords. Organizations have started developing these capabilities to support business requirements.
– Thirdly, technology would continue to evolve. We already have started talking about Quantum Computing and Analytics at the edge.
– Fourthly, and most importantly, Business Intelligence initiatives would become even more decentralized across every user group with their queries becoming even more complex than ever before.
As more data becomes available, we will continue to see a shift from BI being used to represent data to more towards BI being used as a centre of excellence.
One caveat which is noticeable in the whole post is that while we have shown an evolution of BI, we have observed that while some organizations are evaluating potential benefits of Predictive and prescriptive analytics while there are some that are still stuck in paper.
It is our strong belief that the future of any organization depends on how well they are able to use their data to take strategic and tactical decisions.