After decades of investing in business intelligence (BI), most employees still don’t have access to trusted, real-time insights. Traditional business intelligence relies on the neatly choreographed arrangement of staging servers, pre-calculated cubes, batch report servers, static dashboards, PDF and Powerpoint reports, desktop visualization tools, and many other convoluted and ungoverned methods of sharing data in the enterprise.
What do all of these have in common?
They reflect the limitations of the existing on-prem analytics infrastructure. Giving all employees access to real-time insights would be too expensive — it’s cheaper to batch-process or copy the data instead.
Business intelligence today: Belts and gears
Today’s BI looks very much like the steam-powered manufacturing floors of the 19th century. A typical factory relied on an intricate set of belts and gears to deliver power to the individual workbenches, as pictured below. As these belts and gears drove the hammers, looms, and presses then, the data staging servers, emails, and batch processes drive business intelligence today.
Yet, it took almost fifty years to shift manufacturing from steam to electricity.
Why?
To take full advantage of electricity, factory owners had to transform their factory floors. Electric motors delivered power exactly where and when it was needed, and as the technology developed, every workbench had its own tool with its own little electric motor. Unlike steam, electricity was distributed in real-time — in the moment that workers needed it.
The same transformation is happening today with the transition of data infrastructure from on-premises to the cloud. What was not economical a few years ago can be done efficiently today: Every query, every visualization, and every transformation can be processed in real-time in the cloud.
Monolithic and batch-oriented business intelligence suffers from other limitations as well. For example, the lack of governance and duplicated business rules result in many versions of the truth.
To take full advantage of cloud data infrastructure, we must undergo our own transition from steam to electricity.
A new paradigm: Data as a Service
The data industry now has a unique opportunity. Cloud-based data infrastructure can allow every decision to be data-driven. And as both people and machines make decisions today, this new infrastructure needs to support automated decision-making as well. We need to break down the monolithic nature of existing BI tools, and we need to deliver Data as a Service to every device and person so that access to data becomes truly pervasive.
The idea for Data as a Service comes from our experience with large analytics projects over the last decade and fromnumerous articles written by industry experts.
These are our Data as a Service principles:
Data as a Service: Tech principles
- Separation of analytics logic and the presentation layer. API-first analytics can be leveraged by a broad spectrum of users, tools, machines, and devices. Analytics logic needs to be defined only once and made available to everyone over the API — and APIs need to be flexible enough to allow unrestricted data querying.
- Declarative business rules. All business rules need to be defined declaratively — we need to express the logic of metrics without describing the query flow. The process logic and queries must then be inferred and generated automatically in real-time, otherwise the data services become too brittle and dependent on the underlying data structures and query engines.
- Cloud-based. All analysis is processed in the cloud, and no data is copied to desktops. Every time we copy data, we lose the ability to govern it.
Data as a Service: Business principles
- Predictable pricing and licensing. APIs don’t count seats. Period. At GoodData, we price by the logical domain (sales, customer support, marketing…), enabling access for every user without increased costs.
- Open ecosystem. Data as a Service needs to be available for anyone who wants to build a data presentation, lineage, or cataloging tool, or to perform data governance and lifecycle management.
- Multiplatform support. Data as a Service means no lock-in — so DaaS infrastructure needs to support any database, data streaming infrastructure, and any public or private cloud.
Next week, we will take a major step on our road to DaaS vision with the free, community-focused release of our cloud-native platform with these benefits:
- A modern, developer-friendly framework: GoodData.CloudNative allows users to deploy analytics based on the most modern framework available — including Docker, Kubernetes, etc. This developer-centric, API-first build delivers performance, UI flexibility, and efficiency benefits across the board.
- Deployment flexibility: The ability to deploy with any cloud gives developers real options and eliminates the need to move or copy data. This reduces costs, improves efficiency, and enables real-time analytics by reducing latency.
- Compliance and security: Allowing for on-premise options gives customers the freedom to choose deployment and implementation options that best support their business requirements with GoodData’s existing best-in-class security practices.
Our hope is that the transition from monolithic BI to Data as a Service will be much faster than that of steam to electricity.
The time for experimentation with analytics is over. To make every decision data-driven, we must move to a new paradigm.