How an API-in-a-Box Can Deliver Data Analytics Nirvana
The field of API for data analytics is still new, but to be successful in the long term, it’s an approach that we vehemently advocate. Give it a try.
Join the DZone community and get the full member experience.
Join For Free"Information is the oil of the 21st century, and analytics is the combustion engine."
As early as 2011, Peter Sondergaard, a Senior Vice President at Gartner, predicted a change in data management strategies known as big data, the pursuit of which would create an unprecedented amount of information of enormous variety and complexity.
He was right. Today, organizations store vast amounts of data, much of it across multiple, disparate databases that are unable to talk to each other. It’s a problem that’s exasperated by mergers and acquisitions where new datasets are inherited.
Organizations generally understand the power behind analytics, but how do you make it work culturally and technically? We take a look at the barriers to data analytics success and suggest new approaches that buck the system, with dramatic results.
The Technology Challenge
Different departments will always need separate access rights. Probably most HR data should be accessible other than financial data. Highly sensitive data should only be accessed by authorized personnel, while other data (sales/marketing) might need to be shared among cross-functional teams during certain time intervals.
A common approach to this problem is to store all “sharable” data in a data warehouse — which is an expensive and time-consuming approach. 80% of the effort of a typical data project is focused solely on cleaning data. Furthermore, extracting data from its original source into a warehouse duplicates that data, increasing storage demands. Another problem is that data, such as sales forecasts, ages quickly. Without a way to continually and automatically update that data in the warehouse — in real-time — your analytics will be founded on outdated information.
The Cultural Challenge
To become a truly data-driven organization, a cultural shift is necessary. Change always prompts concern. Fear of change and subsequent data fiefdoms are some of the main reasons why data analytics projects fail. Data owners fear they’ll lose relevance or control of data if they are forced to share datasets with other departments, agencies, or external expertise is brought in.
Without a Shift, Data Analytics Is Destined to Fail
Welcome to the world where technological complexity and cultural fiefdom is killing data analytics projects. It comes as no surprise that 60% of data analytics projects fail.
How can organizations counteract these challenges and find a way to connect disparate data (only using the original data source) while gaining buy-in from the team? The answer lies in an unlikely source: application program interfaces (APIs).
Addressing the Technology Challenge: Break Down Silos With APIs
Data may be today’s oil, but it will be tomorrow’s oxygen. Mobile devices, IoT, and cloud applications generate vast data streams. We’ve come to expect access to valuable information at our fingertips.
Enterprise data problem-solving has also changed. Gone are the days when software giants, such as Microsoft, SAP, Oracle, and MicroStrategy, were one-stop shops for addressing your data challenges. Today. you can mix and match data from different systems without the help of the big guys.
Thanks to APIs, disparate systems can now interact with one another and exchange data. With APIs being lightweight (APIs eliminate the need for traditional hard-coded system integration), modern, flexible, and less risky than other data sharing approaches, API use is booming.
Concurrently, data warehouses are losing relevance. They still have a role to play but are no longer predominant as the single or predominant source of data in the enterprise. And that’s okay. It’s not necessary to maintain a “golden record” of all the data entities in your organization. And that’s where APIs truly excel. They allow you to work with real-time data (as opposed to historical data) and real-time analytics to provide a better understanding of what’s going on at any given time.
Addressing the Cultural Challenge: Take an API-Enabled Iterative Approach
With APIs, fear and entrenched data fiefdoms are a thing of the past. Instead of grabbing data from a department’s database, cleaning it and prepping it for analysis, the data stays right where it is, under the control of the data owner. Opening your API also helps you maintain the health of your business intelligence program by promoting data hygiene. Knowing that their data will be shared, data owners instinctively become more accountable for keeping that data clean. Whereas with a data warehouse approach, once the data leaves the department, data owners no longer feel responsible for it.
APIs also support an iterative approach to analytics. Data owners can decide what to share based on what they feel most comfortable with. As they see the fruits of their sharing, they start giving up their data monopoly. It’s a nimble and cost-effective approach that increases team buy-in.
Of course, it doesn’t happen overnight. How can your organization achieve this shorter, nimbler path to actionable data insights? Read more about what we call the Minimal Viable Prediction (MVP) approach.
Turn your APIs into A Powerful Analytics Foundation: Meet the API-in-a-Box
More and more businesses are embracing an API business model. But how do you enable this API-driven analytics transformation? Allow us to introduce the API-in-a-box.
An API-in-a-box is a containerized API adapter that can be deployed in a plug-and-play fashion, quickly and cost-effectively. It integrates disparately stored data by providing a safe passage for non-sensitive data or data that’s been given a green light by a department to be shared. With an API-in-a-Box, data remains in situ at its original source but is accessed in real-time.
APIs are a proven method for encouraging cross-departmental collaboration, analytics, and reporting, while facilitating the identification and correction of data discrepancies. Teams maintain full control of their data and can provide exact rules as to who can access that data.
An API-in-a-box can be spun up in an extremely short period of time, eliminating the time-consuming data integration problem. Plus, after data errors are found and one department’s data is merged with another, actionable insights start to emerge and the barriers of fear and fiefdom start to break down.
Go Ahead, Resist the Big Bang Approach
The traditional approach to data analytics is often risky big bang-thinking. Some of these projects have worked, but those successes are few and far between. They call for a huge planning endeavor: one that’s beyond the time and resources of many organizations. That old safeguard, the data warehouse, has also run its course as the stalwart of business intelligence initiatives.
As the arguments in this piece show, it’s time for a new approach.
Using new technology concepts (API-in-a-Box) and iterative approaches (minimal viable prediction), results emerge, sometimes in a matter of weeks, not months or years, and at a fraction of the cost of doing it the old way.
Data owners become heroes as new and actionable insights are achieved. A culture shift starts to take place as more people pull in the direction of a data culture.
The field of API for data analytics is still new, but to be successful in the long term, it’s an approach that we vehemently advocate. Give it a try.
Published at DZone with permission of Wolf Ruzicka. See the original article here.
Opinions expressed by DZone contributors are their own.
Comments