AI and BI Projects Get Bogged Down With Data Preparation Tasks
Data quality challenges are often why organizations are reassessing AI and BI projects. Here are four best practices to help you do data prep efficiently.
Join the DZone community and get the full member experience.Join For Free
IBM is reporting that data quality challenges are a top reason why organizations are reassessing (or ending) artificial-intelligence (AI) and business intelligence (BI) projects.
Arvind Krishna, IBM’s senior vice president of cloud and cognitive software, stated in a recent interview with the Wall Street Journal, “about 80% of the work with an AI project is collecting and preparing data. Some companies are not prepared for the cost and work associated with that going in. And you say: ‘Hey, wait a moment, where’s the AI? I’m not getting the benefit.’ And you kind of bail on it.” 
Many businesses are not prepared for the cost and effort of data preparation (DP) when starting AI and BI projects. To compound matters, hundreds of data and record types and billions of records are often involved in a project’s DP effort.
However, data analytics projects are increasingly imperative to organizational success in the digital economy, hence the need for DP solutions.
What is AI/BI Data Preparation?
Gartner defines data preparation as “an iterative and agile process for exploring, combining, cleaning, and transforming raw data into curated datasets for data integration, data science, data discovery, and analytics/business intelligence (BI) use cases.” 
A 2019 International Data Corporation (IDC) study  reports that data workers spend an amazing amount of time each week on data-related activities (see Figure 3):
Thirty-three percent on data preparation compared to thirty-two percent on analytics (and, sadly, just thirteen percent on data science). The top challenge cited by more than thirty percent of all data workers in this study was that “too much time is spent on data preparation.”
Figure 1: 2019 International Data Corp. (IDC) study, “Time Spent on Data-related Activities.”
The variety of data sources, the array of data types, the enormity of data volumes, and the numerous uses for data analytics and business intelligence -- all result in multiple data sources and complexity for each project. Consequently, today’s data workers often use multiple tools for DP success.
Capabilities Needed in Data Preparation Tools
Evidence in the Gartner Research report, Market Guide for Data Preparation Tools , shows that data preparation time and reporting of information discovered during DP can be reduced by more than half when DP tools are implemented.
In the same research report, Gartner lists details of vendors and DP tools. The analyst firm predicts that the market for DP solutions will reach $1 billion this year, with nearly a third (30 percent) of IT organizations employing various self-service data preparation toolset.
Another Gartner Research Circle Survey  on data and analytics trends revealed that over half (54 percent) of respondents want and need to automate their data preparation and cleansing tasks during the next 12 to 24 months.
To accelerate data understandings and improved trust, data preparation tools should have crucial capabilities , including the ability to:
- Extract and profile data. Typically, a data prep tool uses a visual environment that enables users to extract interactively, search, sample, and prepare data assets.
- Create and manage data catalogs and metadata. Tools should create and search metadata; and track data sources, data transformations, and user activity against each data source. Tools should also keep track of data source attributes, data lineage, relationships, and APIs. All of this enables access to a metadata catalog for data auditing, analytics/BI, data science, and other operational use cases.
- Support basic data quality and governance features. Tools should have the ability to integrate with other tools that support data governance/stewardship and data quality criteria.
The challenge is getting good at data preparation. As a recent report by business intelligence pioneer Howard Dresner discovered, 64 percent of respondents frequently perform end-user DP, but only 12 percent reported they were very effective . Nearly 40 percent of data professionals spend half of their time prepping data rather than analyzing it.
The following are a few practices that help assure optimal DP for your AI and BI projects. Many more are available from data preparation service and product suppliers [6, 7].
Best Practice #1: Decide Which Data Sources Are Needed to Meet AI and BI Requirements
Take these three general steps to data discovery:
- Identify the data needed to meet required business tasks.
- Identify potential internal and external sources of that data (and include its owners).
- Assure that each source will be available according to required frequencies.
Best Practice #2: Identify Tools For Data Analysis and Preparation
It will be necessary to load data sources into DP tools for data to be analyzed and manipulated. It’s essential to get the data into an environment where it can be closely examined and readied for the next steps.
Best Practice #3: Profile Data For Potential and Selected Source Data
Profiling is a crucial (but often discounted) step in DP. A project must analyze source data before it can be adequately prepared for downstream consumption. Beyond simple visual examination, you need to profile data, detect outliers, and find null values (and other unwanted data) among sources.
The primary purpose of this profiling analysis is to decide which data sources are even worth including in your project. As data warehouse guru Ralph Kimball writes in his book, The Data Warehouse Toolkit, “Early disqualification of a data source is a responsible step that can earn you respect from the rest of the team.”
Best Practice #4: Cleansing and Screening Source Data
Based on your knowledge of the end business analytics goal, experiment with different data cleansing strategies that will get the relevant data into a usable format. Start with a small, statistically-valid sample to iteratively experiment with different data prep strategies, refine your record filters, and discuss the results with business stakeholders.
When discovering what seems to be an appropriate DP approach, take time to rethink the subset of data you really need to meet the business objective. Running your data prep rules on the entire data set will be very time consuming, so think critically with business stakeholders about which entities and attributes you do and don’t need and which records you can safely filter out.
Proper and thorough data preparation, conducted from the start of an AI/BI project, leads to faster, more efficient AI and BI down the line. DP steps and processes outlined here apply to whatever technical setup you are using -- and they will get you better results.
Note that DP is not a “do once and forget” task. Data is generated continuously from multiple sources that may change over time, and the context of your business decisions will undoubtedly change over time. Partnering with data preparation solution providers is an essential consideration for the long-term capability of your DP infrastructure.
 The Wall Street Journal, AI Projects Bogged Down in Data Preparation, May 29, 2019, p. B3
 Gartner Research, Market Guide for Data Preparation Tools 2019
 Gartner Research, Gartner Survey Shows Organizations Are Slow to Advance in Data and Analytics
 Import.io , 10 Best Practices in Data Preparation
About the Author
Wayne Yaddow has over 12 years of experience leading data migration/integration/ETL testing projects at organizations, including J.P. Morgan Chase, Credit Suisse, Standard and Poor’s, and IBM. Additionally, Wayne has taught International Institute of Software Testing (IIST) courses on data warehouse, ETL, and data integration testing. He continues to lead numerous ETL testing and coaching projects on a consulting basis. You can contact Wayne at firstname.lastname@example.org
Opinions expressed by DZone contributors are their own.