Using AI to clean up big data

data-cleaningBig data is a hot topic right now, but the successful utilization of that data largely rests on the ability of organizations to provide clean, accurate and usable data to employees to make real-time insights.  Suffice to say, much of the data held in organizational databases is anything but clean, and few organizations seem willing to undertake the laborious job of cleaning it up.

AI may be about to come to the rescue, as a team of researchers from Columbia University and the University of California at Berkeley have developed some automated software to do the job for you.

AI to the rescue

The software, called ActiveClean, uses prediction models to test out datasets, and uses the results to understand the fields that require cleaning whilst simultaneously updating the models at the same time.

“Big data sets are still mostly combined and edited manually, aided by data-cleaning software like Google Refine and Trifacta or custom scripts developed for specific data-cleaning tasks,” the researchers say. “The process consumes up to 80 percent of analysts’ time as they hunt for dirty data, clean it, retrain their model and repeat the process. Cleaning is largely done by guesswork.”

As with so many laborious processes, human error can be a significant factor, so ActiveClean takes them out of the equation in two of the most error prone areas: finding the dirty data to begin with, and then updating models accordingly.

The software uses machine learning to analyze the structure of the model to then determine the kind of errors such a model is likely to generate.  When the software was tested against a couple of control methods, with positive results.

The testing was undertaken using the Dollars for Docs database on ProPublica, which has over 240,000 records of corporate donations to doctors.  The data was notably ‘messy’, with multiple names for a single drug commonplace.  Because of this messiness, improper donations could only be detected 66% of the time, but after just 5,000 records had been cleaned by ActiveClean, this jumped to 90%.

“As datasets grow larger and more complex, it’s becoming more and more difficult to properly clean the data,” the researchers say. “ActiveClean uses machine learning techniques to make data cleaning easier while guaranteeing you won’t shoot yourself in the foot.”

Related

Facebooktwitterredditpinterestlinkedinmail

Leave a Reply

Your email address will not be published. Required fields are marked *

Captcha loading...