AI data managementAs interest in artificial intelligence (AI) continues to increase rapidly, they say data is the new oil. The machine and deep learning algorithms that drive AI applications require access to massive amounts of data to work.

A new survey of 100 business executives conducted by Corinium Digital, an online community focused on analytics, and commissioned by Paxata, Accenture Applied Intelligence, and Microsoft, finds 72 percent of large enterprise IT organizations have already allocated more than $2 million to AI initiatives in the 2018/2019 timeframe. A full 93 percent of respondents say their organizations are investing more than $1 million in analytics initiatives in the same timeframe as well.

Data management challenges

But when it comes to AI and analytics, there’s a significant hurdle that needs to be overcome. Just like oil, data needs to be processed before it has any real business value. A total of 70 percent of respondents, however, admit their IT/data management teams struggle to support analytics applications. Well over a third (38 percent) say data quality is a problem. More than half (54 percent) say they’re only somewhat confident in the quality of the results, and 15 percent express low confidence in the quality of the results they’re getting.

The data management issue most organizations are having is that managing data as an asset is very different from managing data as a burden. Historically, most IT organizations have tried to limit the amount of data being actively managed. Now they’re being asked to manage massive amounts of data being generated in the cloud or on premises to support algorithms being employed in multiple AI models. In some instances, that even means moving or replicating massive amounts of data from one physical location to another.

Groundwork for AI success

All that data is at the heart of digital business transformation projects being enabled by AI, which explains why so much money is being dedicated to them. It’s only a matter of time before even more dollars get allocated to cleaning up the data mess that exists within most organizations.

Today, an untold number of disparate applications regularly generate data in isolation from one another. There’s no structured processes in place to feed all that data into AI applications that organizations are betting will ultimately make them more competitive. Because AI applications are considered strategic, the amount of money that will be allocated to build the data pipelines needed to feed those applications will be significant.

AI applications may seem like something that will bear fruit in the distant future. But the millions of dollars already being spent on AI applications should begin to have a significant impact on how businesses are run before the end of the decade. The AI opportunity managed service providers have in front of them now is enabling organizations to implement a more disciplined approach to managing data.

The truth is few organizations would be able to earn the equivalent of a Good Housekeeping seal of approval when it comes to data management. Yet, it’s already becoming apparent that the difference between success and failure in the next AI-infused century will come down to how well all that data gets managed.

Subscribe to SmaterMSP

Photo: ktsdesign/Shutterstock.com

Mike Vizard

Posted by Mike Vizard

Mike Vizard has covered IT for more than 25 years, and has edited or contributed to a number of tech publications including InfoWorld, eWeek, CRN, Baseline, ComputerWorld, TMCNet, and Digital Review. He currently blogs for IT Business Edge and contributes to CIOinsight, The Channel Insider, Programmableweb and Slashdot. Mike blogs about emerging cloud technology for Smarter MSP.

Leave a reply

Your email address will not be published. Required fields are marked *