Share This:

Given the extensive history most IT organizations have managing data leveraging all information to drive a wide variety of digital processes should be a simple enough proposition. But it turns out melding data sources is anything but easy.

A new survey of 400 IT professionals in the U.S. and Europe conducted by the research firm Vanson Bourne on behalf of Devo Technology, a provider of IT operational analytics application delivered as a cloud service, sheds light on the scope of the challenge. While a full 91 percent of respondents agree that combining historical data and real-time data would be valuable to their organization, providing true and more actionable insights, a total of 95 percent face obstacles when trying to get a single view of data. The root cause of that problem is that 74 percent report their business is currently using different systems for real-time and historical data storage and analysis. In fact, 98 percent state their organization experiences challenges trying to reduce these silos as more data starts to now be generated at the edge of the enterprise thanks to, the rise of the Internet of Things (IoT).

IT organizations for some time now have been trying to rise to this data management challenge by investing in Big Data platforms such as Hadoop. The problem is that Big Data applications are difficult to set up and require a lot of data science expertise to master. That complexity is one of the reasons so many organizations are now researching machine and deep learning algorithms. Rather than spend so much time and effort trying to organize data the hope is that algorithms will essentially discover patterns within data with much less effort. In fact, algorithms are poised to utterly transform how data gets managed. Artificial intelligence (AI) applications require access to massive amounts of data to train the models built using algorithms. That data requirement in turn requires organizations to have a much more disciplined approach to how construct data pipeline to collect, store, and prepare data to be consumed by those AI models, also known as DataOps.

The golden opportunity for MSPs

As organizations start to focus more on how the data they collect needs to be melded with other data sources to drive digital business processes, managed service providers (MSPs) are being presented with a golden opportunity. Most organizations don’t want to be in the business of managing data. They want to enjoy the business benefits of being able to analyze that data. MSPs that position themselves as a provider of data management services that enable next-generation analytics applications are going to be warmly received. A recent survey of 500 IT leaders conducted by SnapLogic, a provider of integration software, finds the amount of budget dollars being specifically allocated to operationalize data is on average set to double over the next five years.

The truth of that matter is that most IT organizations today don’t have a clue as to how to go about operationalizing data. Most IT organizations are tasked with storing, protecting, and making sure data is available. Being able to identify what data is most valuable to the business is not something most internal IT organizations do especially well. Being able to combine various data in a way that drives value of for the business is even more problematic. MSPs that can bridge the data gap between internal IT and lines of business trying to operationalize data will clearly be worth their weight in digital gold.

Photo:  Jirsak / Shutterstock.


Share This:
Mike Vizard

Posted by Mike Vizard

Mike Vizard has covered IT for more than 25 years, and has edited or contributed to a number of tech publications including InfoWorld, eWeek, CRN, Baseline, ComputerWorld, TMCNet, and Digital Review. He currently blogs for IT Business Edge and contributes to CIOinsight, The Channel Insider, Programmableweb and Slashdot. Mike blogs about emerging cloud technology for Smarter MSP.

Leave a reply

Your email address will not be published. Required fields are marked *