Share This:

IT organizations are starting to signal their intent to consolidate all the analytics applications they need to support around a common pool of “big data” whenever possible. A recent report from Dresner Advisory Services finds 60 percent of respondents would prefer to have a single analytical data infrastructure (ADI) platform, with the majority of those respondents preferring that platform to reside in the cloud.

The challenge most organizations have historically faced is that it’s been relatively simple for most lines of business to set up their own analytics applications, as the cost of deploying these applications locally or accessing them as a cloud service has continued to decline. As more organizations start to view data as a strategic asset, there’s a push to centralize it. The primary driver of that shift will be the rise of artificial intelligence (AI) applications that require access to massive amounts of data to accurately create an AI model that automates a business process. While that shift is still relatively nascent, the critical mass of data being aggregated will eventually create enough gravity to pull all the analytics applications of an organization within the scope of the big data repository being employed to drive AI applications.

Most of those big data repositories will inevitably be deployed on public clouds. The cost of deploying big data repositories in an on-premises IT environment is simply going to be prohibitive. Recognizing that issue, cloud service providers are doing everything they can to make their platforms become the most economically attractive place to store massive amount of data. Every cloud service provider knows that the more data that winds up on their platform, the greater the effects of data gravity become. The more data there is on a cloud, the more likely it becomes that developers will want to build applications on the clouds that are closest to the data their applications need to access.

Industry insights on the effects of data gravity

Jason Woodrum is a director of public cloud solutions for Ensono, a provider of managed services for both Amazon Web Services (AWS) and Microsoft. He says this data gravity issue is going to drive more organizations to focus more of their efforts on a single cloud platform. The operational costs associated with trying to manage data strewn across multiple clouds will simply be too high.

“We tell clients to pick a cloud and go,” says Woodrum.

Savvy managed service providers (MSP) will clearly need to align their resources based on the amount of data their clients have collectively decided to host on any of the major cloud platform. That doesn’t mean MSPs won’t need to support multiple clouds, but the number of cloud platforms any MSP will be able to effectively support will be limited.

It will be a few more years before the rise of AI and big data start to force these cloud issues. As far as MSPs should be concerned, the writing is already on the wall. As far as MSPs should be concerned, the writing is already on the wall when it comes to the rise of AI and big data forcing issues with the cloud.

Photo: whiteMocca / Shutterstock


Share This:
Mike Vizard

Posted by Mike Vizard

Mike Vizard has covered IT for more than 25 years, and has edited or contributed to a number of tech publications including InfoWorld, eWeek, CRN, Baseline, ComputerWorld, TMCNet, and Digital Review. He currently blogs for IT Business Edge and contributes to CIOinsight, The Channel Insider, Programmableweb and Slashdot. Mike blogs about emerging cloud technology for Smarter MSP.

One Comment

Leave a reply

Your email address will not be published. Required fields are marked *