Share This:

With the rate at which workloads have been moving public clouds, it may not come as a surprise to managed service providers (MSPs) that capital spending in hyperscale environments is increasing.

A survey of 200 IT decision makers published by Schneider Electric finds in the last four quarters, capital spending in hyperscale environments totaled $116 billion, with consulting engineers dedicating almost half of their time (45 percent) to hyperscale and colocation projects during the same period. That number is expected to increase to 52 percent within the next three years, according to the survey.

The debate between public cloud and on-premise environments

Based on those forecasts it might be easy for MSPs to conclude that every workload is eventually headed to the public cloud. However, the public cloud has now been around for a decade and 70 to 80 percent of workloads still run in an on-premise environment.

As it turns out, there’s a lot of very good reasons to continue running workloads in a local data center, including application performance, data governance, security, and cost. In many cases, running certain classes of applications in the cloud can be more expensive than an on-premises environment.

Beyond the debate about costs, the primary reason many organizations embrace public clouds is agility. However, local data centers are becoming more agile thanks to a combination of modern hyperconverged infrastructure that is being augmented by managed services, often provided by the vendor. Hewlett-Packard Enterprise (HPE), for example, has a Greenlake managed service wrapped around its servers that is being consumed by 740 customers. Even more surprising, that service is now being resold by roughly 500 HPE partner.

In fact, HPE Greenlake is only one of many instances of managed services now being provided by vendors. Microsoft made it clear during its Ignite conference this week that it intends to extend the reach of the Azure cloud service into on-premises IT environments.

At the same time, a startup, dubbed Volterra, backed by $50 million in financing this week launched a managed service built on top of Kubernetes that can be deployed on top of any public cloud or local data center. The next wave of managed services may very well span hybrid cloud computing environments that will enable customers to deploy workloads wherever and whenever they please.

MSPs as co-pilots

The one thing that all these platforms have in common is that MSPs in each of these cases are being asked to function as co-pilots. Rather than having to build and manage an entire service from the ground up, the service is delivered in partnership with a vendor.

However, it’s not only IT vendors that want more control. End customers now also often want to be able to mange the application environment alongside the MSP. In effect, MSPs are being asked to co-pilot the management of infrastructure alongside vendors and the management of applications alongside the end customer.

That fundamental shift is obviously going to take some getting used to for MSPs that are accustomed to controlling the entire IT environment. Like it or not, many MSPs will need to accept that they are not exercising as much control over the proverbial IT flight as they once were — even though they are still likely to get blamed, should that plane not arrive at the right destination.

Photo: light poet / Shutterstock


Share This:
Mike Vizard

Posted by Mike Vizard

Mike Vizard has covered IT for more than 25 years, and has edited or contributed to a number of tech publications including InfoWorld, eWeek, CRN, Baseline, ComputerWorld, TMCNet, and Digital Review. He currently blogs for IT Business Edge and contributes to CIOinsight, The Channel Insider, Programmableweb and Slashdot. Mike blogs about emerging cloud technology for Smarter MSP.

Leave a reply

Your email address will not be published. Required fields are marked *