Docker containersBy now most managed service providers are at least aware of Docker containers. But, far too many of them view Docker containers simply as a tool for building next-generation applications based on microservices. The most pressing use case for Docker containers is lifting and shifting existing enterprise applications into the cloud.

That capability creates a unique opportunity for MSPs to cost-effectively address one of the biggest issues associated with migrating workloads to the cloud. The main reason more workloads are being migrated to the cloud is that in most cases they need to be refactored. It’s not possible to run an application that was originally deployed on a VMware hypervisor on an instance of an open source hypervisor employed by Amazon Web Services (AWS). In fact, that issue is at the heart of why VMware eventually allied with AWS to host its stack of software on the AWS cloud.

Easing cloud migration process

But many organizations don’t want to continue to pay licensing fees for VMware software in the cloud. The path of least resistance for many is to encapsulate an existing application in a Docker container, which then enables that application to run anywhere without having to be refactored.

The biggest challenge then becomes making some form of persistent storage available to encapsulated applications in the cloud. Most of the applications that an enterprise IT organization will want to port are stateful, which means they need access to storage. While advances have been made in terms of making storage available to containers, it’s still a work in progress.

One way around that issue comes from DH2i, which provides DxEnterprise (DxE) software that makes it possible to containerize the underlying database that most applications rely on to access storage. The version of DxE announced today extends that capability by making it possible to employ, for example, containerized instances of Microsoft SQL database on Windows that can automatically fail over to an instance of Microsoft SQL running on Linux.

How MSPs can help

In general, DH2i CEO Don Boxley says MSPs have been slow to realize opportunities afforded by containers mainly because they’re still too focused on virtual machines as the primary mechanism around which they deliver a service. Customers are under pressure to move workloads to the cloud. But, most organizations are not migrating workloads to the cloud as quickly as they’d like because the tasks associated with migrating applications to the cloud remain a challenge. Docker containers present MSPs with a mechanism to provide a service that specifically addresses that issue by simplifying the migration process. The atomic unit of computing, however, becomes the container rather than the virtual machine.

Naturally, once those applications are migrated to the cloud they need to be managed. Long term, many of those monolithic applications will be carved up into a set of more manageable microservices. The opportunity created by containers extends well beyond a one-time migration. But, for MSPs to take advantage of that opportunity they need to develop some core Docker container competency now. That means making an investment in training to get personnel certified. After all, technicians with Docker skills aren’t going to magically appear one day. They need to be made as part of a carefully crafted modernization strategy that spans well into the next decade.

Ready Set Managed

Photo: MOLPIX/Shutterstock.com

Mike Vizard

Posted by Mike Vizard

Mike Vizard has covered IT for more than 25 years, and has edited or contributed to a number of tech publications including InfoWorld, eWeek, CRN, Baseline, ComputerWorld, TMCNet, and Digital Review. He currently blogs for IT Business Edge and contributes to CIOinsight, The Channel Insider, Programmableweb and Slashdot. Mike blogs about emerging cloud technology for Smarter MSP.

Leave a reply

Your email address will not be published. Required fields are marked *