As the IT industry enters 2019, there’s a lot of excitement about edge computing. Of course, edge computing is not necessarily a new idea. IT organizations have been deploying applications in, for example, remote offices or on various classes of embedded systems for decades. What is changing is the nature and characteristics of the applications being deployed.
The future of edge computing
Larger amounts of data will increasingly be processed at the edge as a new generation of applications get deployed on smaller, denser systems. Many of those systems will be based on the same hyperconverged infrastructure (HCI) that is now deployed widely in the cloud and in on-premises IT environments. In some circles, the processing of data at the edge is also referred to as “fog computing.” Systems deployed at the edge in turn will be connected back to applications running in the cloud or in a local data center using 4G/5G wireless connections or software-defined wide area networks.
Most of the applications deployed at the edge will also be based on modern microservices architectures. These typically employ containers and Kubernetes orchestration software to deliver a more granular set of applications that promise to be simpler to continually update using DevOps processes. The goal is to deliver a more real-time application experience by pushing the processing of as much code as possible out to the network edge.
A skill and resource gap means opportunity for MSPs
Naturally, pushing all that code out to the edge creates a massive opportunity for providers of managed services. Most internal IT organizations simply don’t have the skills and resources required to manage a highly distributed application environment. The issue that has yet to be determined, however, is determining who precisely will deliver those managed services.
Naturally, pushing all that code out to the edge creates a massive opportunity for providers of #ManagedServices. Most internal #IT organizations don’t have the skills and resources required to manage a highly distributed application environment
IT vendor VMware has launched a Project Dimension initiative around a stack of HCI software that can be deployed on various classes of dense servers being developed by Dell EMC and other manufacturers of traditional HCI platforms. As part of the Project Dimension initiative, VMware is making a case for managing all those platforms as part of a managed service it delivers.
It’s still early days as far as any form of edge computing is concerned. What types of MSP models will dominate edge computing remains to be seen. There will be no shortage of options when it comes to either building out their own managed service or opting to resell a service provided by a vendor.
In the meantime
Fortunately, industry resources MSPs can tap into to create an edge computing practice are starting to coalesce. The Industrial Internet Consortium, for example, has agreed to merge operations with the OpenFog Consortium in 2019. Those consortiums are already refining various type of reference architectures for wide number of edge computing use cases, including Internet of Things (IoT) applications. The market research firm Technavio estimates those use cases will fuel $5.7 billion of additional IT spending between now and 2023. In fact, as edge computing continues to evolve the size of that market opportunity will one day rival public clouds.
It may take the better part of a decade for edge computing to fully manifest itself. But, as MSPs head into 2019, the size and scope of the opportunities ahead continue to expand exponentially.
Photo: Jakub Grygier / Shutterstock