There’s an old saying that posits the key to success is to discover where a customer’s pain points are and solve that problem. A survey of 218 IT professionals published this week by the research firm Futuriom suggests that the most critical pain point in all of IT right now is optimizing data center utilization.

To achieve that goal, different approaches are being explored; including the adoption of smart network integration cards (NICs),improving the overall efficiency of application code,  increasing the reliance on graphical processor units (GPUs) and other domain-specific processors, making networks faster, and, finally, adding more servers.

The Futuriom survey suggests that the core two issues IT organizations are trying to address with new technologies are how to employ smart NICs to offload the processing of network traffic in the data center in way that provides more control over east-west network traffic, followed by adopting various classes of emerging and existing virtualization technologies. The top reasons organizations are investing in smart NICs are to improve the efficiency of virtual machines and containers such as Docker (56 percent), followed by a need to share Flash-based storage more efficiently (55 percent), enable more software defined networking (54 percent), and accelerate hyperconverged infrastructure (50 percent).

For example, the Futuriom survey found that virtual machines and containers ranked highest among technologies needed to drive data center efficiency (41 percent), followed closely by software-defined storage (40 percent), software-defined networking (39 percent), and software-defined security (36 percent).

The downstream opportunity

What quickly becomes apparent to savvy managed service providers (MSPs) is how much downstream opportunity will be driven by adoption of smart NICs.

Overall, the Futuriom survey makes it clear that despite the number of application workloads moving on to public clouds, interest in optimizing existing data centers remains high. As much as 80 percent of all workloads still run in an on-premises data center. IT organizations may not be expanding the number of data centers they operate, but interest in optimizing the utilization of their current data centers remains maniacal.

Part of that focus is also being driven because of concerns that Moore’s Law is starting to brush up the limits of physics. Moore’s Law stipulates processing power will double every 18 months for the same price. As the number of transistors that can fit on a physical chip becomes limited, some worry that throwing hardware at application performance issues will no longer be as easy as it once was. As an alternative, more chips are being stacked on top of one another. It’s still not clear how effective that approach will be in addressing the inevitable heating and cooling issues.

Whatever the long-term outcome, the immediate opportunity for MSPs is to build a practice around all aspects of data center optimization. After all, data centers still represent the largest capital investment made by most organizations. There’s always going to be a lot of interest in optimizing investments in those data centers. The challenge and opportunity for MSPs is finding the way to start that conversation in a way the resonates most with the end user.

Subscribe to SmaterMSP

Photo: Frame Stock Footages / Shutterstock

Mike Vizard

Posted by Mike Vizard

Mike Vizard has covered IT for more than 25 years, and has edited or contributed to a number of tech publications including InfoWorld, eWeek, CRN, Baseline, ComputerWorld, TMCNet, and Digital Review. He currently blogs for IT Business Edge and contributes to CIOinsight, The Channel Insider, Programmableweb and Slashdot. Mike blogs about emerging cloud technology for Smarter MSP.

Leave a reply

Your email address will not be published. Required fields are marked *