A research report published this week by CAST Software, a provider of tools for analyzing software, uncovers an interesting correlation between how old an application is and the potential impact it can have on the business should it become unavailable.

A total of 2,067 applications, representing 733 million lines of code from 14 different technologies that are developed and maintained by more than 12,000 people across multiple verticals, were examined using a Software-as-a-Service (SaaS) application the company developed. The research concludes that over 75 percent of applications built in the 1980’s will have a critical impact on business operations if they go down. That compares to just under 50 percent of those written in this decade. Newer applications, however, are significantly more resilient than older ones, which the study suggests creates something of a paradox where organizations are relying on older software for mission-critical applications that are the most likely to be disrupted.

Naturally, an application that is more than a decade old is only likely to be still in use if it has withstood the test of time. But from a managed service provider (MSP) perspective, the report makes it clear that organizations relying on older applications run the highest amount of risk. Organizations that rely on those applications are faced with a stark choice. They can either modernize or replace those legacy applications or they invest more in various forms of data protection to minimize any disruption to those applications.

Making the case for data protection services

The research published by CAST Software suggests that MSPs that specialize in data protection should be applying a new calculus to how they make the case for data protection services. Identifying the applications that are most likely to have the biggest impact on the business if they go down is only the start of the exercise. MSPs should also calculate the probability of that disruption occurring based on the age of the application. The older an application is the more important it is to the business, otherwise the organization would have already replaced it. MSPs that can identify those older applications can make a more compelling economic case for the data protection services they provide.

Of course, once an organization understands the economic risk inherent in relying on older applications that move to replace them. That creates a separate potential application modernization opportunity for the MSP. No application, no matter how modern is free from potential disruption, so given the fact that the business value of the application has already been identified, applying the appropriate level of data protection is still a relevant discussion.

How much an MSP can charge for those services will vary widely. But the more critical an application is to an organization, the more likely it is that they will pay a premium to protect it. For example, Disaster Recovery-as-a-Service (DRaaS) becomes a lot more interesting to a customer that equates how long it really takes regain access to an application and economic disruption to the business. After all, data protection only becomes a compelling conversation once the age-old relationship between time and money becomes a real number to the business.

Subscribe to SmaterMSP

Photo:  sdecoret / Shutterstock.

Mike Vizard

Posted by Mike Vizard

Mike Vizard has covered IT for more than 25 years, and has edited or contributed to a number of tech publications including InfoWorld, eWeek, CRN, Baseline, ComputerWorld, TMCNet, and Digital Review. He currently blogs for IT Business Edge and contributes to CIOinsight, The Channel Insider, Programmableweb and Slashdot. Mike blogs about emerging cloud technology for Smarter MSP.

Leave a reply

Your email address will not be published. Required fields are marked *