Share This:

Artificial intelligence is all the rage these days, but many managed service providers (MSPs) have been making use of it in some capacity for years. The main difference is with the rise of generative AI. It’s now becoming possible to employ machine learning algorithms to not only predict events but also generate text, images, and code.

In general, there are three types of AI models. A predictive AI model uses machine learning algorithms to determine the probability of specific outcomes based on data collected from past events. They are already widely used in various IT operations platforms to determine, for example, the likelihood a service might run out of capacity as the amount of data being processed continues to increase.

The second type of AI model requires less frequency and employs machine learning algorithms to determine the cause of a complex series of events. Known as Causal AI, the goal is to provide insights into the root cause of, for example, an IT incident to enable IT teams to prevent it from happening again.

Finally, there is generative AI, which uses machine learning algorithms to create a large language model (LLM) that generates content in response to prompts from an end user. The goal is to automatically create text, images, and code based on the data used to train the LLM. Those types of capabilities will make it a lot easier for MSPs to onboard new IT staff because they will be able to use natural language to learn how various IT environments are behaving much faster. Just as importantly, an LLM can also be used to generate the code needed to remediate a newly discovered vulnerability.

The best-known instance of generative AI is the ChatGPT service created by OpenAI, but there will soon be thousands of LLMs that will be trained using narrow sets of data to create content for specific domains. The expectation is that LLMs trained using narrow sets of data, which will be easier and less costly to build, will also be less likely to hallucinate because the data used to train them is smaller and has been more thoroughly vetted.

AI models will be used by MSPs to optimize processes

None of these AI models supersedes the need for the others. In fact, MSPs will soon find themselves using these models in concert with one another to optimize a wide range of processes. The issue MSPs will need to come to terms with is to what degree they will want to leverage a platform that embeds these models within a platform they’ve adopted versus either choosing to customize or build an LLM themselves.

There are already a wide range of fundamental LLMs available that MSPs could customize using their own data. In the short term, however, most MSPs will look to extend LLMs by loading data into a vector database that an LLM can then analyze alongside the data it has already been trained on. The vector database also crucially provides a layer of isolation that prevents proprietary data from inadvertently being used in the future to retrain an LLM.

Every aspect of delivering managed IT services will be transformed in 2024. The challenge and the opportunity for MSPs is to determine to what degree processes will be automated so they can remain competitive tomorrow.

Photo: Prostock-studio / Shutterstock


Share This:
Mike Vizard

Posted by Mike Vizard

Mike Vizard has covered IT for more than 25 years, and has edited or contributed to a number of tech publications including InfoWorld, eWeek, CRN, Baseline, ComputerWorld, TMCNet, and Digital Review. He currently blogs for IT Business Edge and contributes to CIOinsight, The Channel Insider, Programmableweb and Slashdot. Mike blogs about emerging cloud technology for Smarter MSP.

Leave a reply

Your email address will not be published. Required fields are marked *