Share This:

generative AI

While there is a massive amount of interest in all things relating to generative artificial intelligence (AI), the number of organizations that have the skills and resources required to fully exploit it remains limited.

Under the hood of generative AI platforms

There are, of course, millions of users taking advantage of platforms such as ChatGPT to become more productive. However, any organization that wants to enable generative AI capabilities will need to attain significant data management expertise in order to train or customize the underlying large language model (LLM). Not to mention, the ability to deploy and manage some type of vector database.

A vector database makes it possible to present unstructured data in a format that an LLM can recognize. It then uses that external data alongside the data it was originally trained on to generate better-informed responses and suggestions. Once this is completed, organizations can then go a step further by using a framework such as LangChain to build and deploy an AI application.

Building their own

Some organizations may even go so far as to build their own LLM or customize an existing one to ensure the highest level of accuracy. As most end users have come to discover, a general-purpose LLM such as ChatGPT that was trained on a mass of often conflicting data is prone to “hallucinations” that provide erroneous responses to queries. Just as challenging, a general-purpose LLM might not provide a consistent answer to the same query launched at different times.

Most organizations that want to take advantage of generative AI are clearly going to need some outside help. The number of organizations with the data scientists, data engineers, application developers, and cybersecurity experts required to build and deploy generative AI applications is still few and far between. And the majority are still trying to understand the use cases that generative AI technologies can enable.

The one thing that is certain is generative AI will eventually transform almost every business process as natural language is used to create prompts to be used in place of existing user interfaces to automate a wide range of tasks. Arguably, the most limiting factor right now is not the underlying technology but rather our collective imaginations.

The generative AI opportunity for MSPs

Managed service providers (MSPs) that can exhibit a master of generative AI will be in high demand for many years to come. Many organizations are leery of putting their most sensitive data into a cloud service, so the number of organizations looking to build or customize LLMs running in a private cloud or on-premises IT environment will only steadily increase. There are also a wide range of data sovereignty and privacy issues that are only going to become more challenging to navigate as AI regulations inevitably become more stringent.

Of course, there is no shortage of service providers that see the same opportunity. The race to provide managed generative AI services is already on. The only thing that remains to be seen is how many winners there will be. Generative AI ultimately transforms not only every existing business process, but also enables new ones that previously would have never been thought possible.

Photo: Deemerwha studio / Shutterstock


Share This:
Mike Vizard

Posted by Mike Vizard

Mike Vizard has covered IT for more than 25 years, and has edited or contributed to a number of tech publications including InfoWorld, eWeek, CRN, Baseline, ComputerWorld, TMCNet, and Digital Review. He currently blogs for IT Business Edge and contributes to CIOinsight, The Channel Insider, Programmableweb and Slashdot. Mike blogs about emerging cloud technology for Smarter MSP.

Leave a reply

Your email address will not be published. Required fields are marked *