Share This:

“Hello, how are you today?” A simple enough statement, often used to start a conversation. A generative AI (GenAI) interaction will result in a simple human response, such as “I’m fine, thanks.” The GenAI engine can then move on to the main purpose of the interaction, such as “How can I help you today?”

In the first instance, any common response by a GenAI engine is based on analyzing a large data set of how humans interact. This is often a scrape of data from the open internet, also known as a large language model (LLM). LLMs are in most AI models – but there are issues that have had to be addressed as the AI market has evolved.

Transformers help filter out wrong or factually incorrect content

The problem with using the public internet as a data source is that it is not factually correct. There is more ‘wrong’ data out there than there is factually correct content. Allowing an LLM analysis without filters in a GenAI environment leads to undesirable outcomes. In early tests, the majority of GenAI systems included comments that were not only factually incorrect, but also included comments that were wrong for being racist, misogynistic, religiously biased, etc. Luckily, such issues were noticed quickly, and steps have been taken to weed them out.

This has been done by using what are called ‘transformers.’ At a simple level, these have been used from the early days of AI to work against an LLM to create a base way for the GenAI engine to operate. For example, it ensures that output follows common syntactical usage, and that probabilistic analysis is used for the completion of sentences (for example, managing to fill out a sentence such as “I would * to know the cost involved with * a product” as “I would LIKE to know the cost involved with BUYING a product”).

Many people now realize that transformers can also filter LLMs. By seeding a transformer with unacceptable content, you can make the output from an LLM more suitable for use in both commercial and domestic environments.

Focused language models address specific business use cases

However, this still leaves us with one other problem. Once the details are set, you want GenAI to close business or create a funnel for human opportunities

Let’s return to the “How can I help you today?” stage. If the human response is “I see that you offer <such and such a product>. How much would it cost for 20 people?”, then the GenAI engine can just move over to a simple rules-based engine to create the data for a suitable response – no problems.

How about if the response is something like “I have a group of people working on the creation of marketing content. I am not sure whether I should use a shared workflow system or a shared document store. What do you recommend?” It is doubtful that a GenAI engine using a public LLM will be able to provide a suitable response – and any response that it does provide could be dangerous to your business. Since the LLM doesn’t know your business, it might suggest a competitor as a suitable solution. It also could produce complete nonsense that then scares the prospect off.

This is where a more specific language model comes into play – one that is highly specific to your own business. You can also apply transformers here. First, they conduct a self-supervised learning session on the large dataset. Then humans supervise the learning process on a small task-specific dataset or focused language model (FLM).

This final fine-tuning session ensures that the result of the GenAI meets the needs of the end user.

The impact of GenAI on MSPs

Simply installing a publicly available GenAI engine will provide some basic capabilities that will meet some customers’ needs. However, it’s more likely to lead to issues as customers who do not understand the pitfalls of GenAI usage suddenly find major issues in what they are offering to their own customers.

Instead, MSPs can profit by helping customers develop the FLM that suits their needs. Then they can integrate their specific model into the more generalized GenAI platform. Remember, such an FLM must deal not only with the complexities of the customer’s own products and services but must also ensure that output does not stray from what they expect. No recommendations to go outside of what the customer offers; no recommendations that just simply will not work.

Getting GenAI right is not easy. However, MSPs can leverage it to aid their customers while bringing in extra revenue streams.

Photo: 3rdtimeluckystudio / Shutterstock


Share This:
Clive Longbottom

Posted by Clive Longbottom

Clive Longbottom is a UK-based independent commentator on the impact of technology on organizations and was a co-founder and service director at Quocirca. He has also been an ITC industry analyst for more than 20 years.

Leave a reply

Your email address will not be published. Required fields are marked *