What Executives Need to Know About Knowledge Management, Large Language Models and Generative AI

This article originally appeared in the Journal of Applied Marketing Analytics.

 

Abstract  This paper discusses the opportunities and risks presented by large language models (LLMs), which power the popular and widely adopted Chat-GPT types of applications. The potential benefits include support for enhancing the customer journey and efficient management of an ever-increasing volume of information for employees. Risks include hallucinations (made up answers by generative AI that are not factually correct), exposure of corporate intellectual property (IP) to training models, lack of traceability and audit trails and misalignment with brand guidelines. The approach to handling risk described in this paper is retrieval-augmented generation (RAG), which references corporate knowledge and data sources in order to identify precise answers and retrieve exactly what users want. The paper also outlines the need for a knowledge architecture which enables enriched embeddings into vector databases which retain the context of intelligently componentised content. Using RAG requires knowledge hygiene and metadata models, and the paper discusses an experiment in which results were measured with and without the knowledge architecture. The improvement was significant: 53 per cent of questions were answered correctly without the model versus 83 per cent with the model. The use of RAG virtually eliminated hallucinations, secured corporate IP and provided traceability and an audit trail.

KEYWORDS: RAG, retrieval augmented generation, generative AI, ChatGPT, LLMs, large language models, KM, knowledge management, LLM challenges, LLM solutions, knowledge models, metadata models, knowledge architecture

 

INTRODUCTION

The customer journey is a knowledge journey. Every time a customer interacts with an organisation, they seek answers to their questions: What products do you have? Which ones are best for me? How do I decide on a purchase? How/where can I purchase your product? How do I maintain it? How can it be serviced and supported? Customers need answers, and if they cannot find the answer on the company’s website, they either call customer service or turn to a competitor.

Increasingly, organisations are offering better performing chatbots to deflect calls from the call centre, and many are now looking at generative artificial intelligence (AI) to power those bots, but both of those options have risks. Generative AI can support conversational search, product selection, support questions

and more, and therefore can impact every stage of the journey. Generative AI can also support a more personalised experience. Figure 1 illustrates example questions that customers have along their journey.

Applying generative AI is not as simple as it may seem, or as easy to implement as many vendors claim. The stampede of vendors entering the large language model (LLM)/ChatGPT market is likely to produce a deluge of failed projects, wasted

efforts and damaged careers because neither the vendors nor their customers understand the risks or how to counteract them to achieve success.

Executives and line of business managers are trying to factor in how LLMs and generative AI will be part of their current and future capabilities and figure out how transformation roadmaps will be impacted. The risks of diving into this technology are significant, and include:

• unrealistic expectations of LLMs as a magic solution to managing corporate content without human involvement;

• generating responses not aligned with company policies or brand;

• the knowledge gap that occurs when LLMs are not trained with current knowledge
or knowledge of the organisation and therefore cannot produce answers — or worse, make up answers;

• difficulty distinguishing between creative outputs and fabricated responses (hallucinations);

• absence of clear audit trails and citation sources;

• decisions around training models: balancing usefulness with the threat of exposing trade secrets or other proprietary knowledge;

Figure 1: The customer journey is a knowledge journey Source: Earley Information Science, Inc., 2023

Meet the Author
Seth Earley

Seth Earley is the Founder & CEO of Earley Information Science and the author of the award winning book The AI-Powered Enterprise: Harness the Power of Ontologies to Make Your Business Smarter, Faster, and More Profitable. An expert with 20+ years experience in Knowledge Strategy, Data and Information Architecture, Search-based Applications and Information Findability solutions. He has worked with a diverse roster of Fortune 1000 companies helping them to achieve higher levels of operating performance.