Multiple LLM Engines
Natural Language Processing can be based on Pyramid's internal Portable Language Model (PLM) or customers can leverage more powerful third-party Large Language Models (LLMs). When you use the Chatbot to ask Pyramid questions and automatically generate visuals, presentations, and publications, the underlying action depends on which language model interprets and completes your requests.
The option to pick one or more language models is the basis for Pyramid’s multi-pronged language model strategy.
Note: You can select to use different LLM engines for different Models. There is a separate option to pick a language model to drive the speech-audio interpreter.
LLM Engine selection
The engines that are available for use when sending NLQs to the Chatbot will always include the Pyramid Internal portable language model (PLM) and may include one or more other third-party LLM engines that your administrator has added to the LLM Manager.
Your administrator selects which LLM engines should be used by the Chatbot in the following locations in the Admin Console:
- In AI Settings, the administrator:
- Enables NLQ and selects the Default Chatbot Engine to use. The list of available engines includes Pyramid Internal and any third-party LLM engines (in the latter case, the engines are added from the LLM Manager).
- Enables the Chatbot speech to text functionality and selects the Provider to use to transform spoken queries into text in the Chatbot.
- Optionally, in Data Model Management under NLQ, the administrator:
- Configures the model information to use a specific engine for a particular model. This overrides the default as required.
- Configures synonyms for the same model.
The availability of multiple LLMs allows different capabilities to be available for different models. It is important to note that the engines will also behave slightly differently to each other in response to prompts, since that is the nature of intelligence of the type that is in use.
Pyramid's Portable Language Model (PLM)
The Pyramid Internal engine is a "portable" language model (PLM). The built-in engine is very fast and does not need to be connected to external services to run; that is, it can run 'off the grid'. It also operates without any risk of data or insight sharing with external LLM providers.
The PLM engine enables the Chatbot at Design time in Discover and at Runtime in Present.
However, Pyramid's PLM only supports English language "chats", and its syntactical interpretation capabilities are limited compared to its LLM cousins. As such, stronger guidance is provided to access its capabilities.
Third-Party LLM engines
Third-party LLM engines can be included if they are set up and configured in the LLM Manager.
The third-party LLM engines all enable the Chatbot at Design time in Discover, Present, and Publish, and at Runtime in Discover and Present. They include capabilities that are not available with the Pyramid Internal engine, such as multi-language support.