Generative AI
The Generative AI module adds a general OpenAI based fallback.
Dependencies
The Generative AI module is dependent on the following modules:
Configuration
The Generative AI module will read and respect the values shown below from settings. To add or edit configuration values, follow this path: Experience Manager > Create tab > Content component > Settings item > generativeAi section:
index (string, required): The name of the VectorDB index to use.
prompt (string, optional): The prompt to use. If not specified, a default of “Answer the question based on the context below.” will be used.
model (string, optional): Either “gpt-3.5-turbo“ or “gpt-4“. If not specified, the default is “gpt-3.5-turbo“.
scoring (string, optional): Either “embedding“ or “token“. If not specified, the default is “embedding“.
enableSourceLinks (boolean, optional): Whether or not to display a link to the source content that the answer was derived from.
Fallback Implementation
A fallback handler is registered with the Base Intents module that, when triggered, will feed the query text in to the VectorDB node with the configured index name, prompt and model, and return the answer from the VectorDB node as a response to the user.
Handlers
Handlers for the following events can be registered by calling generativeAi.registerHandler and passing it the handler key:
followup
This handler is executed after an answer has been returned from the VectorDB.
The handler is passed the following parameters:
msg (object): The Node-RED message object.