LLM Slots
Last updated
Was this helpful?
Last updated
Was this helpful?
An LLM slot uses a large language model (LLM) to scan user input in a specified State or Start Conversation block and extract information based on a custom description. This functionality provides greater flexibility than predefined or machine learning-based slots, making it ideal for extracting complex or dynamic data.
Using LLM slots incurs additional costs. For pricing details or contract modifications, contact your Customer Success Manager.
LLM capabilities must be enabled for your tenant. Contact your Customer Success Manager for activation.
Use pre-built slots for structured data such as dates, email addresses, or monetary amounts. These slots are optimized for standard data types.
Use LLM slots to extract unstructured or dynamic information that goes beyond the capabilities of pre-built slots.
Pre-built slot use case: "What is your email address?" â Use the pre-built email
slot.
LLM slot use case: "What items do you want to buy?" â Use an LLM slot with a description such as "Extract a list of items mentioned in the user's input".
LLM slots are supported in Phone 2 and Chat releases.
Open a State or Start Conversation block.
Click the Intents tab.
Use the global toggle (highlighted below) to enable or disable LLM slot extraction for the block.
When the global toggle is disabled, none of the LLM slots in the block will function.
If the LLM slot is enabled in the block, the block will no longer extract other slot types. Ensure this configuration aligns with your requirements.
Click the button.
When the global toggle is enabled, you can control individual slots using their corresponding toggles.