LLM Slots
An LLM slot uses a large language model (LLM) to scan user input in a specified State or Start Conversation block and extract information based on a custom description. This functionality provides greater flexibility than predefined or machine learning-based slots, making it ideal for extracting complex or dynamic data.

LLM capabilities must be enabled for your tenant. Contact your Customer Success Manager for activation.
When to Use LLM Slots
Use pre-built slots for structured data such as dates, email addresses, or monetary amounts. These slots are optimized for standard data types.
Use LLM slots to extract unstructured or dynamic information that goes beyond the capabilities of pre-built slots.
Pre-built slot use case: "What is your email address?" → Use the pre-built
email
slot.LLM slot use case: "What items do you want to buy?" → Use an LLM slot with a description such as "Extract a list of items mentioned in the user's input".
How to Enable and Use LLM Slots
Create an LLM slot
In Speech Assets, click the Slots tab.
Click the
button.
Select LLM Slot as the slot type.
Enter a name for the slot.
Enter a description to specify the data to extract from the user's input. For example: "Extract a list of items mentioned in the user’s input."
The slot saves automatically.
Configure LLM slot extraction
Open a State or Start Conversation block.
Click the Intents tab.
Use the global toggle (highlighted below) to enable or disable LLM slot extraction for the block.
When the global toggle is disabled, none of the LLM slots in the block will function.
When the global toggle is enabled, you can control individual slots using their corresponding toggles.
If the LLM slot is enabled in the block, the block will no longer extract other slot types. Ensure this configuration aligns with your requirements.
Last updated
Was this helpful?