LogoLogo
  • 👋START HERE
    • Welcome!
  • ℹ️General
    • Release Notes
      • Full Feature Base Template
      • Services
      • Rule-based Automation
        • February 2025
        • January 2025
        • December 2024
        • November 2024
        • October 2024
        • September 2024
        • August 2024
        • July 2024
        • June 2024
        • May 2024
        • April 2024
        • March 2024
        • February 2024
        • January 2024
        • 2023
        • 2022
        • 2021
        • Dialog Design Update
    • Glossary of Terms
    • Authentication Methods
      • SSO (Single Sign-On)
      • Built-In User Management
    • Acceptable Use Policy
  • ⚙️Rule-based Automation
    • Overview
      • Account Settings
        • Profile
        • Team
        • Roles and Permissions
          • User Management
          • Project Permissions
      • Basic Concepts
        • Project Management
        • Version Management
        • Multi-Lingual Bots
          • Supported Languages
        • Managing User Interactions
          • Unexpected User Input
          • No User Input
    • Dialog Interface
      • Blocks
        • Conversation Logic
          • Start Conversation
          • Global
          • State
          • Intermediate Response
          • To Previous State
          • End Conversation
        • Subdialog
          • Reusable Subdialogs
        • Phone
          • Continue Listening
          • Call Control
        • Technical Logic
          • Service
          • Condition
          • Storage
        • Other
          • Note
      • Speech Assets
        • Intents
          • Utterances
          • Descriptions
        • Slots
          • Custom Slots
            • List Slots
            • Machine Learning Slots
            • Regex Slots
            • LLM Slots
          • Prebuilt Slots
            • DTMF Slot
        • Text Snippets
        • Dictionary
      • Variables
        • Intents
        • Slots
        • Storage
        • Text Snippets
        • Environments
        • Platform
        • Context
      • Services
        • Service Integration Guide
        • Service Development
        • Service Branches and Error Handling
        • Public Services
          • Date and Birthdate Recognition
          • Spelling Post-Processing for Phone
          • IBAN Validation
          • License Plate Validation
          • Address Search
          • Street Names per Postal Code
          • Email Service
          • SMS Service
          • API Adapter
          • Salesforce-Flow Connector
          • Opening Hours
          • Speech-to-Text Hints
          • Fuzzy Match Names
          • Delay Service
      • Debugger
        • Phone 2
        • WhatsApp
        • Textchat 2
    • Environments Interface
      • Service Keys
    • Deployments Interface
      • Creating a Release
      • Editing a Release
    • Text-to-Speech
      • Azure
      • ElevenLabs
      • OpenAI via Azure (Preview)
      • SSML
        • Audio
        • Break
        • Emphasis
        • Prosody
        • Say-as
        • Substitute
        • Paragraph and Sentence
        • Voice
    • Autocomplete
    • Parloa APIs
      • CallData Service and API
      • Conversation History API
      • Textchat V2 API
    • Phone Integrations
      • Genesys Cloud
        • Setting up the SIP Trunk
        • Sending/Receiving UUI Data
        • Creating a Script to Display UUI
      • SIP
      • Tenios
        • Setting Up an Inbound Connection
        • Setting Up an Outbound Connection
        • Transferring a Call
      • Twilio
      • Public IPs and Port Information
    • AI Integration Overview
      • Dual Intent Recognizer (DIR)
      • Dual Tone Multifrequency (DTMF) Intent
    • Analytics and Debugging
      • Understanding Conversations and Transactions
      • Managing Caller ID Data
      • Hangup Events and Triggered Analytics
      • Analytics Transactions: Data Structure and Insights
      • Dialog Analytics
      • Audit Logs
      • Parloa-hosted Analytics
    • Data Privacy
      • Anonymizing Personally Identifiable Information
    • NLU Training
      • NLU Training Best Practices
    • How To
      • Create a Scalable and Maintainable Bot Architecture
      • Implement OnError Loop Handling
      • Resolve the 'Service Unavailable' Error
    • Reference
      • Parloa Keyboard Shortcuts
      • Frequently Asked Questions (FAQ)
      • JavaScript Cheat Sheet
        • Using Regular Expressions (Regex)
  • 🧠Knowledge Skill
    • Introduction
    • Knowledge Collections
    • Knowledge Sources
    • Knowledge Skill Setup
      • Step 1 – Create a Knowledge Skill Agent
      • Step 2 – Configure a Knowledge Skill Agent
      • Step 3 – Configure a Knowledge Skill Agent
Powered by GitBook
On this page
  • When to Use LLM Slots
  • How to Enable and Use LLM Slots

Was this helpful?

Export as PDF
  1. Rule-based Automation
  2. Dialog Interface
  3. Speech Assets
  4. Slots
  5. Custom Slots

LLM Slots

PreviousRegex SlotsNextPrebuilt Slots

Last updated 5 months ago

Was this helpful?

An LLM slot uses a large language model (LLM) to scan user input in a specified or block and extract information based on a custom description. This functionality provides greater flexibility than predefined or machine learning-based slots, making it ideal for extracting complex or dynamic data.

Using LLM slots incurs additional costs. For pricing details or contract modifications, contact your Customer Success Manager.

LLM capabilities must be enabled for your tenant. Contact your Customer Success Manager for activation.

When to Use LLM Slots

Use pre-built slots for structured data such as dates, email addresses, or monetary amounts. These slots are optimized for standard data types.

Use LLM slots to extract unstructured or dynamic information that goes beyond the capabilities of pre-built slots.

  • Pre-built slot use case: "What is your email address?" → Use the pre-built email slot.

  • LLM slot use case: "What items do you want to buy?" → Use an LLM slot with a description such as "Extract a list of items mentioned in the user's input".

LLM slots are supported in Phone 2 and Chat releases.

How to Enable and Use LLM Slots

1

Create an LLM slot

  1. In Speech Assets, click the Slots tab.

  2. Select LLM Slot as the slot type.

  3. Enter a name for the slot.

  4. Enter a description to specify the data to extract from the user's input. For example: "Extract a list of items mentioned in the user’s input."

The slot saves automatically.

2

Configure LLM slot extraction

  1. Click the Intents tab.

  2. Use the global toggle (highlighted below) to enable or disable LLM slot extraction for the block.

    • When the global toggle is disabled, none of the LLM slots in the block will function.

If the LLM slot is enabled in the block, the block will no longer extract other slot types. Ensure this configuration aligns with your requirements.

Click the button.

Open a or block.

When the global toggle is enabled, you can control individual slots using their corresponding toggles.

⚙️
State
Start Conversation
State
Start Conversation
LLM Slots