LogoLogo
  • 👋START HERE
    • Welcome!
  • â„šī¸General
    • Release Notes
      • Full Feature Base Template
      • Services
      • Rule-based Automation
        • February 2025
        • January 2025
        • December 2024
        • November 2024
        • October 2024
        • September 2024
        • August 2024
        • July 2024
        • June 2024
        • May 2024
        • April 2024
        • March 2024
        • February 2024
        • January 2024
        • 2023
        • 2022
        • 2021
        • Dialog Design Update
    • Glossary of Terms
    • Authentication Methods
      • SSO (Single Sign-On)
      • Built-In User Management
    • Acceptable Use Policy
  • âš™ī¸Rule-based Automation
    • Overview
      • Account Settings
        • Profile
        • Team
        • Roles and Permissions
          • User Management
          • Project Permissions
      • Basic Concepts
        • Project Management
        • Version Management
        • Multi-Lingual Bots
          • Supported Languages
        • Managing User Interactions
          • Unexpected User Input
          • No User Input
    • Dialog Interface
      • Blocks
        • Conversation Logic
          • Start Conversation
          • Global
          • State
          • Intermediate Response
          • To Previous State
          • End Conversation
        • Subdialog
          • Reusable Subdialogs
        • Phone
          • Continue Listening
          • Call Control
        • Technical Logic
          • Service
          • Condition
          • Storage
        • Other
          • Note
      • Speech Assets
        • Intents
          • Utterances
          • Descriptions
        • Slots
          • Custom Slots
            • List Slots
            • Machine Learning Slots
            • Regex Slots
            • LLM Slots
          • Prebuilt Slots
            • DTMF Slot
        • Text Snippets
        • Dictionary
      • Variables
        • Intents
        • Slots
        • Storage
        • Text Snippets
        • Environments
        • Platform
        • Context
      • Services
        • Service Integration Guide
        • Service Development
        • Service Branches and Error Handling
        • Public Services
          • Date and Birthdate Recognition
          • Spelling Post-Processing for Phone
          • IBAN Validation
          • License Plate Validation
          • Address Search
          • Street Names per Postal Code
          • Email Service
          • SMS Service
          • API Adapter
          • Salesforce-Flow Connector
          • Opening Hours
          • Speech-to-Text Hints
          • Fuzzy Match Names
          • Delay Service
      • Debugger
        • Phone 2
        • WhatsApp
        • Textchat 2
    • Environments Interface
      • Service Keys
    • Deployments Interface
      • Creating a Release
      • Editing a Release
    • Text-to-Speech
      • Azure
      • ElevenLabs
      • OpenAI via Azure (Preview)
      • SSML
        • Audio
        • Break
        • Emphasis
        • Prosody
        • Say-as
        • Substitute
        • Paragraph and Sentence
        • Voice
    • Autocomplete
    • Parloa APIs
      • CallData Service and API
      • Conversation History API
      • Textchat V2 API
    • Phone Integrations
      • Genesys Cloud
        • Setting up the SIP Trunk
        • Sending/Receiving UUI Data
        • Creating a Script to Display UUI
      • SIP
      • Tenios
        • Setting Up an Inbound Connection
        • Setting Up an Outbound Connection
        • Transferring a Call
      • Twilio
      • Public IPs and Port Information
    • AI Integration Overview
      • Dual Intent Recognizer (DIR)
      • Dual Tone Multifrequency (DTMF) Intent
    • Analytics and Debugging
      • Understanding Conversations and Transactions
      • Managing Caller ID Data
      • Hangup Events and Triggered Analytics
      • Analytics Transactions: Data Structure and Insights
      • Dialog Analytics
      • Audit Logs
      • Parloa-hosted Analytics
    • Data Privacy
      • Anonymizing Personally Identifiable Information
    • NLU Training
      • NLU Training Best Practices
    • How To
      • Create a Scalable and Maintainable Bot Architecture
      • Implement OnError Loop Handling
      • Resolve the 'Service Unavailable' Error
    • Reference
      • Parloa Keyboard Shortcuts
      • Frequently Asked Questions (FAQ)
      • JavaScript Cheat Sheet
        • Using Regular Expressions (Regex)
  • 🧠Knowledge Skill
    • Introduction
    • Knowledge Collections
    • Knowledge Sources
    • Knowledge Skill Setup
      • Step 1 – Create a Knowledge Skill Agent
      • Step 2 – Configure a Knowledge Skill Agent
      • Step 3 – Configure a Knowledge Skill Agent
Powered by GitBook
On this page
  • Responses
  • Components of the Responses Tab
  • Intents
  • Components of the Responses Tab

Was this helpful?

Export as PDF
  1. Rule-based Automation
  2. Dialog Interface
  3. Blocks

Conversation Logic

PreviousBlocksNextStart Conversation

Last updated 3 months ago

Was this helpful?

All blocks within the Conversation Logic category share a common component – Responses. The following documentation details the Responses tab, applicable to all blocks in the Conversation Logic category.

Responses

Responses enable you to define what the communicates to the caller.

Components of the Responses Tab

The currently selected response. This represents the currently selected response. It contains the messages that will be communicated to the caller.

The current selected response element: SSML, which makes the chatbot sound more human. The available response elements include:

  • SSML (Speech Synthesis Markup Language): Enhances the chatbot's vocal responses for a more natural sound.

  • Text: Allows the chatbot to send text responses.

  • Cross-Platform:

    • SSML & Text: Enables the chatbot to respond with both voice and text.

    • JSON: Transforms the cross-platform response into a JSON element. This structured data format is readable yet irreversible when converted.

The + Add prompt button enables you to insert additional SSML prompts. The chatbot will select one at random if multiple prompts are available.

The Generate More Prompts button enables you to create multiple AI-generated prompt suggestions.

Define what the chatbot should say to the caller if there is no response within a designated timeout period (typically 5 seconds).

This is a predefined response that the chatbot uses when it cannot understand the caller’s input.

The + button enables you to add more responses.

Intents

Components of the Responses Tab

Start Conversation block – When a conversation begins, typically through an incoming call, the LaunchRequest intent is triggered to deliver your welcome message and set the stage for the interaction.

State block – The Intents tab in the State block is used to create and manage intents that determine how a conversation progresses. Each intent can lead to different paths or outcomes in a conversation⁠.

Choose your preferred method for recognizing intents:

  • Utterances: Leverage predefined phrases to identify user intentions using Natural Language Understanding (NLU).

  • Descriptions: Employ a Large Language Model (LLM) for enhanced intent detection that interprets descriptive text along with NLU.

Speech to Text (STT): This field allows you to select a Speech-to-Text model tailored for specific input types, such as alphanumeric input.

The available specialized models are:

  • Specialized models are currently available only for German (de-DE) and French (fr-FR). See the breakdown below for more details.

  • For other languages, the system uses the default Generic STT model.

Specialized model
German (de-DE)
French (fr-FR)

Alphanumeric

✅

✅

Date

✅

❌

License Plate

✅

✅

Number

✅

✅

Ordinal

✅

❌

Spelling

✅

✅

Given name (hints)

✅

❌

The specialized model Given Name (Hints) should only be used in conjunction with Speech-to-Text Hints functionality for selected conversational turns. It is not recommended to use this model throughout the entire conversation.

The Local no-match logic how your chatbot handles unrecognized intents, either by referring to a local else intent or deferring to a global fallback.

The else serves as the fallback intent, activated when an intent isn't recognized within the LaunchRequest intent's list.

The prompt plays the text defined in the Authenticated , followed by the static text.

The and blocks within the Conversation Logic category share a common component – Intents. The following documentation details the Intents tab.

Intents are user expressions or phrases that prompt specific responses from your . They can be as straightforward as "Order a pizza" to initiate a pizza ordering dialog. Intents bridge the gap between user requests and your chatbot's ability to understand and respond appropriately.

âš™ī¸
text snippet
Start Conversation
State
chatbot
chatbot