2026.6.3
Cognigy.AI
Released April 7, 2026Bug Fixes
- Fixed the connection between Cognigy.AI and Voice Gateway, ensuring Mute Speech Input prevents speech barge-in. This update complements the fix from Voice Gateway patch 2026.6.2. To resolve this issue, make sure you update Cognigy.AI to this version and Voice Gateway to 2026.6.2
2026.6.2
Cognigy.AI
Released March 25, 2026Improvements
- Added redaction limit. Log entries with
datapayloads over 2.5 MB are no longer redacted and now show a placeholder message - Improved internal log redaction for PCI compliance
Cognigy Voice Gateway
Released April 7, 2026Bug Fixes
- Fixed the issue where activating the Mute Speech Input setting didn’t prevent speech barge-in
2026.6.1
Cognigy.AI
Released March 18, 2026Bug Fixes
- Fixed the service configuration to ensure correct deployment and operation
Cognigy Voice Gateway
Released March 31, 2026Bug Fixes
- Fixed the issue on the Login page in the Voice Gateway Self-Service Portal. The page displayed v2026.5.x as the current version even after the v2026.6.0 release. Now, the page displays the correct version
2026.6.0
Cognigy.AI
Released March 17, 2026Features
New Multimodal Agent Widget: Click To Call
Click To Call lets you easily embed a widget in your website or build a custom application that connects to a voice AI Agent for multimodal communication. With Click To Call, users can talk to your AI Agents directly from your website or application with a single click.New Models from OpenAI and Microsoft Azure OpenAI
Introduced support for thegpt-5-1 model from OpenAI and Microsoft Azure OpenAI.
New Models from Anthropic
Introduced support for theclaude-sonnet-4-6 model from Anthropic.
Improvements
- Added a Language Configuration toggle to the AI Agent wizard to stop automatic language detection and prevent unintended switching
- Added OAuth2 authentication support to the Cognigy.AI API
- Deactivated Knowledge Connector actions when the required Extension is missing
- Added an error message when LLMs reach the token limit
- Added a safety context preamble to Simulator prompts, reducing content policy errors when using Azure OpenAI with personas or missions that contain terms flagged by Azure’s content management filters
- Removed the beta tag from the Data Redaction settings section
- Renamed the Sentences API tag to Example Sentences for clarity
Bug Fixes
- Fixed the issue where the Management UI flickered when users were navigating between pages and scrolling content
- Fixed the issue where the error message caused by LLMs reaching the token limit didn’t provide the correct information
- Fixed the issue where the Save button in the schedule configuration in the Simulator remained inactive without displaying any validation error message
- Fixed the issue where the
sourceNamevalue for scheduled sources in the Input object included the_scheduledpostfix - Fixed the issue where uploading Snapshots to Projects that didn’t include a global LLM caused cache issues
- Fixed the issue where setting the maximum tokens above 4000 in the LLM Prompt Node for Microsoft Azure OpenAI’s
gpt-4oandgpt-4o-minimodels threw an error, and similar limits for Anthropic’sclaude-opus-4-0, claude-sonnet-4-0and AWS Bedrock’sclaude-3-5-sonnetmodels - Fixed the issue where setting the Transcript Turns parameter to 0 in the LLM Prompt Node transcript didn’t have any effect
- Fixed the issue where Interaction Panel calls dropped if there was a Set Session Config Node in the Flow setting Microsoft provided by NiCE as TTS or STT vendor
- Fixed the issue where the wrong schedule date was displayed for Knowledge syncs when scheduling was previously deactivated
- Fixed the issue where the legacy
privacy_policyoption was displayed in the Profile Schema field in the Update Profile Node - Fixed the issue preventing the import of Packages with Knowledge Chunks created using custom models
- Fixed the issue where the Last Edited columns on the Connector Configuration displayed edits that didn’t take place
- Fixed the issue where the Knowledge Connectors didn’t work in trial environments
- Fixed the issue where the error responses for the
POST /persona/from-script/API method usedmessageinstead ofdetailand didn’t provide the correct data
Cognigy Voice Gateway
Released March 17, 2026Bug Fixes
- Fixed the issue where the Register status wasn’t updated correctly on the Carrier page in the Voice Gateway Self-Service Portal due to a race condition in the outbound authentication process
Cognigy Live Agent
Released March 17, 2026Bug Fixes
- Fixed the issue where the avatar image wasn’t aligned properly
- Fixed the issue where notifications for conversations escalated to supervisors weren’t translated
Cognigy Insights
Released March 17, 2026Improvements
- Redesigned the search logic in the Transcript Explorer to improve filtering and sorting against data in MongoDB
- Removed the default
order-bycondition in the OData query - Added a lower timestamp bound to optimize deletion queries and avoid timeout errors
Bug Fixes
- Fixed the issue where sorting and filtering in the Transcript Explorer didn’t work correctly
- Fixed the issue where the
PATCH /v2.0/analyticsAPI method failed while updating the PostgreSQL records due to a schema mismatch - Fixed the issue where the
GET /v2.0/conversationsAPI method didn’t work correctly and returned missing, incorrect, ornullrating values - Fixed the issue where messages sent through the WhatsApp Endpoint weren’t displayed in the Transcript Explorer when the payload didn’t contain top-level text