Skip to main content
Deprecations in Cognigy.AI 2026.6.0:
  • The support for the Completions API for custom LLMs is deprecated. The full removal is planned for December, 2026. Refer to your LLM provider’s reference to switch to the appropriate API before the removal date.
  • The support for Google’s gemini-2.0-flash and gemini-2.0-flash-lite models is deprecated. The full removal is planned for June 1, 2026. Switch to the newer models, such as gemini-2.5-flash and gemini-2.5-flash-lite, before the removal date.
  • The POST /v2.0/endpoint/notify API method is deprecated. The full removal is planned for September, 2026. Use the POST https://endpoint-trial.cognigy.ai/notify/{URLToken} API method instead.
  • The POST /v2.0/endpoint/inject API method is deprecated. The full removal is planned for September, 2026. Use the POST https://endpoint-trial.cognigy.ai/inject/{URLToken} API method instead.
Help us improve our product documentation on docs.cognigy.com by sharing your thoughts in a quick survey. Your feedback shapes the future of our content!

2026.6.0

Cognigy.AI

Released March 17, 2026

Features

New Multimodal Agent Widget: Click To Call
Click To Call lets you easily embed a widget in your website or build a custom application that connects to a voice AI Agent for multimodal communication. With Click To Call, users can talk to your AI Agents directly from your website or application with a single click.
New Models from OpenAI and Microsoft Azure OpenAI
Introduced support for the gpt-5-1 model from OpenAI and Microsoft Azure OpenAI
New Models from Anthropic
Introduced support for the claude-sonnet-4-6 model from Anthropic

Improvements

  • Deactivated Knowledge Connector actions when the required Extension is missing
  • Added a Language Configuration toggle to the AI Agent wizard to stop automatic language detection and prevent unintended switching
  • Added an error message when LLMs reach the token limit
  • Added a safety context preamble to Simulator prompts, reducing content policy errors when using Azure OpenAI with personas or missions that contain terms flagged by Azure’s content management filters
  • Removed the beta tag from the Data Redaction settings section
  • Added the Billing API 3.0. This API includes channel-wise conversation counts
  • Marked the inject and notify APIs as deprecated in the API docs

Bug Fixes

  • Fixed the issue where the Management UI flickered when users were navigating between pages and scrolling content
  • Fixed the issue where the error message caused by LLMs reaching the token limit didn’t provide the correct information
  • Fixed the issue where the Save button in the schedule configuration in the Simulator remained inactive without displaying any validation error message
  • Fixed the issue where the sourceName value for scheduled sources in the Input object included the _scheduled postfix
  • Fixed the issue where uploading Snapshots to Projects that didn’t include a global LLM caused cache issues
  • Fixed the issue where setting the maximum tokens above 4000 in the LLM Prompt Node for Microsoft Azure OpenAI’s gpt-4o and gpt-4o-mini models threw an error, and similar limits for Anthropic’s claude-opus-4-0, claude-sonnet-4-0 and AWS Bedrock’s claude-3-5-sonnet models
  • Fixed the issue where setting the Transcript Turns parameter to 0 in the LLM Prompt Node transcript didn’t have any effect
  • Fixed the issue where Interaction Panel calls dropped if there was a Set Session Config Node in the Flow setting Microsoft provided by NiCE as TTS or STT vendor
  • Fixed the issue where the wrong schedule date was displayed for Knowledge syncs when scheduling was previously deactivated
  • Fixed the issue where the legacy privacy_policy option was displayed in the Profile Schema field in the Update Profile Node
  • Fixed the issue preventing the import of Packages with Knowledge Chunks created using custom models
  • Fixed the issue where the Last Edited columns on the Connector Configuration displayed edits that didn’t take place
  • Fixed the issue where the Knowledge Connectors didn’t work in trial environments
  • Fixed the issue where the error responses for the POST /persona/from-script/ API method used message instead of detail and didn’t provide the correct data

Cognigy Voice Gateway

Released March 17, 2026

Bug Fixes

  • Fixed the issue where the Register status wasn’t updated correctly on the Carrier page in the Voice Gateway Self-Service Portal due to a race condition in the outbound authentication process

Cognigy Live Agent

Released March 17, 2026

Bug Fixes

  • Fixed the issue where the avatar image wasn’t aligned properly
  • Fixed the issue where notifications for conversations escalated to supervisors weren’t translated

Cognigy Insights

Released March 17, 2026

Improvements

  • Redesigned the search logic in the Transcript Explorer to improve filtering and sorting against data in MongoDB
  • Removed the default order-by condition in the OData query
  • Added a lower timestamp bound to optimize deletion queries and avoid timeout errors

Bug Fixes

  • Fixed the issue where sorting and filtering in the Transcript Explorer didn’t work correctly
  • Fixed the issue where the PATCH /v2.0/analytics API method failed while updating the PostgreSQL records due to a schema mismatch
  • Fixed the issue where the GET /v2.0/conversations API method didn’t work correctly and returned missing, incorrect, or null rating values
  • Fixed the issue where messages sent through the WhatsApp Endpoint weren’t displayed in the Transcript Explorer when the payload didn’t contain top-level text

Infrastructure

Version Compatibility Matrix

The version compatibility matrix was updated for the following Cognigy products: