Cognigy

Cognigy.AI Docs

COGNIGY.AI is the Conversational AI Platform focused on the needs of large enterprises to develop, deploy and run Conversational AI’s on any conversational channel.

Given the arising need of voice interfaces as the most natural way of communicating with brands, Cognigy was founded in 2016 by Sascha Poggemann and Phil Heltewig. Our mission: to enable all devices and applications to intelligently communicate with their users via naturally spoken or written dialogue.

Get Started

Extensions

Extensions enable anyone to build JavaScript modules and to expose them as Flow Nodes within Cognigy. There are no restrictions on node modules (NPM) or functionality.

Figure 1: Example Extensions Page

▶️

Techinar video "Cognigy Extensions"

Watch this Episode of Cognigy Sessions for a technical deep dive

Extensions allow developers to extend the capabilities of Cognigy.AI by uploading JavaScript Modules into Cognigy.AI. These modules can expose Cognigy Flow Nodes and be used directly within Cognigy Flows.

Please have a look at the Extensions Github Repository for more information.

🚧

Extension Timeout

Extensions have a default time-out of 20 seconds. The time-out can be changed on dedicated Cognigy.AI installations.

Extension Modules

If you're just after finding Cognigy's currently released Extensions to install into your Cognigy.AI instance, please see the link below.

Source Code

If you want to develop your own Extensions and would like to see examples, visit our GitHub repository.

Get Started

If you're keen to get started developing Extensions right away, see our Get Started documentation.

📘

Where are Custom Modules?

Extensions are the successor to the Cognigy Integration Framework Custom Modules and replace them completely.

Extension performance

Cognigy.AI considers the code within an extension to be "un-trusted", meaning that the code will be executed in a secure and additional hardened environment by default. There is a certain overhead in bootstrapping this secure environment per execution - hence Flow Nodes from Extensions generally execute slower than our built-in ones (e.g. our "Say"-Node).

With Cognigy.AI v4.1.6 we have introduced the ability to "trust" the code of an Extension by letting customers decide whether they want to execute the code in a secure environment or in the normal execution environment in which our own Flow Nodes run.

🚧

Feature availability

This feature is only available for our on-premise customers or dedicated SaaS customers with their own Cognigy.AI installation.

In order to enable the feature, the following additional environment variable can be used:

FEATURE_ALLOW_TRUSTED_CODE_CONFIGURATION=true

Our customers usually accomplish this, by adding the following to their "config-map_patch.yaml" in the "kubernetes" repository in which the manifest files for deployment are located:

- op: add
  path: /data/FEATURE_ALLOW_TRUSTED_CODE_CONFIGURATION
  value: "true"

Enabling the feature will not change anything automatically. Once the feature was activated, an additional API endpoint (see our RESTful API documentation) can be used in order to update the "trustedCode" property of an Extension.

❗️

Security considerations

There is a reason why Extensions and their code will be executed in the secure environment, by default! Never trust the code of an Extension without properly reviewing the code within it! Extensions can use external packages from NPM which might contain harmful code and routines - once an Extension runs in the "native context", it might be able to steal sensitive information. Please make sure that you are aware of these implications before changing the execution context.

Updated 3 months ago


What's Next

Members

Extensions


Suggested Edits are limited on API Reference Pages

You can only suggest edits to Markdown body content, but not to the API spec.