Cognigy

Cognigy.AI Docs

COGNIGY.AI is the Conversational AI Platform focused on the needs of large enterprises to develop, deploy and run Conversational AI’s on any conversational channel.

Given the arising need of voice interfaces as the most natural way of communicating with brands, Cognigy was founded in 2016 by Sascha Poggemann and Phil Heltewig. Our mission: to enable all devices and applications to intelligently communicate with their users via naturally spoken or written dialogue.

Get Started

Input Transformer

Description

The Input Transformer is triggered on every message from the user before the Flow is executed. This makes it possible to manipulate the text before it has been sent to the Flow, communicate with external systems, implement integrations with a new channel, and much more.

The Input Transformer is configured by implementing the handleInput function in the Transformer in the Endpoint.

Transformer Function Arguments

The handleInput function gets a configuration object as an argument. This object always contains the key endpoint, which contains the Endpoint configuration. The rest of the keys in the object depends on the base type of the Transformer. An overview of the keys in the object can be seen below

Argument

Description

Webhook Transformers

REST Transformers

Socket Transformers

endpoint

The configuration object for the Endpoint. Contains the URLToken etc.

X

X

X

request

The Express request object with a JSON parsed body.

X

X

response

The Express response object.

X

X

payload

The payload object contains the userId, sessionId, text and data that was sent through the Socket. It also contains the channel of the client.

X

Return Values of the Transformer

Regular Transformer Usage

The Input Transformer can return a valid user ID, session ID and text and/or data that should be sent to the Flow. These values should be extracted from the body of the request. It is important to note that the format of the request body will differ based on the specific channel being used, i.e. a request from Alexa looks very different compared to a request from Facebook Messenger. It is, therefore, necessary to read the documentation from the specific channel to know how the request body is formatted.

Example:

return {
    userId: request.body.user,
    sessionId: request.body.conversation,
    text: request.body.messageText,
    data: { "test": 1 }
};

Partial Transformer Results

If undefined is returned for userId, sessionId, text or data, the already extracted value from the Endpoint is used.

The following example overwrites text and data, but keeps the userId and sessionId as they are:

return {
    userId: undefined,
    sessionId: undefined,
    text: request.body.messageText,
    data: { "test": 1 }
};

Stopping Execution

If the Input Transformer returns a falsy value altogether, then the message from the user is never sent to the Flow.

Example:

return null;

Transformers and Conversation Counts

Conversations in Cognigy.AI are only counted if the Input Transformer returns a non-falsy result.

🚧

Return Value Validation

The return value of the Input Transformer, if provided, will be validated against a set of rules and rejected if the rules are not met.
Every value can return undefined. If something else is returned, these rules apply:

  • userId is a string with max length of 256 characters.
  • sessionId is a string with max length of 256 characters.
  • text is a string with a max length of 10000 characters.
  • data is an object

Updated 9 months ago


Input Transformer


Suggested Edits are limited on API Reference Pages

You can only suggest edits to Markdown body content, but not to the API spec.