Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.cognigy.com/llms.txt

Use this file to discover all available pages before exploring further.

The NiCE Cognigy MCP Server is an npm package that provides a Model Context Protocol (MCP) server for connecting MCP clients like Claude, Cursor, or VS Code to the Cognigy.AI REST API. With the NiCE Cognigy MCP server, you can create, test, and iteratively improve LLM-based AI Agents without leaving your development environment.

Key Benefits

  • Comprehensive API coverage. Use workflow tools in the NiCE Cognigy MCP Server to cover the Cognigy.AI API for end-to-end automation.
  • Fast AI Agent setup and iteration. Build an AI Agent persona, Flow, Node, and REST Endpoint in one workflow, then test, refine, and repeat until it performs as expected.
  • Knowledge and guidance built in. Access Knowledge Stores for RAG use cases to give your AI Agent the information it needs.
  • Reliable and safe operations. Configure rate limiting, Zod input validation, and standardized RFC 7807 error responses for safe operations with the NiCE Cognigy MCP Server.

Prerequisites

  • A Cognigy.AI API key. To create one, go to User Menu > My Profile > API Keys > Create New.
  • A Cognigy.AI API base URL. For example, for the trial environment, use https://api-trial.cognigy.ai.
  • An MCP-compatible client such as Claude Desktop, Claude Code, Codex, Cursor, or VS Code with GitHub Copilot.
  • Node.js 20 or later is required for command-line and manual setup.

How to Set Up

This is the easiest setup path and doesnโ€™t require Node.js.
  1. Download the .mcpb file from the npm package page.
  2. Double-click the file to open the installation dialog in Claude Desktop.
  3. Enter your Cognigy.AI API base URL and API key.

Security

  • API keys are passed through environment variables and are never logged.
  • All inputs are validated with Zod schemas before reaching the API.
  • Rate limiting protects the API from excessive or abusive requests.

Privacy

The NiCE Cognigy MCP Server sends requests only to the Cognigy.AI API base URL. The server doesnโ€™t collect, store, or share any data. All data stays between your MCP client and your Cognigy.AI instance. For more information, see the Cognigy Privacy Policy.

How to Use

Once the server is connected to your MCP client, you can interact with Cognigy.AI using natural language prompts. The following examples show prompts for common workflows and the tools the MCP client runs in response. For a full list of available tools, refer to the Tools section of the NiCE Cognigy MCP Server npm package documentation.
Ask your MCP client to create an AI Agent in a Cognigy.AI Project:
Create a Cognigy AI Agent called "Support Bot" in Project <projectId>.
Give it a helpful customer support persona, set up GPT-4 as the LLM,
and return the endpoint URL so I can test it.
This workflow uses the setup_llm tool to create the LLM resource and the create_ai_agent tool to provision the AI Agent, Flow, and REST Endpoint.
Send a message to the AI Agent through its Endpoint:
Talk to my Support Bot at <endpointUrl> and ask "How do I reset my password?"
This workflow uses the talk_to_agent tool and returns the response in your MCP client.
Refine the AI Agent instructions and compare responses:
Update the job description to make the response more concise and actionable.
Talk to it again and compare the responses.
This workflow uses a self-improvement loop: talk_to_agent โ€” evaluate โ€” update_ai_agent โ€” talk_to_agent.
Attach a Knowledge Store so the AI Agent can search approved content when answering questions:
Create a Knowledge Store in Project <projectId>, add the URL
https://docs.example.com/faq as a source, then attach it to my Support Bot
so the AI Agent can reference it to answer questions.
This workflow uses the manage_knowledge tool to create the store and ingest the source, then create_tool to attach it to the AI Agent Node.
List resources to inspect Projects, AI Agents, or conversations:
List all my Cognigy.AI Projects and show which AI Agents exist in each one.
This workflow uses the list_resources tool with resource types project and agent.You can also ask for recent conversations and summarize the returned data.
Export a Package from the Project, list exportable Packages, and import the Package into another Project:
Upload the package at `/absolute/path/to/support-bot.zip` into Project <projectId>,
show me the import preview, then import it using the default selections.
You can also drag and drop files into some MCP clients to upload them.
This workflow uses the manage_packages tool with operations: upload_and_inspect, list_exportable then import.

More Information