Prerequisites
- Configure a contact center platform that you can integrate with a handover provider. For example, for Live Agent, you need to have an Inbox with the Project ID and the Webhook URL set up and, if necessary, the Enable AI Copilot Workspace option activated.
- Add a compatible LLM for the LLM Prompt and Copilot: Sentiment Tile Nodes and configure the LLM in the Generative AI settings.
Setup Process
The setup process includes the following steps:- Create a handover provider. The handover provider connects the AI Agent and Agent Copilot to your contact center.
- Create a Handover Flow. The Handover Flow controls the interaction with the AI Agent and the handover to a human agent.
- Create an Agent Copilot Flow. The Agent Copilot Flow controls the Agent Copilot workspace and the individual widgets in it.
- Test the Agent Copilot workspace through a Webchat v3 Endpoint.
Set Up Agent Copilot for Chat
1
Create a Handover Provider
In Deploy > Handover Providers, create a Handover Provider according to your contact center platform.
2
Create a Handover Flow
- In Build > Flows, create a Flow and give it a clear name, for example,
Handover. - In the Flow editor, add a Say Node and set its Text parameter to
Hi, let's hand you over to the human agents.. - Add a Handover to Agent Node and configure as follows:
- Handover Provider — set to the previously created Handover Provider, and configure it accordingly. For example, if you use Live Agent, you need to set the Live Agent Inbox ID.
- Handover Accepted Message — enter
Transferring to human agents....
3
Create an Agent Copilot Flow
- In Build > Flows, create a Flow and give it a clear name, for example,
Agent Copilot. - Add a Copilot: Set Grid Node and configure the grid as follows:
- After the Copilot: Set Grid Node, add an LLM Prompt Node. Configure it to generate the next best response for the agent based on the user’s latest input and to store the result in the Input object:
- System Prompt — enter the following:
- In the Storage & Streaming section, configure the following:
- How to handle the result — set to Store in Input.
- Input Key to store Result — set to
input.promptResult.
- Add the following Agent Copilot Nodes after the LLM Prompt Node and configure them as follows:
- Copilot: Next Action Tile Node:
- Tile ID —
next-best-action - Text:
This expression dynamically retrieves the LLM-generated suggestion for the tile.- (Optional) In the Settings section, set Label to
Next Best Action.
- Tile ID —
- Copilot: Sentiment Tile Node:
- Tile ID —
sentiment. - (Optional) In the Settings section, set Label to
Customer Sentiment.
- Tile ID —
- Copilot: Next Action Tile Node:
4
Test Agent Copilot
To test Agent Copilot, create a Webchat v3 Endpoint:
- In Deploy > Endpoints, create a Webchat v3 Endpoint
- In the Endpoint settings, configure the following:
- Set Flow to the previously created Handover Flow, for example,
Handover. - In the Copilot section, set Copilot Flow to the previously created Agent Copilot Flow, for example,
Agent Copilot.
- Set Flow to the previously created Handover Flow, for example,
- Click Save to activate the Open Demo Webchat button.
- In the Webchat v3 Endpoint settings you created, click Open Demo Webchat in the upper-right corner. Demo Webchat opens.
- Click Start conversation and send the AI Agent a message, for example,
Hello, I need help with my order. You should receive a message with the text you set in the Say Node followed by the text from the Handover to Agent Node. - Go to your contact center platform and search for the session with the message you sent. The Agent Copilot workspace displays two widgets that suggest next actions.