Designing conversational experiences with sentiment analysis in Amazon Lex
To have an effective conversation, it is important to understand the sentiment and respond appropriately. In a customer service call, a simple acknowledgment when talking to an unhappy customer might be helpful, such as, “Sorry to hear you are having trouble.” Understanding sentiment is also useful in determining when you need to hand over the call to a human agent for additional support.
To achieve such a conversational flow with a bot, you have to detect the sentiment expressed by the user and react appropriately. Previously, you had to build a custom integration by using Comprehend APIs. As of this writing, you can determine the sentiment natively in Amazon Lex. This post demonstrates how to use user sentiment to manage conversation flow better. We will describe the steps to build a bot, add logic to update response based on user sentiment and configure hand over to an agent.
Building a bot
We will use the following conversation to model a bot:
User: When is my package arriving? It’s so late.
Agent: Apologies for the inconvenience. Can I get your tracking number?
Agent: Got it. It should be delivered to your home address on Nov 27th.
User: Great, thanks.
Now, let’s build an Amazon Lex bot with intents to track delivery status and change delivery date. The
CheckDeliveryStatus intent elicits tracking number information and responds with the delivery date. The
ChangeDeliveryDate intent updates the delivery to a new date. In this post, we maintain a database with the tracking number and delivery date. You can use an AWS Lambda function to update the delivery date.
To enable sentiment analysis in the bot, complete the following steps:
- On the Amazon Lex console, click on the bot
- Under Settings, choose General
- For Sentiment Analysis, choose Yes
- Click on Build to create a new build
Adding logic to modify response
Now that you set up the bot, add logic to respond to the user’s sentiment. The dialog code hook in the
CheckDeliveryStatus examines the sentiment score. If the score for negative sentiment is above a certain threshold, you can inject an acknowledgment such as “
Apologies for the inconvenience” when prompting for the tracking number. See the following Lambda code snippet:
The following event is passed to the Lambda function:
You can also perform analytics across multiple conversations by keeping track of the aggregated score at the conversation level. This post maintains a database with an entry for each intent. You can store the aggregate of the sentiment scores for each intent per conversation in the table, and use this information to get insights into how specific intents are performing. You can also track overall sentiment at a user or bot level.
Configuring the handover
Lastly, let us review the configuration for hand over to an agent. You could trigger this path if the user sentiment is very negative: “Where’s my delivery? This is so frustrating.”
Use an Amazon Connect contact flow to perform the handover. You can set a higher threshold to initiate the handover. Add an
AgentHandover intent to the bot definition. Trigger the
AgentHandover intent in the dialog code hook Lambda if the negative sentiment is above the threshold. The following screenshot shows the contact flow in Amazon Connect:
The following Lambda code snippet triggers the handover to an agent:
This post demonstrated how you can understand user sentiment and enhance conversation flow. You can also perform analytics on sentiment information or hand over the call to a human agent. For more information about incorporating these techniques into your bots, please see the documentation.
About the authors
Anubhav Mishra is a Product Manager with AWS. He spends his time understanding customers and designing product experiences to address their business challenges.
Kevin Cho works as a Software Development Engineer at Amazon AI. He works on simplifying and improving the Lex user experience. Outside of work he can be found discovering new food around Seattle or playing basketball with friends and family.