Skip to main content

Blog

Learn About Our Meetup

5000+ Members

MEETUPS

LEARN, CONNECT, SHARE

Join our meetup, learn, connect, share, and get to know your Toronto AI community. 

JOB POSTINGS

INDEED POSTINGS

Browse through the latest deep learning, ai, machine learning postings from Indeed for the GTA.

CONTACT

CONNECT WITH US

Are you looking to sponsor space, be a speaker, or volunteer, feel free to give us a shout.

Building enterprise-grade, stable, smart bots using machine learning services from AWS

Abbott Laboratories has more data than its field team can decipher while on-site with other clients.  Their solution? Working with Smart Bots to build an enterprise-grade, reliable and stable chatbot called Maya, powered by AWS machine learning services like Amazon Lex, AWS Lambda, Amazon Comprehend, and Amazon SageMaker.

For context, Abbott Laboratories is a multinational healthcare company and a forerunner in India in its deployment of AI.  Maya serves Abbott’s 3000+ person field force in India, providing sales operations support and providing access to contextual information at employees’ fingertips.

The chatbot proves especially helpful while employees are in the field meeting doctors. Maya can handle the nitty-gritty of querying and fetching information from enterprise applications so that employees can focus on higher-order tasks.

Maya is integrated with the customer relationship management (CRM) system at Abbott. For each query, the bot gets authenticated on behalf of the user and retrieves the required information.

Amazon Lex enables the language model

Amazon Lex is core to the Maya solution, having been chosen after long discussions regarding the conversation flows and data access protocol from the backend system.

The team identified intents from the conversation flows. Maya today has more than 50 intents—including a “small talk” intent to make the bot more human-like—and close to 250 slots. Most of the intents revolve around data-related actions (for example, filter, compute, and so on). The small talk intent handles phrases like “thank you for your help.”

Lambda determined the response

All 50 intents are linked to a single Lambda function. The following steps are performed on all the requests that call the function.

  • Validate the slots based on business rules.
  • Call all the subscribed methods related to the newly filled slots.
  • Identify the next state.
  • Construct the response object.

Lambda acted as the right fit to implement the validation and state flow logic described above.

Session attributes handled context

The team used intent chaining to enhance the conversation flow, which they laud because it makes the bot smarter and streamlines bot management. For those less familiar with this concept, intent chaining facilitates shifting between multiple intents without losing the context. In Maya, context is stored as JSON in the session attributes. The Context object is structured as follows:

sessionAttributes: {
  "context": {
    "previous-context": {
        "primary-context": true,
        "intent-name":"intent-A",
        "slots": {
          "slot-name": "slot-value",
          ...
        },
        "context-variable-1": "value",
        "context-variable-2": "value"
    },
    "current-context": {
        "intent-name":"intent-B",
      "context-variable-3": "value",
      "context-variable-4": "value"
    }
  }
}

* Values in session attributes can only be a string, so the Context JSON object has to be stringified and then assigned.

In the above example, the flow was shifted from intent A to intent B (leaving intent A pending fulfillment). After the current intent (intent B) is fulfilled, the dialogue state goes back to intent A, retaining the previous state.

In real-world terms, this example is applicable in the healthcare space when a user wants to toggle between analysis of a large dataset and individual patient health records. For example, users may want to view the analysis for the causes, symptoms, and likelihood of various diseases.

Results and next steps

With the Maya chatbot deployed in the field, about a third of the queries that medical representatives raise are now answered by Maya rather than a human.

In the coming months, the team looks to further the use of the chatbot and also make it smarter. In particular, they’re looking at using Amazon SageMaker Reinforcement Learning with the Gym interface to facilitate ongoing training while engaging users. The thinking is to prompt a user with what it expects is next set of useful interactions, then reward or penalize the bot based on the relevance of its recommendations.

Amazon SageMaker is also core to a mother-bot architectural approach that is currently being tested. This mother bot is effectively the coordination point that can query the correct child bot to get an answer to the user. This ensemble of bots is expected to perform even better than a single bot handling all the intents. From a technical perspective, the mother bot is a classification algorithm implemented in Amazon SageMaker—a relatively easy task thanks to the streamlined workflow that Amazon SageMaker enables.


About the Author

Marisa Messina is on the AWS AI marketing team, where her job includes identifying the most innovative AWS-using customers and showcasing their inspiring stories. Prior to AWS, she worked on consumer-facing hardware and then university-facing cloud offerings at Microsoft. Outside of work, she enjoys exploring the Pacific Northwest hiking trails, cooking without recipes, and dancing in the rain.