QnABot on AWS is a multi-channel, multi-language conversational interface (chatbot) that responds to your customer’s questions, answers, and feedback. It allows you to deploy a fully functional chatbot across multiple channels including chat, voice, SMS, and Amazon Alexa. The solution’s content management environment, and contact center integration wizard allow you to set up and customize an environment that provides the following benefits:
-
Enhance your customer’s experience by providing personalized tutorials and question and answer support with intelligent multi-part interaction
-
Reduce call center wait times by automating customer support workflows
-
Implement the latest machine learning technology to create engaging, human-like interactions for chatbots
Deploying this solution with the default parameters deploys the following components in your AWS account (bordered components are optional).
Figure 1: QnABot on AWS architecture
The high-level process flow for the solution components deployed with the AWS CloudFormation template is as follows:
-
The admin deploys the solution into their AWS account, opens the Content Designer UI or Amazon Lex web client, and uses Amazon Cognito to authenticate.
-
After authentication, Amazon API Gateway and Amazon S3 deliver the contents of the Content Designer UI.
-
The admin configures questions and answers in the Content Designer and the UI sends requests to Amazon API Gateway to save the questions and answers.
-
The
Content Designer
AWS Lambda function saves the input in Amazon OpenSearch Service in a questions bank index. If using text embeddings, these requests will first pass through a ML model hosted on Amazon SageMaker to generate embeddings before being saved into the question bank on OpenSearch. -
Users of the chatbot interact with Amazon Lex via the web client UI or Amazon Connect.
-
Amazon Lex forwards requests to the
Bot Fulfillment
AWS Lambda function. Users can also send requests to this Lambda function via Amazon Alexa devices. -
The
Bot Fulfillment
AWS Lambda function takes the users input and uses Amazon Comprehend and Amazon Translate (if necessary) to translate non-English requests to English and then looks up the answer in in Amazon OpenSearch Service. If using LLM features such as text generation and text embeddings, these requests will first pass through various ML models hosted on Amazon SageMaker to generate the search query and embeddings to compare with those saved in the question bank on OpenSearch. -
If an Amazon Kendra index is configured for fallback, the
Bot Fulfillment
AWS Lambda function forwards the request to Kendra if no matches were returned from the OpenSearch question bank. The text generation LLM can optionally be used to create the search query and to synthesize a response given the returned document excerpts. -
User interactions with the
Bot Fulfillment
function generate logs and metrics data, which is sent to Amazon Kinesis Data Firehose then to Amazon S3 for later data analysis.
Refer to the implementation guide for detailed instructions on deploying QnABot in your AWS account.
Alternatively, if you want to custom deploy QnABot on AWS, refer to the details below.
- Run Linux. (tested on Amazon Linux)
- Install npm >8.6.0 and node >18.X.X (instructions)
- Install and configure git lfs (instructions)
- Clone this repo.
- Set up an AWS account. (instructions)
- Configure AWS CLI and a local credentials file. (instructions)
Navigate to the root directory of QnABot (directory will be created once you have cloned this repo).
Install node.js modules of QnABot:
npm install
Next, set up your configuration file:
npm run config
now edit config.json
for the following parameters:
param | description |
---|---|
region | the AWS region to launch stacks in |
profile | the AWS credential profile to use |
namespace | a logical name space to run your templates in such as dev, test and/or prod |
devEmail(required) | the email to use when creating admin users in automated stack launches |
Next, use the following command to launch a CloudFormation template to create the S3 bucket to be used for Lambda code and CloudFormation templates. Wait for this template to complete (you can watch progress from the command line or AWS CloudFormation console)
npm run bootstrap
Finally, use the following command to launch template to deploy the QnABot in your AWS account. When the stack has completed you will be able to log into the Designer UI (The URL is an output of the template). A temporary password to the email in your config.json:
npm run up
If you have an existing stack you can run the following to update your stack:
npm run update
Currently the only browsers supported are:
- Chrome
- Firefox We are currently working on adding Microsoft Edge support.
Refer to LICENSE.txt file for details.
Refer to CHANGELOG.md file for details of new features in each version.
A workshop is also available that walks you through QnABot features.
As QnABot evolves over the years, it makes use of various services and functionality which may go in and out of support. This section serves as a reference to the deployable solution versions along with links to their Public and VPC CloudFormation templates.
Note: Deployable solution versions refers to the ability to deploy the version of QnABot in their AWS accounts. Actively supported versions for QnABot is only available for the latest version of QnABot.
- v5.4.5 - Public/VPC
- For those upgrading from
v5.4.X
to later versions, if you are upgrading from a deployment with LLMApi set to SAGEMAKER then set this value to DISABLED before upgrading. After upgrading, return this value back to SAGEMAKER..
- For those upgrading from
- v5.4.4 - Public/VPC
- v5.4.3 - Public/VPC
- We do not recommend to use this version due to a potential issue with the testall functionality which may introduce a high number of versions stored in the testall S3 bucket. Please use the latest version available or v5.4.4+
- v5.4.2 - Public/VPC
- We do not recommend to use this version due to a potential issue with the testall functionality which may introduce a high number of versions stored in the testall S3 bucket. Please use the latest version available or v5.4.4+
- v5.4.1 - Public/VPC
- v5.4.0 - Public/VPC
- Note: Lambda Runtimes have been updated this release. Solution now uses: [nodejs18 and python3.10]
- v5.3.4 - Public/VPC
- v5.3.3 - Public/VPC
- v5.3.2 - Public/VPC
- v5.3.1 - Public/VPC
- v5.3.0 - Public/VPC
- v5.2.7 - Public/VPC
- v5.2.6 - Public/VPC
- v5.2.5 - Public/VPC
- v5.2.4 - Public/VPC
- v5.2.3 - Public/VPC
- v5.2.2 - Public/VPC
- v5.2.1 - Public/VPC
- Note: Lambda Runtimes have been updated this release. Solution now uses: [nodejs16 and python3.9]
- All solutions less than
v5.2.1
are no longer deployable due to Lambda Runtime deprecations. - Upcoming/Recent deprecations
- nodejs16 will enter Phase 1 deprecation on Mar 11, 2024.
For QnABot, the most common reason is due to AWS Lambda Runtimes being deprecated. When a Lambda runtime has been marked as deprecated, customers can no longer create new Lambda functions in their AWS account. This means that older versions of our solutions that make use of those runtimes will fail to deploy. This makes it hard for the community to provide support as we are unable to deploy a similar environment to investigate issues and reproduce bug reports.
If you've currently got an existing deployment working for you, there is nothing requiring you to update. However, it is strongly recommended that you build a plan to test and migrate production deployments to a supported version. The further away a deployment gets from latest
the greater risk it is at to experiencing instability (especially with regards to deployment).
And for those looking to get started with the solution for the first time, it is always recommended you use the latest version. It is the most secure, stable, and feature-rich version of QnABot!
In most cases, a simple Update Stack operation should allow you to migrate your instance onto a newer version while maintaining your data on the new deployment.
Note: For those upgrading from
v5.4.X
to later versions, if you are upgrading from a deployment with LLMApi set to SAGEMAKER then set this value to DISABLED before upgrading. After upgrading, return this value back to SAGEMAKER.
The team strongly recommends that any upgrades (especially between minor/major versions) first be tested on a non-production instance to check for any regressions. This is critical if you have made custom modifications to your deployment, integrate with external services, or are jumping between multiple versions.
Some additional precautions you can take are:
- export all of your questions using the Content Designer UI (instructions)
- export all of your settings using the Content Designer UI (click
Export Settings
at the bottom of settings page) - backup DynamoDB table (instructions)
- create a manual snapshot of your OpenSearch Domain (instructions)
This solution collects anonymous operational metrics to help AWS improve the quality of features of the solution. For more information, including how to disable this capability, please see the implementation guide.
Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved.
Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.