Creating an Integrated Chat Bot for an AWS Serverless Web Application
This customized and templateable bot will make it simple for you to create and AWS serverless applications.
Join the DZone community and get the full member experience.Join For Free
We will discuss the capabilities of a few AWS components to build a simple serverless application with a customized bot integrated within the application.
We will be using Amazon Lex; other components which are used in the architecture will include:
- API Gateway
- Cloud Formation
There are various inline AJAX Rest Invocations present in the HTML pages to make the website responsive and orchestrate the responses, but that is beyond the scope of this post.
Let’s look at the high-level architecture. The website is a simple portal to add various tasks for a user incorporating the type, timeline, and complexity of the tasks. There is a help service inside the website which is based on some pre-configured instructions and fulfills the desired intent.
We have uploaded the user credentials inside DynamoDB. When the user logs into the portal after entering valid credentials, they are able to create new tasks inside the portal which gets saved in the database for further processing inside the bot.
Serverless Web Application
Here we have a simple HTML invoking the endpoint exposed by API Gateway. The data is also manipulated and validated in Lambda which connects with DynamoDB in the backend.
In the above diagram, we have shown that the API connects to the lambda and produces a simple but end-to-end serverless web application which performs basic CRUD functionalities done by lambda with database. The user will be able to add the tasks in the second page of the application where the previously-added tasks are also displayed.
The user can enter all the details which are shown in the above diagram and these details are stored in the DynamoDB across multiple tables as per the directions implemented inside the Lambda.
We have an embedded UI inside the web application which interacts which the REST API exposed by API Gateway (using a different endpoint this time) and creates a continuing conversation scenario.
In the right side of the application, we can see that there is a simple interactive chat window which creates a conversation with the end user and derives the final result after doing the backend validation by lambda and Lex.
We have designed Amazon Lex to produce a REST-based API to incorporate all the incoming values and construct a complete JSON request which will be executed by Lambda and fetch the result from the backend.
We also denote the intents and fulfillment of the bot inside the console of the Lex. There is also a separate API-based invocation which can be used to create more intents and validations of the Lex. We can check more of the details in the official documentation of Lex.
API Gateway is also one of the integral parts of the solution. We have separate endpoints with various methods (
GET) for each of the endpoints which cater to the whole solution.
In the below diagram, you can see there are many methods for each of the endpoints and each of the endpoints are connected to various Lambda for performing a single independent function.
Finally, we are done!
We can also create a CloudFormation Template for automating all the configurations which need to be done and can have a dedicated stack (not in scope of this post). The bot embedded inside the application can also be exposed through different messaging channels like Slack or SMS.
This Slack configuration is just a one-time authentication setup and creates an app inside any workspace of Slack.
Published at DZone with permission of Aritra Nag. See the original article here.
Opinions expressed by DZone contributors are their own.
Never Use Credentials in a CI/CD Pipeline Again
What Is mTLS? How To Implement It With Istio
Auto-Scaling Kinesis Data Streams Applications on Kubernetes
Micro Frontends on Monorepo With Remote State Management