According to Forrester Research, “globally, 57% of companies either use chatbots already, or plan to do so in the coming year.” That’s incredible! Experts think that the use of chatbots is only going to explode in near future.
Think about it. Chatbots have been around for quite a long time. But why this sudden surge and interest in chatbots now? Well, there are various reasons. Unlike the earlier days, many AI and NLP capabilities are now available as consumable services. Also, serverless technologies make chatbots easier to build and scale.
But wait a minute — what’s this buzzword “serverless” and what’s it all about?
In layman's terms, serverless allows us to upload code into the cloud (in the form of functions) and execute it when needed — without us worrying about the underlying servers. In technical terms, serverless functions are event-driven, are short-lived, and can scale as needed. These very characteristics make serverless functions suitable for use for developing chatbots.
A chatbot is a program that mimics human conversations (as text or speech) using artificial intelligence techniques such as natural language processing (NLP).
Say you want to build a program for booking movie tickets (i.e. a MovieBot). A user can ask about movies, book tickets, or cancel them by chatting with the program. For example, you can ask, “Is Annabelle playing in Gopalan mall in Bangalore tonight?” And the program should respond with a “yes” or “no” and then allow you to choose a number of tickets, show time, seats, and then actually book it.
Building such a MovieBot requires three elements:
Chat interface channel (like Skype or FaceBook Messenger).
Natural language processor to understand user intent (examples: “book a ticket,” “ticket availability,” and “cancellation”).
Access to a backend where the transactions and data related to movies are stored.
The chat interface channels are universal and can be used for different kinds of bots. NLP can be implemented using technologies like AWS Lex or IBM Watson. The question is, how is the backend served? Would you set up a dedicated server (or a cluster of servers), an API gateway, deploy load balancers, put in place identity and access control mechanisms, etc.? That’s costly, painful, and time-consuming! That’s where serverless technology can help.
In the rest of this article, we discuss how serverless is best suited for chatbots as well as challenges in using serverless for developing chatbots.
Serverless and Chatbots
There are many advantages in using the serverless approach for building a chatbot. Let’s take a look at a few of them.
Serverless functions automatically scale to support the rate of incoming requests without needing to configure anything. There is practically no limit to the number of requests your code can handle, making serverless a strong contender for chatbots. For example, if a blockbuster like “Spiderman” is released, then there could be a huge surge in users accessing the MovieBot at the same time, and the setup would effortlessly scale (you have to pay for the calls, though!).
There are no upfront costs for us to use serverless. We are only billed for the number of calls and the amount of time our code runs (and there are no charges for idle times). This is where serverless functions edge over the legacy server approach — if no users are engaging with the chatbot, then you don’t have to pay anything! However, the chatbot is still available (though it may be idle) and the serverless function can get triggered if and when a user uses the bot.
In other words, serverless proves effective not just when there are spikes in workload but also when the bot is left idle for a long time.
Easy Integration With Other Services
Going serverless allows a seamless integration to various other cloud services from the same provider. For example, if you are using the AWS platform for chatbots, then you can use DynamoDB for the database, write programming logic as Lambda functions, and expose them through the API Gateway.
This entire setup does not require you to provision any infrastructure or have any knowledge about underlying servers/VMs in the cloud.
Challenges in Going Serverless
Though the term serverless seems really exciting, and the cost advantages and scaling with serverless functions are very attractive, there are many challenges or drawbacks when we have to deal with them in real-world applications. A few key challenges are the follwing.
Debugging and Monitoring
Let’s assume that the MovieBot provides an inconsistent response or doesn’t understand the user’s intent. It might break. Now, the question is: How can we deal with situations like these?How can we debug the MovieBot code because it is running remotely? One possible way is to use extensive logs. You can insert instrumentation into your code to help you validate that your code is working as expected. AWS Lambda automatically integrates with Amazon CloudWatch logs and pushes all logs from your code to CloudWatch logs.
For debugging the MovieBot, we need to include a lot of details — such as the NLP scores, the dialog responses, and the query results of the movie ticket database. Then, we have to manually analyze and do detective work to find out what could have gone wrong. And, that is painful. It is also difficult to know if and when a lambda function is not working properly or misbehaving. Hence, monitoring is also a significant challenge.
Any conversation represents a sequence of chats. It is important for the MovieBot to understand and remember the flow of the entire conversation. For example, if for the query, “Is Annabelle playing in Gopalan mall in Bangalore tonight?” the answer from MovieBot is “yes,” then the next query from the user could be, “Are two tickets available?” If the MovieBot confirms availability, then the user could say, “Okay, book it.” For this transaction to work, the MovieBot should remember key details of the entire dialog, which includes the name of the movie, the theatre location, the city, and the number of tickets to book. This entire dialog represents a sequence of stateless function calls. But we need to retain the details somewhere for the final transaction to be successful. This maintenance of state external to functions is a tedious task, but necessary.
Each serverless function code typically would have third-party library dependencies. When deploying the serverless function, we need to deploy the third-party dependency packages, as well, and that increases the deployment package size. Further, maintaining all the dependent packages, versioning them, etc. is a practical challenge, as well. It becomes even more effort-intensive when the deployment packages get larger in size. Because containers are used underneath to execute the serverless functions, the increased deployment size increases the latency to start up and execute the serverless functions.
Chatbots are becoming wildly popular. And serverless is all about creating solutions without thinking or worrying about servers; think of it as, “just put your code in the cloud and run it!” Serverless is a game-changer because it shifts the way you look at how applications are composed, written, deployed, and scaled. Serverless is a natural choice for implementing chatbots. You can choose serverless technologies from different providers and use tools/frameworks for developing chatbots. If you are interested in developing a chatbot, check out this detailed presentation: Chatbots With AWS Lambda: Step-by-Step. Happy botting!
Learning More About Serverless and Chatbots
Want to explore more about it? Check out these resources:
Serverless Architectures on AWS by Peter Sbarski
Serverless Architectures by Martin Fowler
The Complete Beginner’s Guide to Chatbots by Matt Schlicht
How Bots Will Completely Kill Websites and Mobile Apps by Matt Schlicht
It’s All Going to Be Serverless — the Question Is, “When?” by Jouni Heikniemi
These are the best of times, and these are the worst of times! There are so many awesome new technologies to catch up on. But, we simply can’t. We have seen a progression of computing models — from virtualisation, IaaS, PaaS, containers, and now, serverless — all in a matter of a few years. You certainly don’t want to be left behind. So join us at the Serverless Summit, India’s first confluence on serverless technologies, being held on October 27, 2017 at Bengaluru. It is the best place to hear from industry experts, network with technology enthusiasts, as well as learn about how to adopt serverless architecture. The keynote speaker is John Willis, director of ecosystem development at Docker and a DevOps guru (widely known for the book ‘The DevOps Handbook’ that he co-authored). For more details, please visit the website www.inserverless.com.