Choosing a Chatbot: From Hubot to Yetibot, What You Need to Know
Choosing a Chatbot: From Hubot to Yetibot, What You Need to Know
If your third-party integrations aren't available, aren't flexible enough to work with your unique situation, or are missing core functionality, it's time for a chatbot.
Join the DZone community and get the full member experience.Join For Free
From firing off an API call to resetting a server, chatbots are a way to trigger a set of automations using chat functionality. Let’s get started!
It’s time for a chatbot when third-party integrations are:
- Not available.
- Not flexible enough to work with your unique situation.
- Not feature rich or missing core functionality your team needs.
There are several well-known chatbots available with more and more popping up all of the time. However, much of the media attention and conversations online about chatbots are more focused on the business-to-consumer (B2C) style of chatbot development.
These are not the chatbots we’re talking about. For our purposes, we are primarily focused on chatbots that help our teams manage their day-to-day work lives rather than facilitating interaction with consumers.
Later, GitHub rebuilt Hubot from the ground up and open-sourced the project, making it available to the public and allowing others to contribute to its ongoing development. Because it has been around the longest, the number of contributors to the core application as well as the growing list of scripts that manage the interactions with third-party services, the infrastructure, and the codebase is larger than for any other chatbot available today.
As ChatOps has gained in popularity, a chatbot written in Ruby named Lita has caught the attention of teams around the world. Definition files and instructions are written in Ruby in the form of modules that allow much of the same functionality that Hubot provides. With a strong and growing community consistently contributing to the source code and modules, this chatbot has become very popular and easy to implement.
Errbot is an open-source project written in Python. It has a number of advantages over both Lita and Hubot, the most notable one being that the application does not need to be restarted each time a new script is added to the library. Errbot is able to recognize new Python scripts and begin running them on behalf of users as soon as they have been placed in the correct directory. Following its nearly complete redesign a year ago, Errbot has a gentle learning curve and intriguing features to consider. These include:
- Ability to “pipe” commands, automation, and conversations.
- Ability to manage the bot through chat.
- Built in ACLs for security.
Cog is another new player in the ChatOps space, but it’s much more than simply a bot. Cog is engineered to be more of a framework that addresses a number of concerns many teams have, such as security. With built-in access control and audit logging functionality, Cog allows teams to collaborate on sensitive tasks with higher confidence. Taking inspiration from the command-line interface, Cog has a “pipe” operator that allows users to run a command and use that output as the input for another command in a process.
Recently, a new chatbot named Yetibot has caught the attention of some.
A self-proclaimed “communal command line,” Yetibot is written in Clojure. This chatbot has a few features that may pique the interest of technical teams looking to piece together strings of commands. Similar to Cog’s piping functionality, Yetibot allows users to chain together complex and flexible commands as well as letting you embed the output of one command into an outer command. Commands can be nested as many levels deep as you like.
When it comes to instructing your bot to take action on your behalf, the scripts, modules, or files that contain instructions will vary depending on your choice of bot.
- Ruby is the language of Lita.
- Python is the language of Errbot.
- Cog is extensible in any language.
- Yetibot is instructed using Clojure files.
Bot and Language-Agnostic Libraries
It can be tough coming to a consensus on which bot and (as a result) programming language to use, especially for larger organizations with many teams. But there is another approach.
Some make the strong argument that it is much better to build a library of automation scripts, written in any language, that people can access via not only a group chat tool, but also an API, command-line interface, or graphical user interface. Because there are multiple ways to interact with a library of scripts, teams can focus more on building small, decoupled automation scripts that can be peer reviewed, version controlled, and made available across the organization.
In fact, you can put in place fine-grained access control to ensure those who should have the ability to execute commands can and those that shouldn’t can’t. This adds a level of abstraction away from the complex inner workings of the automation scripts.
What Are the Benefits to Building These Automation Scripts?
If you build these scripts, then you can expose teams to actions with complete disregard for the programming language or chatbot in use. Teams are less likely to become dependent on any particular bot or programming language and thus can focus more on building small yet powerful scripts in their languages of choice. Any chatbot mentioned here or any other that may surface and evolve will be able to execute these scripts, regardless of the language it’s developed in.
Syntax: Command Versus Natural Language
At present, all the chatbots described here require a very specific syntax in order to execute commands. Much like when inputting commands with variables, triggers, and flags from the command line, the bots will only execute your commands if they are typed or pasted into the chat client in a very specific way. Typing errors, missing variables, or an errant space will prevent the bot from executing anything. The bottom line is that a basic layer of security is built into your interactions with the chatbots.
With that said, some teams are experimenting with ways to create a more “human-like” interaction with their preferred chatbots. Natural language processing (NLP) and the development of associated APIs to allow more natural interactions with applications (such as chatbots) is an area of technology that is gaining more and more attention. In fact, many teams are now exploring interacting with chatbots and services through the use of Voice-Recognition bots such as Amazon’s Alexa.
As ChatOps continues to evolve, the ability to use natural language processing with the chatbots to make the interactions more seamless and “human-like” will continue to improve. Operators will be able to interact with bots as though they are real-life members of the team; hello, Artificial Intelligence! Being able to immediately begin interacting with a chatbot, not knowing anything about the correct syntax, will open the door for more people to use chatbots in their ChatOps endeavors. Exciting possibilities are on the horizon.
Published at DZone with permission of Jason Hand , DZone MVB. See the original article here.
Opinions expressed by DZone contributors are their own.