Diving Into Cognitive Programming With Watson IBM and the Mule Platform

DZone 's Guide to

Diving Into Cognitive Programming With Watson IBM and the Mule Platform

Learn how one company implemented Watson's AlchemyLanguage and Visual Recognition API services by using the Mule platform.

· AI Zone ·
Free Resource

Watson is a service provided by IBM to create cognitive programming. What makes Watson so remarkable is that it processes data like a real person rather than like a computer. This allows Watson to extract valuable information from unstructured data such as texts, images, and speech.

Watson offers several services that allow you to work with different types of data. For example, at Admios, we support AlchemyLanguage and Visual Recognition:

  • AlchemyLanguage is a collection of APIs that offer text analysis through natural language processing. It can analyze text and help you to understand its tone, keywords, entities, high-level concepts, and more.
  • Visual Recognition is an API that provides image analysis by recognizing scenes, objects, faces, and text.

Taking the Leap

The main goal for both of these connectors was to gain real experience using the Mule platform. While we could have used any service, no one on our team had worked with Watson before, so we figured, why not use this as an opportunity to learn a cool new technology in an emerging field like cognitive programming?

It actually sounds more complex than it really is. Sure, Watson performs complex operations and implements sophisticated algorithms — but you don’t actually need to know any of that in order to use it. IBM makes it simple in that you only need to use a normal up-to-date REST API.

Our first step was to download and learn about the Java SDK for Watson, which is open source. Once we felt comfortable with the SDK, we implemented a wrapper around it that exposed its operations and classes to the Mule ecosystem.

Building the Connector

We used DevKit 3.8 which is the official SDK for building custom connectors. This allowed us to define the operations that could be used inside Anypoint Studio along with its proper configuration. We had to follow a set of guidelines and best practices to define request classes so that they contain all the variables that we wanted to pass to the connector, but the Studio would always explain in a user-friendly way.

The internal architecture was also pretty straightforward. You needed at least one class that defined the connector and another that defined the configuration. Besides that, we broke our logic into several handlers that managed the logic for the operations defined in the connector class and it internally used the Watson SDK.


In addition to the implementation, if you want to certificate your connector, you need to provide a test for each operation. DevKit can easily be used to implement an embedded server and create functional or integration tests.

If you’re planning on making a connector for MuleSoft, then I highly recommend you certify your connector — or, as we did in our case, make a small test connector using a service that you wish to learn. The certification team will ask you to apply a variety of good practices and will help you learn the right way to work inside the Mule ecosystem so that you can successfully implement a connector for your company in no time.

ai ,algorithms ,api ,cognitive computing ,java sdk ,mule ,nlp ,visual recognition ,watson

Published at DZone with permission of Gianluigi Pierini , DZone MVB. See the original article here.

Opinions expressed by DZone contributors are their own.

{{ parent.title || parent.header.title}}

{{ parent.tldr }}

{{ parent.urlSource.name }}