Using RESTful APIs and Microservices to Work With Db2
Using RESTful APIs and Microservices to Work With Db2
In this article, see an in-depth tutorial on how to use RESTful APIs and microservices to work with Db2.
Join the DZone community and get the full member experience.Join For Free
The new Db2 Data Management Console is a free browser-based user interface included with Db2 for Linux, UNIX, and Windows. It's more than a graphical user interface to monitor, manage, run, and optimize Db2. It is a set of open RESTful APIs and microservices for Db2.
Anything you can do in the user interface is also available through REST. You can also embed parts of the user interface into your own webpages, or Jupyter notebooks.
This article demonstrates how to use the open RESTful APIs and the composable user interfaces that are available in the Db2 Console service.
To learn more about the Db2 Console check out the community at www.ibm.biz/Db2Console. You can download the Db2 Console and find more coding examples, best practices, and access to a free hands-on coding lab.
You might also want to read: Invoking REST APIs From Java Microservices
Constructing API Calls
To access the service, we need to first authenticate with the service and create a reusable token that we can use for each call to the service. The token ensures that we don't have to provide a user ID and password each time we run a command while keeping the communication secure.
Each request is constructed of several parts. First, you need to identify the URL of the service. For example:
http://localhost:11080. In this example, we assume that the console service is running on the same machine as the Jupyter notebook but it can be remote. Port 11080 is typical for running an unsecured connection. (11081 for https).
You then need to identify the API and the API version. In this case
The next part of the call identifies the REST request and the options. For example
'/metrics/applications/connections/current/list' . This is followed by more options separated by an
So a complete call might look like this:
In this case, the options are the start and end time, whether to include system-generated connections, how to sort the results, and where to start the list.
Some complex requests also include a JSON payload. For example, running SQL includes a JSON object that identifies the script, statement delimiters, the maximum number of rows in the results set as well as what to do if a statement fails.
The full set of APIs is documented as part of the Db2 Data Management Console user interface in the help menu.
You can also use the developer tools in your favorite browser to see the API traffic between the Db2 Console user interface and the Db2 Console API service.
Simple cURL Example
cURL is a command-line tool for getting or sending data including files using URL syntax. The name stands for "Client URL". It is particularly useful to write simple scripts to access the RESTful APIs of the Db2 Console. In this example, we include cURL calls in a BASH script. The script constructs the URLs that contain the RESTful calls, then submits them to the Db2 Console's 11080 communication port.
The jq library I used in these examples makes parsing JSON a snap.
The first call establishes a connection and retrieves a reusable token that is used to securely submit additional requests.
Once we have a reusable token, you can start to retrieve data from the service. This first example we retrieve information about the database connection we are using:
The Db2 Console saves the information to connect to a Db2 database in a connection profile. Here is what that looks like in the JSON that is returned:
You can get more than just setup and configuration information. The next example returns a list of the schemas available in the database:
Here is the result set:
Python and Jupyter Notebooks and the Db2 Console
Once you start getting into more complex projects, you will need a more powerful language and a more powerful environment. I found that Python and Jupyter notebooks are a perfect way to build some very powerful routines to automate your work with Db2 through the Db2 Console.
For Database Administrators who use scripts today, there are several advantages other than just using a powerful language like Python. Jupyter notebooks are part of a simple but powerful development environment that lets you build and share your notebooks. Notebooks can combine code, microservices, documentation, and data analysis into a single document.
Let's start with a few Python routines that will get you started. (All of the examples here are available through GITHUB with links at the end of the article.)
To get started, you need to include a few standard Python libraries so that we can work with REST, JSON, and communicate with the Db2 Console APIs.
The Db2 Class
To make life easier, we created a Db2 helper class that encapsulates the REST API calls that access the Db2 Console service. The class below includes retrieving the reusable token as well as the basic structures to POST and GET requests and extract JSON from the API responses.
To connect to the Db2 Data Management Console service, you need to provide the URL, the service name (v4), the Db2 Console username and password, and the name of the connection profile used by the Db2 Console to connect to the database you want to work with.
To confirm that your connection is working, get the details of the specific database connection you are working with. Since your console user id and password may be limited as to which databases they can access, you need to provide the connection profile name to drill down on any detailed information for the database.
Take a look at the JSON that is returned by the call in the cell below. (This is similar to what you did with cURL.) You can see the name of the connection profile, the database name, the database instance the database belongs to, the version, release, and edition of Db2 as well as the operating system it is running on.
You can also check the status of the monitoring service. This call takes a bit longer since it is running a quick diagnostic check on the Db2 Data Management Console monitoring service. You should see that both the database and authentication services are online.
The console lets you work with objects in the Database. Here are just two examples of how to access that through REST and the composable interface.
You can call the Db2 Data Management Console microservice to provide an active console component that you can include in an IFrame directly into your notebook. The first time you access this, you will have to log in just like any other time you use the console in a new browser session. If you want to see all the schemas including the catalog schemas, select the "Show system schemas" toggle at the right side of the panel.
In the schema explorer microservice interface, click on Show system schemas at the right side of the screen. This displays all the schemas in the Db2 catalog as well as user schemas.
You can get the same list through the RESTful service call, which we walk through in the next example.
Many of the examples below use the Pandas DataFrames library. JSON data sets can be much easier to work with if you can convert them to a DataFrame. To use them, you need to import the required libraries.
DataFrames are a powerful way to represent data in Python as an in-memory table. The library has many functions that can manipulate the data in the frame.
Listing Schemas Through the API
This next example calls the
/schemas API. If the call is successful, it will return a 200 status code. The API call returns a JSON structure that contains the list of schemas. The code extracts the 'resources' section of the JSON result and passes it to the json_normalize function and is then used to create a DataFrame. A single call displays the names of the first 10 schemas.
display(df[['name']].head(10)) is a quick way to display all the column names in the DataFrame.
You can search the objects in your database through the
/admin API. This API requires a JSON payload to define the search criteria. In this example, we are looking for Views with "table" in their name. It will search through both user and catalog views.
Tables in a Schema
This example uses the
/schemas API to return all the tables in a single schema.
You can access key high-level performance metrics by directly including the monitoring summary page in an IFrame or calling the
/metrics API. To see the time series history of the number of rows read in your system over the last day, run the statement below. Then scroll to the right side and find the Database Throughput Widget. Then select Rows Read and Last 1 hour.
To access the same data directly through an API, you can use the
/metrics/rows_read API call. To extract the timeseries data from the JSON returned from the API, you need to access the 'timeseries' part of the full JSON data set.
The example below retrieves the last hour of data and converts it to a Pandas DataFrame for easy manipulation. It prints out the names of the columns available in the dataframe.
EPOC Time Conversion
The timeseries data is returned as UNIX epoch time. That is the number of msec since January 1st 1970. Notice that the start and end times in the REST request are also defined in EPOC time.
To make the timeseries data readable, we need a routine to do the conversion from EPOC to a human-readable timestamp.
Next, we will apply the routine to each row in the DataFrame and display the last 20 measurements.
Finally, we can plot the results.
You can access the storage report page directly by calling it into an IFrame, or you can access the data from an API. In the report below, you can select the timeframe for storage usage, group by table, or schema, select the object you want to analyze, and then select View Details from the Actions column.
You can also list storage by schema. The following example retrieves the current level of storage usage. There are lots of columns available in the JSON results but the example below displays the schema name and size.
These were just a few examples of what is possible by using the open RESTful APIs and the microservice user interface in the Db2 Console.
You can also access a free hands-on interactive lab that uses all of the notebooks at www.ibm.biz/DMCDemosPOT. After you sign up for the lab, you will get access to a live cloud-based system running Db2, the Db2 Console, as well as extensive Jupyter Notebooks and Python to help you learn by doing.
Published at DZone with permission of Peter Kohlmann . See the original article here.
Opinions expressed by DZone contributors are their own.