Over a million developers have joined DZone.

Using Cloud Functions for Automated, Regular Cloud Database Jobs

DZone's Guide to

Using Cloud Functions for Automated, Regular Cloud Database Jobs

Using IBM Cloud Functions it is easy to implement automated, regular maintenance jobs. This could be to clean up data in a database, for example.

· Database Zone ·
Free Resource

Slow SQL Server? These SentryOne resources share tips and tricks for not only troubleshooting SQL Server performance issues, but also preventing them before they hit your production environment.

Some days ago, I blogged about a tutorial I wrote. The tutorial discusses how to combine serverless and Cloud Foundry for data retrieval and analytics. That scenario came up when I looked into regularly downloading GitHub traffic statistics for improved usage insights. What I needed was a mechanism to execute a small Python script on a daily or weekly basis. After looking into some possible solutions, IBM Cloud Functions was the clear winner. In this article, I am going to discuss how simple it is to implement some regular, automated activities, such as maintenance jobs for a cloud database.

Image title

Code Your Action

The action is the part that is executed. IBM Cloud Functions supports several programming languages for coding an action. JavaScript, Swift, Python, and some others can be used, or even a Docker image provided. In my case, I implemented a Python action to fetch the GitHub account information and the list of repositories from Db2, then to retrieve the traffic data from GitHub and, last, to merge it in Db2. The code for that particular action can be found in this file on GitHub.

Create Action, Trigger, and Rule

Once the code is ready, it can be used to create a Cloud Functions action. The available runtime environments already include drivers for several database systems, including Db2. The ZIP file ghstats.zip includes extra files for modules that are not part of the standard environment. The second step is to bind the action to the database service. Thereby, the database credentials are automatically obtained.

# Create the action to collect statistics
bx wsk action create collectStats --kind python-jessie:3 ghstats.zip

# Bind the service credentials to the action
bx wsk service bind dashDB collectStats --instance ghstatsDB --keyname ghstatskey 

# Create a trigger for firing off weekly at 6am on Sundays
bx wsk trigger create myWeekly --feed /whisk.system/alarms/alarm\
   --param cron "0 6 * * 0" --param startDate "2018-03-21T00:00:00.000Z"\
   --param stopDate "2018-12-31T00:00:00.000Z"

# Create a rule to connect the trigger with the action
bx wsk rule create myStatsRule myWeekly collectStats 

A trigger emits an event on the given schedule. The above trigger definition uses the cron syntax to fire every Sunday at 6 AM. Last, a rule creates the connection between the trigger and the action. This causes the action to be executed on a weekly schedule.


Using IBM Cloud Functions it is easy to implement automated, regular maintenance jobs. This could be to clean up data in a database, call APIs of web services, summarize activities, and send out the weekly report, and much more. For my use case, it is the ideal tool for the problem. It is inexpensive ("cheap") because it only consumes resources once a week for few seconds. Read the full tutorial in the IBM Cloud documentation.

If you have feedback, suggestions, or questions about this post, please reach out to me on Twitter (@data_henrik) or LinkedIn.

Database monitoring tools letting you down? See how SentryOne empowers Enterprises to go faster.

cloud functions ,ibm cloud ,database ,tutorial ,automation ,python ,data analytics ,cron job

Published at DZone with permission of

Opinions expressed by DZone contributors are their own.

{{ parent.title || parent.header.title}}

{{ parent.tldr }}

{{ parent.urlSource.name }}