How to Test Your Mainframe Environment With Open-Source Brightside, Taurus, and Jenkins
Test your Mainframe environment with these open-source tools!
Join the DZone community and get the full member experience.
Join For FreeMainframe and the applications that run on it are central to IT operations for the majority of Fortune 500 companies. Therefore, despite the rise of enterprise use of cloud applications, Mainframe isn't going away any time soon. But Mainframe solutions require developers and testers with high levels of expertise to run them, often resulting in lengthy release cycles and testing bottlenecks. However, this can change if enterprises start using open-source tools that enable shifting left.
Incorporating DevOps methodologies and open-source tools into developers' skill sets will speed up the time to production. In this blog post, we will cover how to test your environment with open-source tools: Brightside, Taurus, BlazeMeter (open-source based), and Jenkins.
But first, let's understand some of the challenges developers have when mainframe testing and how open-source Brightside can help with these challenges.
Mainframe testers are bound to an older generation of tools specifically designed for the mainframe. These tools are expensive, often with limited licensing, and need specialist knowledge to operate them. In addition, traditional mainframe testing solutions don't integrate with today's open-source testing tools or continuous integration processes.
As a result, mainframe testing usually only happens at the end of the development process, by a small number of experts, stalling the whole development process. This causes bottlenecks in application development, longer deployment cycles, and more expensive development costs. The solution is open-source Brightside, which connects the open source to the mainframe.
Brightside was launched in June 2018 and open-sourced as part of Project Zowe under the Linux Foundation. Brightside is a mainframe interaction framework, enabling development teams to control, script, and develop for the mainframe like any other cloud platform.
For open-source tools and implementation, Brightside allows developers to easily work with modern DevOps toolchains and frameworks, like Taurus, an open-source test automation framework, and Jenkins, for CI/CD on the Mainframe.
Brightside connects to REST and other APIs of mainframe services and products and makes them usable by using simple command-line commands. Brightside can run on systems with Windows, Mac, or Linux and connects to services on z/OS via the network.
Typical mainframe applications that run on z/OS provide interfaces that are not easy to use outside of mainframes such as JCL or TSO. Open-source testing tools (such as Taurus) cannot run directly on z/OS. Brightside enables users to invoke CLI commands that interact with mainframe from Windows, Mac, and Linux.
Brightside offers two major benefits to mainframe development: new generations of testers who already know open-source tools can use them for mainframe applications and traditional mainframe developers that will speak the same language as the rest of their teams.
In this post, we will test an application running on the mainframe. We will use the following tools: Brightside, Jenkins, and Taurus.
Our application is a web application running on z/OS and developed in Java. The technology that is used to develop it is not relevant to how it will be tested since the testing is done over HTTP protocol and the described scenario will work for any web application.
- Brightside will be used to prepare the test environment. In this example, it is used to start the application, which has several z/OS started tasks that are activated by Brightside.
- Taurus will be used for measuring the response time of that mainframe application.
- Jenkins will be used for scheduling the test and implementing continuous testing.
Preparing the Testing Environment Prior to Running a Performance Test
First, we need to set up the testing environment before we can start the load test. We will use CA Brightside to do this.
Before you can use Brightside, you need to install it. Brightside is a Node.js-based tool, so it can be installed using NPM.
1. Issue the following command to set the npm registry to the CA Brightside scoped package:
npm config set @brightside :registry https: //api.bintray.com/npm/ca/brightside
2. To install CA Brightside, issue the following command:
npm install -g @brightside/core@next
For more details, you can see: Install CA Brightside.
3. Before you can connect to your mainframe system, you need to set up a Brightside profile using the following command:
bright profiles create zosmf-profile <system> --host <system> --user <userid> --password <password>
4. At this point, you can use Brightside to interact with your mainframe system using its commands in the terminal. If you want to see what commands are available, type just bright and it will list all available commands.
Using Brightside commands, you can set up the tested web application: create datasets, load them with test data, or start the tested address space. Our tested application has multiple started tasks that need to run before the performance test can start. We will use a Python script that will issue several Brightside commands to start the required started tasks.
5. We will write the following Python script and save it to a file named prepare_test_environment.py:
from pprint import pprint
from pybright.cli import bright
def start_not_running_tasks_of_api_layer(prefix):
expected_jobnames = {
"SYS1": ['JOB1A', 'JOB1B'],
"SYS2": ['JOB2A', 'JOB2B']
}
owner = "MASSERV"
active_jobs = []
for job in bright(f"zos-jobs list jobs --prefix {prefix}* --owner {owner}"):
if job['status'] == 'ACTIVE':
active_jobs.append(job['jobname'])
print("Active jobs:")
print(", ".join(active_jobs))
for system, jobnames in expected_jobnames.items():
for jobname in jobnames:
if jobname not in active_jobs:
print(
f"Job {jobname} is not active on system {system}. Starting...")
bright(
f'zos-console issue command --sysplex-system {system} "S {jobname}"')
if __name__ == '__main__':
start_not_running_tasks()
This script can be executed in any system where Brightside and Python are installed. Before running the real test on Jenkins, we will run the script locally on the workstation to make sure that it works.
If you want to use this script on your systems with different system names, you need to change the system names and job names in the expected_jobnames variable and owner in the owner variable.
6. Execute the script locally by the following command:
python prepare_test_environment. py
If the script runs without any error, you can proceed with the following instructions.
At this point, we have a Python script that set up the tested application and prepares the test environment for performance tests using Brightside commands. When you will execute this script, your application will be started on z/OS.
We will use Taurus to execute the tests. Taurus is an automation-friendly framework for continuous functional and performance testing. The test scenarios are defined in simple Taurus YAML format. This allows a simple definition of web API tests or embedding tests in a variety of existing test frameworks ( JMeter, Selenium, Mocha, JUnit, pytest...). It has rich methods for evaluating results and their reporting (open formats known by Jenkins plugins or BlazeMeter). You can find more information about Taurus on its website.
We have chosen Taurus because the performance test can be defined by using simple YAML definitions, and the same definition can be used to run the test from your workstation and on Jenkins as well.
7. Install Taurus by using the following command:
pip install bzt
8. This is the Taurus test definition that we have used — save to the file named perftest.yml:
execution:
- concurrency: 100
ramp-up: 10s
hold-for: 1m
scenario: quick-test
scenarios:
quick-test:
requests:
- ${BASE_URL}/api/v1/myapplication/endpoint
services:
- module: shellexec
prepare:
- python3 -u prepare_test_environment.py
settings:
artifacts-dir: test-results/%Y-%m-%d_%H-%M-%S.%f
env:
BASE_URL: https://localhost:10010
REPORT_NAME: Master Build
reporting:
- module: blazemeter
report-name: ${REPORT_NAME}
test: My Application Performance Test
project: Default project
- module: passfail
criteria:
- p90>200ms for 10s, stop as failed
- module: junit-xml
filename: test-results/bzt_test_report.xml
data-source: pass-fail
- module: final-stats
summary: true # overall samples count and percent of failures
percentiles: true # display average times and percentiles
summary-labels: false # provides list of sample labels, status, percentage of completed, avg time and errors
failed-labels: false # provides list of sample labels with failures
test-duration: true # provides test duration
dump-xml: test-results/bzt_dump.xml
modules:
blazemeter:
token: <API Key Id:API Key Secret>
You can adapt this script for your application by changing the URL in the scenarios section. You need to provide your BlazeMeter API token in order to report the results to your BlazeMeter environment.
9. To obtain the BlazeMeter API token, follow these steps:
- Press Generate
- Copy the API Key Id and API Key Secret and use them in the YAML file instead of <API Key Id:API Key Secret>. The Id and Secret are separated by a colon (:)
The YAML file defines the following:
- The execution section defines how long the test will run (1m in this example) and how many concurrent users will access the tested application
- The scenarios section defines URL endpoints that are used during the performance test
- The services section defines that a Python script should be executed before the test starts. This script invokes Brightside commands.
- The setting section is used to define the path where the results are stored and defines default values for environment variables.
- The reporting section defines that the results are posted to BlazeMeter and stored to XML files that can be used by Jenkins. The pastfail module defines a condition that means that the test has failed.
10. Then, you can use Taurus to execute this test from the command-line:
bzt -o settings.env.BASE_URL=https://sys1:10010 perftest.yml
We have pointed the base URL to the test system "sys1." This variable can be used for testing on different systems. For instance, you can have a development instance on different systems; you can use it during development or you can use the dedicated testing environment for performance tests that are executed from Jenkins.
You can adapt the command to your environment by changing the value of BASE_URL to use the hostname of your test system and the port of test instance of your application. For example, if your hostname is testsys.mydomain.net, the port is 8080 and you use HTTP protocol; then the new value would be: http://testsys.mydomain.net:8080
11. You will see the following user interface while the test is running:
These results show the status of the performance test: how many users are accessing the applications and the current values for the response time. You can also see the HTTP response code to be sure that the tested application is working as expected.
12. You can see full reporting and historical data in BlazeMeter in the screenshot below (Taurus reports can be stored to BlazeMeter).
The web page with the BlazeMeter report is opened automatically when you execute the test by Taurus.
Congratulations! You have now run your open-source performance test on your mainframe application. Let's see how you can schedule it in Jenkins as part of your continuous integration process.
For continuous testing, we will use Jenkins to schedule our tests. We use Jenkins because we need regular execution of the performance tests. The previous steps on your laptop are good for the development of the performance tests and related scripts, but they do not provide consistent performance results.
Another option is to set up a few machines as BlazeMeter Private Locations and use them to test our on-premise application from BlazeMeter.
13. Install Jenkins on a Linux server with Ubuntu following instruction at How To Install Jenkins on Ubuntu 18.04.
14. Before we can use Brightside and Taurus on a fresh Jenkins machine, we need to install them together with their prerequisites. Taurus requires a Python installation and Brightside requires Node.js. Required Python is already installed in the majority of Linux distributions, so you just need to install the Taurus using the command:
pip install bzt
For more details, refer to this link.
15. You can install Node.js using the package manager of your Linux distributions, following the steps here.
16. The next step is to install Brightside. You need to use the same commands on the Jenkins machine as on your computer under the Jenkins user:
sudo -u jenkins bash
npm config set @brightside:registry https://api.bintray.com/npm/ca/brightside
npm install -g @brightside/core@next
bright profiles create zosmf-profile <system> --host <system> --user <userid> --password <password>
For more details, you can see: Install CA Brightside.
At this moment, we have created the Python script with Brightside commands and the Taurus YAML file on our computer. We need to push it to a GitHub repository so that the Jenkins machine can read it from there.
17. Create a new repository on GitHub. If you do not use GitHub, you can use a different source code management system.
We will commit these files to the Git repository and push them to the GitHub repository.
- git init
- git add .
- git commit -m "Taurus test with Brightside"
- git push
Now, we will define a Jenkins pipeline that pulls the code from GitHub and starts Taurus to run the performance test.
18. The first step is to press the New Item button in Jenkins.
19. Choose the Pipeline job type.
When you create a job, you can change its settings.
20. We need to schedule the execution of the test on Jenkins. The Jenkins syntax allows any type of schedule. In our example, we would like to run the test between 5 AM and 6 AM in the local time zone.
21. The last step is to define the pipeline in the Pipeline section:
You need to provide your Git repository URL and the base URL of the tested application.
The full source code of the pipeline is:
pipeline {
environment {
BASE_URL ="http://yourAppHost:port"
}
stages {
stage('Preparation') {
// Get your code from a Git repository
git '...'
}
stage('Performance Test') {
steps {
sh """
bzt -o settings.env.BASE_URL=${BASE_URL} -o settings.env.REPORT_NAME=Jenkins_Build_${BUILD_NUMBER} perftest.yml
"""
}
}
}
post {
always {
junit allowEmptyResults: true, testResults: 'test-results/bzt_test_report.xml'
perfReport percentiles: '0,50,90,100', sourceDataFiles: 'test-results/bzt_dump.xml'
}
}
}
The junit and perfReport steps in the post section to report the success of the performance test and to create charts that show the trend of your application performance using the performance plugin for Jenkins.
To adapt this pipeline for your environment, you will need to update the BASE_URL environment variable at the top of the pipeline and use the hostname and port of your test system.
22. The charts are displayed on the page of your Jenkins job:
That's it! In this post, we have developed a performance test for a web application running on z/OS. We have used CA Brightside to start the web application on z/OS. Taurus allows us to develop the performance test quickly on your computer, and then, you can use the same test on Jenkins and have it scheduled to run regularly.
You can learn more about mainframe testing with open-source tools including JMeter, Taurus, BlazeMeter, and Jenkins here.
Published at DZone with permission of Petr Plavjanik, DZone MVB. See the original article here.
Opinions expressed by DZone contributors are their own.
Comments