DevOps for Legacy: Continuous and Faster Delivery
DevOps for Legacy: Continuous and Faster Delivery
The goal behind DevOps is to enable CD. This means that software is designed, written, tested, and deployed to end users continuously, with no delay in the pipeline.
Join the DZone community and get the full member experience.Join For Free
Easily enforce open source policies in real time and reduce MTTRs from six weeks to six seconds with the Sonatype Nexus Platform. See for yourself - Free Vulnerability Scanner.
Still today, legacy systems play an important role in delivering critical business — not just simply as stand-alone platforms, but also as key components of modern distributed applications. Clearly, legacy systems are still a workhorse.
What may not be as obvious is the legacy system’s relevance in supporting customer, employee, and partner-facing applications and difficulties inherent in supporting these applications. Organizations are finding that the traditional approaches to software development and delivery are not sufficient to meet customer needs.
To meet customer needs, a new software development process evolved: DevOps. DevOps is an embodiment of the requirements for development and operations to work together more seamlessly across the application lifecycle to deliver faster and continuously. This article will cover the implementation of DevOps for mainframe legacy systems.
What Is DevOps?
The current buzzword of technology is DevOps. It is an enterprise software development phrase used to mean a type of Agile relationship between development and IT operations. The goal of DevOps is to change and improve the relationship by advocating better communication and collaboration between the multiple business units. Rather than seeing these two groups as silos who pass things along but don’t really work together, DevOps recognize the interdependence of software development and IT operations and helps an organization produce software and IT service more rapidly and with frequent iterations. DevOps is the cure for the enterprise disease known as the silo sickness.
Figure 1: DevOps Environment.
The DevOps movement evolved from Agile. IT organizations moved away from the Waterfall model to the Agile model to speed up software delivery. In Agile, the total software development requirement is breaking down into smaller chunks known as “user stories” that accelerate feedback loops and align product features with market need.
DevOps fulfills the gaps in Agile and provides an environment that promotes communication and collaboration between development and operation teams with a set of automation tools. DevOps and Agile complement each other. The idea of both is to deploy working functionality to production as quickly as possible.
- Collaboration across disciplines.
- Develop and test against production-like systems.
- Deploy frequently using a repeatable and reliable process.
- Continuously monitor and validate operation quality characteristics.
- Amplify feedback loops.
The Need for DevOps in Legacy
The rise of startups and technologies involving big data, faster networks, cloud, and distributed computing made us believe that these old legacy technologies going to be obsolete. Interestingly, legacy system workload is growing and still adding new logic to systems of record. Still today, most mission-critical business runs on mainframes and processes billions of transactions daily to fulfill critical business needs. These systems are not going anywhere — they aren't easily replaced.
Meanwhile, the IBM mainframe system evolved a lot and showed ways to walk alongside modern technology. This is where the concept of legacy modernization popped up and started to participate in the technology revolution. Across the years, legacy modernization has been implemented in many ways, like exposing legacy system as a service by using SOA or REST, rehosting in new systems, or rewriting legacy systems into new technology. To some extent, the legacy problem was resolved by implementing deferent legacy modernization methodologies.
Then. legacy and modern technology worked together with two different work environments, and the customer was not happy with the software delivery process. Legacy systems were following the old Waterfall model, which is a very heavy and long-running process of software delivery. Agile was introduced to address its problems. Organizations found many problems in Agile, like delivery over quality and development over planning. To address these issues, organizations adopted DevOps, which brings together the development and operation team for the entire service lifecycle, from design through the development process to production support.
"Mainframe application delivery does not need to be inherently slow and complex. Automating the mainframe application delivery pipeline reduces risk, and complexity." — Forrester Resarch
The goal behind the DevOps movement is to enable Continuous Delivery. This means that software is designed, written, tested, and deployed to end users on a continuous basis, with no delay in the pipeline.
How to do DevOps on Legacy
Implementing DevOps on legacy can be little difficult because mainframe environments are very different. In addition, jumping from a mainframe workflow to a DevOps-centric workflow is a big cultural change. Still, these challenges can be overcome using tools that connect mainframes seamlessly with the rest of an organization’s infrastructure.
IBM and other organizations have come up with various automation toolsets for development, source quality management, test automation, monitoring automation, infrastructure automation, and deployment automation.
In conclusion, here are a couple of good DevOps automation tools:
Rational Developer for Z systems. This is an IBM product that provides an easy interactive environment for creating and maintaining Z/OS applications quickly and efficiently.
Omegamon. This is an IBM product for performance monitoring.
UrbanCode Deploy. This is IBM's automation tool for automating application deployment through your environment.
Syncsort Ironstream. This is a tool for moving machine data like SMF records, Syslogs, RACF, and DB2 from the mainframe into a platform hosted elsewhere.
Syncsort DMX-h. This tool helps simplify big data integration achievement and achieve faster time value.
Opinions expressed by DZone contributors are their own.