Generative AI Leverage in Application Modernization
Generative AI is becoming a powerful enabler to drive change in how application modernization programs are accelerated in the era of extreme automation.
Join the DZone community and get the full member experience.
Join For FreeApplication modernization is the process of updating legacy applications, leveraging modern technologies, and enhancing performance. It can be made adaptable to evolving business speeds by infusing cloud-native principles like DevOps, infrastructure-as-code, and so on. Treatment of legacy applications could range from a complete re-write to a re-host based on the value, criticality and objectives. It is also a known fact that benefits are highest for rewrites as it provides an opportunity to get to the true cloud-native model with a high degree of agility and speed. Many CIOs/CTOs are hesitant to invest due to the cost and timelines involved in realizing value while being able to balance between high investment re-write initiatives vis-à-vis low-value rehost approaches. Service providers and tooling vendors are trying to address this space by building accelerators that could be customized for enterprise consumption that help accelerate specific areas of modernization: Evolvware, IBM Consulting Cloud Accelerators, and cloud service provider (AWS, Azure, GCP etc.) specific tools. While attempting to drive acceleration and optimize cost of modernization, Generative AI is becoming a critical enabler to drive change in how we accelerate modernization programs. This article focuses on Generative AI possibilities in application modernization process.
Application Modernization Overview
Application modernization starts with assessment of current legacy applications, data and infrastructure and applying the right modernization strategy (rehost, re-platform, refactor, or rebuild) to achieve the desired result. While rebuilding results in maximum benefit, there is a need for high degree of investment, whereas rehosting is about moving applications and data to the cloud without any optimization and this requires less investments while value is low. Modernized applications are deployed, monitored, and maintained, with ongoing iterations to keep pace with technology and business advancements. Typical benefits realized would range from increased agility, cost-effectiveness, and competitiveness, while challenges include complexity and resource demands. Many enterprises are realizing that moving to cloud is not giving them the desired value nor agility/speed beyond basic platform-level automation. The real problem lies in how the IT is organized, which reflects in how their current applications/services are built and managed (refer Conway’s law). This, in turn, leads to the following challenges:
- Duplicative or overlapping capabilities offered by multiple IT systems/components creates sticky dependencies and proliferations which impacts productivity and speed to market.
- Duplicative capabilities across applications and channels give rise to duplicative IT resources (e.g., skills and infrastructure)
- Duplicative capabilities (including data) resulting in duplication of business rules etc., give rise to inconsistent customer experience.
- Lack of alignment of IT capabilities to business capabilities impacts time to market & business-IT. In addition, enterprises end up building several band-aids and architectural layers to support new business initiatives and innovations.
- Legacy technologies and monolith nature impacting speed and agility and in addition impacting security and compliance posture.
Hence, application modernization initiatives need to be focusing more on the value to business and this involves significant element of transformation of the applications to business capabilities aligned components & services. Biggest challenge with this is the amount of investment needed and many CIOs / CTOs are hesitant to invest due to the cost and timelines involved in realizing value. Many are addressing this via building accelerators that could be customized for enterprise consumption that helps accelerate specific areas of modernization and one such example from IBM is IBM Consulting Cloud Accelerators. There are accelerators from Cloud Service Providers like AWS, Azure, GCP etc. that help in expediting the application modernization journey as well. While attempting to drive acceleration and optimize cost of modernization, Generative AI is becoming a critical enabler to drive change in how we accelerate modernization programs. We will explore key areas of acceleration with an example in this article.
A simplified lifecycle of application modernization programs (with emphasis on re-write) is depicted below.
Discovery focuses on understanding legacy application, infrastructure, data, interaction between applications, services and data, and other aspects like security etc,
Planning breaks down the complex portfolio of applications into iterations to be modernized to establish an iterative roadmap – and establishing an execution plan to implement the roadmap.
Blueprint/design phase activities change based on the modernization strategy (from decomposing applications and leveraging domain-driven design or establishing target architecture based on new technology to build executable designs).
Subsequent phases are build, test and deploy to production. Lets explore the Generative AI possibilities across these lifecycle areas.
Discovery and Design
The ability to comprehend legacy applications with minimal Subject Matter Expert (SME) involvement is a pivotal accelerator in the modernization process. SMEs often find themselves occupied with system maintenance tasks and critical issues, potentially having limited knowledge due to the duration of their system support. The discovery and design phases of modernization projects typically consume significant time, while development becomes considerably smoother once the legacy application's functionality, integration aspects, logic, and data complexity are decoded.
Modernization teams rely on code analysis and reference numerous dated documents, making the use of code analysis tools vital. In cases of rewrite initiatives, it is essential to map functional capabilities to the context of the legacy application, facilitating effective domain-driven design and decomposition exercises. Generative AI proves invaluable in this context, as it can correlate domain and functional capabilities with code and data, creating a coherent view of business capabilities connected to application code and data. Naturally, these models need to be fine-tuned and contextualized according to the enterprise domain model or functional capability map. The API mapping facilitated by Gen AI, as discussed in this paper, serves as a prime example.
For application decomposition and design, event-storming requires process maps, and Gen AI assists by contextualizing and mapping information from process mining tools. Gen AI also aids in generating use cases based on code insights and functional mappings. Overall, Generative AI plays a crucial role in mitigating the risks associated with modernization programs by ensuring comprehensive visibility into legacy applications and their dependencies.
Furthermore, Gen AI supports the generation of target designs tailored to specific cloud service provider frameworks through model adjustments based on standardized patterns, encompassing elements such as ingress/egress, application services, data services, and composite patterns. Additionally, Generative AI serves various other purposes, including the creation of target technology framework-specific code patterns for security controls. It extends its capabilities to generate detailed design specifications, including user stories, User Experience Wireframes, API Specifications (e.g., Swagger files), component relationship diagrams, and component interaction diagrams, among others.
Planning
Creating a comprehensive macro roadmap for a modernization program is a challenging endeavor that involves striking a delicate balance between parallel efforts and sequential dependencies, all while identifying co-existence scenarios. Traditionally, this task is treated as a one-time effort, but achieving continuous realignment through Program Increments (PI) planning exercises incorporating execution-level inputs poses a much greater challenge. Generative AI proves invaluable in this context by facilitating the generation of roadmaps based on historical data, encompassing factors such as applications to domain area mappings, effort and complexity considerations, and dependency patterns. This capability extends to modernization programs within specific industries or domains.
To make this process manageable, it becomes imperative to offer a suite of assets and accelerators that can effectively address the complexities inherent in the enterprise landscape. This is precisely where Generative AI assumes a pivotal role by establishing meaningful correlations between application portfolio details and the discovered dependencies.
Build and Test
Code generation is a well-known and widely recognized application of Generative AI. However, it's crucial to emphasize that the scope of code generation extends far beyond just application code. It encompasses the generation of various related code artifacts, such as Infrastructure-as-Code (IAC), which includes templates for platforms like Terraform or Cloud Formation. Additionally, Generative AI can create code and configurations for pipelines, embed critical security design elements like encryption and IAM integrations, and generate application code based on swaggers or other code insights, particularly relevant when dealing with legacy systems. This capability also extends to the creation of firewall configurations, where resource files are generated based on instantiated services.
Generative AI streamlines these processes through a well-orchestrated approach that leverages predefined application reference architectures built from patterns while integrating outputs from design tools and other sources.
Another significant domain where Generative AI plays a pivotal role is testing. It has the ability to automatically generate appropriate test cases, test code, and even test data, optimizing the execution of test cases and enhancing the efficiency of the testing process.
Deploy
Numerous critical "last mile" activities in modernization programs typically consume days to weeks, contingent upon the complexity of the enterprise. An essential Generative AI use case is the ability to derive insights for security validation by analyzing application and platform logs, design points, Infrastructure-as-Code, and more. This capability greatly expedites the security review and approval processes. Furthermore, Generative AI is instrumental in generating inputs for configuration management (CMDB) and change management, drawing from release notes generated through Agility tool work items completed per release.
While the aforementioned use cases hold immense promise throughout the modernization journey, it's crucial to acknowledge that enterprise complexities demand a contextually orchestrated approach to leverage many of these Generative AI accelerators effectively. The development of enterprise-specific contextual patterns is an ongoing effort to accelerate modernization programs. We have observed substantial benefits from investing time and effort upfront and continuously customizing these Generative AI accelerators to align with specific patterns that exhibit repeatability within the enterprise.
Let us now examine a potential proven example:
Example 1: Re-Imagining API Discovery With BIAN and AI for Visibility of Domain Mapping and Identification of Duplicate API Services
The Problem
A Large Global Bank manages a vast ecosystem of over 30,000 APIs, encompassing both internal and external interfaces across diverse domains such as Retail Banking, Wholesale Banking, Open Banking, and Corporate Banking. Within this extensive API portfolio, there exists significant potential for the presence of duplicate APIs across various domains. This redundancy not only escalates the total cost of ownership for maintaining the extensive API collection but also poses operational challenges related to the management of API duplication and overlap.
The absence of a robust visibility and discovery mechanism for these APIs compels API Development teams to inadvertently create identical or similar APIs rather than identify and leverage relevant ones for reuse. Furthermore, the inability to gain a holistic perspective of the API portfolio from a Banking Industry Model viewpoint hinders both Business and IT teams in comprehending the pre-existing capabilities at their disposal and discerning the novel capabilities essential for the bank's evolution.
Gen AI-Based Solution Approach
The solution harnesses the power of Generative AI models, specifically BERT Large, Sentence Transformer, and a Triple Multiple Negatives Ranking Loss Function, in conjunction with domain-specific rules. It is further fine-tuned using the comprehensive knowledge of the BIAN Service Landscape to facilitate the understanding of the bank's extensive API portfolio. This solution offers a seamless capability to identify and uncover APIs while automatically associating them with corresponding BIAN standards. In essence, it effectively maps API Endpoint Methods to the granular level 4 of the BIAN Service Landscape Hierarchy, specifically within the realm of BIAN Service Operations.
The core functions of the solution are the ability to:
- Import Swagger specifications and other API documentation to comprehensively interpret the API, including endpoints, operations, and their respective descriptions.
- Ingest BIAN details to gain a deep understanding of the BIAN Service Landscape
- Employ fine-tuning processes, incorporating both successful and unsuccessful mappings between API Endpoint Methods and the BIAN Service Landscape
- Offer an intuitive visual representation of the mapping results, complete with matching scores. This visual interface includes hierarchical navigation within the BIAN framework, as well as filters for BIAN levels, API categories, and matching scores, providing users with an accessible and informative view of the data.
Below is a depiction of the solution view:
Key Benefits
The solution significantly simplified the process for developers to discover reusable APIs organized by BIAN business domains. It provided multiple filter and search options, empowering teams to locate the APIs they needed efficiently. Furthermore, it enabled teams to pinpoint crucial API categories essential for bolstering operational resilience. In the upcoming iteration, the search functionality will evolve to incorporate natural language processing and support conversational use cases.
By identifying duplicate APIs within the context of BIAN service domains, the solution played a pivotal role in shaping a modernization strategy aimed at addressing redundancy while rationalizing capabilities.
Remarkably, this use case was successfully implemented in just 6-8 weeks, a feat that would have taken the bank a full year to achieve, given the vast number of APIs that needed to be discovered and managed.
Example 2: Automated Modernization of MuleSoft API to Java Spring Boot API
The Problem
The ongoing effort to modernize MuleSoft APIs into Java Spring Boot faced significant challenges. The sheer volume of APIs, coupled with limited documentation and inherent complexity, had a noticeable impact on the project's pace and efficiency.
Gen AI-Based Solution Approach
The process of modernizing Mule API to Java Spring Boot was notably streamlined through the implementation of a Generative AI-powered accelerator developed in-house. This initiative commenced with the establishment of an in-depth understanding of the APIs, their components, and underlying logic, followed by the finalization of response structures and code specifications. Subsequently, we employed IBM's version of Sidekick AI to generate Spring Boot code leveraging LLM-like Open AI GPT, which precisely adhered to the API specifications originally designed in MuleSoft. This code generation process encompassed the creation of unit test cases, design documentation, and user interface elements.
The transformation of Mule API components into their Spring Boot equivalents was executed methodically, with each component integrated into the tool one by one. The accelerator generated the corresponding Spring Boot components, which were then interconnected while addressing any encountered errors. Additionally, it facilitated the generation of a user interface tailored for the desired channel, complete with unit test cases, test data, and comprehensive design documentation. This design documentation encompasses various crucial elements, including sequence and class diagrams, request and response details, endpoint specifications, error codes, and considerations related to the overall architecture.
Key Benefits
Sidekick AI, leveraging OpenAI GPT, augments Application Consultants' daily work by pairing multi-model Generative AI technical strategy contextualized through deep domain knowledge and technology. Key benefits are:
- Automatically generates Spring Boot code and test cases that are optimized, maintainable, and align with best practices, ensuring repeatability.
- Simplifies the integration of APIs with front-end channel layers, enhancing the overall development process.
- Provides developers with code that is easy to understand and offers valuable insights for efficient debugging.
The Accelerator PoC was completed with four different scenarios of code migration, unit test cases, design documentation, and UI generation in 3 Sprints over six weeks.
Conclusion
Many CIOs and CTOs have expressed reservations when contemplating modernization initiatives, citing a multitude of challenges outlined at the outset. These include concerns about the extensive SME (Subject Matter Expert) involvement required, potential disruptions to the business due to change, and the necessity for alterations in the operating model across various organizational functions, including security and change management. While it's important to acknowledge that Generative AI is not a one-size-fits-all solution for these complex challenges, it undeniably contributes to the success of modernization programs. It achieves this by accelerating the process, reducing the overall cost of modernization, and, most importantly, mitigating risks by ensuring that no critical functionality is overlooked. However, it's essential to recognize that introducing Large Language Models (LLM) and related libraries into the enterprise environment entails a significant commitment of time and effort. This includes rigorous security and compliance reviews and scanning procedures. Moreover, improving the data quality used for fine-tuning these models is a focused effort that shouldn't be underestimated. While cohesive Generative AI-driven modernization accelerators have not yet become ubiquitous, it's expected that, with time, integrated toolkits will emerge to facilitate the acceleration of specific modernization patterns, if not an array of them.
Published at DZone with permission of Balakrishnan Sreenivasan. See the original article here.
Opinions expressed by DZone contributors are their own.
Comments