Leverage Legacy Systems With APIs and Microservices
Leverage Legacy Systems With APIs and Microservices
Legacy enterprises accelerate new innovations without changing their backend systems.
Join the DZone community and get the full member experience.Join For Free
Great catching up with Zeev Avidan, Chief Product Officer, OpenLegacy on the heels of them closing a $30 million funding round and launching version 4.2 of their API integration and management platform.
To date, OpenLegacy has earned their stripes by enabling financial services and insurance companies with significant legacy infrastructure to leverage existing systems and avoid or mitigate the risk of costly migration and modernization projects.
Most of OpenLegacy's clients are already engaged in some level of transformation. They are typically using several technologies and vendors offering cloud and microservices but find the need to merge core legacy systems gets bogged down due to a lack of tools with the ability to fully integrate legacy applications.
OpenLegacy enables the connection of systems with microservices to deploy to production in a few days versus a few months. Their approach, which is not middleware or service-oriented architecture, is native to the world of microservices and introduces automation with a simplicity of architecture to do it quickly with agility and velocity consistent with a software factory and modern software architecture.
What do you mean when you say, "jumping middleware?"
One of the challenges we see organizations struggling with is trying to solve the hybrid integration problem, not just the cloud, but cloud to on-prem, or on-prem to on-prem. They're trying to solve it with the same kind of middleware, and those middleware are large pieces of software that really try to do everything, and they will do 1,000 things very well. But then you will try to do the 1,001st thing, then you will get stuck. This has been a problem with the integration for two decades now.
If you are already distributing your software architecture, why not also distribute your middleware or distribute your integration? That is exactly what we do. In technical terms, each microservice would have embedded in it the entirety of the integration stack, so that it can be deployed together. And it will be done in a way that's not too cumbersome, very scalable, very lightweight, and also automatically regenerated. So that gives you basically a non-middleware approach to integration where you can have a transaction on a mainframe or stored procedure on a database, not only exposed as an API but also enhanced with additional functionality and doing so with agility. So if you need to make additional changes, you make them without worrying about the kind of a house of cards factor and the coupling that goes with a monolithic application. So you get the best out of both worlds -- the agility of microservices, you get the velocity of them, but you're not paying the price, time, or risk of downtime for a migration.
What do developers, engineers, and architects need to know about OpenLegacy 4.2?
Our implementers are not legacy skills people, they don't need to know Cobol, mainframes, AS/400, or any other kind of backend. They're going to love that the artifacts that get created as a deployable unit are 100% standard Java artifacts. In order to test, to version, and do all those things, you just use whatever is available on the Java ecosystem. It's very easy for developers. The skills are transferable, and you can do everything in a very standard way. We integrate into any kind of DevOps pipeline. Developers work with known Java frameworks, like Spring, Spring Boot, and Spring Cloud. You don't have to be an expert on open legacy, you just need to have a good Java foundation in order to do the most complex things. And, you can do the most simple things without knowing Java.
Everything is standard Java code. We follow the code first approach so everything is based on the code. There is no additional metadata hidden somewhere. If you change the code, the graphical user interface will reflect it. If you are a coder, you can be very comfortable working with that. Basically, what you're doing is building Java SDKs on top of legacy systems to bring them all to a level of abstraction where you are just dealing with Java objects. You don't need to concern yourself with the integration because that was done automatically for you. This gives you great efficiency and standardizes your different backends into one place where you can do everything very easily. The approach and also complete adherence and compatibility with a modern microservices architecture means you have the flexibility to customize it in a way that supports any kind of runtime environment that supports Java.
Do you have a use case you would like to highlight?
An interesting use case is Fern Expo. They organize events and provide trade show services for large companies like Google, Samsung, and Lego. One of the challenges they have is managing the logistics for thousands of exhibitors and trade shows that require tracking hundreds of thousands of exhibitor shipments in and out of the event.
In order to be excellent in their execution, Fern established a lot of manual processes over their 110-year history. However, they faced challenges when it came to connecting modern devices to their legacy AS/400 system. They wanted to enable remote access to their legacy data via a mobile web app from the trade show floor.
Our platform helped them expose and extend the AS/400 applications as REST APIs so exhibitors and event staff could have real-time information about shipment locations on their mobile devices from the event floor. They are now able to scan and record the shipment with a mobile app, photograph it, and automatically notify the exhibitor on the status of their freight using email or SMS.
Everything was deployed in less than a month. The bulk of the time was spent defining parameters that make up the inputs and outputs from the AS/400 and training on the new system. Fern developers were able to create and implement the API in just one hour. Previously, this kind of project would have taken six months. This kind of efficiency is very important to them.
What does the future hold for OpenLegacy?
Looking forward, we're doubling down on making the process of integrating with the DevOps pipeline even easier. We will always provide an opinionated platform with best practices, but it will be completely customizable. You will have a DevOps pipeline process pre-built in, and you can customize it whichever way you want so deployment, testing, versioning will be much easier for you and will fit with your process.
Opinions expressed by DZone contributors are their own.