Giving Legacy Applications New Life
Giving Legacy Applications New Life
Join the DZone community and get the full member experience.Join For Free
Continue to drive demand for API management solutions that address the entire API life cycle and bridge the gap to microservices adoption. Brought you to you in partnership with CA Technologies.
Application instrumentation is able to give legacy applications new functionality by monitoring, modifying, and automating the user interface. Unlike test automation or screen scraping, app instrumentation does not use automation to replay user input for testing or batch processing. Instead, it adds new behaviors to existing controls. These behaviors include data validation, field permissions, invoke services, gather usage statistics, and task automation. Application instrumentation is interesting because it is influenced by elements in aspect-oriented programming like bytecode instrumentation.
App instrumentation injects new code into an application at runtime that interacts directly with UI objects. It does not use graphics, accessibility, or testing interfaces. Instead of simulating user input, app instrumentation uses the properties, methods, and events of an application's controls and therefore, it is more robust than many alternate approaches. App instrumentation can intercept existing methods and modify the parameters or return values. It can also automate an application that is out of focus, minimized, or hidden.
Application virtualization only virtualizes a portion of the operating system, unlike a virtual machine, which virtualizes all of the OS and the physical hardware. The file system and the registry are virtualized, but not the memory, graphics, or windowing systems. The virtualized application runs as a native Windows process within the OS and it is indistinguishable from a process started locally, but when the application attempts to access a file or a registry key, the virtualization system intercepts the request and returns virtualized content. The application still runs locally, but the components can be stored anywhere. This makes deployment and management much easier. It also allows system administrators to deploy virtualized applications using modern paradigms such as remote storage and on-demand streaming.
When used together, application instrumentation and virtualization could, for example, solve issues with desktop standardization on a new architecture and updating old applications that are written in a different language than the SOA. The legacy applications can use application instrumentation technology to start identifying controls. Then developers can access the web service WSDL and generate a proxy to link events with web service methods and displaying a results window. To integrate this solution into a website, the application instrumentation software can add an automation process to navigate the website and provide the proper functions. When the applications have been instrumented, the developers just have to build a deployment package that contains XML descriptions of the application UIs and the SOA's automation assemblies.
After completing the application instrumentation stage, developers must sequence all of the applications by running the applications' installers (and the instrumentation installer) while the sequencer monitors the OS. Then developers update the deployment package to point to the correct virtual file locations and add it to the virtual application package. Once all of those components are all in the virtual package, it can be deployed to the virtual server and streamed to the re-imaged, standardized desktops. As long as legacy applications are still a big part of the enterprise, application virtualization and instrumentation will be powerful tools for bringing modern deployment and integration paradigms.
Opinions expressed by DZone contributors are their own.