Parasoft's New Testing Framework Tackles the SOA Lifecycle
Parasoft recently unveiled the latest release of its SOA Quality Solution, an end-to-end, automated testing framework for SOA applications. The framework allows organizations to create, execute and maintain end-to-end test suites that can be initiated from rich user interfaces, through the logic in the message layer, through the implementation component, to the database (or mainframe) and back, validating the entire business process, according to Wayne Ariola, Parasoft VP of Strategy.
Some of the product's features include -
- End-to-End Testing - for validation of complex transactions which may extend through web interfaces, backend services, ESBs, databases
- Testing support for Web UIs, RIAs and Ajax - guidance on developing robust, noiseless regression tests for rich and highly-dynamic browser-based applications.
- Team Workflow and Task Management - ability to establish sustainable workflow between different team members, including Web UI developers, QA, and businss analysts.
- Increased visibility into the ESB - Visualization of intra-process events triggered by tests, real time transaction monitoring and test generation from transaction messages
- Improved Performance Testing – via a centrally-managed load test configuration and execution environment
- Platform-Awareness – built-in vendor-specific awareness to TIBCO, Progress Sonic, Oracle/BEA, IBM, Software AG webMethods, and other platforms.
DZone had a chance to follow up with Rami Jaamour, SOA Solutions Manager at Parasoft, to get his thoughts on some common challenges associated with application testing as well as to get his thoughts on some testing best practices.
DZone: What are some of the most common pain points associated with integration testing, regression testing and security testing?
Rami: With regards to integration testing, not all systems are always available/accessible during this type of testing. The systems you want to validate have dependencies that are out of your control, which is why the ability to create an emulated version of services is needed to fill these gaps.
Moreover, it is difficult to pinpoint what went wrong in an integrated system due to the many components and layers involved. And when things seem to execute right, how do you make sure that all the involved systems and components did what they are supposed to do in the backend? e.g. did database record got update correctly at the end of the transaction? This is why the testing solution needs to have a framework to allow for such deep system control and visibility for the tester.
When there is too much complexity and too many parts involved you need a way to manage that to ensure proper validation and compliance with system quality requirements.
Systems under test change as development progresses or in between different releases of the system, which can make it difficult to create and define validation criteria on the system behavior at the right level so it catches regressions that are defects/errors in the system, but not be sensitive (or give false positives) about changes that are not defects. For example, when running a scenario that validates a series of Web UI clicks, one of them validation account information for a customer, it can't fail if the account information location changed its location slightly in the page, but it needs to fail if the data came back incorrectly -- unless, of course, part of the intended validation is to verify location. The similar situation applies to Services, XML and protocols, need to handle dynamic messages and values properly when validating the message contents.
Maintaining the regression tests can also be challenging. You have a great suite of tests for version 1.0 of you system. Version 2.0 changed many things, would you have to go and redo or change hundreds of your tests cases in terms of test definitions, the validation criteria, etc.? Such regression test maintenance cost and effort can become prohibitive unless you have a framework that minimizes such maintenance activities and a workflow that allows to carry them out effectively.
With regards to security testing, that it is done too late merely as an audit activity rather than an inline process that prevents security issues from the beginning of the SDLC. That's why Parasoft provides not only assists with security verification though penetration testing and execution of complex authentication, encryption, and access control test scenarios, but also uses static analysis and other strategies to prevent security vulnerabilities from the start of the SDLC.
DZone: In which tier of the application should testing be done first?
Rami: It depends. Generally unit tests come first (JUnit, NUnit, etc.) because the developer can use that while writing the code components following a TDD paradigm whenever possible (Test Driven Development). When the functionality is at a higher level, i.e., you are working on something that involves the application at runtime then functional testing (at the service interface level, Web UI level, etc.) is needed to guide through the development process. After that, the test artifacts at these deferent layers, especially functional layer) can be extended and leveraged for a more complete testing typically by QA.
It's important to make quality as a continuous process throughout the SDLC—not just QA. The result is a sustainable process that delivers greater productivity and significantly fewer software defects.
Development teams should prevent errors and continuously test logical units of the application. QA testers or business analysts can then focus on validating end-to-end business scenarios as an iterative process—not a quality task at the end of a development cyc
DZone: There are a plethora of free, open source testing tools available to developers today; What criteria should developers use when trying to select the best tool for the job?
Rami: This issue really boils down to assessing the ROI. There are upfront and maintenance costs for commercial solutions, but then there are costs of continuing to do the work without the tool, and doing it with the tool. Certainly, with open source software there are no upfront costs, and in many cases this can be attractive and the right choice for some organizations in many cases, but the ROI calculation can take a different turn when you consider these factors, which can make commercial solutions the right choice from a cost perspective. However, there is another important factor related to risk besides foreseeable costs. You've got to ask yourself, if a critical need arises around the tool to carry out a business critical task, such as in the SOA quality space -- the need to validate a certain system with a capability that is not directly supported by the tool out-of-the-box, then can the tool be extended easily and quickly to achieve that? Or would I need to resort to fulfilling my needs manually and with significant effort and costs by writing my own code or scripts? The mere fact that a tool is open source does not mean that we can reasonably expect the end user to add or modify its functionality, it is true that it is theoretically possible but the fact is that it would be quite impractical and prohibitive in practice, which is why it is important to consider the vendor viability, their stability in the market and ability to execute on needs as they arise on-demand as a solution partner, not just a tool vendor.
DZone: What new features can we expect to see in upcoming versions of SOAtest?
Rami: We want to assist users further with workflows around creating and maintaining their testing assets, with correlation to their activities from a policy perspective. The goal there is to expand our capabilities in how quality policies are defined and managed and how they are enforced and tracked with automation that includes both machine and human tasks. Furthermore, today we have special capabilities and integrations with IBM WebSphere products, Microsoft .NET, Oracle/BEA, TIBCO, Progress Sonic, HP, Software AG/webMethods and others, so can also expect a continuation of expanding our support for specific target platform features, and which enables our customers with capabilities that help them maximize value from the joint solution they have form Parasoft and its partners.