Data Integration vs. API Integration vs. Systems of Engagement Level Integration
Join the DZone community and get the full member experience.
Join For FreeData Integration vs. API Integration vs. Systems of Engagement Level Integration
Originally authored by Dave West, Chief Product Officer at Tasktop Technologies and former analyst at Forrester Research.
The method wars are over and Agile has won. But just like any war, peace time comes with its own set of problems and issues. As Agile becomes the de-facto way teams build software the impact on the other processes and systems become widespread. For many organizations that results in Water-scrum-fall with traditional processes of planning and release management being run in a traditional manner and development uses a different approach. Even for organizations that are more Agile in nature the resulting process, though labeled Agile, are a mix of processes and practices. The reality is that Agile will always be a mix of different practices. Which leads organizations to a serious problem, how do they operate in an Agile way using feedback to respond to their environment when the practices and tools that comprise their complete process are not integrated?
It is true to say that for the majority of organizations the tools used in their software delivery practice or Application Lifecycle Management (ALM) process do not integrate. And, as software delivery speeds increase, Agile becomes the norm and software becomes an assembly of software from other sources, the need for consistent information about an application is crucial. Connecting the disciplines of business analysis, development, testing, project management and third parties such as outsourced testing, cloud vendors and open source will not be a nice to have, but a fundamental requirement for success. To use an analogy would you expect your customer, billing and delivery information on Amazon to be separate, disparate systems that are not integrated and a spreadsheet used to tell you when your order will arrive? And, as business processes increase in speed, the need for automated, integrated practices becomes essential for business success.
Why is integration so bad?
For many years tool vendors have promoted the idea that you need to go to one tool to solve all your development process needs. Many touted their product suites as the ERP for software delivery, and they made great progress on building integrated suites of products. Borland, Rational and MS drove the idea of one vendor providing everything a company needs to deliver software. But from the very first implementation, rarely did these ‘super suites’ provide everything a development team needed. With the rise of open source, or developer driven tooling and the increased complexity of language, architecture and platform it became a challenge for any one suite to support everything. Add to this technical complexity, new development methods such as Agile and the increased value and importance of software to the business and you have the perfect storm for heterogeneous tool stacks. But the motivation and reason for tool diversity does not guarantee that those tools will integrate for the following reasons:
• Tool vendors are not motivated to integrate - Not always for any quasi evil reason, but just because they want to invest their finite resources into making their products better. Also for many vendors getting access to their competitor’s tools has license implications and requires building skills in those tools.
• Open source is slow outside of development. With the creation of Mylyn and the work of Eclipse we have seen developers building strong integrations for commercial tools and open source products but the disciplines of project management, testing and requirements do not gather as much developer interest.
• Integration is not easy. Complex workflows, tool customization and weak API’s make building robust, flexible, and efficient automated integration difficult. This level of difficulty makes the economic decision much more challenging.
But what about OSLC?
OSLC (Open Services for Lifecycle Collaboration) defines a set of standards for the sharing of data between software lifecycle tools. In practical terms, it defines architecture and a series of API standards for accessing data from other tools. OSLC follows the web principle of leaving information in the particular tool, and providing mechanisms to surface that information in other tools. This makes for a very robust model; ignoring the challenges of passing information around instead delegating presentation and update of the information back to the originating tool. However, OSLC by its very nature requires tool vendors to build components that not only surface the information, but also API’s to allow access to the tool from another tool. The primary problems with OSLC today include:
• Users like to use their own tool, and have all the information present in that tool. Perhaps because of training, culture or just stubbornness for many users getting dialogues or browser pages inside their own tool from another makes for a confusing user experience.
• Users want to not only see information, but also want to update that information. There is nothing stopping OSLC implementations from not only surfacing information, but also allowing edit capabilities, but the reality is that most implementations make the updating of information difficult and outside of the tool being used, instead relying on external dialogue boxes and web pages.
• Requires vendors to buy into the OSLC vision – Very few of the primary ALM vendors have joined the group and without strong vendor representation it is difficult to see how successful any standard will be.
• Limited support for reporting – One of the primary use cases for integration is the need to report across tools. For example creating a traceability report that shows requirements and their associated tests, or a build report describing what requirements are covered. Reporting requires information to not only be accessed from the other tools, but also aggregated, processed and generally formatted for that context. A large overhead for any linked integration model.
• Legacy tools abound – OSLC works great for next generation tools that are architected around its model and built on the web, but for many organizations changing tools is not easy with large investments in heavily customized existing tools.
The bottom line is OSLC is a great idea and over time may become the standard for software lifecycle tools integration, but today the reality of existing tool implementations and vendor strategies make OSLC interesting, but not effective as a complete integration strategy.
Strategies for tool integration
There are a variety of ways of slicing and dicing ALM integration and each factor has fairly signification ramifications for what it means to an organizations long term tools strategy. In many cases the wrong choice may result in no gain, leaving tools poorly integrated with inflexible automation. Agile methods make the situation even worse requiring any integration is based on a model that supports change. By their very nature process models for Agile organizations have to be flexible and support change. Integration should also follow this principle.
Point-to-Point vs. Bus vs. Hub
A point-to-point integration is typically what is found in most companies today and how they deal with the integration challenge for their software delivery process flow. There is a need to for better coordination or improved flow of information between two constituents in the software value chain and a point-to-point integration is developed or purchased. In general, point-to-point solutions map the first tool to the second tool directly. There is no abstraction layer and the key is mapping the two tool’s data models to each other.
Bus integration solutions are sometimes known as Enterprise Service Bus have gained significant prominence in numerous areas outside software development and delivery because of large companies like Tibco and open source technologies like Mule and Camel. The bus serves as a vehicle for communication between the various tools. Many buses are stateless and they act as abstraction layer for facilitating synchronization and communication. The bus use connectors or adaptors where all of the tool specific translation intelligence resides.
Hub solutions are generally aggregators. All tools connect to the hub, and the hub serves as the central data store receiving information from the various tools and then sending it along (sometimes in aggregated or summarized form) to other tools in the ecosystem that need it.
Data Integration vs. API Integration vs. Systems of Engagement level Integration
Data integration essentially means that the integration takes place at the database level. Essentially products or custom solutions that integrate at a database level do so by either copying the data from one ALM product database to another or by having all the ALM products access information from the same database.
API integration focuses on workflows. Essentially, a triggering event takes place that puts in motion a slew of activities regardless of whether those activities are in the ALM tool that triggered the event or in a separate tool. Any number of things can cause the event to trigger e.g., user activity, tool activity, etc. but the integration itself takes place in the background. The trigger results in a workflow to kick off and often that workflow drives activities and updates in other ALM tools.
Systems of engagement integration focus on providing a consolidated view of information from multiple sources under “1 pane of glass” or in one user interface (UI). In some cases, this can be accomplished by “widgets” where part of one ALM tool’s UI is displayed within another tool’s UI and in some instances this can be accomplished via mash-up or a content portal.
Conclusion
Agile requires the software delivery stack to be integrated. Manual processes, email and informal collaboration technologies do not provide adequate speed, rigor or allow for effective reporting and history gathering. ALM is therefore a key business process that requires focus, attention and investment if software is crucial for the business to be successful. Too many great business ideas are undermined by the lack of execution and increasingly the systems that support that execution are software delivery in nature. To improve your tools integration and therefore your ALM application development professionals should:
• Define an ALM architecture – Not just the tools being used, but also the process flows, information model and supported technologies. Having an architecture group or individual to own the ALM architecture allows for optimization.
• Understand the lifecycle of ALM - Understand today’s workflows between silos and determine what optimal workflows would be desired information flowed cleanly and in real-time.
• Run root-cause analysis in software failures. Determine if touch points between silos is a problem and focus on fixing those touch points.
• Involve the PMO and audit – ALM is not just the responsibility of development but also includes the PMO and audit groups. Make sure their reporting and traceability needs are being captured in the associated data model.
Bio
Dave West is the Chief Product Officer at Tasktop. In this capacity, he engages with customers and partners to drive Tasktop’s product roadmap and market positioning. As a member of the company’s executive management team, he also is instrumental in building Tasktop into a transformative business that is driving major improvements in the software industry.
As one of the foremost industry experts on software development and deployment, West has helped advance many modern software development processes, including the Unified process and Agile methods. He is a frequent keynote at major industry conferences and is a widely published author of articles and research reports, along with his acclaimed book: Head First Object-Oriented Analysis and Design, that helped define new software modeling and application development processes.
He led the development of the Rational Unified Process (RUP) for IBM/Rational. After IBM/Rational, West returned to consulting and managed Ivar Jacobson Consulting for North America. During the past four years he served as vice president, research director at Forrester Research, where he worked with leading IT organizations and solutions providers to define, drive and advance Agile-based methodology and tool breakthroughs in the enterprise.
If you have questions about this article, you can ask David a question on twitter: @davidjwest
Published at DZone with permission of . See the original article here.
Opinions expressed by DZone contributors are their own.
Trending
-
What Is React? A Complete Guide
-
Essential Architecture Framework: In the World of Overengineering, Being Essential Is the Answer
-
Health Check Response Format for HTTP APIs
-
Tomorrow’s Cloud Today: Unpacking the Future of Cloud Computing
Comments