How are OData, “Good Enough,” and Microservices Reshaping Design?
Integration, long the bane of existence for both over-extended architects and impatient business stakeholders, is undergoing true disruption. As the trends of Cloud, Hybrid, Mobile, Internet of Things (IoT), Data Science, and Microservices rage on, fully leveraging business data has never been more critical. While the current picture may seem grim, there are a number of technologies, design patterns, and strategies that are accelerating modern integration approaches and finally delivering on the promises of loosely-coupled, heterogeneous connectivity across applications.
Conventional Approaches Cannot Keep Up
Enterprise portfolios have grappled with heterogeneous application portfolios for decades, and have focused on service-orientation, API-first, and similar methods to weave together business data, processes, and composite applications. Although Services Oriented Architecture (SOA) has been a key part of the IT integration strategy for nearly 20 years, only roughly 20% of Services Oriented Architecture (SOA) projects can actually be described as "successful".
Challenges to large-scale SOA success include:
In addition, enterprises continue to consume more cloud-based applications (including Software as a Service - SaaS), leading to additional integrations that need to be reconciled with on-premises business data. Also, the emphasis on mobile applications continues to skyrocket with IDC predicting a four-fold increase by 2016 . Add to this the coming tsunami of IoT data (the “trillion sensor economy” by 2025 ), and it becomes clear that conventional integration strategies simply cannot support the evolving needs of the modern enterprise.
Three Ways out of the Desert: Tools, Process, Design
As architects examine their current and projected set of integration challenges, they realize that there are a number of approaches that can both meet short-term needs as well as position themselves to reduce some of the complexity described above. The mechanisms described here are intended to provoke conversation and provide viable options. We offer three approaches to chart a course "out of the desert":
1. Tools - "play it where it lies" via OData : Depending on the integration patterns being implemented, the approach often involves data movement. Accordingly, there is a list of considerations involved, including security, performance, impact on operational systems, data translation, etc. One approach is based on the Open Data (OData) protocol - see below.
OData Integration (source: Salesforce)
OData allows disparate systems to access each other’s data by reference, instead of actually moving the data. This provides a few remarkable benefits, including:
reduced impact on operational systems: since data is accessed by reference, only record pointers are retrieved, drastically reducing the load on the system being queried. Using a traditional paging pattern can further reduce the load (by as much as 90%).
simplification in mapping: depending on the OData implementation, integrations are typically created using graphical mapping tools between source and destination. This leads to a reduction in the required skill set (i.e., business analysts can typically create the mappings without the need for integration architects).
data abstraction: OData-based services provide the ability to “surface” external data objects in foreign applications as near “first class” citizens. This abstraction enables integration to fade into the background and instead allows IT and the business to focus on the business process (the “what”) vs. the integration plumbing (“the how”).
Additionally, all major IT providers are providing OData “providers” for their data sources and applications. Traditional SOA and integration vendors alike are also rolling out OData support for their products, too.
2. Process - use “Good Enough” Data that improves over time: Most enterprises have undertaken Master Data Management programs with a historical failure rate of greater than 75%. As traditional Systems of Record are blended with Systems of Aggregation (marketing/campaigns, IoT /sensor data, social network data, etc.) one effective way of proceeding is adopting a “good enough” data strategy. Consider these steps:
- Map key subject areas to producers and consumers - while most enterprises have done portions of this mapping, adding the dimension of the emerging portfolio (i.e., mobile, IoT, data science) can shed light on key gaps in the data strategy, especially as it will evolve over the planning period.
- Identify the “good enough” source(s), and integrate accordingly - instead of waiting for the complete MDM strategy, tooling, process, and implementation to be “done”, modern architectures instead allow for faster (but admittedly imperfect) integration strategies. For instance, it may be deemed acceptable for Asia Pacific to source financial data from the Japan regional headquarters, while North America’s integrations could key off the source in the New York regional office.
The key to this design approach is to have checkpoints by which the “better” (and finally, the “best”) data sources can be swapped into view. Over time, more consumers will then be accessing data from a consistent source. As integrations become more lightweight and “disposable”, sources will be able to be “upgraded” with a minimum of impact on consumers.
3. Design - Plan on Eventual Consistency when adopting Microservices: The Microservices design pattern is sweeping through the enterprise architecture world, and has some profound impacts on go forward application design. Martin Fowler lays out a great set of tradeoffs  that architects need to take into consideration:
Strong Module Boundaries: Microservices reinforce modular structure, which is particularly important for larger teams.
Distribution: Distributed systems are harder to program, since remote calls are slow and are always at risk of failure.
Independent Deployment: Simple services are easier to deploy, and since they are autonomous, are less likely to cause system failures when they go wrong.
Eventual Consistency: Maintaining strong consistency is extremely difficult for a distributed system, which means everyone has to manage eventual consistency.
Technology Diversity: With Microservices you can mix multiple languages, development frameworks and data-storage technologies.
Operational Consistency: You need a mature operations team to manage lots of services, which are being redeployed regularly.
As the classical definition of microservices includes team-focused, independent deployment, the “price” of eventual consistency needs to be taken into account, along with changes to the integration design. This implies a few key design considerations:
"Anti-Corruption" Layer between bounded contexts – a crucial area of microservices design is the notion of "Domain Driven Design" , which allows separate "contexts" to be designed and developed independently (a key difference between microservices and traditional SOA). While traditional integration approaches typically employ a single canonical view of data (or in-process transformations), microservices can include an "Anti-Corruption" layer (see diagram below), which enables the microservices to be more robust and resilient in the face of eventual changes.
Subject-area "service levels" - what are the acceptable limits of data consistency for each subject area? If the use case is such that eventual consistency cannot be tolerated (i.e., multiple microservices must access the most up-to-date data entity), then the granularity of the microservice may be too fine and might need to be expanded.
Re-playable event store - some microservices-based architectures are adopting a re-playable event store as a “steel thread” of integration. In this manner, microservices are less about point-to-point integrations and more about “point-in-time” integrations.
Anti-corruption Layer (source: http://www.markhneedham.com/blog/2009/07/07/domain-driven-design-anti-corruption-layer/)
Clearly, architects contemplating an adoption of the microservices model need to consider these impacts to their designs (and associated DevOps strategies).
The disruption of the application portfolio is well-underway, and architects who do not consider modern integration will continue to lag both their peers as well as fail to meet the accelerating demands of business users. Technology trends such as OData are finally catching up to the needs of “composable” and event-driven models. Also, the need for rapid prototyping is forcing “good enough” data to the visibility of business users. Finally, knowing the tradeoffs of design choices show the value (and limitations) of new programming models, including microservices.