Issues Affecting the Integration of Applications and Data
The scale and complexity of the data and the applications
Join the DZone community and get the full member experience.Join For Free
To gather insights on the state of application and data integration, we spoke with 18 executives from 15 companies who are involved in the integration of applications and data.
Here’s who we talked to:
Shawn Ryan, V.P. Marketing Digital as a Service, Axway | Kurt Collins, Director of Technology Evangelism and Partnership, Built.io | Thomas Hooker, V.P. of Marketing, CollabNet | Piyush Mehta, CEO, Data Dynamics | Daniel Graves, VP of Product Management, Delphix | Samer Fallouh, V.P. of Engineering, and Andrew Turner, Senior Solutions Engineer, Dialexa | Andrew Leigh, V.P. of Marketing and Alliances, Jitterbit | Trevor Hellebuyck, CTO, Metalogix | Mike Stowe, Developer Relations Manager, MuleSoft | Zeev Avidan, V.P. Product Management, Open Legacy | Sean Bowen, CEO, Gordon McKinney, Senior Solution Architect, Ross Garrett, Product Marketing, Push Technology | Joan Wrabetz, CTO, Quali | Razi Sharir, V.P. of Products, Robin Systems | Girish Pancha, CEO, StreamSets | Bob Brodie, CTO, SUMOHeavy |
And here’s what they told us when we asked them, "What are the most common issues you see affecting the integration of applications and data?"
- It’s easy to integrate your system with your own APIs and applications. APIs now belong to the applications team. They used to belong to the integration team. The integration team is learning the value of giving applications to developers for their own APIs. This is a time saver and the correct way to build APIs.
- Bridging the gap between legacy and open standards with new platforms that are available. There are typically many different platforms in one enterprise. Public access standards with APIs have to integrate with old versions of enterprise platforms with closed APIs (e.g. Documentum or IBM). No REST end points are available.
- Integration used to be a 1:1 connection that allowed for unidirectional and bi-directional synchronization. We allow for more flexibility enabling clients to customize their data before moving it and enabling clients to take action. You used to have to keep data in synch, now it’s about taking action based on what you’ve learned from the data. The platform takes into account real-time information/inputs. All integrations can run on a clock but most are now running when an action takes place (i.e. a lead comes in and there’s an automated response to it).
- 1) Moving into containers from VMs – more complex issues but avoid VM sprawl. Maintain the performance with the benefits of virtualization. 2) Ability to save capital and operating expense with more compact and compressed data. 3) Data runs at the appropriate place and speed to get the predicted performance.
- 1) Loss of data in the pipeline due to the incorrect, or less than ideal, system setup. Need to think about how to set up so you’re not losing data. 2) Not having an idea of what to do with the data. Show clients how you can use the data you have to segment and provide greater, more relevant information of value to your customers.
- The lack of collaboration across the organization – identifying business objectives and reaching agreement about how the objectives will be achieved, by whom, by when, and with what tools.
- Customers that buy apps and believe it will meet 100% of their requirements. Integration and connectivity makes or breaks the value of the application. How often you iterate the frequency of the connectivity, how quickly you build the next integration affects performance. Two or three days is now the norm.
- 1) Lack of, or outdated, API documentation. 2) Changes to the API or backend without informing the consuming apps. 3) Building platforms without thinking about their future and versioning. Over architecting solutions that are difficult to maintain.
- Big pharma and media companies are using SQL on a relational database and SQL on Hadoop. There are real-time pressures combining traditional batch and real-time data integration. Today you have to get more from the data than “set and forget.” You must continually implement in order to get insights into what’s happening in real time.
- Reliance on vendor services. Companies are paying an “arm and a leg” for siloed service. You get locked in with the provider. It’s a challenge to grow and mange with traditional scripting.
- A lot of companies claim they’re able to provide all of these services with scale and security; however, as the application scales, latency becomes an issue thus degrading the user experience. You must understand the data to be efficient. Google started to make changes and showing the importance of data. Clients want solutions while hitting acceptance levels but protocols have their limitations. There are a lot of elements to latency and security.
- Culture change that comes faster to start-ups and slower to enterprises well-entrenched with ESBs. We see APIs used as a lightweight ESB that can move faster. Data integration is hard. Speed and technology may not align with the culture. Companies using are API strategies for security and integration. Developers need APIs for what they are creating. More mature organizations may be able to span two groups. API management plans caters to both architects and the security team.
- The inability to test real production data sources due to the riskiness of doing so.
- The most common issues businesses facing today is their inability integrate applications and data quickly enough. IT is being crushed by project demands from other parts of the business, who require applications and data to be always available and always connected. IT leaders can no longer meet the demand by just running faster on the hamster wheel, instead they need to find a way to promote re-use within the organization by building application networks. Leveraging APIs to connect applications, data and devices allow for any digital asset to be quickly discovered and easily reused by consumers on the application network.
- 1) It is difficult to access production data without disrupting ongoing operations or introducing downtime. 2) Production dataset extracts may need to contend with shrinking batch windows. 3) High-touch, manual processes, and coordination between multiple teams required to copy and move data. 4) Sensitive data (names, addresses, payment information, healthcare records) need to be secured. 5) Components may be out-of-scope for dev/test: not yet completed, still evolving, controlled by a third-party, available in a limited capacity or only at certain times, etc.
What issues do you see affecting the integration and application of data?
Opinions expressed by DZone contributors are their own.