Surmounting Cloud Adoption Challenges With an iPaaS Model

DZone 's Guide to

Surmounting Cloud Adoption Challenges With an iPaaS Model

Ready to move to a hybrid cloud environment but aren't sure about how to handle the integration challenges ahead? Check out this proposed iPaaS adoption model.

· Cloud Zone ·
Free Resource

Cloud is not the next big thing anymore. It is now the big thing. Immersive cloud-based technologies have totally altered the IT landscape. Organizations now understand that centralized computing and archaic architectures cannot drive the edge. The zeal to achieve low latency and hyper-interactivity is pushing organizations to adopt cloud-based technologies. Integration is being seen in the light of new approaches, but organizations are facing same old challenges that were prevailing in old frameworks. We will cover those challenges and methods to circumvent them with the help of an Integration Platform as a Service (iPaaS) framework.

Barriers to Cloud Adoption

A fair share of cloud migration challenges are spawned by weak integration between on-premise and cloud-based applications. These problems keep on resurfacing when the point-to-point network and hairball coding is used to integrate several applications. Developers develop code and throw it before the testing team for validation. And whenever there is a shakeup, developers need to repeat the entire process. This strategy is not ideal for companies with thousands of applications.

Once upon a time, applications were developed with little focus on integration with other applications. They were primarily stove-piped applications that offered one or two endpoints. These points don’t scale to accommodate SOA infrastructure. Lengthy code needs to be redeveloped to accommodate small changes. The following are some of the drawbacks of the hand-coded approach:

  • Growth in complexity leading to the formation of an integration hairball.
  • Lack of scalability for Service Oriented Architecture.
  • Increased Total Cost of Ownership (TCO).
  • Long delays in onboarding partner data.
  • Lack of dedicated features for firewall mediation, data management, governance, etc.

Historically, data was brought to the Central Processing Unit (CPU) for processing. This approach was changed as massive amounts of data soon overwhelmed the processor. The instant response was bringing multiple processors to data for processing and assessing. Each server consisted of small components to process individual elements of data sets. It was called parallel processing, where an infinite number of CPUs processed infinite numbers of data sets. In such an ecosystem, organizations face difficulties in scaling the ability of processors up to deal with a wide variety, volume, and velocity of data. As a result, teams encountered treacherously difficult challenges while:

  • Deriving true value from their data.
  • Overcoming data silos that restrain capabilities.
  • Getting the skills required for marshaling and orchestrating data.
  • Linking the data with Big Data and other digital initiatives.
  • Safely exchanging data with business partners.

To surmount these challenges, experts recommend organizations embrace a strategic integration approach that is based on industry best practices. Hybrid cloud adoption will see further increase and organizations undermining integration will fail to receive the benefits of cloud migration.

Conventional Approaches to Hybrid Integration

Initially, application leaders used ESB architecture to manage integration between applications and services. However, they realized that ESB doesn’t support scenarios where new and old applications run parallelly. It lacked scalability to accommodate new technology initiatives like Salesforce, Workday, Quickbooks, etc. Frequent IT intervention was needed at every layer to develop, test, and generate code.

Extract, Transform, and Load (ETL) was also used, but it suffered from many drawbacks, too. It only allowed getting data from a data repository. However, with the advent of Hadoop, this functionality has become outdated. Traditional row-and-column ETL doesn’t allow users to support storage of structured and unstructured data.

Modern applications use the Extended Markup Language (XML) format for storing data. On the other hand, physical machines use comma-separated values (CSV) for data storage. The ESB/ETL approaches are overwhelmed when too much data needs to be mapped from XML to CSV or CSV to XML. It is unwise to use these approaches for high speed and high volume projects.

Connecting applications through APIs is good, but it is not a silver bullet for more pervasive B2B integration needs. Specific APIs need to be developed for specific integration needs. And too much pressure is taken on when the same API muscle is used for multiple integrations. Organizations need to buy separate licenses for specific application integration scenarios.

Data has become a critical corporate asset and it is no longer a secondary asset. Jurassic integration approaches don’t help teams leverage data and engage customers. A greater degree of support from data integration tools is required to bring data from mobile, Internet of Things (IoT), and social channels.

Data security has become an even bigger concern as new and stringent compliances like the General Data Protection Regulation (GDPR) are looming. Conventional approaches don’t provide a safe passage through dense API cloud networks. A reliable pathway is needed that secures data at every endpoint.

Manual Coding and Thorny Challenges

Thorny challenges await application leaders when on-premise systems need to be integrated with cloud-based systems. And disruptions continue to resurrect in manual workflows. Let’s take a real-time scenario to understand this problem where manual steps will be used to connect SAP with Salesforce. This is a powerful combination of two trusted icons that can help organizations in becoming more productive.

Salesforce offers the Data Loader to integrate with other applications. To download the application, go to Salesforce Setup --> Data management -> Data Loader.

Lighting Connect is another approach to connect an SAP ERP Central Component (ECC) system with Salesforce, which will be discussed in this article. Here are the steps to integrate SAP with Salesforce with this tool.

Step 1: Configure Lighting Connector to perform queries and getting connected with SAP ERP system.

Step 2: Login to SAP Community Network

  • Select Join us and get free Login @ http://scn.sap.com/
  • A pop up appears
  • Register so that you are now part of the SAP Community Network

Step 3: Get access to the Public SAP System

Step 4: Configure the Lighting Connector for SAP Access

  • Click Setup → Develop → External Data Sources.
  • Click New External Data Sources
  • Fill out the following information


SAP Data




Lightning Connect OData 2.0



Connection Timeout:


High Data Volume:


Compress Requests:


Include in Salesforce Searches:


Custom Query Option:






Identity Type:

Named Principal

Authentication Protocol

Password Authentication


<The SAP Supplied user-name from Step 2>


<The SAP supplied password from Step 2>

  • Click Save >>A page appears: Connect to a Third Party System or Content System
  • Click Validate and Sync >>Validate External Data Source Page Appears

Step 5: Synchronize tables from Salesforce to SAP (creating corresponding custom objects inside Salesforce)

  • Click Sync to allow Salesforce to Read SAP Tables
  • Click SOHeaders to see the custom object and the custom fields

Objects that end in __c are the custom objects that you created

Step 6: Create an Apex Class to Retrieve SAP Data

  • Go to Setup --> Develop --> Apex Classes and click New.
  • Cut and paste this code provided in the code editor and then Save:
public class SAPsalesordersExtension {
    // Read the custom object SOHeaders__x, that was created by the oData sync. 
    // Use this to display the specific sales order data by customer number via 
    // a VF page...
    private final Account acct;
    List<SOHeaders__x> orderlist;

    public SAPsalesordersExtension(ApexPages.StandardController stdController) {
        Account a = (Account)stdController.getRecord();
        List<Account> res = [ SELECT Id, AccountNumber from Account WHERE Id = :a.Id LIMIT 1];
        this.acct = res.get(0);

    public String getSAPCustomerNbr() {
        return acct.AccountNumber;

    public List<SOHeaders__x> getOrderList() {
        if (null == this.orderList) {
            orderList = [SELECT ExternalId, CustomerId__c,
                                SalesOrg__c, DistChannel__c,
                                Division__c, DocumentDate__c,
                                DocumentType__c, OrderId__c,
                                OrderValue__c, Currency__c
                         FROM SOHeaders__x 
                         WHERE CustomerId__c = :this.acct.AccountNumber 
                         LIMIT 300];
        return orderList;
} // end of oData Apex Class

The SAP Sales Order Execution Page Appears

Step 7: Create a Visualforce page to display results

  • Go to Setup --> Develop --> Pages, click New
  • The Visualforce page appears

Include the following information






A simple example of getting SAP Data without any middle-ware!

  • Paste this code in the Editor
<apex:page >

    td {
      border-bottom-color: rgb(224, 227, 229);
      border-bottom-style: solid;
      border-bottom-width: 1px;
      background-color: #FFFFFF;
      border-collapse: separate;
      padding-bottom: 4px;
      padding-left: 5px;
      padding-right: 2px;
      padding-top: 5px;
    th {
      border-color: rgb(224, 227, 229);
      border-style: solid;
      border-width: 1px;
      background-color: #F7F7F7;
      border-collapse: separate;
      font-size: 11px;
      font-weight: bold;
      padding-bottom: 4px;
      padding-left: 5px;
      padding-right: 2px;
      padding-top: 5px;
    table {
      border-color: rgb(224, 227, 229);
      border-style: solid;
      border-width: 1px;
  <apex:dataTable >
    <apex:column >
      <apex:facet >Id</apex:facet>
      <apex:outputText ><a >{!order.Externalid}</a></apex:outputText>
    <apex:column >
      <apex:facet >Sales Org</apex:facet>
      <apex:outputText >{!order.SalesOrg__c}</apex:outputText>
    <apex:column >
      <apex:facet >Dist Channel</apex:facet>
      <apex:outputText >{!order.DistChannel__c}</apex:outputText>
    <apex:column >
      <apex:facet >Division</apex:facet>
      <apex:outputText >{!order.Division__c}</apex:outputText>
    <apex:column >
      <apex:facet >Customer Id</apex:facet>
      <apex:outputText >{!order.CustomerId__c}</apex:outputText>
    <apex:column >
      <apex:facet >Document Type</apex:facet>
      <apex:outputText >{!order.DocumentType__c}</apex:outputText>
    <apex:column >
      <apex:facet >Order Id</apex:facet>
      <apex:outputText >{!order.OrderId__c}</apex:outputText>
    <apex:column >
      <apex:facet >Order Value</apex:facet>
      <apex:outputText >{!order.OrderValue__c}</apex:outputText>
    <apex:column >
      <apex:facet >Currency</apex:facet>
      <apex:outputText >{!order.Currency__c}</apex:outputText>
    <apex:column >
      <apex:facet >Date</apex:facet>
      <apex:outputText >{!order.DocumentDate__c}</apex:outputText>

  • Click Save.
  • The SAPo Data Page appears

Step 8: Assign the Visualforce page

  • Setup --> Customize --> Accounts --> Page Layouts.
  • Click Edit to modify page layout
  • The Account Layout Page populates
  • Drag the Section option and move it to the spot where it needs to be moved.
  • A Popup appears to name the new section.
  • Click the OK button
  • Locate the Visualforce Page list and drag it to the Accounts Page Layout
  • Add the newly created Visualforce Page to the screen.
  • Drop the Visualforce page on the screen
  • Click Save to save updated Accounts Page Layout

Step 9: Test Drive Data Movement

  • Click Accounts tab and bring up any account.
  • Click New to Create a New Account
  • Populate the Following Test Data. For Example:

Account Name

Belmont cafe Inc

Account Number


  • Click the Save button on the Salesforce Account page:
  • Click on the newly created account:
  • Click on the newly created account under the recent accounts section

All real-time SAP data will be returned.

This method is cumbersome and it cannot be used over and again for integrating Salesforce data with other applications. This approach allows only access to SAP data and moving large chunks of data can be an uphill task.

iPaas Approach for Becoming a Cloud First Enterprise

Previously, it was only employees who were generating and accumulating organizational data into computer systems. Now users and machines are also generating the data across social channels, forums, online commerce, etc. Due to this change, organizations today have to deal with larger accumulated data generated by their customer facing platforms, monitoring systems, smart meters, etc. The next big challenge is unlocking this colossal amounts of data which holds massive hidden opportunities. Processing and refining such massive amounts of data is a next level challenge during cloud migration which can be addressed through iPaaS.

IT experts consider iPaaS is the best solution for defying the impact of disruption on security, data and analytics, communications, and endpoint technology. Smarter organizations are successfully overcoming their integration weaknesses and setting up future-ready IT architecture with this model. Next generation iPaaS model delivers compelling business benefits:

  • 3000 times faster lead times.
  • 300 times increase in deployments.
  • 30X faster recovery.
  • 10 times lower failure rate.
  • 60 times more reusability

By simplifying application and data integration, iPaaS helps in modernizing IT architecture and setting up a cloud-first enterprise. The framework enables even business users to integrate with a gamut of external and internal business applications and processes safely and cost-effectively. Organizations can access new workloads from new channels (social, analytics, cloud, and the Internet of Things with a few clicks). This advantage allows users to connect faster with partner networks, bringing data faster and reducing total cost of ownership and accelerating time to revenue.

Leveraging iPaaS: No Code Approach for Integrating a Cloud Application (Salesforce) with an ERP (SAP)

An advanced iPaaS framework automates integration between Salesforce and SAP. It provides a secure bridge to connect Salesforce with ERP and other applications. Normal business users can handle exceptions and replicate Salesforce data faster with other applications. Here are some steps for Salesforce API integration with SAP.

  Adeptia iPaaS Interface to Connect Salesforce with SAP Adeptia iPaaS Interface to Connect Salesforce with SAP

Choose from a shared list of Salesforce connections

Step 1: Log into the Adeptia Integration Suite

Step 2: Choose from a variety of Salesforce to SAP Connectors

Create Connections

Step 3: Use Triggers and Actions.

  • Triggers: For Salesforce to the target system.

  • Actions: To sync-up data from other business application into Salesforce

Visually Map Data Fields

Step 4: Map Data between Source and Target fields with drag-and-drop ease

Step 5: Click Save

The data between Target and Source Systems is mapped!

In this way, an iPaaS framework allows normal business users to connect with any business application. Users can update leads, contacts, and campaigns in Salesforce in simple non-technical steps.

Guidelines for Adopting an iPaaS Framework

Adopting an iPaaS framework is a longstanding decision. And organizations should evaluate their requirements before investing in the right platform.

It is about time that organizations should become more data-oriented instead of system-oriented. View iPaaS as a process that streamlines IT integration in steps. Here are some guidelines for succeeding with iPaaS adoption.

Preparing a Proof of Concept (PoC): Data governance and management is as important as any other change management initiative. Organizations need to demonstrate that their iPaaS approach delivers significant value and return on investment, i.e., improved forecasting, greater degree of personalization, optimized resources, better-targeted marketing, etc. It is important to consider the continuum of data challenges arriving from a wide variety of data sources. Establishing a strong data governance model will require continuous support from all departments at every layer. It is important to bring all departments for getting more ideas and preparing a crackerjack data governance frame.

Testing the Hypothesis: It is better to fail faster during testing than at the later stages. The next step is putting the frameworks to the test and shortlisting an ideal model that ensures triumphant success. The frameworks should be tested on the basis of data readiness, feasibility analysis, usability, etc.

Validating the Roadmap: At this stage, the data governance model should be tested in the actual working environment. Application leaders must demonstrate at every stage that the model allows an organization to harness more value with less friction. Some of the factors to consider are capital expenditure, operational expenditure, total cost of ownership, and ROI.

Implementing the Data Governance Model: Then the model has to be handed over to the business teams and embedded into the organization. It is important to verify that the model delivers real-time benefits in a continuous operating environment.

Emerging business needs are driving more innovations in the iPaaS market. The iPaaS market is becoming dense with every newly added functionality. To achieve continued success from an investment, an organization should follow these guidelines and select an iPaaS model that promises stable trajectory and guaranteed success.

cloud adoption ,cloud integration ,ipaas ,cloud ,hybrid cloud

Opinions expressed by DZone contributors are their own.

{{ parent.title || parent.header.title}}

{{ parent.tldr }}

{{ parent.urlSource.name }}