DZone
Thanks for visiting DZone today,
Edit Profile
  • Manage Email Subscriptions
  • How to Post to DZone
  • Article Submission Guidelines
Sign Out View Profile
  • Post an Article
  • Manage My Drafts
Over 2 million developers have joined DZone.
Log In / Join
Please enter at least three characters to search
Refcards Trend Reports
Events Video Library
Refcards
Trend Reports

Events

View Events Video Library

Zones

Culture and Methodologies Agile Career Development Methodologies Team Management
Data Engineering AI/ML Big Data Data Databases IoT
Software Design and Architecture Cloud Architecture Containers Integration Microservices Performance Security
Coding Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks
Culture and Methodologies
Agile Career Development Methodologies Team Management
Data Engineering
AI/ML Big Data Data Databases IoT
Software Design and Architecture
Cloud Architecture Containers Integration Microservices Performance Security
Coding
Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance
Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks

The software you build is only as secure as the code that powers it. Learn how malicious code creeps into your software supply chain.

Apache Cassandra combines the benefits of major NoSQL databases to support data management needs not covered by traditional RDBMS vendors.

Generative AI has transformed nearly every industry. How can you leverage GenAI to improve your productivity and efficiency?

Modernize your data layer. Learn how to design cloud-native database architectures to meet the evolving demands of AI and GenAI workloads.

Related

  • Software Delivery at Scale: Centralized Jenkins Pipeline for Optimal Efficiency
  • Concourse CI/CD Pipeline: Webhook Triggers
  • Automating Data Pipelines: Generating PySpark and SQL Jobs With LLMs in Cloudera
  • Optimize Deployment Pipelines for Speed, Security and Seamless Automation

Trending

  • Data Quality: A Novel Perspective for 2025
  • Endpoint Security Controls: Designing a Secure Endpoint Architecture, Part 2
  • MySQL to PostgreSQL Database Migration: A Practical Case Study
  • How to Perform Custom Error Handling With ANTLR
  1. DZone
  2. Testing, Deployment, and Maintenance
  3. DevOps and CI/CD
  4. Building Salesforce Using Atlassian Bitbucket Pipelines

Building Salesforce Using Atlassian Bitbucket Pipelines

In this tutorial, we explore using Bitbucket Pipelines, Docker, and Force Migration Tool to deploy Salesforce code, with Zone Leader John Vester.

By 
John Vester user avatar
John Vester
DZone Core CORE ·
Apr. 19, 17 · Tutorial
Likes (3)
Comment
Save
Tweet
Share
82.9K Views

Join the DZone community and get the full member experience.

Join For Free

Earlier this year, I wrote an article about Atlassian introducing Pipelines to the Bitbucket git-based repository. Back in December 2015, I wrote another article focused on using Atlassian Bamboo to deploy to Salesforce environments. This article will provide a simple example on how to use Bitbucket Pipelines to deploy to Salesforce, leveraging what I presented in the two referenced articles.

Setting Up Bitbucket

The first step is to make sure a repository exists within Atlassian Bitbucket, containing the Salesforce org Metadata to be deployed. While it is not possible to deploy every Metadata element using the Force Migration Tool, I still prefer to extract all the Metadata into the source repository. I feel better knowing that a copy of the design/data is stored and maintained outside the Salesforce ecosystem.

As a result, my  src  structure resembles the following screenshot:

Image title

For this project, I am using the following  .gitignore :

.project
.settings
.metadata
salesforce.schema
Referenced Packages
apex-scripts/log
config
localCopy

In addition to the standard  package.xml  in the  src  folder, I created a  deployPackageCSCore.xml  file, which is a subset of the  package.xml  (created by Salesforce/Force Migration Tool) and contains a listing of the elements that are being deployed in our environment. As developers introduce new or remove unneeded objects, the changes are reflected in the  deployPackageCSCore.xml  file.

Setting Up Force Migration Tool

At the root level of the project, a  build  folder exists and contains the following files:

Image title

Below is a summary of each of the files shown above:

  • ant-salesforce.jar - Force Migration Tool jar.

  • build.properties - contains static property data (maxPoll, pollWaitMillis and general Slack information).

  • build.xml - logic used by Apache Ant (detailed below).

The build.xml executes the following logic:

  1. Paint banner.

  2. Paint deployment information.

  3. Copy Source (optional) and rename deployPackageCSCore.xml to package.xml.

  4. Delete Unmigrateable Files.

  5. Deploy Code.

  6. Post to Slack.

An example  build.xml  is shown below:

<project name="Retrieve and Deploy SFDC metadata" default="testDeployOnly" basedir=".." xmlns:sf="antlib:com.salesforce">
    <taskdef uri="antlib:com.salesforce"
        resource="com/salesforce/antlib.xml"
        classpath="${basedir}/build/ant-salesforce.jar"/>

    <property file="${basedir}/build/build.properties"/>
    <property name="slackMessage" value="Salesforce%20Deployment%20Complete!%0A%20-%20commit%20=%20${commit.id}%20%0A%20-%20branch%20=%20${branch.id}%0A%20-%20host%20=%20${sf.serverUrl}%0A%20-%20username%20=%20${sf.username}%0A%20-%20testLevel%20=%20*${sf.testLevel}*%0A%20-%20checkOnly%20=%20*${sf.checkOnly}*"/>

    <target name="deploy" depends="deleteUnmigrateableFiles">
      <sf:deploy
          username="${sf.username}" 
          password="${sf.password}" 
          serverurl="${sf.serverUrl}"
          testLevel="${sf.testLevel}"
          checkOnly="${sf.checkOnly}"
          logType="Debugonly"
          deployRoot="${basedir}/localCopy"
          pollWaitMillis="${sfdc.pollWaitMillis}"
          maxPoll="${sfdc.maxPoll}" 
          allowMissingFiles="false"
          autoUpdatePackage="false"
          rollbackOnError="true"
          ignoreWarnings="true"/>
        <antcall target="postToSlack"/>
    </target>

    <target name="postToSlack">
      <exec executable="curl">
        <arg line="-d 'token=${slack.token}&amp;channel=${slack.channel}&amp;text=${slackMessage}&amp;pretty=1' https://slack.com/api/chat.postMessage"/>
      </exec>
    </target>

    <target name="deleteUnmigrateableFiles" depends="copySource">
        <echo level="info">Removing files that cannot be migrated:</echo>
        <delete dir="${basedir}/localCopy/flows" />
        <delete dir="${basedir}/localCopy/layouts" />
        <delete dir="${basedir}/localCopy/permissionSets" />
        <delete dir="${basedir}/localCopy/profiles" />
        <delete dir="${basedir}/localCopy/quickActions" />
        <delete dir="${basedir}/localCopy/settings" />
        <delete dir="${basedir}/localCopy/workflows" />

        <echo level="info">Cleaning up build.xml for references that cannot be migrated:</echo>
        <echo level="info">  - unfiled$public</echo>
        <replaceregexp
            match="^        &lt;members&gt;unfiled\$public&lt;/members&gt;$"
            replace=""
            flags="gm"
            byline="false">
            <fileset
            dir="${basedir}/localCopy"
            includes="**/package.xml"
            />
        </replaceregexp>
    </target>

    <target name="copySource" depends="deployInformation">
      <echo level="info">Initializing localCopy folder</echo>
      <delete dir="${basedir}/localCopy" />
      <mkdir dir="${basedir}/localCopy" />
      <echo level="info">Copying src to localCopy folder</echo>
      <copy todir="${basedir}/localCopy" >  
        <fileset dir="${basedir}/src" includes="**"/>  
      </copy> 
      <echo level="info">Deleting standard package.xml</echo>
      <delete file="${basedir}/localCopy/package.xml" />
      <echo level="info">Renaming ${sf.deployFile} to package.xml</echo>
      <move file="${basedir}/localCopy/${sf.deployFile}" tofile="${basedir}/localCopy/package.xml"/>
    </target>

    <target name="deployInformation" depends="banner">
      <echo level="info"> Information for this deployment:</echo>
      <echo level="info"> - Target Host Name = ${sf.serverUrl}</echo>
      <echo level="info"> - Login ID = ${sf.username}</echo>
      <echo level="info"> - Deployment File = ${sf.deployFile}</echo>
      <echo level="info"> - Test Only Mode = ${sf.checkOnly}</echo>
      <echo level="info"> - Apex Test Level = ${sf.testLevel}</echo>
    </target>

    <target name="banner">
      <echo level="info">╔═════════════════════════════════════════════════════╗</echo>
      <echo level="info">║    ____ _                  ____  _       _          ║</echo>
      <echo level="info">║   / ___| | ___  __ _ _ __ / ___|| | __ _| |_ ___    ║</echo>
      <echo level="info">║  | |   | |/ _ \/ _` | '_ \\___ \| |/ _` | __/ _ \   ║</echo>
      <echo level="info">║  | |___| |  __/ (_| | | | |___) | | (_| | ||  __/   ║</echo>
      <echo level="info">║   \____|_|\___|\__,_|_| |_|____/|_|\__,_|\__\___|   ║</echo>
      <echo level="info">║                                                     ║</echo>
      <echo level="info">║    Salesforce Continuous Intergration Deployment    ║</echo>
      <echo level="info">║       created by CleanSlate Technology Group        ║</echo>
      <echo level="info">╚═════════════════════════════════════════════════════╝</echo>
    </target>
</project>

The Copy Source step is optional, but makes things a lot easier when running the Force Migration Tool locally. It does make a copy of the entire source directory so that the files can be manipulated to work while deploying to Salesforce. This should not be an issue, since the  localCopy  folder is in the . gitignore  file and Pipeline deployments will be pushed via a Docker image, which will be discarded after use.

With everything in place, it is possible to execute the deployment by running the following command:

ant -buildfile build/build.xml deploy 
  -Dsf.username=enter.username@here.com 
  -Dsf.password=passwordHerePlusSecurityToken 
  -Dsf.serverUrl=https://test.salesforce.com 
  -Dsf.checkOnly=true 
  -Dsf.testLevel=NoTestRun 
  -Dsf.deployFile=deployPackageCSCore.xml

Setting Up Pipelines

With the Force Migration Tool process working, the Pipeline process can automate the deployment. From the Settings | Pipelines | Settings screen within Bitbucket, make sure Enable Pipelines is set to "enabled."

The next step is to configure environment variables used by the pipeline processing. From the Settings | Pipelines | Environment variables screen within Bitbucket, I configured the following items:

My personal sandbox:

  • SFDC_JV_CHECK_ONLY - true/false to specify if code will be deployed (true = check only is performed and code is not deployed).

  • SFDC_JV_HOST_NAME - host address (https://test.salesforce.com).

  • SFDC_JV_USER_ID - username to login to Salesforce.

  • SFDC_JV_PASSWORD_TOKEN - password + security token (configured as a Secured variable within Bitbucket).

  • SFDC_JV_TEST_LEVEL - NoTestRun/RunLocalTests/RunAllTestsInOrg.

The process was repeated for SFDC_QA (QA sandbox) and SFDC_PROD (Production) environment variables.

Finally, configure and deploy the  bitbucket-pipelines.yml  file. A very simple example is displayed below:

# ----- 
image: johnjvester/docker-salesforce

pipelines:
  default:
    - step:
        script:
          - echo "Running Default Script against QA environment (Mock Deploy, No Tests Executed)"
          - ant -buildfile build/build.xml deploy -Dsf.deployFile=deployPackageCSCore.xml -Dsf.checkOnly=true -Dsf.testLevel=NoTestRun -Dsf.username=$SFDC_QA_USER_ID -Dsf.password=$SFDC_QA_PASSWORD_TOKEN -Dsf.serverUrl=$SFDC_QA_HOST_NAME -Dcommit.id=$BITBUCKET_COMMIT -Dbranch.id=$BITBUCKET_BRANCH

  custom: # Pipelines that are triggered manually
    sandbox-jv: # John Vester's Sandbox
      - step: 
          script: 
            - echo "Running JV Sandbox"
            - ant -buildfile build/build.xml deploy -Dsf.deployFile=deployPackageCSCore.xml -Dsf.checkOnly=$SFDC_JV_CHECK_ONLY -Dsf.testLevel=$SFDC_JV_TEST_LEVEL -Dsf.username=$SFDC_JV_USER_ID -Dsf.password=$SFDC_JV_PASSWORD_TOKEN -Dsf.serverUrl=$SFDC_JV_HOST_NAME -Dcommit.id=$BITBUCKET_COMMIT -Dbranch.id=$BITBUCKET_BRANCH

    qa: # QA Full Copy Sandbox
      - step: 
          script: 
            - echo "Running QA (showing banner only and posting to Slack)"
            - ant -buildfile build/build.xml banner
            - ant -buildfile build/build.xml postToSlack -Dsf.checkOnly=true -Dsf.testLevel=NoTestRun -Dsf.username=$SFDC_QA_USER_ID -Dsf.serverUrl=$SFDC_QA_HOST_NAME -Dcommit.id=$BITBUCKET_COMMIT -Dbranch.id=$BITBUCKET_BRANCH

    prod: # Production
      - step: 
          script: 
            - echo "Running Production (showing banner only)"
            - ant -buildfile build/build.xml banner 

This file leverages the johnjvester\docker-salesforce Docker image that I have uploaded on DockerHub (feel free to utilize) and completes the following tasks when commits are made to Bitbucket:

  1. Use/download the johnjvester\docker-salesforce image.

  2. Execute the default pipeline:

    1. Echo text to the console.

    2. Run the deploy use environment variables for the following parameters:

      1. Username.

      2. Password.

      3. Server URL.

The Bitbucket commit number and branch will be included in the processing as well, to use in the Slack message.

The file also includes three custom pipelines, which can be run on demand. In my example, there is an update for my personal sandbox, our QA environment, and Production. These could be automatically updated, but we like to control when these sandboxes are updated.

Again, this is just a simple example Pipeline file.

Running the Pipeline

With Bitbucket Pipelines in place, a merge into the master branch yields the following flow:

Banner is displayed, with migration information:

Image title

Deployment information is displayed:

Image title

The Copy Source processing is executed, which also renames the deployPackageCSCore.xml to be package.xml:

Image title

Files that cannot be migrated are removed from localCopy folder:

Image title

Perform the deployment:

Image title

Post to Slack:

Image title

Build Successful:

Image title

Additionally, on the Bitbucket commit itself, a Run pipeline action exists:

Image title

Which allows the custom deployments to be executed on demand:

Image title

Conclusion

Using a simple Docker image that I uploaded to DockerHub and a functional Force Migration Tool implementation, implementing Bitbucket Pipelines becomes a task which can be implemented without much effort. As a result, it is possible to introduce CI/CD into your Salesforce environment, while keeping all the deployment information with the source code - where it truly belongs.

Have a really great day!

Pipeline (software)

Opinions expressed by DZone contributors are their own.

Related

  • Software Delivery at Scale: Centralized Jenkins Pipeline for Optimal Efficiency
  • Concourse CI/CD Pipeline: Webhook Triggers
  • Automating Data Pipelines: Generating PySpark and SQL Jobs With LLMs in Cloudera
  • Optimize Deployment Pipelines for Speed, Security and Seamless Automation

Partner Resources

×

Comments
Oops! Something Went Wrong

The likes didn't load as expected. Please refresh the page and try again.

ABOUT US

  • About DZone
  • Support and feedback
  • Community research
  • Sitemap

ADVERTISE

  • Advertise with DZone

CONTRIBUTE ON DZONE

  • Article Submission Guidelines
  • Become a Contributor
  • Core Program
  • Visit the Writers' Zone

LEGAL

  • Terms of Service
  • Privacy Policy

CONTACT US

  • 3343 Perimeter Hill Drive
  • Suite 100
  • Nashville, TN 37211
  • support@dzone.com

Let's be friends:

Likes
There are no likes...yet! 👀
Be the first to like this post!
It looks like you're not logged in.
Sign in to see who liked this post!