DZone
Thanks for visiting DZone today,
Edit Profile
  • Manage Email Subscriptions
  • How to Post to DZone
  • Article Submission Guidelines
Sign Out View Profile
  • Post an Article
  • Manage My Drafts
Over 2 million developers have joined DZone.
Log In / Join
Please enter at least three characters to search
Refcards Trend Reports
Events Video Library
Refcards
Trend Reports

Events

View Events Video Library

Zones

Culture and Methodologies Agile Career Development Methodologies Team Management
Data Engineering AI/ML Big Data Data Databases IoT
Software Design and Architecture Cloud Architecture Containers Integration Microservices Performance Security
Coding Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks
Culture and Methodologies
Agile Career Development Methodologies Team Management
Data Engineering
AI/ML Big Data Data Databases IoT
Software Design and Architecture
Cloud Architecture Containers Integration Microservices Performance Security
Coding
Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance
Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks

Modernize your data layer. Learn how to design cloud-native database architectures to meet the evolving demands of AI and GenAI workkloads.

Secure your stack and shape the future! Help dev teams across the globe navigate their software supply chain security challenges.

Releasing software shouldn't be stressful or risky. Learn how to leverage progressive delivery techniques to ensure safer deployments.

Avoid machine learning mistakes and boost model performance! Discover key ML patterns, anti-patterns, data strategies, and more.

Related

  • Key Components of a Successful DevSecOps Pipeline
  • DevOps Pipeline and Its Essential Tools
  • 3 GPT-3 Tools for Developers, Software and DevOps Engineers, and SREs
  • Mule 4 Continuous Integration Using Azure DevOps

Trending

  • Understanding Java Signals
  • Microsoft Azure Synapse Analytics: Scaling Hurdles and Limitations
  • Beyond ChatGPT, AI Reasoning 2.0: Engineering AI Models With Human-Like Reasoning
  • Unlocking the Potential of Apache Iceberg: A Comprehensive Analysis
  1. DZone
  2. Testing, Deployment, and Maintenance
  3. DevOps and CI/CD
  4. DevOps Pipeline Managing PCF App and Resources

DevOps Pipeline Managing PCF App and Resources

This tutorial shows how to set up an automated pipeline to reduce the use of memory/storage for your PCF application, thus reducing your DevOps team's bill.

By 
Rajesh Bhojwani user avatar
Rajesh Bhojwani
·
Updated Jul. 25, 18 · Tutorial
Likes (6)
Comment
Save
Tweet
Share
15.3K Views

Join the DZone community and get the full member experience.

Join For Free

There are many tools available to do DevOps for PCF. Automating the deployment of artifacts to PCF is very easy and many articles have been published about it. Now, you will be asking, what different aspects this article is going to cover?

In my current project, I have observed that developers keep deploying applications on PCF with little control so the resources are piling up and leading to a huge bill to the DevOps team who manages the platform. After analyzing the issue, I found that teams are building the applications and deploying to a PCF test environment but they are sitting idle 80% of the time without being used. This is a huge waste of a test environment, as IaaS charges based on the consumption of memory/storage.

To address this waste, I have come up with a DevOps process which will not only just deploy the application to PCF, but also automate the provisioning and de-provisioning of the dependencies around it. This will ensure that all the resources are used optimally and not sitting idle. The idea is that you create the resources when you need them and delete them as soon as your work is finished. The below solution will first create the Org/Space for PCF, then create dependent backing services, deploy the application, test it with automation, clean up the resources after testing completion, and delete the Org/Space itself. There will be no environment sitting idle and adding to your bill.

For this pipeline, I have used Bamboo, but this can be implemented with any other pipeline, like Jenkins, GoCD, etc.

Prerequisites

  1. Bamboo Pipeline setup

  2. A Spring Boot application

  3. PCF CLI and Maven plugins for Bamboo

  4. Basic understanding of Bamboo pipeline

Stage 1 — Create Build and Analysis

This first step will checkout the code and integrate it with SonarQube. The Sonarqube dashboard will show the Application Analysis results based on the widgets available.

Image title







  1. First, checkout the code from Git.

  2. The next two steps are for copying the build and SonarQube scripts and retrieving the Git user and branch details for running the build.

  3. The third step is for the Maven build. I have disabled the Gradle script as my application is using the Maven pom for the build.

  4. The last step is to run the Sonar scan.

Stage 2 — Secure Code Scanning

This step is pretty standard and you can use many tools available, like Ccheckmarx, Coverity, SourceClear, etc. These tools do static code scans from a security point of view and generate log reports.

Image title








Stage 3 — Deploy Artifact to Repository (Nexus)

This is going to push the build artifact (jar, war file) to the repository, like Nexus or JFrog.

Image title



Stage 4 — Create the PCF Environment and Deploy the App

This is the most important part of this article. This step is going to create the PCF environment and then deploy the app.

  1. Copy the manifest file from the source code and make any changes through scripts, as required.

  2. Log into PCF using the PCF CLI plugin or a bash script.

  3. Create the Org/Space for the application and backing services where it will be deployed. Target to the new org/space.

  4. Create the service instances for each backing service required using the cf CLI command create-service.

  5. Push the application downloaded from the repository to PCF. The manifest file takes care of the service binding before starting the app.

  6. Log out of PCF.

Image title








All the above steps can be implemented in two ways.

  1. Write a shell script and keep it in the source code repository. This script can be imported into a Bamboo task and executed. 

  2. Bamboo has a PCF CLI plugin so a task can be created for each command to log in, create services, deploy app, etc...

I have used a mix of both the approaches to showcase them (disabled tasks are for the second approach).

Now we have configured and provisioned everything required by the application to run, so it would be easy to de-provision it when the job is completed.

Step 5 — Run Automated Tests

This step is also a key point. Unless the testing is automated, you would need your app up and running to do manual testing, and that leads the app to sit idle for most of the time. So, automate most of the testing steps to reduce the idle time.

Image title








Step 6 — Delete the Resources and App

Once the testing is completed, all the resources and app can be de-provisioned.

Again, this can be either by script or the separate task for each command.

Image title








If we look at it before and after running this pipeline, you won't see any new space/app/services in PCF, but still, you fulfilled your purpose of using PCF to deploy and test the app.

Miscellaneous Use Case

The above strategy works very well for a dev environment where developers will keep playing with a lot of resources. For other environments, this might not be the case. For that, we may need to follow a different strategy. Let me explain that as well.

Let's take an example of a UAT environment where developers will be pushing the app and users will be doing the manual testing (now, don't argue with me that it should also be automated. There is always one thing or another which the user would like to see and test by himself before approving it to go to production). In that scenario, you would need to keep the app up and running for a certain period. In that case, you would need a pipeline which can just run Step 6 to clean up the resources. You can keep that pipeline aside to do this job in an automated way rather than doing a manual job.

Image title









That's all for this article. I hope you find it useful to minimize your bills.

Please do share your ideas on how to minimize the resource waste and the bill on the PCF platform. Share your views through comments.

app Pipeline (software) DevOps application Manifest file

Opinions expressed by DZone contributors are their own.

Related

  • Key Components of a Successful DevSecOps Pipeline
  • DevOps Pipeline and Its Essential Tools
  • 3 GPT-3 Tools for Developers, Software and DevOps Engineers, and SREs
  • Mule 4 Continuous Integration Using Azure DevOps

Partner Resources

×

Comments
Oops! Something Went Wrong

The likes didn't load as expected. Please refresh the page and try again.

ABOUT US

  • About DZone
  • Support and feedback
  • Community research
  • Sitemap

ADVERTISE

  • Advertise with DZone

CONTRIBUTE ON DZONE

  • Article Submission Guidelines
  • Become a Contributor
  • Core Program
  • Visit the Writers' Zone

LEGAL

  • Terms of Service
  • Privacy Policy

CONTACT US

  • 3343 Perimeter Hill Drive
  • Suite 100
  • Nashville, TN 37211
  • support@dzone.com

Let's be friends:

Likes
There are no likes...yet! 👀
Be the first to like this post!
It looks like you're not logged in.
Sign in to see who liked this post!