DZone
Thanks for visiting DZone today,
Edit Profile
  • Manage Email Subscriptions
  • How to Post to DZone
  • Article Submission Guidelines
Sign Out View Profile
  • Post an Article
  • Manage My Drafts
Over 2 million developers have joined DZone.
Log In / Join
Please enter at least three characters to search
Refcards Trend Reports
Events Video Library
Refcards
Trend Reports

Events

View Events Video Library

Zones

Culture and Methodologies Agile Career Development Methodologies Team Management
Data Engineering AI/ML Big Data Data Databases IoT
Software Design and Architecture Cloud Architecture Containers Integration Microservices Performance Security
Coding Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks
Culture and Methodologies
Agile Career Development Methodologies Team Management
Data Engineering
AI/ML Big Data Data Databases IoT
Software Design and Architecture
Cloud Architecture Containers Integration Microservices Performance Security
Coding
Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance
Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks

Because the DevOps movement has redefined engineering responsibilities, SREs now have to become stewards of observability strategy.

Apache Cassandra combines the benefits of major NoSQL databases to support data management needs not covered by traditional RDBMS vendors.

The software you build is only as secure as the code that powers it. Learn how malicious code creeps into your software supply chain.

Generative AI has transformed nearly every industry. How can you leverage GenAI to improve your productivity and efficiency?

Related

  • Optimizing Azure DevOps Pipelines With AI and Continuous Integration
  • Strategic Insights Into Azure DevOps: Balancing Advantages and Challenges
  • DevOps Nirvana: Mastering the Azure Pipeline To Unleash Agility
  • A Comprehensive Guide on Microsoft Azure DevOps Solutions

Trending

  • How to Submit a Post to DZone
  • DZone's Article Submission Guidelines
  • The End of “Good Enough Agile”
  • Event Driven Architecture (EDA) - Optimizer or Complicator
  1. DZone
  2. Software Design and Architecture
  3. Cloud Architecture
  4. Use Shared Packages From an Azure DevOps Feed in a Maven Project

Use Shared Packages From an Azure DevOps Feed in a Maven Project

We run through a tutorial on on how to create Azure DevOps Artifacts, connect our Artifacts to a Maven feed, and build and push the changes to our Artifacts.

By 
Sreekumar C user avatar
Sreekumar C
·
Vidya Subramanyam user avatar
Vidya Subramanyam
·
May. 11, 21 · Tutorial
Likes (2)
Comment
Save
Tweet
Share
18.5K Views

Join the DZone community and get the full member experience.

Join For Free

Introduction

While building multiple apps, you will come across reusable code. This common code can be packaged, versioned, and hosted so that it can be reused within the organization. 

Azure DevOps Artifacts is a service that is part of the Azure DevOps toolset, which allows organizations to create and share libraries securely within the organization.

In this post, we will cover the way in which a library is created and how it can be shared within the organization securely.

The following are the key points we'll discuss:

  • Create and build a pipeline to publish the package to Azure Artifact.
  • Connect an application to your Artifact's Maven feed to consume the library. 
  • Build and Push changes to the artifact and consume the newer version from your application.

This Demo library and application is built with Scala using Maven feeds.

Setting Up a Private Feed

Let us start by creating an organizational feed. Azure Artifacts support Maven, npm, NuGet, Python packages, and more.

In this demo we will be creating a private Maven feed. 

Follow the below steps.

1. Log into the Azure DevOps portal and make sure that you have an organization and a project.

Select the Artifacts menu and click on Create feed.

2. Enter the name of the new feed and select the visibility and select Organization as scope. Click on create.

Integrate the Artifacts Feed With Build Pipeline

1. Create a Maven project and check-in to the Azure DevOps repo.

(Sample Scala Maven project)

Java
 




xxxxxxxxxx
1
16
9


 
1
package org.util.spark
2
import org.apache.spark.sql.Column
3
import org.apache.spark.sql.functions.{regexp_replace, to_date}
4

          
5
object DateUtils extends Serializable {
6
  def cleanColumn(inputCol: Column): Column = {
7
         regexp_replace(inputCol, "[^0-9A-Za-z]", "")
8
  }
9
}


2. From the artifacts, select the feed which we created in the previous step and select connect to feed:

3. Select Maven from the menu and updated both the <repositories> and <distributionManagement> sections of the project pom.xml:

4. Create a build pipeline and add the following line in the azure-pipelines.yml file:

YAML
 




xxxxxxxxxx
1


 
1
mavenAuthenticateFeed: true


5. Run the build pipeline to get the artifacts deployed to the created feed.

Connect to the Feed and Consume the Package on Your Local Machine

1. Create a persona access token to access the Azure DevOps Service from the local machine. 

2. Select the PAT screen as indicated below 

3. Create a personal access token with read and write access for packaging. 

4. Maven pulls the credentials from setting.xml ("%USERPROFILE%/.m2/settings.xml" on Windows) to connect to the feed and thus add the below tags in settings.xml, replacing the above generated token password.

5. Create a Maven project to test the feed. 

(Sample Scala Maven project)

Java
 




x


 
1
package org.example
2
import org.apache.spark.sql.SparkSession
3
import org.util.spark.DateUtils._
4

          
5
object Main extends App {
6
    val spark =     SparkSession.builder().appName("DevOpstest").master("local").getOrCreate()
7

          
8
import spark.implicits._
9
 /* Create a sample dataframe */
10
  val initialDF = Seq((1, "02-03-2021"), (2, "2021-04-05"), (3, "2021-04-25")).toDF("item", "trans_date_str")
11
  val transformedDF = initialDF.withColumn("trans_date_dt", cleanColumn($"trans_date_str"))
12
  transformedDF.show()
10
  val initialDF = Seq((1, "02-03-2021"), (2, "2021-04-05"), (3, "2021-04-25")).toDF("item", "trans_date_str")
13
}


6. Update the application's pom.xml with the dependency details for the created Artifact Feed:

7. Update the <repositories> and <distributionManagement> sections of the project pom.xml file with the same content we used before for the library project.

Conclusion

Azure DevOps Artifacts are a great place to securely host and share libraries. Access to the feeds is based on the configuration. 

Multiple versions of the same library can be hosted. A pom.xml version decides if the package can be used there by reducing the compatibility issues with the library versions. 

Apache Maven azure DevOps

Opinions expressed by DZone contributors are their own.

Related

  • Optimizing Azure DevOps Pipelines With AI and Continuous Integration
  • Strategic Insights Into Azure DevOps: Balancing Advantages and Challenges
  • DevOps Nirvana: Mastering the Azure Pipeline To Unleash Agility
  • A Comprehensive Guide on Microsoft Azure DevOps Solutions

Partner Resources

×

Comments
Oops! Something Went Wrong

The likes didn't load as expected. Please refresh the page and try again.

ABOUT US

  • About DZone
  • Support and feedback
  • Community research
  • Sitemap

ADVERTISE

  • Advertise with DZone

CONTRIBUTE ON DZONE

  • Article Submission Guidelines
  • Become a Contributor
  • Core Program
  • Visit the Writers' Zone

LEGAL

  • Terms of Service
  • Privacy Policy

CONTACT US

  • 3343 Perimeter Hill Drive
  • Suite 100
  • Nashville, TN 37211
  • support@dzone.com

Let's be friends:

Likes
There are no likes...yet! 👀
Be the first to like this post!
It looks like you're not logged in.
Sign in to see who liked this post!