Dependency Management with .NET - Doing it Right
Join the DZone community and get the full member experience.Join For Free
the problem of dependency management is neither new nor original, it exists in all development platforms, and .net is no different. let’s go through different solutions and see how they perform. i’ll list them here in no particular order.
keeping dependencies in your source control
that’s a very popular solution, and for a reason. the benefits are obvious. here are some of them:
- no setup. you already have your source control in place (hm, i hope you do!). add \bin directory, and you are fine.
- no learning curve. developers are used to work with source control.
- shared. the whole team gets changes and updates from the server as they occur.
enterprisy (in a good way). the software is proven, backed up, drp is done.
- it isn’t a proxy. vcs can’t download the dependency you need from a central repository when you need one. you need to manually download it and add it to vcs. the history starts from there - you just lost the link to the original file. so, you work hard, and on top of it lost information; this repeat itself for each new dependency.
- versioning mismatch. source files are versioned by their content. vcss know how to diff them and understand what changed. binaries, on the other side, usually versioned by their name. from vcs point of view they are different entries, each one without any version history.
- some very popular vcss (like subversion) can’t obliterate files. that means - once a file was added, it stay in the repository forever. that’s not a big issue for small source files, but can become quite a pain when it comes to obsolete large binaries.
- source control knows how to search sources. and, of course, the most important type of search is by content. searching for binaries is different: what matters there is the location, structure of the file name and, in case of archived artifact, the contents of archive.
- the permissions scheme of vcss is tailored for versioning sources (again!). for example, there is no override permission. that’s because overriding sources is something we do all the time (that’s what diff is for in vcs) - it’s the same security level as, let's say, adding a new source file. with binaries the situation is very different. while adding new binaries is fine, overriding released binary is something that shouldn’t be done, one should have a special permission for it.
distributed vcss, awesome by themselves, are particularly unsuited for handling big binary files. when cloning a remote repository to your machine
you are bringing all the history of all the files in it
. now just think about all the huge binaries sitting there...
as you see, the conclusion is simple - we can do better. let’s try something specialized for binaries.
gac and webgac
global assembly cache is, on contradictory to vcs ,tailored for storing binaries. it understands versions, prevents conflicts, and generally does a good job being your local dependencies storage. the main problem with gac is being local, which means - each and every developer should take the binaries from somewhere and install them in their local gac. you see the troubles coming in that setup, don’t you?
to the rescue here. it's essentially gac shared by webdav and enables clients to fetch dependencies from the server, simplifying dependencies management for a team. let’s do our pros/cons math. the benefits:
- gac is good, standard and proven solution for binaries management. it deals well with versions.
- webgac is a central binaries repository for a team. every team member synchronizes with it.
webdav is popular well-known http extension with locking, security management, etc. working with the apache webdav module is generally straightforward.
- it isn’t a proxy. webgac can’t download the dependency you need from a central repository when you need one. you need to manually download it, add it to webgac and only then it becomes available to the team. not only must you work for every version of every dependency needed, the link to the original file is lost.
- no notion of packages. gac contains single dlls. you install them one by one. but think about nunit, as an example. it contains about dozen of dlls along with various xml and configuration files. how can you install it to gac?
- security is cumbersome. you’ll need to configure apache server’s security, and even then it won’t be flexible enough to determine between deployer (a user that can publish private dependencies) and promoter (a user that can move dependencies from a private repository to a public one).
search is basic. webdav by itself only knows about files. it doesn’t care about the structure of the filename, or about the presence of strong names.
looks like we still didn’t find what we are looking for, and then...
here comes nuget
this is something else.
designed to be “a developer focused package management system for the .net platform intent on simplifying the process of incorporating third-party libraries into a .net application during development”. that’s exactly what we need. let’s look how great it is:
- manages packages, not dlls.
- provides nuget gallery - almost 4.5k (at the time of writing) packages are at your disposal for all your development needs.
- supports binary versioning.
- integrates with visual studio.
- integrates with your build.
integrates with your build server (only
at the moment of writing).
- the content of submissions to the gallery is (almost) unverified. everyone can register, get the api key and start uploading whatever they like. scary, isn't it? (yes, very scary).
being public, nuget gallery can’t be used for inter-team packages exchange.
private remote feeds
are the recommended solution. next we'll see if it is good enough.
working with nuget remote feeds
, introduced in nuget 1.4 are crucial need for any development team. it serves a dual purpose: it allows sharing 3rd party packages that aren’t available on gallery (or even replaces the gallery for those who can’t trust it) and it serves as a target for internal deployments - both for team collaboration and for other usages, as making packages available to qa, or even serving them to the customers from the outside world (by using
, for example). if that’s so right, what’s wrong? here’s what:
- you saw it coming: it isn’t a proxy. you know the score by now.
- it can’t aggregate. the nuget remote feed exposing one monolithic repository: the one you have on your machine. it can’t aggregate nuget packages from remote repositories, or expose number of local repositories (separated for security reasons, for example).
- you can’t attach your own metadata. let’s say you want to annotate some package with compatibility information (e.g. works with certain browsers). no, can’t do.
- the repository is very simplistic. it doesn’t provide any web interface; it browsable and searchable only from a client - be it visual studio or the command line interface (pretty basic by itself).
- even the vs search interface is very basic (all you have is arbitrary sorting and free text search). it should be enough for starters but lack of searching inside the packages or by properties (from the previous bullet) will bite you eventually.
- the security scheme is even less than simplistic. all that's required to authenticate a deployment/delete of all the users at once is an api key. what about separation of duties? some users should only be able to read, others only to annotate with metadata (qa team that tests compatibility in my previous example), and only small subgroup - to deploy. the all-or-nothing scheme is definitely insufficient.
storage format is suboptimal:
- the packages are stored on the filesystem in a naive simple format. that fine for small repository,but as you grow, you'd expect storage which more optimized for binaries.
the metadata is not indexed. again, fine for small repo, troubles are foreseen when it comes to scaling.
so, is there a good alternative to nuget remote feed to be your in-house gallery for nuget packages? we, the proud makers of artifactory , believe there is.
is an enterprise-grade binary repository that centralizes all aspects of managing software binaries. that means that we tackle all the problems mentioned above. we are developing artifactory since 2006. being used by millions of users for storing, sharing and managing binaries, we have gathered great feedback from our users.
that's what we've learned:
- binary packages are different from sources (by being big and binary) and deserve smart storage.
- binary packages are usually archives (be it jars, zips, rpms or nupkgs). they should be browsable and searchable without the need to download them locally to developer's machine.
- big public repositories exist on the net, they need to be proxied smartly (variations, auditing, managing)
- users come in different flavors. their permissions should match possible responsibilities (and in the case of binary packages they are different from other cases).
- a binary repository holds critical information, it should be rock-solid, backed up, and drp ready.
your software ends up being a package. we know how to help you...
- build it in a reproducible manner, integrating with your build tools and your build server.
- stage it to ensure the best quality.
distribute it to your customers.
you get the gist behind artifactory by watching this 2.5 minutes youtube video . if you’re not in the mood for movies (or ran out of popcorn), here’s quick recap:
as you see, instead of working with a number of nuget feeds (nuget gallery, orchard gallery, remote feeds from co-workers and from different teams) developers work with exactly one repository. it simplifies setup and daily work and centralizes management and maintenance. the work is bi-directional, the users resolve their 3rd party dependencies from artifactory and deploy their created packaged into it. now let’s add a build server to the picture (literally):
yup, with numbers this time. so, here we go:
- developers find and fetch new 3rd party packages from artifactory in visual studio. the packages are downloaded from artifactory to the developer's machine. if the packages aren’t present in artifactory it will look for them in remote galleries/feeds. on developer machines packages.config is updated with the list of used packages.
- developers commit their code and packages.config (but not the binaries) to vcs.
- the build server (as i already mentioned, teamcity now supports nuget) takes the changes from the vcs.
- it builds the solution and packs the produced artifacts as nuget packages.
- during the build it fetches the needed packages from artifactory. if the packages aren’t present in artifactory it will look for them in remote galleries/feeds.
once the packages are built they are deployed to artifactory.
built packages in artifactory can be used by other teams (as their 3rd party dependencies), by qa for running tests and even by the end users ( chocolatey ftw), all this with fine-grained permissions and robust promotion procedures (moving a package between repositories with different visibility rules). you know what? it deserves dedicated how-to blog post. i’ll link it here once published.
assuming you've read up to this point, you've gathered that starting from artifactory version 2.5.0 we are proud to serve the .net world with full nuget support. we can proxy any remote nuget feed (starting with nuget gallery, of course), we can host the packages that aren’t found on any remote nuget feed, we can host the packages you produce and we can aggregate any number of repositories of any kind under single a url. we provide you with an awesome ui for configuring your repositories, browsing and searching for your packages. we also feature smart storage that enables attaching searchable metadata on top of your binaries. we can do it all on the cloud with our saas version .
hopefully , you're convinced by now and probably looking for the download link on our site ( here’s it , btw, click on “evalution”). if not, give it a try by playing with our live demo . look at the nuget-gallery cache: that’s how we proxy the nuget gallery. you’ll find some of the packages saved locally; once you've selected a package, you’ll see all kinds of information about it: its name and size, who deployed it to artifactory, where it came from (from nuget gallery, naturally for this is the nuget gallery cache) and the operations you can perform on this package (as anonymous the selection is naturally limited). clicking on the triangle in the tree will open the package and let you dive into its content, including downloading specific files from the archive:
we, at jfrog, believe that artifactory is the missing piece of the puzzle for a robust, agile .net dependency management, which can make the development process easier compared to other alternatives. we'll be happy to receive any insights, thoughts and comments on the ideas presented in this blog and/or your experience using artifactory together with nuget.
Opinions expressed by DZone contributors are their own.