{{announcement.body}}
{{announcement.title}}

Azure DevOps Agent With Docker Compose

DZone 's Guide to

Azure DevOps Agent With Docker Compose

Learn more about using Docker for Azure DevOps Linux Build Agent with Docker Compose.

· DevOps Zone ·
Free Resource

Learn more about using Docker for Azure DevOps Linux Build Agent with Docker Compose.

In the past, I've dealt with using Docker for Azure DevOps Linux Build Agent in a post called Configure a VSTS Linux Agent With Docker in Minutes, and I've blogged on how you can use Docker inside a build definition to have some prerequisite for testing (like MongoDB and SQL Server). Now, it is time to move a step further and leverage Docker compose.

You may also like:  A Developer's Guide to Docker — Docker Compose

Using Docker commands in pipeline definition is nice, but it has some drawbacks: First of all, this approach suffers in terms of speed of execution because the container must start each time you run a build (and should be stopped at the end of the build). It is, indeed, true that if the Docker image is already present in the agent, machine start-up time is not so high, but some images, like MsSQL, are not immediately operative, so you need to wait for them to be ready for every build. The alternative is to leave them running, even if the build is finished, but this could lead to resource exhaustion.

Another problem is dependency from the Docker engine. If I include Docker commands in the build definition, I can build only on a machine that has Docker installed. If most of my projects use MongoDB, MsSQL, and Redis, I can simply install all three on my build machine mabe using a fast SSD as my storage. In that scenario, I'm expecting to use my physical instances, not waiting for Docker to spin a new container.

Including Docker commands in pipeline definition is nice, but it ties the pipeline to Docker and can have a penalty in execution speed.

What I'd like to do is leverage Docker to spin out an agent and all needed dependencies at once. Then we use that agent with a standard build that does not require Docker. This gives me the flexibility of setting up a build machine with everything pre-installed, or to simply use Docker to spin out in seconds, an agent that can build my code. Removing the Docker dependency from my pipeline definition gave the user the upmost flexibility.

For my first experiment, I also want to use Docker in Windows 2109 to leverage the Windows Container.

First of all, you can read the nice MSDN article about how to create a Windows Docker image that downloads, install and run an agent inside a Windows server machine with Docker for Windows. This allows you to spin out a new Docker Agent based on Windows image in minutes (just the time to download and configure the agent).

Thanks to Windows Containers, running an Azure DevOps agent based on Windows is a simple Docker Run command.

Now, I need that agent to be able to use MongoDB and MsSQL to run integration tests. Clearly, I can install both DB engines on my host machine and let the Docker agent use them, but since I've already used my agent in Docker, I wish for my dependencies to also run in Docker. So, welcome Docker Compose!

Thanks to Docker Compose, I can define a YAML file with a list of images that are part of a single scenario, so I specified an agent image followed by a SQL Server and MongoDB images. The beauty of Docker Compose is the ability to refer to other container machines by name. Let's do an example; here is my complete Docker compose YAML file.

YAML




x
26


 
1
version: '2.4'
2
 
          
3
services:
4
  agent:
5
    image: dockeragent:latest
6
    environment:
7
      - AZP_TOKEN=efk5g3j344xfizar12duju65r34llyw4n7707r17h1$36o6pxsa4q
8
      - AZP_AGENT_NAME=mydockeragent
9
      - AZP_URL=https://dev.azure.com/gianmariaricci
10
      - NSTORE_MONGODB=mongodb://mongo/nstoretest
11
      - NSTORE_MSSQL=Server=mssql;user id=sa;password=sqlPw3$secure
12
 
          
13
  mssql:
14
    image: mssqlon2019:latest
15
    environment:
16
      - sa_password=sqlPw3$secure
17
      - ACCEPT_EULA=Y
18
 
          
19
    ports:
20
      - "1433:1433"
21
 
          
22
  mongo:
23
    platform: linux
24
    image: mongo
25
    ports:
26
      - "27017:27017"



To simplify everything, all of my integration tests that needs a connection string to MsSQL or MongoDB grab the connection string by the environment variable. This is convenient so each developer can use their DB instances of choice, but also, this technique makes it super easy to configure a Docker agent, specifying database connection strings as seen in Figure 1. I can specify in environment variables connection string to use for testing and I can simply use other Docker service names directly in the connection string.

Figure 1: Environment variable to specify connection string.

As you can see (1), connection strings refer to other containers by name, nothing could be easier.

The real advantage of using Docker Compose is the ability to include Docker Compose file (as well as dockerfiles for all custom images that you could need) inside your source code. With this approach, you can specify build pipelines leveraging YAML build of Azure DevOps and also the configuration of the agent with all dependencies.

Since you can configure as many agents as you want for Azure DevOps (you actually pay for the number of concurrent executing pipelines), thanks to Docker Compose you can set up an agent suitable for your project in less than one minute. But this is optional, if you would not like to use Docker Compose, you can simply set up an agent manually, just as you did before.

Including a Docker Compose file in your source code allows the consumer of the code to start a compatible agent with a single command.

Thanks to Docker Compose, you pay the price of downloading pulling images once. You are paying only once the time needed for any image to become operative (like MsSQL or other databases that need a little bit before being able to satisfy requests). After everything is up and running, your agent is operative and can immediately run your standard builds — no Docker reference inside your YAML build file, no time wasted waiting for your images to become operational.

Figure 2: Small modification to SQL Server docker windows image file, targeting Windows Server 2019.
Figure 3: My build result ran on the Docker agent.
Figure 4: Agent running in Docker. It is treated as standard agents.

Thanks to the experimental feature of Windows Server 2019, I was able to specify a docker-compose file that contains not only windows images, but also Linux images. The only problem I had is that I did not find a Windows 2019 Image for the SQL Server. I started getting an error using standard MsSQL images (built for Windows 2016). So, I decided to download the official Docker file, change the reference image, and recreate the image, and everything worked like a charm!

Since it is used only for tests, I'm pretty confident that it should work, and indeed, my build runs just fine.

Actually, my agent created with Docker Compose is absolutely equal to all other agents. From the point of view of Azure DevOps, nothing is different, but I've started it with a single line of Docker-Compose command.

That's all, with a little effort, I'm able to include in my source code from both the YAML build definition as YAML Docker Compose file to specify the agent with all prerequisites to run the build. This is especially useful for open-source projects where you want to fork a project then activate CI with no effort.

Further Reading

Docker Basics: Docker Compose

Speed Up Development With Docker Compose

A Developer's Guide to Docker — Docker Compose

Topics:
agent, azure, compose, devops, docker

Published at DZone with permission of Ricci Gian Maria , DZone MVB. See the original article here.

Opinions expressed by DZone contributors are their own.

{{ parent.title || parent.header.title}}

{{ parent.tldr }}

{{ parent.urlSource.name }}