Maximising CI capacity in Azure DevOps Pipelines with self-hosted Docker agents for multiple workflows

Simon Headley
Nerd For Tech
Published in
3 min readFeb 25, 2021

--

If you, like me, manage a self-hosted agent pool in Azure DevOps and have found that you’re wanting for more capacity, then Docker is possibly the solution for you. By running your Azure Pipelines Agent in Docker, you can leverage the resources available to you without introducing new virtual machines and increasing the running & maintenance cost of your agent pool. Using Docker you can run multiple instances of your build agent on a single machine and leverage this to extend your CI capacity.

The Dockerfile

Let’s quickly work through the Dockerfile that eventually catered for all our use cases.

The base

The initial steps use an escape character of “`" as the default of “\” interferes with install paths and this keeps things a bit simpler. I went with the mcr.microsoft.com/dotnet/framework/runtime:4.8 as the base image. I had tested various other base images, such as the windows-servercore, windows-nanoserver and dotnet images, but all of these had some tool or feature missing. Another potential base image is the mcr.microsoft.com/dotnet/framework/sdk:4.8 image, which comes with Visual Studio 2019 build tools & test tools installed, but doesn’t come with SQL Server Data Tools which ruled it out for me. It was easier to install Visual Studio myself than to try patch something that is already installed.

If you use mcr.microsoft.com/dotnet/framework/sdk:4.8 as your base image, you can skip the portion of this code that installs the Visual Studio Build tools as well as the Agents for Visual Studio tools. You can read more about the various Visual Studio workflows and components at Microsoft’s Docs.

Git

Next I needed to install Git. I opted to use the MinGit installation.

MinGit comes with an interesting issue where the default gitconfig file has an infinite loop pointing back to itself, so I simply downloaded that file and removed the loop, copied it into my Docker image and overwrote the default. You can read more about this issue reported on GitLab. My gitconfig file is available on GitHub.

NodeJS

Next I needed to install NodeJS.

I needed to support both new and legacy Node modules and found the last version of NodeJS 10.x was sufficient. If you do not need to support legacy systems I would strongly suggest using the latest stable release of NodeJS.

Java

To support Java builds, along with various static analysis tools in the pipeline, both the Java JDK and JRE were necessary. Thankfully the Open JDK and Open JRE are readily available and easy to install. It was, however, necessary to set both environment variables and update the PATH to reference to the installations.

NuGet

Having NuGet.exe installed and part of your path can be extremely useful for agents that are setup to run .NET builds, although it is not completely necessary as there are pipeline tasks to install NuGet that can be added to the azure-pipelines.yml. The main benefit of installing NuGet directly on to the agents is that you can use the nuget CLI in your Dockerfile to authenticate with any private feeds you may have.

Docker-in-Docker

This next step assumes that the when launching the container, the host’s installation of Docker is mounted into the running container. This allows various build processes to run Docker tasks in the pipeline. If the host’s Docker installation is not mounted into the running container, the capability to run Docker will simply not be present when viewing the Docker agent from Azure DevOps.

Chrome headless

It is necessary to have a headless browser installed directly in the Docker agent to allow UI tests to run. I opted to use Chocolatey to install Google Chrome.

The base Windows images no longer come packed with any fonts installed — for obvious reasons, however, in order for Chrome to work, fonts need to be installed on the OS, which is what the FontsToAdd.tar and Add-Font.ps1 takes care of.

Cleaning up and setting the entry point

The final bit simply cleans up all temporary install directories and files, then sets the entry point for the running container.

The start.ps1 file is slightly modified from that provided by Microsoft where the install and run directory of the Azure Pipelines Agent is no longer C:\azp\agent but C:\agent.

The complete Dockerfile

Docker Hub image

If you need, you can pull the Docker image from Docker Hub.

docker pull krylixza/azure-pipelines-windows-build-agent:1.0.0.38

--

--

Simon Headley
Nerd For Tech

I am a .NET developer and DevOps advocate for my organisation. I am South African to the core. I love solving problems and learning new technologies.