Azure Devops Build and Release Agents With Docker

Azure Devops offers a great pipeline feature for automating the build and release process. After we realized Microsoft’s hosted agents took more time than we wanted to gather all the required assets for our build, we decided to investigate building our own agents to run the process (Microsoft offers the ability to host your own agents).

Since we use .NET Core for most of our projects, we can’t use Azure container management because it doesn’t support Windows containers. Instead, we created Windows containers on a native Windows machine with Docker, which allows us to host our build agents with Docker running on a Windows virtual machine in Azure.

In this post, we’ll walk through the whole process of creating a custom build agent with Docker. Microsoft has its own solution on GitHub, which was the basis for this project. All the files mentioned in this post are included at the bottom of this page.

Creating the Dockerfile

The dockerfile specifies the contents of our container image, so it’s important to include everything the build agent will need. I’ll highlight the main elements of the dockerfile in this section, but you can find the entire dockerfile at the bottom of the page.

The first element of every dockerfile is what core image the new image will be based on. Since we need a Windows image to install .NET Core, we will base it on Windows server core.

FROM microsoft/windowsservercore:10.0.14393.1358
ENV WINDOWS_IMAGE_VERSION=10.0.14393

We need to install several components on these servers, and we’ve found that chocolatey is a convenient package manager that will allow us to run clean installs. We can invoke a PowerShell command to download and run the chocolatey install script, which will allow us to set the default choco configuration.

RUN @powershell -NoProfile -ExecutionPolicy Bypass -Command "iex ((New-Object System.Net.WebClient).DownloadString('https://chocolatey.org/install.ps1'))" && SET "PATH=%PATH%;%ALLUSERSPROFILE%\chocolatey\bin"
RUN choco config set cachelocation C:\chococache

Once chocolatey is installed, we can use it to install all our dependencies, most of which are shown in the list below. You can also add anything else you may need. Most things have a chocolatey package associated with them, and you can find specific instructions to install them on the their site.

RUN choco install \
    git  \
    nodejs \    
    curl \
    docker \
    dotnet4.6.1 \
    --confirm \
    --limit-output \
    --timeout 216000 \
    && rmdir /S /Q C:\chococache

Most build servers I’ve seen rely on a full copy of Visual Studio to be installed, but I found that installing Visual Studio Build Tools is really all that is needed. Here we install build tools the same way the we installed the other packages with choco.

RUN choco install \
    visualstudio2017buildtools

Now all we have left to install is .NET Core–which can’t be done with chocolaty, so we have to invoke a web request to download the installer. Once the installer is downloaded, we can extract the files and remove the no longer needed zip file.

# Install .NET Core
ENV DOTNET_VERSION 2.1
ENV DOTNET_DOWNLOAD_URL https://download.visualstudio.microsoft.com/download/pr/ce443d89-75f1-4122-aaa8-c094a9017b4a/255b06ace4207a8ee923758160ed01c3/dotnet-runtime-2.1.5-win-x64.zip

RUN Invoke-WebRequest $Env:DOTNET_DOWNLOAD_URL -OutFile dotnet.zip; \
    Expand-Archive dotnet.zip -DestinationPath $Env:ProgramFiles\dotnet -Force; \
    Remove-Item -Force dotnet.zip
    
# Install .NET Core SDK
ENV DOTNET_SDK_VERSION 2.1
ENV DOTNET_SDK_DOWNLOAD_URL https://download.visualstudio.microsoft.com/download/pr/28820b2a-0aec-4c24-a271-a14bcb3e2686/5e0ad8ae32f1497e8d0cace2447b9e01/dotnet-sdk-2.1.403-win-x64.zip

RUN Invoke-WebRequest $Env:DOTNET_SDK_DOWNLOAD_URL -OutFile dotnet.zip; \
    Expand-Archive dotnet.zip -DestinationPath $Env:ProgramFiles\dotnet -Force; \
    Remove-Item -Force dotnet.zip

The last thing we need to do is download the tools to connect our agent to the Azure Devops agent pool. If you navigate to your agent pools in Azure Devops, you will find a “Download Agent” button, where you can copy the “Download the Agent” link to download a zip file with all the tools needed to connect our agent to the pool.

Once we have the link, we can add another step to our dockerfile to download and extract this zip file in our container image. We will create a new directory for the build agent files and extract the file there, then remove the zip file.

#Install Agent
RUN mkdir C:\BuildAgent;

ENV VSTS_ACCOUNT_DOWNLOAD_URL ""

RUN Invoke-WebRequest $Env:VSTS_ACCOUNT_DOWNLOAD_URL -OutFile agent.zip; \
    Expand-Archive agent.zip -DestinationPath c:\BuildAgent -Force; \
    Remove-Item -Force agent.zip

We want these agents to be fully automated, so we need a script that will configure and connect our agent to the agent pool when the docker container is started. We’ll take a detailed look at creating this script in the next section, but go ahead and add these steps to the end of the dockerfile. This makes the build agent directory we just created the working directory and copies our start scripts into that directory. When the container is started, it will run the start.cmd file.

WORKDIR C:/BuildAgent
COPY ./start.* ./
CMD ["start.cmd"]

Creating the Start Scripts

The start script is what actually connects the agent to Azure Devops, making use of the tools we downloaded from the agent pool earlier.

Before we make the script, we’ll create a personal access token for the agent to use for authentication. To create one, click on your user profile at the top right of Azure Devops, then select the security tab, navigate to Personal Access Tokens, and choose “New Token.” You can name the token and specify the privileges required. For build agents, it will only needed to read and manage the agent pools. Once you finish the creation, you’ll have a token that can be saved for use in the start script.

We now can create the PowerShell script start.ps1. First, we add the variables we will be using for the script. We’re declaring environment variables at the beginning of the script. If you would rather, you can also just pass environment variables into the container when starting it with Docker.

$env:VSTS_ACCOUNT = ""
$env:VSTS_TOKEN = ""
$env:VSTS_POOL = ""
    VTST_ACCOUNT is the name of your organization (e.g., VSTS_ACCOUNT.visualstudios.com or dev.azure.com/VSTS_ACCOUNT)
    VSTS_TOKEN is the personal access token that we just generated
    VSTS_POOL is the name of the agent pool that should be joined; if left blank, the agent will be added to the default pool

Refer to the files at the end of the page for the full PowerShell script.

We now need to create a simple star.cmd file to trigger our PowerShell script. In our dockerfile, we already set this script to run every time the container image starts.

PowerShell.exe -ExecutionPolicy ByPass .\start.ps1

Starting the Docker Containers

After all the configuration is done, we’re ready to build our custom docker images. We need to create a directory with all the files we’ve created so far: start.ps1, start.cmd, and dockerfile. Note that dockerfiles have no extension.

Now open PowerShell and navigate to the newly created directory and run the following docker command. Be sure to name your image here.

docker build -t "<imgename>" .

The build will take a little while as the dockerfile installs all the components–especially the .NET Core step, which may appear to be hung up. Be patient, and the installer will finish its work. Once the image has been created, it will output “Successfully tagged imagename:latest” in the console.

We can now start our container with the newly created docker image.

docker run -it -d --restart always --name "" --hostname "" imagename:latest

I configured my start script to take the host-name of the docker container and use it as the name of the agent in the Azure Devops agent pool. Docker randomly generates a hostname for each container, so I pass the host-name in as a parameter here so I could give each agent a specific name. This is also where you would want to pass in the environment variables if you did not declare them in the start.ps1 file. There are many more Docker run arguments that can be found in the Docker documentation.

To see the running Docker containers, issue a “docker ps” command in the console. We should also be able to see the newly created agent in the specified agent pool in Azure Devops.

If you don’t see your agent here after a few minutes, try restarting the Docker container and connecting to it with PowerShell. This will allow you to run the start.ps1 file manually to see if there is an error with the configuration.

Conclusion

We can now start as many build and release agents as we need from this Docker image. If future projects require other dependencies on the build servers, we can simply edit the dockerfile and rebuild the image. In our use case, we found that the build agents used more CPU power than RAM, so we were able to select a compute optimized server with several cores and found we were saving minutes on our build and release process.

Project Code

dockerfile

FROM microsoft/windowsservercore:10.0.14393.1358
ENV WINDOWS_IMAGE_VERSION=10.0.14393

ENV chocolateyUseWindowsCompression=false

RUN @powershell -NoProfile -ExecutionPolicy Bypass -Command "iex ((New-Object System.Net.WebClient).DownloadString('https://chocolatey.org/install.ps1'))" && SET "PATH=%PATH%;%ALLUSERSPROFILE%\chocolatey\bin"

RUN choco config set cachelocation C:\chococache

RUN choco install \
    git  \
    nodejs \    
    curl \
    docker \
    dotnet4.6.1 \
    visualstudio2017buildtools \    
    azure-cli \
    azurepowershell \
    --confirm \
    --limit-output \
    --timeout 216000 \
    && rmdir /S /Q C:\chococache

# common node tools
RUN npm install gulp -g && npm install grunt -g && npm install -g less && npm install phantomjs-prebuilt -g

SHELL ["powershell", "-Command", "$ErrorActionPreference = 'Stop'; $ProgressPreference = 'SilentlyContinue';"]

# Install .NET Core
ENV DOTNET_VERSION 2.1
ENV DOTNET_DOWNLOAD_URL https://download.visualstudio.microsoft.com/download/pr/ce443d89-75f1-4122-aaa8-c094a9017b4a/255b06ace4207a8ee923758160ed01c3/dotnet-runtime-2.1.5-win-x64.zip

RUN Invoke-WebRequest $Env:DOTNET_DOWNLOAD_URL -OutFile dotnet.zip; \
    Expand-Archive dotnet.zip -DestinationPath $Env:ProgramFiles\dotnet -Force; \
    Remove-Item -Force dotnet.zip
    
# Install .NET Core SDK
ENV DOTNET_SDK_VERSION 2.1
ENV DOTNET_SDK_DOWNLOAD_URL https://download.visualstudio.microsoft.com/download/pr/28820b2a-0aec-4c24-a271-a14bcb3e2686/5e0ad8ae32f1497e8d0cace2447b9e01/dotnet-sdk-2.1.403-win-x64.zip

RUN Invoke-WebRequest $Env:DOTNET_SDK_DOWNLOAD_URL -OutFile dotnet.zip; \
    Expand-Archive dotnet.zip -DestinationPath $Env:ProgramFiles\dotnet -Force; \
    Remove-Item -Force dotnet.zip

#Install Agent
RUN mkdir C:\BuildAgent;

ENV VSTS_ACCOUNT_DOWNLOAD_URL "<agentdownloadurl>"

RUN Invoke-WebRequest $Env:VSTS_ACCOUNT_DOWNLOAD_URL -OutFile agent.zip; \
    Expand-Archive agent.zip -DestinationPath c:\BuildAgent -Force; \
    Remove-Item -Force agent.zip

SHELL ["cmd", "/S", "/C"]

RUN setx /M PATH "%PATH%;%ProgramFiles%\dotnet"

# Trigger the population of the local package cache
ENV NUGET_XMLDOC_MODE skip

RUN mkdir C:\warmup \
    && cd C:\warmup \
    && dotnet new \
    && cd .. \
    && rmdir /S /Q C:\warmup 

WORKDIR C:/BuildAgent

COPY ./start.* ./
CMD ["start.cmd"]

start.ps1

$ErrorActionPreference = "Stop"
$env:VSTS_ACCOUNT = ""
$env:VSTS_TOKEN = ""
$env:VSTS_POOL = ""

If ($env:VSTS_ACCOUNT -eq $null) {
    Write-Error "Missing VSTS_ACCOUNT environment variable"
    exit 1
}

if ($env:VSTS_TOKEN -eq $null) {
    Write-Error "Missing VSTS_TOKEN environment variable"
    exit 1
} else {
    if (Test-Path -Path $env:VSTS_TOKEN -PathType Leaf) {
        $env:VSTS_TOKEN = Get-Content -Path $env:VSTS_TOKEN -ErrorAction Stop | Where-Object {$_} | Select-Object -First 1
        
        if ([string]::IsNullOrEmpty($env:VSTS_TOKEN)) {
            Write-Error "Missing VSTS_TOKEN file content"
            exit 1
        }
    }
}

if ($env:VSTS_AGENT -ne $null) {
    $env:VSTS_AGENT = $($env:VSTS_AGENT)
}
else {
    $env:VSTS_AGENT = $env:COMPUTERNAME
}

if ($env:VSTS_WORK -ne $null)
{
    New-Item -Path $env:VSTS_WORK -ItemType Directory -Force
}
else
{
    $env:VSTS_WORK = "_work"
}

if($env:VSTS_POOL -eq $null)
{
    $env:VSTS_POOL = "Default"
}

# Set The Configuration and Run The Agent
Set-Location -Path "C:\BuildAgent"

& .\bin\Agent.Listener.exe configure --unattended `
    --agent "$env:VSTS_AGENT" `
    --url "https://$env:VSTS_ACCOUNT.visualstudio.com" `
    --auth PAT `
    --token "$env:VSTS_TOKEN" `
    --pool "$env:VSTS_POOL" `
    --work "$env:VSTS_WORK" `
    --replace

& .\bin\Agent.Listener.exe run

start.cmd

PowerShell.exe -ExecutionPolicy ByPass .\start.ps1

 

Cosmos Graph DB Automation with Gremlin

We’re getting started on a new project using a graph DB on Microsoft’s Cosmos DB, so we want to set up a process for Cosmos DB deployment automation. We use Azure Resource Manager Templates for automating our Azure resource deployment.

This post assumes you have previous experience with ARM templates, Gremlin, and Cosmos DB in general, so we’ll jump straight into some of the quirks that popped up in our setup. If you want some more basic info on Gremlin, check out our previous blog post or head over to the Gremlin documentation.

Graph DBs in Cosmos consist of 3 main components: a Cosmos Database account, one or more databases, and one or more graphs (collections) in each database. Database accounts can be provisioned using ARM templates, but you’ll have to find another way to create databases or graphs automatically. We do this with some .NET code that runs on startup of our application, which we’ll show in a bit.

Provisioning the Cosmos Graph DB Account

You can find the ARM template format for Cosmos DB accounts here.

Most of the settings are straightforward, and you can view the property descriptions for information on some of the various settings for Cosmos DB accounts.

However, you need to use some specific settings for graph databases. When we first created our DB account, we got a lot of strange, unspecific errors when trying to load a graph in the Azure portal because we were missing a few graph DB-specific settings. I was able to find those settings in Github’s Azure quickstart templates. The settings we’re concerned with are:

 	“kind”: “GlobalDocumentDb”
 	"tags": { "defaultExperience":  "Graph" }
 	"properties: {  "capabilities": [ {"name": "EnableGremlin"}  } ]

Creating the Database and Graph

Once we set up our database account, we’re ready to create our database and graph. You can do this manually, but we opt to do it automatically as part of our application startup using the DocumentClient class in the Microsoft.Azure.DocumentDB.Core NuGet package.

We need a couple of parameters to create our DocumentClient:  the .NET SDK URI and the primary auth key for the database account. You can find the URI in the “Overview” or “Keys” tab for your DB account (be sure to use the .NET SDK URI and not the Gremlin Endpoint) and the primary key in the “Keys” tab.

Cosmos db settings

We call the following method in our application’s startup:

public async Task SetupGremlinDb()
{
    DocumentClient docDbClient = 
            new DocumentClient(new Uri(_gremlinDbOptions.Uri), _gremlinDbOptions.AuthKey);
    await CreateDatabaseIfNotExists(docDbClient);
    await CreateDocumentCollectionIfNotExists(docDbClient);
}

This function creates a DocumentClient, which it uses to create our database and our graph, if necessary. In our first method, we check to see if the database already exists, and if it doesn’t, we create it.

public async Task CreateDatabaseIfNotExists(DocumentClient client)
    {
    // Check to verify a database does not exist
    var result = client.CreateDatabaseQuery()
            .Where(d => d.Id == _gremlinDbOptions.Database).AsEnumerable().FirstOrDefault();
    if (result == null)
    {
    // If the database does not exist, create a new database
    await client.CreateDatabaseAsync(new Database { Id = _gremlinDbOptions.Database });
    }
}

We also need the database ID – whatever we want to call the database. Next, we check to see if our graph already exists, and if it doesn’t, we create it.

public async Task CreateDocumentCollectionIfNotExists(DocumentClient client)
{

    var result = client.CreateDocumentCollectionQuery(
            UriFactory.CreateDatabaseUri(_gremlinDbOptions.Database))
            .Where(c => c.Id == _gremlinDbOptions.Collection).AsEnumerable().FirstOrDefault();
    if (result == null)
    {
        // If the document collection does not exist, create a new collection
        var collectionInfo = new DocumentCollection
        {
            Id = _gremlinDbOptions.Collection
        };

        // Here we create a collection with 400 RU/s.
        await client.CreateDocumentCollectionAsync(
            UriFactory.CreateDatabaseUri(_gremlinDbOptions.Database),
                collectionInfo,
                new RequestOptions { OfferThroughput = 400 });
    }
}

This code could also be reused for setting up multiple databases and graphs. Once that’s all set up, running this code should create our database and graph, and it’s ready to use. Depending on what sort of tooling / framework you use for your CI/CD process, you could likely further streamline this process.

Interested in learning more about our work? Check out some of our previous projects.