Over a million developers have joined DZone.
{{announcement.body}}
{{announcement.title}}

Database Delivery With Docker and SQL Change Automation

DZone 's Guide to

Database Delivery With Docker and SQL Change Automation

See the basics of how to automate database builds into a Linux SQL Server container running on Windows.

· Database Zone ·
Free Resource

See the basics of how to automate database builds into a Linux SQL Server container running on Windows and then back up the containerized database and restore it into dedicated containerized development copies for each developer and tester.

An obvious use for Docker images of SQL Server is to run up a working database from a backup quickly, maybe to test it or possibly to mask the data. We'll start by doing that in this article. We'll then use SQL Change Automation (SCA) to synchronize an empty copy of a development database in a Docker container with the latest build in source control and fill it with data ready for testing. Finally, we'll do a backup of the containerized database so we can restore it into each developer's local container. These techniques, combined with 'glue scripts,' can be used for supporting continuous delivery of databases.

As in most of my previous Product Learning articles, I'll take an automated approach because it is much faster and eliminates simple errors. It is also a 'DevOps' approach that allows faster, more frequent releases and eliminates much of the tedium of database deployment.

Docker containers offer a very different way of propagating databases to SQL Clone. The latter is a database-level 'push' service, delivering database copies to every user's machine, and is ideal if you need to distribute multiple copies of a large source database. The former is a server-level 'pull' service. The user initiates pulling a version of SQL Server and then loads into it a version of a database.

With Docker containers, tests are rather more complicated because you must use SQL Server security with ID and password. The SQL Server instance name has a port number applied to differentiate different containerized instances. If you need anything in the way of features or configuration other than what comes with the standard SQL Server image, you must create a new image from the image of the SQL Server version you choose, but we won't go into that in this article.

Creating a SQL Server Container, Stocked With Any Databases You Need

In Running Linux SQL Server as a Container, I explained enough of the principles behind the SQL Server Docker image to get you started. We'll continue this approach because this is a technology you need to play with and build confidence slowly. This is a conventional Linux container, and the database will work in the same way as other Linux-based relational databases. The Docker for Windows application manages the Virtual Machine in which the containers run. The SQL Server instance communicates with the outside world in at least four possible ways:

  1. By a port address that you specify (for ODBC, SMO, SSMS, and other Windows-based SQL Server tools)
  2. By running the core Linux SQL Server tools of sqlcmd and bcp (lowercase, please) via DockerExec.
  3. By opening a bash shell and running the Linux SQL Server tools
    1. By running mssql-cli,  to create a new one, and then using DockerExec into the new container

We'll just use the second method because it cuts down on dependencies and keeps things neat. We'll assume that you will want to restore one or more backups into your new instance. If you don't specify any, then that's fine; there is less work for the script to do!

Here is the script in its stripped-down form to allow you to refine it, check out the parts, and walk through it. Once you have it working the way you want, it would best turn into a function. I have introduced one or two checks that are there to show how they can be done but aren't entirely necessary. The ContainerID, for example, is fetched from the container because some routines still require it in Docker.

In the script, I scan through to spot errors, but some informational messages are worth checking; it always pays to check the logs.

Listing 1: Run a SQL Server instance in a container, copy and restore backup files

<#---- these parameters need to be changed for your requirements ----#>
  $VerbosePreference = 'Continue' # for debugging!
  ## Create an instance via docker
  $InstanceName = 'MyInstance' #the name of the sql server instance you will create
  $password = 'MyPassword' #the SA password which must conform to policy
  # The 'sa' password has a minimum complexity requirement (8 characters, uppercase,
  # lowercase, alphanumerical and/or non-alphanumerical)
  #https://docs.microsoft.com/en-us/sql/relational-databases/security/password-policy?view=sql-server-2017
  # otherwise it doesn't install completely.
  $Timezone = 'Europe/London' #The Timezone for the database
  $PortToUse = '1436' #The port that the containers port is mapped to
  $BackupNames = @(@{
          BackupName = 'WideWorldImporters-Full.bak';
        DatabaseName = 'WideWorldImporters'
      },
      @{
          BackupName = 'AdventureWorks2016.bak';
          DatabaseName = 'AdventureWorks'
      }) #The database backup names
  $BackupPath = "C:\Backup\" #The full path to the backup file
  $ImageName = 'mcr.microsoft.com/mssql/server'
  #$ImageName = 'mcr.microsoft.com/mssql/server:2019-CTP2.2-ubuntu'
  <# -------the processing starts here --------#>
  $LogErrors = @() # initialize variable for any install logfile errors
  <# Create the shared directory for the container if it doesn't exist (
  this is only necessary if you opt for path mapping rather than creating
  a volume #>
  if (-not (Test-Path -PathType Container "C:\config\$InstanceName"))
  {
      # we create the  directory if it doesn't already exist
      New-Item -ItemType Directory -Force -Path "C:\config\$InstanceName";
  }
  # now run the docker image as a container
  docker run -e "ACCEPT_EULA=Y" -e "SA_PASSWORD=$password"  `
         -e "TZ=$timeZone" -p "$($portToUse):1433" --name $InstanceName  `
         -v "$($InstanceName):/var/opt/mssql" `
         --restart unless-stopped  `
         -h $InstanceName -d "$ImageName"
  If ((docker ps --filter "name=$InstanceName" --format "{{.Names}}") -ne $InstanceName)
  {
      Throw "Failed to create SQL Server instance $InstanceName"
  }
  <# On a slow or cpu-throttled laptop you may need a brief sleep to ensure the container is 'alive' before making any calls to it #>
  Start-Sleep -seconds 2 
  <# it is still useful to have this container ID as some docker commands require it #>
  $Containerid = (docker ps --filter "name=$InstanceName" --format "{{.ID}}")
  <# check the install log for errors. e.g. Was that password OK? #>
  $LogErrors += docker logs "$InstanceName" | where { $_ -like '*Error *' }
  If ($LogErrors.Count -gt 0)
  {
      Throw "Errors were returned by SQL Server  $($InstanceName): $LogErrors"
  }
  <# Simple check to make sure the container is running #>
  $ContainerState = (docker inspect -f "{{.State.Running}}" $CONTAINERID)
  if ($ContainerState -ne 'true')<# returns a string 'true' if it is running #>
  { write-error "Docker container $InstanceName 'state.Running is $ContainerState" }
  $backupNames | <# Now copy the backup files to the container address space #>
  foreach {
      write-verbose "copying $($BackupPath)$($_.BackupName) to $($InstanceName)"
      docker cp "$($BackupPath)$($_.BackupName)" "$($InstanceName):/var/opt/mssql/data"
  }
  $Iterations = 3
  While ($Iterations -gt 0)
  {
      $result = docker exec -it "$InstanceName" /opt/mssql-tools/bin/sqlcmd `
                       -S localhost -U SA -P "$password" `
                       -Q "SELECT @@SERVERNAME"
      if ($result -like '*Error:*') #no exception is triggered  
      {
          # so if you got an error it means that it won't allow logins yet
          if ($iterations -gt 0)
          {
              write-verbose "the Container $InstanceName is still initializing"
              Start-Sleep -seconds 10
          }
      }
      else
      { $Iterations = -1 } #flag that it was successful
  }
  if ($Iterations -eq -1) { write-verbose "$InstanceName responds to queries" }
  if ($result -ne $InstanceName) #make sure that it reported its instancename
  { throw "container gave '$result' for ServerName query, not '$InstanceName'" }
  <# Specify where the database files go. First find out what there are #>
  $backupNames | foreach{
      $BackupName = $_.BackupName
      $DatabaseName = $_.DatabaseName
      write-verbose "Restoring $BackupName ..."
      $FileList = docker exec -it "$InstanceName" /opt/mssql-tools/bin/sqlcmd `
                         -S 127.0.0.1 -U SA -P "$password"    `
                         -Q "RESTORE FILELISTONLY FROM DISK='/var/opt/mssql/data/$BackupName'" -s "," -W
  <# Now build the SQL for the restore, specifying reach file #>
      $SQLCommand = "RESTORE DATABASE $DatabaseName
      FROM DISK='/var/opt/mssql/data/$BackupName'
      WITH RECOVERY"
      #now we must specify where every file should go
      $SQLCommand += $FileList |
      Select-object -Skip 2 | convertfrom-string -Delimiter ',' |
      Select P1, P2 | foreach {
          ",
      MOVE '$($_.P1)' TO '/var/opt/mssql/data/$(split-path -leaf $_.P2)'"
      }

      <# Now We have the SQL for this backup, we can execute it.#>
      docker exec -it "$InstanceName" /opt/mssql-tools/bin/sqlcmd `
             -S 127.0.0.1 -U SA -P "$password" `
             -Q $SQLCommand
  }

If successful, you'll see output like the following, and you'll have a running containerized SQL Server instance that you can access the database through SSMS:

Your container will be running a standard instance of SQL Server from the chosen image. This will create a container running a standard instance of SQL Server from the chosen image. If you want features that aren't in this image, you would need to create a new image with the features you need using the docker image build. An example of a docker file to do this is here. We'll tackle this in more detail in a later article! It is also possible to add features by running a BASH session on an existing script.

Building a Database Into a Container From a Script Directory

Imagine that you have the latest build and you want to test it. You can, of course, use SQL Server 2017 localdb to do this, but then you will be limited to what you can test on it. Instead, we'll create a Docker container hosting whatever version of SQL Server we want and stock it with the latest database build, from source control.

To get a running container, just use Listing 1 without specifying any backups to restore. If you already have an image with the same name, you'd have to stop and remove the previous reincarnation first. Then, in Listing 2 below, we create an empty database in the container using SMO and then use SCA's Sync-DatabaseSchema cmdlet the empty database with the latest development version of the database from the Git source code directory. If you are scripting, it is best to use SMO to create the database because this takes care of generating SQL code for the CREATEDATABASE command that is appropriate for the version of SQL Server.

Having built the database, the script will stock the tables with your test datasets, assuming you specify a data directory. There are two quick ways of adding the data, either using native BCP from outside the container or alternatively with native Linux BCP from within the container. This method outside the container isn't as fast as the 'internal' way, but it is good enough for our purposes.

Listing 2: Latest database build from source control, in a container

<#---- these parameters need to be changed for your requirements ----#>
  $ServerInstance = 'Server,1234'; #ensure the port number is right
  $DatabaseName = 'MyDatabase'
  $Uid = 'sa'
  $password = 'MyPassword'
  # a path for a scripts folder, created by SQL Compare or from SQL Source Control database repository
  $scriptDirectory = 'PathToMyGitDirectory\MyDatabase'
  # a path to a data folder, created a native BCP output of all the tables 
  $DataDirectory='PathToDataDirectory'
   # a database connection string 
  $MyConnection  = "Server=$ServerInstance;Database=$DatabaseName;User Id=$Uid;Password=$password;"
  <# -------the processing starts here --------#>
  $Errors = @()
  $MS = 'Microsoft.SQLServer'
  $My = "$MS.Management.Smo"
  $Mc = "$MS.Management.Common"
  <# ------- import any modules you need --------#>
  Import-Module SqlChangeAutomation    `
      -ErrorAction silentlycontinue -ErrorVariable +Errors
  Import-Module sqlserver    `
     -DisableNameChecking -ErrorAction silentlycontinue -ErrorVariable +Errors
  <# ------- Connect to the container --------#>
  $sqlConnection = new-object System.Data.SqlClient.SqlConnection($MyConnection )
  $conn = new-object ("$Mc.ServerConnection")($sqlConnection)
  $InstanceObject = new-object ("$My.Server") $conn
  if ($InstanceObject.Version -eq $null)
  {
    $errors += "Could not connect to the server $ServerInstance with the credentials"
  }
  else
  {
      <# -------Kill any existing version of the database --------#>
    If (@($InstanceObject.Databases | % { $DatabaseName }) -contains $DatabaseName)
    { $InstanceObject.KillDatabase($DatabaseName) }
      <# -------Create a new version of the database --------#>
    $DatabaseObject = New-Object ("$My.Database") ($InstanceObject, $DatabaseName)
    $DatabaseObject.Create()
      <# -------Add any database configuration stuff here --------#>
    if ($DatabaseObject.name -ne $DatabaseName)
    {
      $errors += "Can't create the database '$DatabaseName' in '$ServerInstance'"
    };

  }
  if ($Errors.Count -eq 0)
  {
     <# -------Synchronize the empty database as the target --------#>  
    $syncResult =
    Sync-DatabaseSchema    `
        -Source $scriptDirectory -Target $MyConnection    `
        -AbortOnWarningLevel None -SQLCompareOptions 'NoTransactions'   `
        -ErrorAction silentlycontinue -ErrorVariable +Errors
  }
  <# -------Fill the new database with data if required --------#>  
  <# Now we BCP all the table data in. As we are using native mode the utility
          disables constraints for the table before doing the import #>
  If (($DataDirectory -ne $null) -and ($Errors.Count -eq 0))
  {
    If ($DataDirectory -ne '')
    {
      $DatabaseObject.Tables | Select name, Schema |
      foreach {
        $filename = "$($_.schema)_$($_.Name)" -replace '[\\\/\:\.]', '-';
        $TheScriptPath = "$($DataDirectory)\$($filename).bcp";
        $WhatHappened =
        BCP "`"$DatabaseName`".`"$($_.Schema)`".`"$($_.Name)`""    `
          in "`"$TheScriptPath`"" -q -n -N -E "-U$($Uid)"    `
          "-P$($Password)" "-S$($ServerInstance)";
        if ($WhatHappened -like '*Error*')
        { throw ("$whatHappened $DestinationServer $DestinationDatabase $filename") };
        $result = $InstanceObject.ConnectionContext.ExecuteNonQuery(
          "EXEC sp_msforeachtable 'ALTER TABLE ? WITH CHECK CHECK CONSTRAINT all'");
      }
    }
  }
  <# -------Handle all soft errors here --------#>
  if ($Errors.Count -gt 0) { $errors | foreach{ Write-Error "$_" } }

Backing up a Containerized Database

Building a database and then backing it up will then allow other team members to pull an image of SQL Server and then restore the database into the container. It also makes plenty of test operations possible, particularly if they make changes to the databases.

Backing up a database is an interesting activity because it means writing to the file system within the container. If you make a backup from a container, you will then need to copy it out whether you are performing this backup by using docker or via SSMS.

The PowerShell script in Listing 3 copies the resulting file into the host directory you specify.

Listing 3: Backup a database in a container

<# -------Backup a named database --------#>
  <#---- these parameters need to be changed for your requirements ----#>
  $InstanceName = 'MyInstance' #the name of the sql server instance you will create
  $password = 'MyPassword' #the SA password which must conform to policy
  $DatabaseName = 'MyDatabase'
  $Destination = 'c:\Backup'
  <# -------the processing starts here --------#>
  $SQLCommand = "BACKUP DATABASE $DatabaseName
       TO DISK = '/var/opt/mssql/data/$DatabaseName.bak'
          WITH FORMAT;"
  docker exec -it "$InstanceName" /opt/mssql-tools/bin/sqlcmd `
         -S 127.0.0.1 -U SA -P "$password" `
         -Q $SQLCommand
  <# -------Copy from the container to the host file system --------#>
  docker cp "$($InstanceName):/var/opt/mssql/data/$DatabaseName.bak" "$Destination"

Conclusions

In this article, I've shown the basics of how one can use containers to automate a few common database development tasks. I haven't given you an entire worked script, as these tend to be rather intimidating, but it should be very easy to combine these components to create a scripted build and test process. I would keep a shared backup directory that I'd fill with the latest version of the databases required for the application and build all the test or development containers on the destination machines, taking the backups from the shared directory.

Docker containers are very useful, but they are most useful for simple database applications. I don't mind having to use SQL Server Authentication and can live with the complications of a foreign filesystem in a VM and having to specify a container instance by its port address. It takes more head-scratching, though, to obtain an image with any features out of the ordinary, and I haven't dared go into that sort of complexity of creating a custom image in this article.

Containers certainly take more server resources than SQL Clone does because containers duplicate an entire server rather than reference a shared VHD, as with SQL Clone. However, I reckon that both technologies have their place in speeding the delivery process.

Topics:
database ,tutorial ,sql server ,docker ,change automation ,database delivery

Published at DZone with permission of

Opinions expressed by DZone contributors are their own.

{{ parent.title || parent.header.title}}

{{ parent.tldr }}

{{ parent.urlSource.name }}