SQL Server DevOps: Continuous Integration and Deployment with Redgate’s DLM Automation

By | 2017-11-15T19:16:33+00:00 November 15th, 2017|Application Lifecycle Management (ALM)|0 Comments

I recently wrote about how deploying changes to a SQL Server database has become much easier than ever before and highlighted the three primary methods of automating the deployment of databases, which are Redgate’s DLM Automation, Redgate’s ReadyRoll and Microsoft Visual Studio SQL Server Projects. In this installment, I will go in to detail on exactly how to use Redgate’s DLM Automation to configure a database for Continuous Integration and Continuous Deployment. As I briefly described in my earlier post, the Redgate tool is designed to work with a physical database that functions essentially as a local workspace for development. My development database is hosted on a local instance of SQL Server, but I will be deploying to an instance of Azure SQL Database.

The first thing that is needed and at the core of the Redgate Solution is Redgate’s SQL Source Control. SQL Source Control can leverage several different source control systems including Git, TFVC, SVN, Perforce and others. I will be using a Git repository hosted in a VSTS team project. RedGate provides basic access to Git for operations such as pull, commit and push, but for the initial configuration of your Git repository you will need to use a more fully functional Git client. I won’t go through the steps in setting up a Git repo since they’re a bit out of scope. However, with a Git repository in place and SQL Source Control installed on my local development environment, I will link my database to the repository by selecting the Link to my source control system option and then choosing Git from the Setup page.


Choose an existing Git repository and the linking is complete.


After linking the database I generally edit the source control filters so that user and role information is not added to source control. The filter itself is checked in to source control. Static data from lookup tables can also be added. In this case I will link the Product table’s data to identify changes that may need to propagate across environments.


Once I have made the changes to the filter and static data options, selecting the Commit page shows all the changes that need to be committed to my local repository.


After committing the changes, they need to be pushed to the central repository. In my example I am pushing to Master. However, with Git, this can be changed by using the Git command line to switch branches. The Redgate tool automatically picks up on the branch checkout and after a refresh indicates the current branch.


Now that the database schema and the linked Product table data are safely checked in to VSTS, it is time to create and test a build as well as configure an automatic deployment of the database changes to my Azure SQL Server. I will then make some changes to both the data and the schema to test the process.
On the VSTS Build page for my PartsUnlimited project, I create a new build definition and choose to start with an empty process.


With my empty build definition I choose an appropriate agent. ‘Hosted’ works, but a local build server can be configured easily if that is desired. Clicking on the Add Task link brings up the task selection.


The Redgate build task isn’t included in the default list of build tasks, so it’s off to the Marketplace to find and install it.


A quick search of the Marketplace reveals four available extensions. The ones I am interested in for this exercise are the DLM Automation build and deploy extensions which can be quickly installed to VSTS or downloaded and installed to a local instance of TFS.


Once I have the extensions installed, I can choose the Redgate build task to add to my build definition and configure it to build my database. In the list of operations for the build task are four options:

  • Build a database package from Redgate Source Control
    •   Creates a NuGet Package
  • Test a database package built by DLM Automation using tSQLt tests
    • Runs database unit tests
  • Publish a database package built by DLM Automation to a NuGet feed
    • Publishes a NuGet package to a feed
  • Sync a DLM Automation Package to a target database
    • Deploys database changes contained in a NuGet package

For this example I am only going to use the Build package option. However, just like in an application build, Unit Tests should not be neglected. Redgate has created a UI that utilizes the open-source tSQLt framework (http://tsqlt.org) to create tests that exercise database functionality. Adding the build task and choosing the test option will enable the configuration of selected tests to be run that were created and committed to source control. If you are utilizing a specific NuGet feed for deployments you can add a task to publish a built package. Finally, if deploying to a specific database at the end of the build is desired, than a task can be configured to perform that operation as well. Since I want to manage my deployments to multiple environments with a managed workflow, I will use the deployment extension to deploy my database instead of the sync option of the build extension.

To configure the build task, I choose the Build a database package option and specify a name for my NuGet package. I end the name with an underscore because I want my package to take on the format of PackageName_MajorVersion.MinorVersion.BuildNumber. I specify the major and minor version and choose the option to append the build number to the package version to accomplish this.


Now that the configuration is complete, I will save and queue the build which runs and completes successfully.


To configure my build as a Continuous integration (CI) build, I configure the Continuous Integration trigger so that any future changes to the database that are pushed to my repository will cause the build to execute.


The next step is creating a release to deploy the changes to my Azure SQL Server. On the release page, clicking on the New Release button brings up the option for choosing an appropriate release template. Since I will be using the Redgate release task, I choose to start with an empty process which brings up the release configuration page. I need to ensure that my CI build is linked to the release as well as configuring the deployment to my first environment to be automatic (Continuous Deployment).


I will be deploying to three different environments, a user acceptance test environment, staging and production environments. I name my first environment UAT. Once I configure the UAT environment, I can use it as a template and clone it to create the Stage and Production environments. Clicking on the Task link in the UAT environment brings up the Release configuration page.
I added and installed the deploy extension at the same time I installed the build extension, so the Redgate Release extension is available in the list of deployment extensions.


Just like the build extension, the release extension offers four separate tasks that can be configured:

  • Create a database release
    • Creates a release which can be used to deploy changes. This option is very useful when there are specific scripts that need to be run along with the packaged changes contained in the NuGet package artifact created by the build
  • Deploy database changes from a package
    • Deploys changes directly to a database from the NuGet package
  • Deploy database changes from a release
    • Deploys the release created with the first option
  • Deploy database changes from a live database
    • This option can be used to reference a database instead of a release or NuGet package

For this example, I am going to keep it simple and deploy the changes from the NuGet package. This is the option I commonly use since it mirrors the typical activity of deploying application changes contained in a build to various targets.
Before configuring the deployment task, I want to create scoped variables so that after configuring the UAT environment I can clone the environment, which will also clone the variables. That way I only have to configure them once, except for their values.


Masking a value is simply a matter of selecting the lock icon which in turn masks the value and stores the encrypted value. Clicking on the lock again to unlock the value does not reveal the value.


The Redgate deployment task is not complicated. As mentioned, I pick the “Deploy from Package” option and then configure the package path. There is an option for specifying additional SQL Compare options in case there are aspects of your database that you want to omit from the comparison operation that generates the final changes to the database.


Selecting the second set of values, I use the variables defined earlier.


Now that I have the UAT environment configured, I can clone it to preserve the settings and variables for the Staging and Production environments.


After cloning, the variables and their scope have been created for me so all I have to do is update the values for the cloned environments.


Next I will configure the workflow to require approval prior to deploying to staging and production. I also verify that the UAT environment is set for automatic deployment, so anytime a new build is successfully completed, the deployment to UAT is executed.


For the staging and production environments, I configure the conditions so that the previous environment must have been successful and deployment has been approved. Approval can be configured so that all the approvers must approve the release in a specific order, or any order as well as configuring it so that only one of the approvers are needed.


The final step is testing the workflow. Submitting a change to my database will trigger the process. Since I linked the data in the Product table, I am going to increase the product’s prices, commit and push the changes to trigger the build.

UPDATE dbo.Product
SET Price = Price * 1.10


After pushing the changes to VSTS the CI build is triggered as expected


When the build completes successfully, it triggers the first deployment to UAT. When that deployment is successful the deployment to Staging waits for the approval process.


Checking on the Price data in staging before and after the release indicates the changes have propagated as expected:


After approving the deployment to staging and production, all environments were successful.


The changes deployed to the database are idempotent, which in case you have not run across that term perform, simply means (in the DevOps application of the term) that the process can run multiple times without the output changing. In other words, if I rerun the update, the product price will not be increased again, or if my update added a table, it wouldn’t try to re-add the table and introduce errors the second time around.

The Redgate DLM Automation extension is straightforward and fairly easy to configure. Of the three database deployment approaches I am covering, the DLM Automation extension is my preferred approach. There are a many more things that can be done with it that I have not covered. More complex schema and data changes that may be problematic with the conventional check-in and build process can be addressed with Migration scripts that are committed to source control as well. Compare options can be configured that ignore specific changes for each deployment. For those organizations where a deployment to production cannot be automated due to technical or regulatory reasons, a deployment package can be created that is executed manually. A full description of the tool can be found on Redgate’s website: https://www.red-gate.com/products/dlm/dlm-automation/.

About the Author:

Leave A Comment