ALM, DLM and Facing the Future

By | 2015-04-02T13:27:12+00:00 April 2nd, 2015|Application Lifecycle Management (ALM), Tools|0 Comments

I have a passion for databases. I have to admit that my passion sometimes runs afoul of developers looking to short-cut the data requirements of an application. It especially runs afoul of a software configuration management system that doesn’t recognize that databases exist at all.

Fortunately, the folks at Redgate share my passion. They also share the passion of delivery managers and database architects around the world. How can we take a system designed to develop, build and deliver applications and make it work with databases as well? It is a question that has plagued a vast majority of the clients I have worked with over the past few years.

Many of the questions have been answered by Redgate in their fairly recent venture into the world of DLM (or Database Lifecycle Management if you are not familiar with the acronym). DLM has been around for quite a while, but for some reason, it is only recently earning the attention it has always deserved. Most organizations rise and fall based on the data they generate, collect and/or analyze. Even with the understanding of that fact firmly in place, a majority of organizations still do not source control their databases, or if they do, it is done without the same care and attention that is applied to their application code. A significant number of database teams work in a fashion where they make a nod toward configuration management, but are still in the dark ages when it comes to true continuous integration, unit testing and automated deployments.

The bottom line is that the days of absolute control of the production data platform are over. The fact is that the days of absolute control are a myth anyway. For years, the DBA team would receive a list of scripts that needed to be run against production to support the latest release. Most often these scripts would be run manually by the DBA on duty. It was the same way for the Dev and QA systems. There was always that human factor involved because automating database builds and delivery was just not feasible. Or was it?

In the modern age of application development, most serious developers would scoff at the idea of manually deploying every file needed for a new release, yet that is exactly what happens with database changes associated with the release; they are manually applied, manually tested and then manually deployed. Every step of the way there is the opportunity for a tired DBA, up for the past 36 hours supporting a high maintenance system, to make a mistake. Sometimes the mistake is just annoying. However, sometimes the mistake can end up costing close to a $1,000,000 (not that I would know anything about that).

So what’s the answer? As you may have guessed, I am a big fan of Redgate tools. They get it. They know that databases deserve the same level of attention and respect that application code receives. In light of that fact, Northwest Cadence has teamed up with Redgate to provide some awesome opportunities to learn how to use the Redgate tools effectively. The first of which will be offered in Bellevue, Washington on May 15, 2015.   You can find the registration information for that event here:

About the Author:

Leave A Comment