Why Continuous Delivery for the Database Matters
Join the DZone community and get the full member experience.Join For Free
“The day started at 6:30 this morning.”
“Unfortunately, I had a late afternoon meeting with my boss and emails have been piling up. Several of the development teams finished some work and their QAs are ready to test. As their DBA, I just need to quickly check over the scripts and push them out to the Test Database so the QA teams can get started. Done.”
“Now, to get to my real work…”
“In my IM, another DBA tried to push changes to staging, but they did it in the wrong order. Apparently the deploy steps were wrong. I think I can get them out of this, so I help him back-out the changes and get the deployment done.”
“Really? More emails? The QA Team found problems and the Developers need another push to Test. Two more teams need pushes as well. I don’t have time to review… well, I’ll just push it out. They’ll find any problems in test anyway. Done.”
“Seriously, is it already 5:15? I spent my entire day fighting fires instead of finishing the production database maintenance plan... again.”
“There has to be a better way.”
As it turns out, there IS a better way. But first, we need to understand why this is happening.
From the 10,000 ft. level, today’s businesses are struggling to keep pace with consumer demand. Not so much the demand for goods and services, but demand for the ideal customer experience. Consumers want to interact with their favorite companies when and where they choose, on the platform of their choice. And if they don’t receive the experience they’re looking for, the competition is literally only a click away. Coupled with this, the window of opportunity for companies capitalizing on market opportunities is dwindling for several reasons – fierce competition, the rapid pace of technological innovation and because of the demands of consumers themselves.
In response to this situation, the business has steadily increased the pressure on IT to help the company become more nimble. This demands that IT deliver products and services faster, and more often. And, IT has responded in-kind through the development and adoption of practices like Agile Development, Continuous Delivery, and DevOps.
Over the past decade, we’ve seen significant innovation of development, test, and release tools to support these new philosophies. Source code control systems, continuous integration servers, automated testing infrastructure and application release automation are all examples of tools developed to help IT deliver stable, high quality software faster.
Continuous Delivery, as a philosophy, is gaining rapid adoption among today’s enterprises. It borrows concepts from lean manufacturing and extends them to the process of creating and delivering software. It coaches us to view the entire SDLC as one process, and to map that process out to understand how value flows through the system. Once understood, Continuous Delivery coaches us to maximize the flow of value by identifying inefficiencies in the process, and improving those steps through a combination of standardizing and automating workflows.
Standardizing a workflow makes it repeatable. Once a workflow is repeatable, it becomes a good candidate for automation. Once it is automated, the workflow becomes faster, and the potential for human error is removed. In Continuous Delivery, the goal is to automate the entire process from development through test, and in some cases all the way through to production – creating a deployment pipeline.
Continuous Delivery works. Where it’s been successfully implemented, the IT organization is delivering more features – faster and with fewer errors. This, in turn, has given the business the ability to get to market faster while reducing operating expenses, thereby creating a sustainable competitive advantage.
But, this recent phenomenon is now causing a problem for the Database Team whose primary job is to provide a safe, high-performance environment for the company’s most valued assets – the data. That mandate is being threatened by the speed of innovation in development. Increasingly, DBAs have to figure out how to maintain the pace of development without sacrificing the quality and security of the data.
THAT is the challenge. THAT is why Continuous Delivery for the Database matters.
In DZone’s recently released Guide toContinuous Delivery, they report that Continuous Delivery for the database is a rapidly growing segment of interest among IT organizations. In research among over 900 IT professionals, 30% report having implemented Continuous Delivery for the database, and another 51% “want to implement.” While 30% is still only half the adoption rate for application Continuous Delivery, it represents a 9% growth rate over the past year for the Database. This suggests that a growing number of IT organizations are facing the same problems, and are working to address the constraints that database deployments have placed on the Continuous Delivery process, in the hopes of additional process improvement.
If you’d like to learn more about Continuous Delivery for the database, join us for a webinar we’re hosting next Wednesday, Feb. 18th from 12:00 – 1:00 pm EST. We’ll discuss the differences between Continuous Delivery for the database and for the application, their consideration for a Continuous Delivery implementation and the common challenges DBA teams face in supporting increased release velocity. We’ll also show what a typical Continuous Delivery pipeline looks like, and demonstrate how to build an automated database deployment pipeline using Datical DB.
Opinions expressed by DZone contributors are their own.