In traditional software development, the professionals who were responsible for building a company's applications were referred to as development. The team that tested the applications was QA management. At this point, the program would be handed off to operations, which would then be responsible for maintenance and update management.
This methodology was called waterfall development, and it had obvious shortcomings. Firstly, it created divides, or silos, in which these separate teams functioned. Each performed a specific task with little regard for the requirements of another team. Quality assurance, for instance, might not actually be given enough time to adequately test a solution before the projected release date. Likewise, systems operations might run into unforeseen hurdles that keep the software from functioning as intended. Often, the result was a cumbersome software creation process that incited animosity between departments.
As implicit in the name, DevOps was meant to bring development and operations teams together into a more cohesive IT staff. QA management, likewise, was absorbed into what essentially became a collaborative team that worked together toward a common goal. The idea here was to eliminate the kinks in the product development and deployment chain by dissolving the barriers between departments. Different organizations have managed to achieve this with varying degrees of success in devops agile operations and continuous delivery.
A case could be made that those companies that didn't succeed in their transition to DevOps did not adequately define it – and herein lies a significant area of contention for organizations and industry experts.
Philosophy or Process?
According to DevOps.com contributor Vincent Geffray, DevOps is a philosophy first, mainly because there is no hard-and-fast rule for how teams should behave in a DevOps environment.
"There are many articles out there that try to define DevOps, but DevOps is more of a philosophy than a procedure," Geffray wrote. "If you polled 100 organizations on their experiences implementing a DevOps strategy within their existing business processes, you'd likely get 100 different answers. An organization's journey toward DevOps is usually unique, since there are so many different factors that influence a DevOps integration."
This view makes sense, considering all organizations are different, and will have varying objectives driven by any number of unique industry circumstances. Thus, the denotation of the term "process" which Merriam-Webster lists as "a series of actions that produce something or that lead to a particular result," doesn't seem to fit with this need for flexibility, as the "series of actions" and the "particular results" will vary significantly between organizations that claim to have DevOps teams. The only constant between these DevOps-inspired organizations is therefore the underlying philosophy informing these processes. DevOps.com contributor Mike Schmidt sums up this thought model in the following way:
"[A] culture and environment where building, testing and releasing software can happen rapidly, frequently and more reliably."
Processes Are Like Branches of the DevOps Tree
That said, the DevOps philosophy cannot be entirely divorced from processes, much like the branches of a tree cannot be disassociated with the trunk. This is where development models come into play. Schmidt supplies the example of continuous delivery, which entails building a solution in such a way that it can be released at any point in production. This doesn't necessarily mean that it has to be released in its crudest form, only that it hypothetically could, and that any potential loose ends would be tied up. Achieving this model requires extremely well-choreographed collaboration among developers, QA management, designers and other departments – so basically, an unremitting adherence to the DevOps philosophy.
Continuous delivery is essentially agile software development testing on steroids. The objective of agile is still to make defined builds for delivery. However, unlike waterfall development, these builds are far less cumbersome, and occur much more frequently, sometimes on a daily basis.
The Commonality of Collaboration and Organization Tools
It's clear that continuous delivery and agile are kin branches on the DevOps family tree. Therefore, despite being slightly different development models, they inevitably have a lot of overlap that manifests itself in a very concrete way, i.e. tools that accommodate the pace at which software is developed. Again, this will vary significantly between companies. However, one of the defining aspects of the DevOps mindset is the ability to make software fast – if need be – but also strong, and this requires resources that waterfall developers wouldn't have needed.
That's why many software development platforms provide real-time project tracking and collaboration. It's vital that developers, testers, designers and ops are always on the same page. Every time a new line of code is written, everyone needs to know about it. On the QA management side, ensuring that each new update functions well with the ancestor script requires frequent testing.
To this end, test automation integration is a huge help. A strong test management tool not only integrates seamlessly with the overall project management software; it also integrates with automation tools that allow regression tests to run automatically with each new build. This saves testers time that can be allotted to other less repetitive, mundane tasks such as user acceptance testing, load testing and other more hands-on test cases. These agile testing methodologies are vital to making sure that code is vetted for quality assurance as quickly as it's updated.
In conclusion, DevOps is really a culture first that drives concrete activity. This concrete activity might vary significantly between organizations, but at the end of the day, collaboration and flexibility will underpin them all.