Docker and DevOps: Developing Stateful Applications and Deploying in Docker

DZone 's Guide to

Docker and DevOps: Developing Stateful Applications and Deploying in Docker

Check out what can happen when you make DevOps a part of the process when using stateful apps.

· Cloud Zone ·
Free Resource


This whitepaper discusses the strategies to build a stateful Java application using Jenkins Continuous Integration server, packaging the application as a Docker image, deployment of database and web server application as container in the Docker platform. This paper can be used as a reference for moving any application to a container and deployment in the Docker ecosystem.

Docker has become the leader in the container software platform category. RightScale's Cloud Report indicates that Docker usage is not limited to container alone. Usage of Docker in DevOps tools category is increasing continuously. In addition, Docker has been reducing the need for configuration management tools such as Chef, Puppet, and Ansible. Combining the power of Docker and the DevOps process, applications are developed quickly and deployed with end-to-end automation. This eliminates the issues faced in manual build and deployment and hosting in a fixed set of servers with dependency issues and constraints.


Docker has been widely used to deploy services and products and automatically customize them for the end users. This reduces the effort required to install the product and configure a different subsystem of the product, and saves the install and configuration time. In addition, Docker supports multiple activities within a software lifecycle. These include:

  • Packaging the application as a Docker image using DockerFile
  • Generating a Docker image from the Dockerfile using Docker Compose
  • Versioning different builds of the Docker images in Docker Registry
  • Using the Docker cluster ecosystem to deploy the containers with load balancing and high availability
  • Deployment of the application stack on the Docker machine

In this white paper, let us analyze these features from a DevOps perspective, explore in detail how different basic blocks of DevOps such as continuous integration, continuous delivery, and controlled/continuous deployment can be achieved, and best practices.

3. DevOps Process and Stages

DevOps broadly consists of elements like process, continuous integration, continuous inspection, continuous delivery, and others.

In this paper, continuous integration, delivery and deployment are discussed, including all the Docker operations related to these phases.

Continuous Integration

Docker has plugins for most continuous integration servers such as Jenkins, Bamboo, and TeamCity. In this white paper, Jenkins is used as the continuous integration tool.

Jenkins polls the version control system and triggers the build based on the code change. Regular best practice and quality checks are performed subsequently. These include building the code, static analysis, unit testing, and functional testing with quality threshold values defined. Once the quality gate checks are passed, the code is ready for deployment.

Dockerfile and Containerization

Docker provides base images for all the popular operating systems and business application servers. Dockerfile provides the support for developers to pack the binary file inside the Docker image.

Based on the application type, select the corresponding Docker image from Docker Hub. This image is the base image, on top of which is the application.

For example, if the requirement is a database engine with pre-populated data, then the base image is PostgreSQL, Oracle, MySQL etc. If the requirement is to run an application server, then it could be Tomcat, JBoss, etc.

Steps in Dockerfile is defined to copy the built binary into the base Docker image and the new Docker image is created once copied. This Docker image is then hosted in a private Docker registry or Docker hub. This application specific image is used to create containers for the application during deployment. Docker image for a simple binary package is created with few lines of instructions in Dockerfile. Applications that has more configuration steps requires many steps defined in the Dockerfile.

The Docker Compose utility interprets the steps in Dockerfile and generates a Docker image.

Image title

Packing the binary file that requires no configuration steps to be followed:

Image title

In this example, required additional software such as Ansible is installed first (yum install Ansible) and Ansible scripts to install and configure the database is copied inside the image. (ADD hosts provision.yml). Ansible script is then executed to prepare the environment and populate the database. In a nutshell, all the steps performed by the end customer to install and configure the software are performed and generated as a single Docker image which can be moved around and deployed with ease quickly.

Image title

Continuous Delivery

If the above strategy is followed, every build in the continuous integration server generates a Docker image which can be preserved, similar to versioning built binaries in the Artifactory at the end of a successful build. This will allow the development, verification, and the operation teams to choose and deploy a particular version of the Docker image with corresponding functionality/feature and thereby enabling a quicker delivery cycle.

Archiving in Docker Registry

Docker provides a private registry which can be used to host the binaries inside the organization or popular artifactory servers such as Nexus or JFrog provide Docker repository. Docker Hub can be used to share the binary publically. Build steps/plugins in Jenkins help in tagging the local image and transferring it to the Docker registry.

Image title

Application binaries produced by continuous integration servers are deployed into the lower staging environments such as development or testing immediately as part of continuous deployment. But deployment in the sensitive environment is controlled with automatic and manual approvals. Docker images stored in the registry are deployed into the environments with the help of Jenkins and Docker compose utility.

Docker Host/Swarm Environment

Docker is installed on top of the operating system which is hosted on the physical system directly or on the Virtual machine. Docker provides a complete ecosystem for hosting the containers. Docker isolates the container environment from the host operating system as well as from other containers. In most cases, a single higher hardware environment would be sufficient to host small applications in development which may need few containers to run its components such as database, web server, and more.

In case the application is big in size and needs several containers or the load on the application is high which needs to be balanced, Docker provides the clustering mechanism as part of Docker Swarm. Docker Swarm combines the Docker processes running on several hosts and provides combined resources to run the application. Balancing the load and orchestration is handled by Docker Swarm internally.

Docker persists the data in the host machine or shared storage through volumes. Docker volumes provide the option to map the mount inside the container to the storage location on the physical machine or mapped to the external storage through volume drivers. In a multi-host cluster environment, the external storage is preferred as the data needs to be shared across the hosts.

Deployment of Services in Docker

Docker images versioned/tagged in the Docker registry are deployed into the Docker environment through the Docker command line utility (CLI) or through Docker Compose. Docker Compose allows the engineer to define the deployment options for a container in a text file. All the containers in an application can be defined in the Compose file together and connected to each other through different options. A Compose file written for an application once can be reused multiple times on the same environment or on different environment.

Image title

Stateful Application

Applications developed before microservices architectures were developed with the state of the application persisting in a database or in physical storage. The current status of the transaction is decided based on the values persisted. This state is preserved across restart and used during backup and recovery if required.

For the comparison of development with or without Docker and DevOps, application with a single database and multiple web server instances connecting to the single database instance are considered.

Development Without DevOps and Docker

Historically, applications or enterprise systems are developed from the database/storage design. Over the years, the design evolved from a single application with heavy load on the database to a client-server architecture, a multi-layer architecture, or microservices on a cluster environment with high availability. Though the load on the database and business logic moved to different levels, the database layer still continues to be important.

In the environment where DevOps is still not implemented and practiced with tools like Docker, application development and deployment is manual and heavily relies on the documentation with manual steps.

In our example, with one database and many web server instances:

  • Developers need to pass the instructions on how to set up the database, install the web server, and bootstrap the database with required tables, users, populated fields, etc.

  • Subsequently, these instructions needed to be captured in install/configuration guides to share with the internal teams and end customer.

  • In every step, manual instructions have to be followed strictly and any mistake requires troubleshooting and effort from the team involved.

  • Depending on the issue, effort is required from internal engineers to support engineers, customers, and everyone else involved.

  • Apart from the effort, the impact on the business and cost associated with it is high.

The state of the application is stored in the database which is hosted in single server. Hence, dependency on the server is high and data needs to be backed up regularly. In the event of hardware failure or migration of the data or patching or upgrade or for scaling, spinning the new environment and making the same data available on the new system is time-consuming and error-prone.

Many production issues are reported when additional software is installed on the environment where the application is hosted. Some common issues are installation of a different version of Java Runtime Environment (JRE) and conflicts, and different modules of the same application requiring a different third-party version.

End-to-End Automation of Application Development

DevOps tools and processes help the team in automating the development and deployment phases of the software lifecycle. In each step, appropriate tools are evaluated and used. Manual steps are reduced. Docker plays important role in automation and eliminates manual errors and provides the ecosystem which is flexible for different integration and scaling.

During the initial stages of development, DevOps toolchain is prepared and configured with quality gates defined in each level. For example, Toolchain monitors the check-in, triggers the code analysis, performs unit testing, and posts the results for code review only if the threshold values in the quality gate is met. Once the binary is built, the deployment is completely automated with provisioning the required systems, installing the third-party software, and deploying the built binary. When this automation is configured with Docker, Docker containers are already prepared with bootstrapped database and web server instance which is ready to run on any environment. Hence, Docker reduces the effort on configuration management and optimizes the automation further.

In the example, of hosting an application with one database and multiple web server instance, the components are segregated based on the functionality. Database and related operations are considered as independent services and run on their own. It can be created as a container which can be hosted on the Docker environment. The web server and the application hosted with business logic can be considered as another service which can be connected to the database service with a well-defined interface. The database instance runs as container store the data in volumes which can be stored in the host machine’s storage or in common storage with volume drivers such as Flocker, EMC, NetApp, etc.

Docker clusters such as Swarm may move the container instances between available hosts and so there is a need to make the volume data available across the machines. The volumes of the Database container (which provides the state of the application) is stored in the shared volume and moved between the nodes with the help of volume sharing techniques such as Flocker.

Advantages of Using Docker and DevOps Together

DevOps and Development: Since the toolchain is configured with required quality gates and appropriate tools, the engineer can focus on their part of the development and need not focus on other things such as environmental issues, configuration problems, and software conflicts.

Install and Configuration: Since Docker images are prepared with the required third-party software and application already installed and preconfigured, installation and configuration issues are eliminated.

Migration and Upgrade: Containers running on one environment can be moved to another environment as long as the volume data is made available in the new environment. Docker offers support on sharing the volumes or movement of data which is inside the volume.

Scaling and Performance: When the load on the service increases, Docker provides support to replicate the service and spins new instances immediately. Since the required environment with the necessary software is already available in the container, scaling is easy.

High Availability and Maintenance: When the application is hosted in a Docker cluster environment such as Swarm or Kubernetes, multiple nodes are configured and prepared to share the volume data across nodes. Therefore, the dependency on one machine is eliminated and the Docker provides support for patching new rolling update and zero downtime.


Development of software applications with the help of a properly-defined DevOps toolchain with the appropriate tools and Docker reduces a lot of manual effort. This helps the organization in faster development and in producing deliverables frequently.

Docker is already popular in developing and hosting microservices in clustered environments with orchestration tools. With more support available now to persist the data and share across machines and environments, moving the applications which are stateful and have a database and another dependency is achievable. Cases studies are available and the features getting added to Docker and related systems indicate that many of the applications are migrating to Docker.


Application Development Consideration

  • Since the service may move between the machine, implement a connection pool mechanism and attempt to reestablish the connection before failing.
  • Implement session sharing logic and avoid changing the fields in the session between releases for rolling update, zero downtime, and scaling.


cd ,ci ,devops ,docker

Opinions expressed by DZone contributors are their own.

{{ parent.title || parent.header.title}}

{{ parent.tldr }}

{{ parent.urlSource.name }}