Subtle Art of Leveraging Docker Compose for Test Automation
Learn to set up Docker Compose and push to Dockerhub.
Join the DZone community and get the full member experience.
Join For FreeMany of us over the years have executed test automation against a production-like test environments. By the term "production-like," I mean test environments that have the same set up as a production environment but may or may not have the exact configuration as that of the production environment.
However, when it comes to executing test automation against the test environments, there is always a certain degree of challenge (although solvable) that an engineer faces. Classic examples include:
- The QA engineer performing manual/exploratory testing on the test environment and at the same time, the CI pipeline gets triggered due to a new commit. This deploys the current build snapshot, triggering automated tests that might hamper the already done test data set up.
- This may be resolved with the design of the test suites and the test framework but parallel execution is not possible in order to support specific use cases.
- Overall cost associated with a dedicated full-fledged production-like test environment.
These are some of the challenges that can be solved by executing test automation using Docker Compose. Some of the classic challenges that we were able to solve with this approach:
- The point that running tests against the Docker images instead of full-fledged test environment is better is always debatable. But at the end of the day, the purpose of test automation is to certify the functionality that is baked into these Docker images and not the infrastructure set up. There can definitely be a subset of tests that can be run against the real production-like environment — preferably on a system integration test or development environment. But a Docker Compose-based setup can be used to run the time-consuming regression suite.
- Multiple isolated environments can be created, possibly in the same host, and test suites can be run in isolation. The biggest advantage here is instead of being dependant on the automation framework level parallelization, we are now able to leverage the parallelization offered by the underlying infrastructure, the isolated environment created by Docker Compose in this case.
- Do test data setups and tear downs at test case or test suite level without thinking much about what the impact will be of this when multiple test suites are running in parallel. When parallelization is done at the automation framework level, the execution still points to a single environment. But when parallelization is done by executing tests against isolated environments the responsibility shifts to underlying infrastructure level, thereby bringing in a certain degree of segregation of concern.
- If a hotfix has to be deployed and regression has to be executed against the hotfix, then instead of deploying the snapshot on the testing environment as a result of which, ongoing sprint testing and deliverables will be put up on hold, spinning up independent environment using Docker Compose and certifying the hotfix is far more convenient.
- For running backward compatibility tests, if there is a need to deploy a current production snapshot, spinning up an independent environment using Docker Compose, creating the test data in the state that is currently in Production, then upgrading to the current release snapshot and verifying if the new changes are not breaking the build can be done far more effectively.
- Copying the required data which will be used for future reference, automated execution test reports in our case, using volumes to the host can be leveraged.
In this article, we will see an example of how to run a Rest API test automation suite against a microservice using Docker Compose . We will:
- Create a Docker image of microservice and push it to Docker Hub.
- Create a Docker image of REST API automation and push it to Docker Hub.
- Create a docker-compose.yaml that:
- Pulls the microservice Docker image from the Docker Hub and brings the service up.
- Pulls the Rest API Automation Docker image from the Docker Hub and brings the image up such that the automated test suite gets executed against the microservice in point 3-a.
- Copy the report of the test automation execution to a volume on the host on which the Docker containers are in action.
Now let us have look at each of these steps in detail.
Microservice Repository and Dockerfile

Rest API Test Framework Repository and Dockerfile
Step 1: Build Docker Images for Microservice Repository and Rest API Test Framework Repository
#Command to build Micro Service docker image
docker build . --tag shreyasc27/apiplayground:1.0.0
#Command to build Rest Api Automation Framework docker image
docker build . --tag shreyasc27/bestbuyapitests:1.0.0
#Command to list the tagged docker images
docker images
Step 2: Running docker images
Lists the Images Created as A Result of The Above docker build
Commands
Step 3: Push the created Docker Images in Docker Hub
#Push microservice Docker image into Docker hub
docker push shreyasc27/apiplayground:1.0.0
#Push Rest Api Automation framework docker image into Docker hub
docker push shreyasc27/bestbuyapitests:1.0.0
Step 4: Pushed Images Listed in Docker Hub
Step 5: Add Local Directory to be Mounted into Docker Containers
Step 6: Inspect the Configuration of the docker-compose.yaml
Step 7: Run docker-compose up
Step 8: Result Files Generated in The Mounted macOS Directory
Step 9: Detailed HTML Report from The Mounted macOS directory
The video tutorial for the steps can be viewed here.
In order to achieve steps above needs strong coordination between Devs, QAs and Ops. Without the support of the Ops, it will be relatively tricky to achieve this goal.
Further Reading
Keeping 'Hotfix' and Production Database Schemas In Sync
Opinions expressed by DZone contributors are their own.
Trending
-
Managing Data Residency, the Demo
-
What I Learned From Crawling 100+ Websites
-
Does the OCP Exam Still Make Sense?
-
Building a Java Payment App With Marqeta
Comments