Over a million developers have joined DZone.
{{announcement.body}}
{{announcement.title}}

Why Should We Care About Containers for Development

DZone's Guide to

Why Should We Care About Containers for Development

The use of containers in development clears away various concerns surrounding application version control and interdepartmental deployment.

· Cloud Zone ·
Free Resource

Site24x7 - Full stack It Infrastructure Monitoring from the cloud. Sign up for free trial.

There has probably been more than one time in your development career where you've spent a few hours troubleshooting an issue only to find out it was a dependency or versions issue, right?

Environments varying from one to the next, outdated components and setting up development machines are frustrations we can all do without.

Some of these issues we've solved with VMs, but managing the entire machine and underutilizing them for each environment is costly. This is where containers have come to solve many challenges.

Why Containers

It's no doubt that you have heard the buzz about containers over the last year or longer. If not containers, then some technology, framework, or tooling associated with it; Docker, Kubernetes, and microservices are just a few of these. What are are the benefits from a developer perspective?

"Works on My Machine"

The scenario of each environment your code or application is deployed to could be different. How many servers, the configuration of CPU, RAM, etc., may vary from Development, QA, and Production and there is no guarantee the app will perform the same. Using containers provides the capability to define these parameters upon startup such that each image has the same requirements regardless of where it is deployed.

docker run --cpus=2 --memory=1.5g myapp

Microservices

Applications that are comprised of many services may not be developed in the same language or framework. If you are developing a service in .NET Core for instance that calls a downstream service written in Node.js or Go, the development team responsible for that service can build the container image and make it available in your private registry.

Using a docker-compose file, you can then bring in that container image without needing Node.js or Go installed on your machine and test your calls against the development version.

services:

  nodeservice:
    image: nodeservice:dev
    environment:
      - NODE_ENV=Development

  namesweb:
    depends_on:
      - nodeservice
    image: namesweb
    build:
      context: .
      dockerfile: namesweb/Dockerfile
    environment:
      - NODE_SERVICE_ENDPOINT=NodeService-Dev
    ports:
      - "57270:80"

Test Data

There are a rare amount of applications that do not involve data and maintaining an instance of your production system locally can be a task. Installing the tools, platform, server etc., as well as keeping all of these items up to date is above and beyond what you'd probably want to worry about as a developer.

Here in this docker-compose file an instance of SQL Server on Linux is started and with a script file, the test database is created, and finally, test data is loaded using the Bulk Copy tool.

version: '3.0'  
services:

  mssql:
    image: microsoft/mssql-server-linux:latest
    container_name: mssql
    ports:
      - 1433:1433
    volumes:
      - /var/opt/mssql
      # we copy our scripts onto the container
      - ./sql:/usr/src/app
    # bash will be executed from that path, our scripts folder
    working_dir: /usr/src/app
    # run the entrypoint.sh that will import the data AND sqlserver
    command: sh -c ' chmod +x ./start.sh; ./start.sh & /opt/mssql/bin/sqlservr;'
    environment:
      ACCEPT_EULA: 'Y'
      SA_PASSWORD: P@$$w0rd


start.sh

#!/bin/bash
wait_time=20s

echo creating resources in $wait_time  
sleep $wait_time  
echo starting...

echo 'creating Names DB'  
/opt/mssql-tools/bin/sqlcmd -S localhost -U SA -P $SA_PASSWORD -i ./init.sql

echo 'creating Names Table'  
/opt/mssql-tools/bin/sqlcmd -S localhost -U SA -P $SA_PASSWORD -i ./data/NameTable.sql

echo 'importing data...'  
# Bulk load data from a csv file
/opt/mssql-tools/bin/bcp Names in data/Names.csv -S 0.0.0.0 -U sa -P $SA_PASSWORD -d Names -c -t ','

echo 'add uniqueid column'  
/opt/mssql-tools/bin/sqlcmd -S localhost -d Names -U SA -P $SA_PASSWORD -I -Q "ALTER TABLE Names ADD ID UniqueIdentifier DEFAULT newid() NOT NULL;"

echo 'checking data'  
/opt/mssql-tools/bin/sqlcmd -S localhost -d Names -U SA -P $SA_PASSWORD -I -Q "SELECT TOP 5 ID,FullName FROM Names;"


init.sql & NameTable.sql

DROP DATABASE Names

CREATE DATABASE Names;  
GO

USE Names;  
GO

CREATE TABLE Names(  
  LastName nvarchar(max),
  FirstName nvarchar(max),
  FullName nvarchar(max)
);
GO  


Using this test database image a subset of the production data is loaded and available for the application to use. The container_name: mssql is set as the server name for the connection string. All that is needed for the application is to hit this server now is a change to the setting.

Deployment

Never is there a scenario of deploying applications that's easy, right? Rolling back on failures is an ever harder story and at times this is almost another deployment in itself.

Whether you are using a full CI/CD system to build your containers, command line tooling through docker build , or Visual Studio Docker Tools, all of the dependencies are isolated to the image when deployed to the host machine from a registry, it is still that same container.

Should there be problems with a deployment, things do go wrong, and rolling a deployment back a version can be as simple as changing the version number of an image from 1.1 to 1.0 and your app is reset.

Summary

Containers provide a packaging and deployment mechanism for our application and its dependencies. The container registry is a powerful concept that helps with the deployment and distribution of our application.

Containers also improve the "inner loop" of our development experience when working locally, particularly as we trend towards building microservices over monolithic applications. They provide greater parity across local and remote environments including the cloud and help our infrastructure to become more immutable.

The vibrant ecosystem of tooling around containers also help us consume cloud-native platforms and development methodologies. Whether we are using serverless containers, a PaaS platform that supports containers, or an orchestrator like Kubernetes, we focus on our application instead of thinking about and managing the individual host or hosts we deploy it to.

Resources

Site24x7 - Full stack It Infrastructure Monitoring from the cloud. Sign up for free trial.

Topics:
docker ,microservices ,containers ,kubernetes ,cloud computing ,distributed applications

Published at DZone with permission of

Opinions expressed by DZone contributors are their own.

{{ parent.title || parent.header.title}}

{{ parent.tldr }}

{{ parent.urlSource.name }}