DZone
Thanks for visiting DZone today,
Edit Profile
  • Manage Email Subscriptions
  • How to Post to DZone
  • Article Submission Guidelines
Sign Out View Profile
  • Post an Article
  • Manage My Drafts
Over 2 million developers have joined DZone.
Log In / Join
Refcards Trend Reports Events Over 2 million developers have joined DZone. Join Today! Thanks for visiting DZone today,
Edit Profile Manage Email Subscriptions Moderation Admin Console How to Post to DZone Article Submission Guidelines
View Profile
Sign Out
Refcards
Trend Reports
Events
Zones
Culture and Methodologies Agile Career Development Methodologies Team Management
Data Engineering AI/ML Big Data Data Databases IoT
Software Design and Architecture Cloud Architecture Containers Integration Microservices Performance Security
Coding Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks
Partner Zones AWS Cloud
by AWS Developer Relations
Culture and Methodologies
Agile Career Development Methodologies Team Management
Data Engineering
AI/ML Big Data Data Databases IoT
Software Design and Architecture
Cloud Architecture Containers Integration Microservices Performance Security
Coding
Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance
Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks
Partner Zones
AWS Cloud
by AWS Developer Relations
  1. DZone
  2. Testing, Deployment, and Maintenance
  3. Deployment
  4. Microservices Automation Deployment Using AWS and Docker

Microservices Automation Deployment Using AWS and Docker

Learn how to automate microservice deployment so you can focus on feature development.

Aritra Nag user avatar by
Aritra Nag
·
Oct. 31, 18 · Tutorial
Like (19)
Save
Tweet
Share
26.80K Views

Join the DZone community and get the full member experience.

Join For Free

Using Docker + AWS to Build, Deploy, and Scale Your Application

For this tutorial, you'll have a Docker application that automatically builds your software on commit and deploys it to an Elastic beanstalk sitting behind a load balancer for scalability. This continuous integration pipeline will allow you to worry less about your deployments and get back to focusing on feature development within your application.

1

Components:

  • AWS CodeCommit– source control (git)
  • AWS Code Build– source code compiler, rest runner
  • AWS Codepipeline– builds, tests, and deploys code every time the repo changes
  • AWS Elastic Beanstalk– service to manage EC2 instances handling deployments, provisioning, load balancing, and health monitoring
  • Docker + Spring Boot– Our containerized Spring Boot application for the demo

Application: Web Portal using Spring Boot

2.jpg

Repository: https://github.com/aritnag/DockerMicroServiceDEMO

The application creates a microservice.jar file when built using Maven. This file is important for us to reference in our Dockerfile.

Maven build: mvn clean install

This will produce target/microservice.jar. The Dockerfile below uses a flavor of Alpine Linux to add, expose and run the Spring Boot application.

FROM java:8
EXPOSE 9000
ADD /target/microservice.jar microservice.jar
ENTRYPOINT [“java”,”-jar”,”microservice.jar”]

1. Git Repository Initialization Using CodeCommit

First things first, we need a git repository to build our code from. AWS CodeCommit is cheap, reliable, and secure. It uses S3 which is a scalable storage solution subject to S3 storage pricing.

Begin by logging into your AWS console and creating a repository in CodeCommit. For the purpose of this tutorial, I have called the repository name the same name as the Spring Boot application. Once created, you will be presented with the standard HTTPS and SSH URLs of the repository.

3

4

1A. Configuring Identity and Access Management (IAM)

We can create a separate user for uploading the code and committing to the repository.

Here, we have used a root account which, by default, has the all the required policies.

1B. Moving the Code to the New CodeCommit Repository

With the new repository created, clone the GitHub repository holding our sample Spring Boot application. Change the remote to your new CodeCommit repository location, then, finally, push the master branch to master.

git clone https://github.com/aritnag/DockerMicroServiceDEMO.git
git remote set-url origin git://https://git-codecommit.us-east-1.amazonaws.com/v1/repos/DockerAutomateMS
git push master master

2. CodeBuild Setup

Now that the CodeCommit repository holds our sample Spring boot application, the code needs to be built for deployment. Navigate to CodeBuild. CodeBuild is a source code compiler which is pay on demand.

Start by creating a new build project and point the source to the AWS CodeCommit repository that was created in Step 1. We can see that I have pointed this new build project to the AWS CodeCommit source provider, and specified the DockerCodePipeline repository.

5

Next, it asks for environmental information. The default system image is fine for this build process. The most important part is to tell CodeBuild to use the buildspec.yml. The buildspec contains the necessary commands to generate the artifacts needed to deploy to the EBS.

6

Included in the sample Spring Boot application is a buildspec.yml. This file is used to tell CodeBuild what commands to run in each phase, and what files to bundle up and save in the artifacts.

buildspec.yml:

7

The final setup for the build process is to specify the location where the artifact made from the buildspec.yml will be stored. In the example below, I put all artifacts in the Amazon S3 under the name dockerAWSCodePipeline folder and in a bucket named dockerAWSCodePipeline. The bucket can be in bucket of your choice. We must go into S3 and create this bucket prior to creating the build project.

8

The build project is now configured and ready to use. Builds can manually be run from the console creating artifacts stored in S3 as defined above.

3. EBS Setup

Now that the code is in CodeCommit, and the artifacts are built using CodeBuild, the final resource needed is a server to deploy the code. That is where the Elastic beanstalk comes in useful. The EBS is a service that automatically handles provisioning, load balancing, auto-scaling, etc. It is a very powerful tool to help you manage and monitor your applications servers.

Let’s assume, for example, my API needs to have four servers due to the number of requests I am receiving. The EBS makes the scaling of those servers simple with configuration options.

Begin by creating a new webserver environment and give it a name and domain name. This domain name is your AWS domain name; if you have a personal domain name you can point it to this load balancer being created using Route53.

9.jpg

The last step of creating your webserver worker environment is to tell EBS that we want to run Docker and to use the example application code. Later, our code from CodeBuild will replace the AWS sample application.

10.jpg

The server and environment will take several minutes to start. Once complete, navigate to the configuration page of your new EBS environment.

11.jpg

By default, the environment has a load balancer installed and auto scales. A scaling trigger can be set to adjust the number of instances to run given certain requirements. For example, I could set my minimum instances to 1 and maximum to 4 and tell the trigger to start a new instance each time the CPUUtilization exceeds 75%. The load balancer would then scale requests across the number of instances currently running.

4. CodePipeline Configuration

This is the final piece of the puzzle which brings all steps 1-4 above together. We will notice that up until now we have had to manually tell CodeBuild to run, then would have to go to the EBS and manually specify the artifact for deployment. Wouldn’t it be great if all this could be done for us?

That is exactly what CodePipeline does. It fully automates the building and provisioning of the project. Once new code is checked in, the system magically takes care of the rest. Here is how to set it up.

Begin by creating a new CodePipeline. In each step. select the repository, build project, and EBS environment created in step 1-4 above.

12

13

Once complete the CodePipeline will begin monitoring changes to your repository. When a change is detected, it will build the project, and deploy it to the available servers in your EBS application. We can monitor the CodePipeline in real time from the pipelines detail page.

14

Conclusion

When configured properly, the CodePipeline is a handy tool for the developer who wants to code more and spend less time on DevOps.

This pipeline gives a developer easy access to manage an application, big or small. It doesn’t take a lot of time or money to set yourself up with a scalable application that utilizes a quick and efficient build and deployment process.

If we are in need of a solution to build, test, deploy, and scale your application, consider AWS CodePipeline as a great solution to get your project up and running quickly.

AWS Spring Framework Docker (software) application Continuous Integration/Deployment Spring Boot Repository (version control) Load balancing (computing) microservice

Opinions expressed by DZone contributors are their own.

Popular on DZone

  • Asynchronous Messaging Service
  • Keep Your Application Secrets Secret
  • A Beginner's Guide to Infrastructure as Code
  • Microservices Testing

Comments

Partner Resources

X

ABOUT US

  • About DZone
  • Send feedback
  • Careers
  • Sitemap

ADVERTISE

  • Advertise with DZone

CONTRIBUTE ON DZONE

  • Article Submission Guidelines
  • Become a Contributor
  • Visit the Writers' Zone

LEGAL

  • Terms of Service
  • Privacy Policy

CONTACT US

  • 600 Park Offices Drive
  • Suite 300
  • Durham, NC 27709
  • support@dzone.com
  • +1 (919) 678-0300

Let's be friends: