DZone
Thanks for visiting DZone today,
Edit Profile
  • Manage Email Subscriptions
  • How to Post to DZone
  • Article Submission Guidelines
Sign Out View Profile
  • Post an Article
  • Manage My Drafts
Over 2 million developers have joined DZone.
Log In / Join
Please enter at least three characters to search
Refcards Trend Reports
Events Video Library
Refcards
Trend Reports

Events

View Events Video Library

Zones

Culture and Methodologies Agile Career Development Methodologies Team Management
Data Engineering AI/ML Big Data Data Databases IoT
Software Design and Architecture Cloud Architecture Containers Integration Microservices Performance Security
Coding Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks
Culture and Methodologies
Agile Career Development Methodologies Team Management
Data Engineering
AI/ML Big Data Data Databases IoT
Software Design and Architecture
Cloud Architecture Containers Integration Microservices Performance Security
Coding
Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance
Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks

Because the DevOps movement has redefined engineering responsibilities, SREs now have to become stewards of observability strategy.

Apache Cassandra combines the benefits of major NoSQL databases to support data management needs not covered by traditional RDBMS vendors.

The software you build is only as secure as the code that powers it. Learn how malicious code creeps into your software supply chain.

Generative AI has transformed nearly every industry. How can you leverage GenAI to improve your productivity and efficiency?

Related

  • Top 10 Advanced Java and Spring Boot Courses for Full-Stack Java Developers
  • Advanced Functional Testing in Spring Boot Using Docker in Tests
  • Component Tests for Spring Cloud Microservices
  • A Comparison of Current Kubernetes Distributions

Trending

  • Implementing Explainable AI in CRM Using Stream Processing
  • Designing a Java Connector for Software Integrations
  • Securing the Future: Best Practices for Privacy and Data Governance in LLMOps
  • Cloud Security and Privacy: Best Practices to Mitigate the Risks
  1. DZone
  2. Testing, Deployment, and Maintenance
  3. Deployment
  4. Deploying Spring Boot Microservices to Multiple AWS EC2 Instances

Deploying Spring Boot Microservices to Multiple AWS EC2 Instances

Learn how to create and deploy microservices instances to multiple AWS EC2 instances.

By 
Rida Shaikh user avatar
Rida Shaikh
·
Updated Sep. 04, 19 · Tutorial
Likes (0)
Comment
Save
Tweet
Share
15.9K Views

Join the DZone community and get the full member experience.

Join For Free

Image title

Swarm of bees. But you knew that already.

In a previous tutorial we deployed services in a Docker Swarm using Docker stacks. We were using Play With Docker to simulate multiple nodes in Docker Swarm. In this tutorial, we will be starting multiple AWS EC2 instances and deploying the microservices on them using Docker Swarm.

You may also enjoy: 
Running Services Within a Docker Swarm (Part 1)


AWS EC2 Docker Swarm Tutorial

AWS EC2 Docker Swarm Tutorial

This tutorial is explained in the following YouTube video.


Getting Started

Starting Multiple EC2 Instances Using Docker Swarm

For this, you will need to register with Amazon web services and create an AWS account. When registering the service we will need to provide credit card details. AWS is free for a period of 1 year but there are some usage limitations. If these are crossed, then AWS will charge you. In this tutorial, we will be starting two AWS EC2 instances. Once we are done with this tutorial, do remember to stop/terminate the EC2 instances.

Once we have registered with AWS go to the Services section and select EC2. AWS services

AWS services

We will see the EC2 Dashboard, and we will see here that there are zero instances running. EC2 services

EC2 services

From the left side menu select the Security Group.EC2 Security Group

EC2 Security Group

For running Swarm in containers, Docker has created rules. We need to open the following ports:

Docker Container cloud port rules

Docker Container cloud port rules

Create new security group named Docker with the following inbound and outbound rules.

Docker Container inbound rules

Docker Container inbound rules

Docker Container outbound rules

Docker Container outbound rules

Next, go again to the EC2 home page and click on Launch Instance.EC2 services

EC2 services

Select the Amazon Linux 2 AMI (HVM) Machine.

Amazon Linux 2 AMI

Amazon Linux 2 AMI

Select the Instance Type as t2.micro, which is the default option. Select Configure Instance Details.Amazon Linux Instance Type

Amazon Linux Instance Type

Keep the default Configure Instance Details as provided and select Add Storage.Amazon Linux Configure Instance Details

Amazon Linux Configure Instance Details

Keep the default Storage setting and click Add Tags.Amazon Linux Storage setting

Amazon Linux Storage setting

In the Tags section add a new tag named ec1 and select Configure Security Group.

Amazon Linux Tags section

Amazon Linux Tags section

In the Configure Security Group section, select the existing security group named Docker that we had created previously. Amazon Linux Configure Security Group

Amazon Linux Configure Security Group

Finally, launch a new instance. Create a new key pair named ec1 and download the key named ec1.pem. Amazon Linux Launch

Amazon Linux Launch

Again, follow all the steps mentioned above for creating another EC2 instance. Only add the tag as ec2 and when launching the instance, don't create a new key pair but the existing key pair named ec1.pem So we have launched two EC2 instances. AWS EC2 instances

AWS EC2 instances

Next, using Putty, we will be connecting with them. For this, we will first need to convert the ec1.pem key to ec1.ppk format. This is done using PuttyGen using the following steps:

Open PuttyGen
Open PuttyGen

Open PuttyGen

Select the ec2.pem file from where you have stored it. Select save private key and save the key as ec2.ppk.
Select pem key in PuttyGen

Select pem key in PuttyGen

Next we will be connecting both the EC2 instances using Putty.
Open Putty

Open Putty

Open Putty instance. In the AWS portal, when you select EC2, there is a connect button which gives us details regarding connecting to the EC2 instance. EC2 instance details

EC2 instance details


In Putty, enter the Host from above as ec2-user@ec2-18-216-91-80.us-east-2.compute.amazonaws.com and in the SSH->Auth, select the ec1.ppk key. Click the connect button.

EC2 instance select key

EC2 instance select key

The EC2 instance is now connected using Putty.EC2 instance using puttyEC2 instance using putty

Similarly, connect to the second EC2 instance.
EC2 instance

EC2 instance

Starting Services on AWS EC2 Instances Using Docker Swarm

In both instances, install Docker by starting the Docker service on both EC2 instances.

 sudo yum install docker


EC2 instance install docker service

EC2 instance install Docker service

In the EC2 instance that will be the leader node, start Docker Swarm.

sudo docker swarm init


EC2 instance start docker service

EC2 instance start Docker service

In the second EC2 instance which will be the worker node use the join command as follows:

sudo docker swarm join --token <Token>


EC2 instance init swarm

EC2 instance init swarm

We can list the nodes in the Docker Swarm as follows:

sudo docker node ls


EC2 instance list docker service

EC2 instance list Docker service

Now as in the previous tutorial we will be creating the Docker Stack file named docker-compose.yaml as follows:

 sudo vi docker-compose.yaml


EC2 instance docker stack

EC2 instance Docker stack

The content of the file will be as follows:

 version: "3"
services:
  consumer:
    image: javainuse/employee-consumer
    networks:
      - consumer-producer
    depends_on:
      - producer

  producer:
    image: javainuse/employee-producer
    ports:
      - "8080:8080"
    networks:
      - consumer-producer 

networks:
  consumer-producer:


EC2 instance docker stack configuration

EC2 instance Docker stack configuration

Next deploy the Docker Stack to multiple AWS EC2 instances using the above-created stack file:

 sudo docker stack deploy -c docker-compose.yaml dockTest


EC2 instance docker stack deploy

EC2 instance Docker stack deploy

We can list the running services in Docker Swarm as follows:

 sudo docker service ls


EC2 instance docker services list

EC2 instance Docker services list

Also, by listing the running containers, we can find the employee consumer and employee producer services are running in which EC2 instances. Below we can see that employee consumer is running in the Manager Node while the employee producer service is running in the Worker Node.

 sudo docker container ls


EC2 instance docker container list

EC2 instance Docker container list

Also if we check the employee consumer logs, it can be seen that the REST service exposed by the employee producer is successfully consumed by the employee consumer.

 sudo docker container logs l3

EC2 instance docker container logsEC2 instance Docker container logsEC2 instance docker service logsEC2 instance Docker service logs



Further Reading 

Docker Tutorial: Play With Docker and Docker Swarm

AWS Docker (software) Amazon Web Services microservice Spring Framework Web Service Spring Boot

Opinions expressed by DZone contributors are their own.

Related

  • Top 10 Advanced Java and Spring Boot Courses for Full-Stack Java Developers
  • Advanced Functional Testing in Spring Boot Using Docker in Tests
  • Component Tests for Spring Cloud Microservices
  • A Comparison of Current Kubernetes Distributions

Partner Resources

×

Comments
Oops! Something Went Wrong

The likes didn't load as expected. Please refresh the page and try again.

ABOUT US

  • About DZone
  • Support and feedback
  • Community research
  • Sitemap

ADVERTISE

  • Advertise with DZone

CONTRIBUTE ON DZONE

  • Article Submission Guidelines
  • Become a Contributor
  • Core Program
  • Visit the Writers' Zone

LEGAL

  • Terms of Service
  • Privacy Policy

CONTACT US

  • 3343 Perimeter Hill Drive
  • Suite 100
  • Nashville, TN 37211
  • support@dzone.com

Let's be friends:

Likes
There are no likes...yet! 👀
Be the first to like this post!
It looks like you're not logged in.
Sign in to see who liked this post!