Over a million developers have joined DZone.
{{announcement.body}}
{{announcement.title}}

How to Install the ELK Stack on Google Cloud Platform

DZone's Guide to

How to Install the ELK Stack on Google Cloud Platform

Learn how to successfully set up, install, and run ElasticSearch, Logstash, and Kibana (ELK) — all on the Google Cloud Platform.

· Big Data Zone
Free Resource

Need to build an application around your data? Learn more about dataflow programming for rapid development and greater creativity. 

Trying to find the right cloud provider can be difficult, even when you are looking at it for your log and analytics needs. Here at Logz.io, we use the ELK Stack to monitor our cloud infrastructure as well as offer the log analytics platform as an AI-powered and enterprise-grade cloud service. So, we are always happy to give installation advice whenever we can.

Here, we’ll take a look at installing the ELK Stack on Google Cloud, the cloud infrastructure provided by the popular search engine.

This post is just a proof of concept, but Google Cloud offers a $300 credit for trial users that should be more than enough to deploy an application, get the ELK Stack set up, and prove that it will cover your needs should you want to move forward with Google Cloud.

Logistical Setup

First, you’ll need to set up a Google Cloud Platform account. This is relatively straightforward. Fill out the forms, and you will have your $300/360 Free Trial setup. Setting up your application is easy, it just requires getting the Gcloud SDK setup for CLI deployment.

For this example, we will use a simple ToDo application — the code can be found here. Since the application isn’t the focus here, we can deploy it and hit it a couple of times to ensure that there are some logs to export once we reach this step. The assumption here is that your application will be much more robust and in need of log management.

Getting Started With ELK Setup

Once your application is deployed, logs should start flowing. This means it’s time for the real set up to begin. You begin by setting up Elasticsearch and Kibana on our Google Cloud installation. This is done via the Networking Console in the web interface. To get these pieces in place, you will need to set up firewall rules:

set up google cloud firewall rules

We’ll add Elasticsearch and Kibana with the IP range set to 0.0.0.0/0 and TCP protocols set to 9200 for Elasticsearch and 5601 for Kibana. (Note: Both are all lowercase as the method prescribed by the Google Cloud Firewall Rules interface.) They should appear in the firewall list as follows:

google cloud firewall list

Now, you can begin to set up the actual ELK Stack. The first step is to set up an instance in Google Cloud Platform. Once the instance is setup, you can SSH into it using $ gcloud compute ssh #{instancename}.

Here you will begin the installation of Java, Elasticsearch, Logstash, and Kibana. Using the following command sets, each will be installed on your instance. (This assumes that you have gone through the steps with the CLI to establish a connection.)

Install Java

Java installation is simple and straightforward. If you have experience with the programming language, you know to use the following command to make sure that you have the latest, stable version of Java on the instance:

$ sudo apt-get install default-jre
#Installs java and ensures we are using an up to date version. ELK stack requires version 1.8+

Install Elasticsearch

$ wget -qO - https://packages.elastic.co/GPG-KEY-elasticsearch | sudo apt-key add -
# This will fetch the latest ElasticSearch Version for us

$ sudo apt-get install elasticsearch
# This will complete the installation

You will need to adjust the configuration of Elasticsearch so that you can ensure that the network host is correct. To do so, edit the elasticsearch.yml file:

$ sudo vi /etc/elasticsearch/elasticsearch.yml

Find the line referring to the network.host portion. It will be commented out. Uncomment the file and make it read network.host “0.0.0.0” — jut be sure to save the file before exiting.

Back in the console, restart Elasticsearch:

$ sudo service elasticsearch restart

Install Logstash

Your next step is to install Logstash. To do so, use the following commands:

$ sudo apt-get install apt-transport-https
# This setups installs for logstash in your system

$ echo "deb https://artifacts.elastic.co/packages/5.x/apt stable main" | sudo tee -a /etc/apt/sources.list.d/elastic-5.x.list
# This will establish the source for Logstash

$ sudo apt-get update
$ sudo apt-get install logstash
# Setting up for Logstash installation

$ sudo service logstash start
# Start the logstash service so we can start shipping logs

Install Kibana

Finally, do the same for the Kibana service so that you will get nice visualizations on your logs:

$ echo "deb http://packages.elastic.co/kibana/5.3/debian stable main" | sudo tee -a /etc/apt/sources.list.d/kibana-5.3.x.list
# This will establish the source for Kibana

$ sudo apt-get update
$ sudo apt-get install kibana
# Setting up for Kibana installation

Similar to Elasticsearch, Kibana will need some configuration adjustments to work. Take a look at kibana.yml to make these changes:

$ sudo vi /etc/kibana/kibana.yml

Find the lines referring to server.port and ensure they say server.port: 5601 and server.host: “0.0.0.0”. It should only be necessary to uncomment these lines. It may also be necessary to adjust the SSL verify line elasticsearch.ssl.verificationMode: none if you are seeing problems once everything is up and running.

Once this is done, you can start the Kibana service:

$ sudo service kibana start

Success! Well…mostly.

If everything works as planned, you should be able to use the server’s IP with port 5601 specified to see if your setup was successful:

configure index pattern

You’ve come a long way and things are looking up, but you still need to pipe in some logs. For this example, we are going to install MetricBeat to snag system metrics. The installation steps are as follows:

$ sudo apt-get install metricbeat
# Similar installation to the other features of the ELK stack

$ sudo service metricbeat start

Once MetricBeat is up and running, set it as an index pattern in the Kibana management screen as metricbeat-*. This will begin the shipping of logs, and you will have your ELK Stack set up:

shipping logs to google cloud

There are other beats that can deliver other logs, and you can stack various beats. But just for our purposes here, we have shown you how to successfully run logs on the Google Cloud Platform with the ELK Stack.

Check out the Exaptive data application Studio. Technology agnostic. No glue code. Use what you know and rely on the community for what you don't. Try the community version.

Topics:
big data ,logs ,elk ,logstash ,kibana ,tutorial ,elasticsearch

Published at DZone with permission of PJ Hagerty, DZone MVB. See the original article here.

Opinions expressed by DZone contributors are their own.

THE DZONE NEWSLETTER

Dev Resources & Solutions Straight to Your Inbox

Thanks for subscribing!

Awesome! Check your inbox to verify your email so you can start receiving the latest in tech news and resources.

X

{{ parent.title || parent.header.title}}

{{ parent.tldr }}

{{ parent.urlSource.name }}