Using Filebeat to Send ElasticSearch Logs to Logsene
Using Logsene and Filebeat to look at and analyze ElasticSearch logs
Join the DZone community and get the full member experience.Join For Free
One of the nice things about our log management and analytics solution Logsene is that you can talk to it using various log shippers. You can use Logstash, or you can use syslog protocol capable tools like rsyslog, or you can just push your logs using the ElasticSearch API just like you would to send data to a local ElasticSearch cluster. And like any good DevOps team, we like to play with all the tools ourselves. So we thought the timing was right to make Logsene work as a final destination for data sent using Filebeat.
With that in mind, let’s see how to use Filebeat to send log files to Logsene. In this post we’ll ship ElasticSearch logs, but Filebeat can tail and ship logs from any log file, of course.
The first step is the easiest — you just need to go to the Filebeat download page and get the package for your operating system. For the purposes of this article we’ve used Filebeat 1.0.1.
After you download the package you need to unpack it into a directory of your choice.
If you already have your Logsene application created — great! If not, please go here to get set up. You will need your token, which you can find on the integration page of Logsene at the top right menu:
Once you have the Logsene app token you are ready to configure Filebeat. To do that you first need to create a new configuration file called logsene.yml and put in it a configuration snippet similar to the one below:
filebeat: prospectors: - paths: - /opt/elasticsearch/2.1.0/logs/*.log input_type: log output: elasticsearch: hosts: ["https://logsene-receiver.sematext.com:443"] index: "aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee" logging: to_files: true files: path: /var/log/filebeat/ name: filebeat_eslogs.log rotateeverybytes: 10485760 level: info
A brief comment on the above configuration: the first section describes which log files should be read and sent to Logsene. In this example we are shipping logs from files ending with the .log in the /opt/elasticsearch/2.1.0/logs/ directory.
The second section, called output, tells Filebeat to send data to ElasticSearch. Yes — to ElasticSearch — because Logsene provides the ElasticSearch API. You’ll need to provide two properties here:
- The first one, called hosts, needs to point to https://logsene-receiver.sematext.com on port 443. You’ll want to use SSL, but you could also use HTTP and send data to port 80 if you don’t want to use SSL.
- The second option is the index, and you’ll need to specify your Logsene app token here. In the above example you can see a token of aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee, but you should use your own.
The third section specifies that you would like to know what Filebeat is doing (and you’ll want to save that information to file).
Running Filebeat With Your Configuration
You can now run Filebeat and use your configuration. To do that, run the following command:
$./filebeat -c logsene.yml
This tells Filebeat to use the configuration file you’ve created and send ElasticSearch log files to Logsene.
You can now go to your Logsene application and look at the logs you’ve sent:
Isn’t that fast and easy? ;)
Once your logs are in Logsene you can build all kinds of reports with Kibana, which is integrated into Logsene, you can get alerts based on data in your logs, you can invite your teammates, so you can all have access to all your logs in one place, and so on.
If you’re not using Logsene yet you can have a 30-day trial up and running in minutes — just sign up for a free account! There’s no commitment and no credit card required. And drop us an email or hit us on Twitter with suggestions, questions or comments about this post.
Published at DZone with permission of Rafał Kuć, DZone MVB. See the original article here.
Opinions expressed by DZone contributors are their own.