Over a million developers have joined DZone.

Logstash - Quirky 'multiline'

DZone's Guide to

Logstash - Quirky 'multiline'

· Cloud Zone
Free Resource

Deploy and scale data-rich applications in minutes and with ease. Mesosphere DC/OS includes everything you need to elastically run containerized apps and data services in production.

Ever since I decided to use the ELK stack for parsing application logs (for example a Java stack trace), I have faced many hurdles making it work. After multiple iterations and explorations, I believe I have found the proper method to use the ‘multiple’ feature of Logstash. This article is not a claim of original invention. It is an attempt to document a widely used and queried feature of the Logstash tool.
My initial experience with ‘multiline’ led me to stop trying to make it work. Instead, I ended up treating each line of input separately, an experience that I described in the article titled ‘Using Multiple Grok Statements to Parse a Java Stack Trace’ ( http://java.dzone.com/articles/using-multiple-grok-statements).

A couple of days ago, I was forced to reconsider my view regarding ‘multiline’ and I had to revisit the feature. An extract of the input that I used is given below.

ERROR, 2015-04-09 06:08:42, Type: Application, Error message: Failed to execute the command 'UpdateCommand' for table 'CS_Item'; the transaction was rolled back. Ensure that the command syntax is correct.

ERROR OCCURRED FOR: at SynchronizeToDCS() method in using block.. BusinessUnit = 223417 and Source =

ERROR DETAILS STACK:    at Microsoft.Synchronization.Data.ChangeHandlerBase.CheckZombieTransaction(String commandName, String table, Exception ex)
   at Microsoft.Synchronization.Data.SqlServer.SqlChangeHandler.ExecuteCommand(IDbCommand cmd, DataTable applyTable, DataTable failedRows)
   at Microsoft.Synchronization.KnowledgeSyncOrchestrator.Synchronize()
   at Microsoft.Synchronization.SyncOrchestrator.Synchronize()
   at MyStoreSyncPullService.SynchronizeToDCS(DataRow dr, String SessionId)

ERROR SOURCE: Microsoft.Synchronization
The Problem: Using multiline in the file block

As tried earlier and suggested by multiple forums on the Internet, I tried parsing the data using a ‘multiline’ codec in the ‘file’ block, which was placed in the ‘input’ section of the script. The script worked, but after some hiccups due to the way Logstash handles files on Windows. The ‘file’ block is shown below.

file {
  path => "application-log.txt"
  type => "application_log"
  start_position => "beginning"
  codec => multiline {
    pattern => "%{GREEDYDATA:appLogLevel}, %{TIMESTAMP_ISO8601:appTimestamp}, %{GREEDYDATA:appLogDetails}"
    negate => true
    what => previous

This script block is able to successfully parse data from the said input file. But, if the same multiline is placed inside a ‘tcp’ block, we get unpredictable results. This is because the ‘tcp’ feature does not understand line breaks and breaks data at arbitrary positions (which seems to be determined by buffer size). This results in the input data getting broken at places that cannot be predicted, which creates a problem for the multiline filter.

The Solution: Using multiline in ‘filter’ block

The solution to overcome this problem is to move the multiline feature from the ‘input’ block into the ‘filter’ block. The advantage of placing the ‘multiline’ block in the ‘filter’ section of the script is that it works for file input as well as TCP/IP input. While making this change, it is important to note that ‘multiline’ is a ‘codec’ in the ‘file’ block of the ‘input’ section, while it is a block by itself in the ‘filter’ section.

Parsing the data

Once multiple lines of data have been merged by the ‘multiline’ block, we need to follow it up by a suitable ‘grok’ block, which will allow us to parse the data and split it into the relevant fields, for further processing. An extract from a script that illustrates this concept is given below.

multiline {
    patterns_dir => "./patterns"
    pattern => " %{CUSTOM_ERROR_LABEL_2}%{SPACE}%{TIMESTAMP_ISO8601:logTimestamp}%{SPACE}%{GREEDYDATA:logDetails}"
    negate => true
    what => previous

grok {
    patterns_dir => "./patterns"
    match => ["message", "%{CUSTOM_ERROR_LABEL_2}%{SPACE}%{TIMESTAMP_ISO8601:logTimestamp}%{SPACE}%{GREEDYDATA:logDetails}" ]
    add_field => { "subType" => "error" }

In Summary

By changing the placement of the ‘multiline’ block and using the proper patterns, I was able to parse application log input that spans multiple physical lines and finally crack this puzzle. But, a problem still remains. How to ensure that the other blocks that follow are not inadvertently concatenated with the current block?

Discover new technologies simplifying running containers and data services in production with this free eBook by O'Reilly. Courtesy of Mesosphere.


Opinions expressed by DZone contributors are their own.


Dev Resources & Solutions Straight to Your Inbox

Thanks for subscribing!

Awesome! Check your inbox to verify your email so you can start receiving the latest in tech news and resources.


{{ parent.title || parent.header.title}}

{{ parent.tldr }}

{{ parent.urlSource.name }}