Sunday, April 28, 2019

Logstash - Send Data To Multiple Elastic Clusters

It is possible to send same data from one Logstash server to multiple Elasticsearch clusters. It is easy to configure, you just need to have multiple outputs definitions for the same events. However, it is not easy to find a clear example from the internet, so I've decided to make this blog.

My example Logstash configuration file:

input {
  file {
    path => ["/path/to/json/file"]
    start_position => "beginning"
    sincedb_path => "/dev/null"
    exclude => "*.gz"
  }
}

filter {
  mutate {
    replace => [ "message", "%{message}" ]
    gsub => [ 'message','\n','']
  }
  if [message] =~ /^{.*}$/ {
    json { source => message }
  }
}

output {
  elasticsearch {
    cacert => "/path/to/cert"
    hosts => "https://elasticsearch1.com:9243"
    index => "test-index-%{+YYYY.MM}"
    user => "logstash"
    password => "logstash"
  }

  elasticsearch {
    cacert => "/path/to/cert"
    hosts => "https://elasticsearch2.com:9243"
    index => "test-index-%{+YYYY.MM}"
    user => "logstash"
    password => "logstash"
  }
}

Example of JSON file:

{"foo":"bar", "bar": "foo"}
{"hello":"world", "goodnight": "moon"}

Note the JSON file content need to be in one line.

This setup is an all or nothing, if one of the output is down, the second will not work. You should think do you really need this setup at the first place, now you have two copy of data that you need to keep in sync, maybe output to one place, and have the role based control is a better option.

No comments: