Sunday, April 28, 2019

Logstash - AWS S3 Bucket As Data Input

You can use the "S3 input plugin" to stream events from files from a AWS S3 bucket. Each line from each file generates an event. Files ending in ".gz" are handled as gzip'ed files. Glacier files will be skipped.

Logstash version: 6.7.1

Here is a basic configuration for streaming data:

input {
  s3 {
    "access_key_id" => "1234"
    "secret_access_key" => "secret"
    "bucket" => "logstash-test-aws-s3-bucket"
    "additional_settings" => {
      "force_path_style" => true
      "follow_redirects" => false
    }
    "region" => "us-east-1"
    "prefix" => "logstash-"
    "type" => "s3"
  }
}

output {
  elasticsearch {
    cacert => "/path/to/cert"
    hosts => "https://elasticsearch1.com:9243"
    index => "test-index-%{+YYYY.MM}"
    user => "logstash"
    password => "logstash"
  }
}


Files in this "logstash-test-aws-s3-bucket" AWS S3 bucket start with "logstash-" will match (including folders).

Start your logstash with "logstash -f confg/s3-input-logstash.conf", you should start seeing data coming into your Elasticsearch cluster.

No comments: