Convert timestamp timezone in Logstash for output index name

14,230

Solution 1

This is the optimize config, please have a try and test for the performance.

You no need to use mutate and date plugin. Use ruby plugin directly.

input {
    stdin {
    }
}

filter {
    ruby {
            code => "
                    event['index_day'] = event['@timestamp'].localtime.strftime('%Y.%m.%d')
            "
    }
}

output {
    stdout { codec => rubydebug }
}

Example output:

{
       "message" => "test",
      "@version" => "1",
    "@timestamp" => "2015-03-30T05:27:06.310Z",
          "host" => "BEN_LIM",
     "index_day" => "2015.03.29"
}

Solution 2

In logstash version 5.0 and later, you can use this:

filter{
ruby {
        code => "event.set('index_day', event.get('[@timestamp]').time.localtime.strftime('%Y%m%d'))"
    }
}

Solution 3

In version 1.5.0, we can convert timestamp by local timezone for the index name. Here is my configuration:

filter {
    ruby {
        code => "event['index_day'] = event.timestamp.time.localtime.strftime('%Y.%m.%d')"
    }
}
output {
    elasticsearch {
        host => localhost
        index => "thrall-%{index_day}"
    }
}
Share:
14,230
Davmrtl
Author by

Davmrtl

Updated on June 22, 2022

Comments

  • Davmrtl
    Davmrtl almost 2 years

    In my scenario, the "timestamp" of the syslog lines Logstash receives is in UTC and we use the event "timestamp" in the Elasticsearch output:

    output {
        elasticsearch {
            embedded => false
            host => localhost
            port => 9200
            protocol => http
            cluster => 'elasticsearch'
            index => "syslog-%{+YYYY.MM.dd}"
        }
    }
    

    My problem is that at UTC midnight, Logstash sends log to different index before the end of the day in out timezone (GMT-4 => America/Montreal) and the index has no logs after 20h (8h PM) because of the "timestamp" being UTC.

    We've done a work arround to convert the timezone but we experience a significant performance degradation:

    filter {
        mutate {
            add_field => {
                # Create a new field with string value of the UTC event date
                "timestamp_zoned" => "%{@timestamp}"
            }
        }
    
        date {
            # Parse UTC string value and convert it to my timezone into a new field
            match => [ "timestamp_zoned", "yyyy-MM-dd HH:mm:ss Z" ]
            timezone => "America/Montreal"
            locale => "en"
            remove_field => [ "timestamp_zoned" ]
            target => "timestamp_zoned_obj"
        }
    
        ruby {
            # Output the zoned date to a new field
            code => "event['index_day'] = event['timestamp_zoned_obj'].strftime('%Y.%m.%d')"
            remove_field => [ "timestamp_zoned_obj" ]
        }
    }
    
    output {
        elasticsearch {
            embedded => false
            host => localhost
            port => 9200
            protocol => http
            cluster => 'elasticsearch'
            # Use of the string value
            index => "syslog-%{index_day}"
        }
    }
    

    Is there a way to optimize this config?

  • Davmrtl
    Davmrtl about 9 years
    Ohh thanks! I didn't know that localtime could be called without any parameters.
  • Xin Chen
    Xin Chen over 8 years
    Why am I getting error: Ruby exception occurred: undefined method `localtime'
  • Admin
    Admin over 7 years
    filter { ruby { code => "event['index_day'] = event.timestamp.time.localtime.strftime('%Y.%m.%d')" } }
  • Aaron
    Aaron almost 6 years
    Update for more modern versions of Logstash: event.set('localdate', event.get('@timestamp').time.localtime.strftime('%Y-%m-%dT%H‌​:%M:%S.%3N%z'))