logstash grok TIMESTAMP_ISO8601 type?
You can, but you need another filter to convert it to date.
filter {
date {
match => [ "timestamp", ISO8601 ]
}
}
The usual usage is to set the @timestamp
field this way. But if you want to do it to another field and leave @timestamp
alone:
filter {
date {
match => [ "timestamp", ISO8601 ]
target => "target_timestamp"
}
}
Which will give you a field named target_timestamp
that will have the ElasticSearch data-type you're looking for.
clay
Updated on June 13, 2022Comments
-
clay almost 2 years
I have a simple logstash grok filter:
filter { grok { match => { "message" => "^%{TIMESTAMP_ISO8601:timestamp} %{NOTSPACE:name} %{WORD:level} %{SPACE} %{GREEDYDATA:message}$" } overwrite => [ "message" ] } }
This works, it parses my logs, but according to Kibana, the timestamp values are output with data type
string
.The logstash @timestamp field has data type
date
.The grok documentation says you can specify a data type conversion, but only int and float are supported:
If you wish to convert a semantic’s data type, for example change a string to an integer then suffix it with the target data type. For example %{NUMBER:num:int} which converts the num semantic from a string to an integer. Currently the only supported conversions are int and float.
That suggests that I'm supposed to leave it as a string, however, if the index supports datetime values, why would you not want it properly stored and sortable as a datetime?
-
clay about 7 yearsThanks! Can you show me the usual usage with setting @timestamp and not creating a new field? That's what I want.
-
sysadmin1138 about 7 years@clay That's my first example. The
target => "@timestamp"
is implicit.