Logstash Update a document in elasticsearch
For the Elasticsearch output to do any action other than index
you need to tell it to do something else.
elasticsearch {
hosts => ["localhost:9200"]
index => "logstash-data-monitor"
action => "update"
document_id => "%{GEOREFID}"
}
This should probably be wrapped in a conditional to ensure you're only updating records that need updating. There is another option, though, doc_as_upsert
elasticsearch {
hosts => ["localhost:9200"]
index => "logstash-data-monitor"
action => "update"
doc_as_upsert => true
document_id => "%{GEOREFID}"
}
This tells the plugin to insert if it is new, and update if it is not.
However, you're attempting to use two inputs to define a document. This makes things complicated. Also, you're not providing both inputs, so I'll improvise. To provide different output behavior, you will need to define two outputs.
input {
file {
path => "/var/log/xmlhome.log"
[other details]
}
file {
path => "/var/log/jsonhome.log"
[other details]
}
}
filter { [some stuff ] }
output {
if [path] == '/var/log/xmlhome.log' {
elasticsearch {
[XML file case]
}
} else if [path] == '/var/log/jsonhome.log' {
elasticsearch {
[JSON file case]
action => "update"
}
}
}
Setting it up like this will allow you to change the ElasticSearch behavior based on where the event originated.
banu
Updated on June 27, 2022Comments
-
banu almost 2 years
Trying to update a specific field in elasticsearch through logstash. Is it possible to update only a set of fields through logstash ?
Please find the code below,
input { file { path => "/**/**/logstash/bin/*.log" start_position => "beginning" sincedb_path => "/dev/null" type => "multi" } } filter { csv { separator => "|" columns => ["GEOREFID","COUNTRYNAME", "G_COUNTRY", "G_UPDATE", "G_DELETE", "D_COUNTRY", "D_UPDATE", "D_DELETE"] } elasticsearch { hosts => ["localhost:9200"] index => "logstash-data-monitor" query => "GEOREFID:%{GEOREFID}" fields => [["JSON_COUNTRY","G_COUNTRY"], ["XML_COUNTRY","D_COUNTRY"]] } if [G_COUNTRY] { mutate { update => { "D_COUNTRY" => "%{D_COUNTRY}" } } } } output { elasticsearch { hosts => ["localhost:9200"] index => "logstash-data-monitor" document_id => "%{GEOREFID}" } }
We are using the above configuration when we use this the null value field is getting removed instead of skipping null value update.
Data comes from 2 different source. One is from XML file and the other is from JSON file.
XML log format : GEO-1|CD|23|John|892|Canada|31-01-2017|QC|-|-|-|-|- JSON log format : GEO-1|AS|33|-|-|-|-|-|Mike|123|US|31-01-2017|QC
When adding one log new document will get created in the index. When reading the second log file the existing document should get updated. The update should happen only in the first 5 fields if log file is XML and last 5 fields if the log file is JSON. Please suggest us on how to do this in logstash.
Tried with the above code. Please check and can any one help on how to fix this ?
-
banu about 7 yearsi have tired the above option document is not getting updated the null value field is getting removed instead of skipping null value update. elasticsearch { hosts => ["localhost:9200"] index => "logstash-data-monitor" action => "update" doc_as_upsert => true document_id => "%{GEOREFID}" }
-
banu about 7 yearsAny suggestion for the above comments @sysadmin1138
-
sysadmin1138 about 7 years@banu If the logic you're looking for is, IF NOT null-value THEN update, then I suggest you wrap that
update
output in an if conditional that tests for that null value. -
banu about 7 yearscould you please provide me the sample condition on this ?
-
sysadmin1138 about 7 years@banu I've updated with an example of a conditional.
-
banu about 7 years@ sysadmin1138 pls find the above updated question for your condition
-
Bala venkatesh about 4 years@sysadmin1138 how logstash know if I update data in MongoDB? is it automatically identify and update in the ES server?