Date field is being presented as integer

7,926

Solution 1

I have figured it out: What you need to do is use a filter plugin in logstash, specifically the date plugin.

Here is the snippet that I added to my logstash config:

filter {
  date {
    match => [ "dateAdded", "UNIX_MS" ]
    target => "dateAddedCorrected"
  }
}

Solution 2

If I read correctly the ElasticSearch documentation https://www.elastic.co/guide/en/elasticsearch/reference/current/date.html

JSON doesn’t have a date datatype, so dates in Elasticsearch can either be:

strings containing formatted dates, e.g. "2015-01-01" or "2015/01/01 12:10:30".
a long number representing milliseconds-since-the-epoch.
an integer representing seconds-since-the-epoch. 

So your dateAdded field represented as an "number" data type is logical: Elasticsearch simply translated a JSON number to an ES number.

If I look at my own ELK instance, I found that the "timestamp" field is represented as a "date" data type. It's done automatically by logstash.

Behind the scene, logstash manage a 'mapping template' to define ES fields data types. In your case, it naively translate the date type from JSON and in the case of the timestamp it knows that it's a date so explicitely define it .

So what you need to do is define a mapping template and use logstash to push it to ES with your data.

ES mapping doc is here https://www.elastic.co/guide/en/elasticsearch/reference/current/mapping.html and Logstash can manage it with manage_template and template in elasticsearch output https://www.elastic.co/guide/en/logstash/current/plugins-outputs-elasticsearch.html#plugins-outputs-elasticsearch-template. An introduction to AS mapping https://www.elastic.co/blog/found-elasticsearch-mapping-introduction.

You can also look at the mapping actually being in use with

curl -XGET 'localhost:9200/<index>/_mapping?pretty'
Share:
7,926

Related videos on Youtube

Elliot Huffman
Author by

Elliot Huffman

I am a self taught computer guru and am proficient in Windows, OS X, Linux, and Cloud. My cert count has exceeded 35 active certs. If you wanna check them out, head on over to acclaim: https://www.youracclaim.com/users/elliot-huffman I used to work at the Microsoft Global Help Desk, which is an international service desk for internal Microsoft employees, vendors and other business guests/contractors. I was also the owner and founder of Elliot Labs LLC, a consulting and services company that focused on Information Technology and Computer Science. It was in operation since late 2011 and serviced a wide range of customers from residential to business. I currently work as a Customer Engineer at Microsoft. I will see you around!

Updated on September 18, 2022

Comments

  • Elliot Huffman
    Elliot Huffman over 1 year

    I have an Elastic Stack server (on Hyper-v) that is ingesting data via a logstash exec command and performing analytics on it. Everything is working great except a date field that is being displayed as a number.

    How do I get logstash, Elasticsearch or Kibana to recognize the field s a date instead of a number?

    The data is Unix epoch time in milliseconds.


    Code:

    Data outputted by the python file is in JSON format. No real processing is taking place until it hits elasticsearch.

    Logstash config:

    input {
      exec {
        command => "/home/elliot/BullhornConnector.py JobOrder isOpen,webResponses,submissions,sendouts,interviews,placements,address,numOpenings,employmentType,owner,title,clientCorporation"
        interval => 60
        codec => json
        tags => ["JobOrder"]
      }
      exec {
        command => "/home/elliot/BullhornConnector.py Lead owner,leadSource,firstName,lastName,status,dateAdded"
        interval => 60
        codec => json
        tags => ["Lead"]
      }
      exec {
        command => "/home/elliot/BullhornConnector.py Opportunity owner,isOpen,dealValue,weightedDealValue,clientCorporation,status"
        interval => 60
        codec => json
        tags => ["Opportunity"]
      }
    }
    
    output {
      elasticsearch {
        hosts => ["localhost:9200"]
      }
      stdout { codec => rubydebug }
    }
    

    Screen shots:

    Here is a screenshot of the raw data: Raw data display

    Index pattern overview page: Data overview on index patterns page

    Detailed view of the field: Setting does not allow me to change it.

    Thanks!

  • Elliot Huffman
    Elliot Huffman almost 7 years
    Unfortunately I do not know how to do that. Thanks for your input :-)
  • Elliot Huffman
    Elliot Huffman almost 7 years
    ok, will try. Thanks for the info. Can you provide an example of a template?
  • Elliot Huffman
    Elliot Huffman almost 7 years
  • Elliot Huffman
    Elliot Huffman almost 7 years
    Thank you for pointing me in the right direction. It did not work for me. You can see what worked for me in my answer. Thank you again for your guidance!
  • daks
    daks almost 7 years
    In fact, this is the way to go if you want dateAdded to be the logstash event timestamp. If it's just a date field but not your timestamp, you must use a mapping template.
  • Elliot Huffman
    Elliot Huffman almost 7 years
    It turns out that this filter plugin can output to a different field than the time stamp, see the target section? That outputs to a different field that is date type.