logstash - How to extract feature from the Elasticsearch _source to index -
i have used logstash,elasticsearch , kibana collect logs. log file json this:
{"_id":{"$oid":"5540afc2cec7c68fc1248d78"},"agentid":"0000000bab39a520","handler":"susicontrol","sensorid":"/gpio/gpio00/level","ts":{"$date":"2015-04-29t09:00:00.846z"},"vhour":1} {"_id":{"$oid":"5540afc2cec7c68fc1248d79"},"agentid":"0000000bab39a520","handler":"susicontrol","sensorid":"/gpio/gpio00/dir","ts":{"$date":"2015-04-29t09:00:00.846z"},"vhour":0}
and code have used in logstash:
input { file { type => "log" path => ["/home/data/1/1.json"] start_position => "beginning" } } filter { json{ source => "message" } } output { elasticsearch { embedded => true } stdout { codec => rubydebug } }
then output in elasticsearch :
{ "_index": "logstash-2015.06.29", "_type": "log", "_id": "au5ag7kahwya2bfnpjo0", "_version": 1, "_score": 1, "_source": { "message": "{"_id":{"$oid":"5540afc2cec7c68fc1248d7c"},"agentid":"0000000bab39a520","handler":"susicontrol","sensorid":"/gpio/gpio05/dir","ts":{"$date":"2015-04-29t09:00:00.846z"},"vhour":1}", "@version": "1", "@timestamp": "2015-06-29t16:17:03.040z", "type": "log", "host": "song-lenovo-ideapad", "path": "/home/song/soft/data/1/average.json", "_id": { "$oid": "5540afc2cec7c68fc1248d7c" }, "agentid": "0000000bab39a520", "handler": "susicontrol", "sensorid": "/gpio/gpio05/dir", "ts": { "$date": "2015-04-29t09:00:00.846z" }, "vhour": 1 } }
but information in json file in _source not index can't use kibana analysis them.
the kibana shows analysis not available object fields. _source object fields
how solve problem?
Comments
Post a Comment