i am newer in ELK, My LOGSTASH 7.8.x loop indefinitely and insert only one row in Elasticsearch !
Here is my sample example :
sample log
2021-07-18 09:15:30,000 INFO Sample log message 01
2022-07-18 10:20:45,111 ERROR Sample log message 02
2023-07-18 11:20:45,222 DEBUG Sample log message 03
2024-07-18 12:20:45,333 WARN Sample log message 04
conf file
input {
file {
path => "/home/sample.log"
start_position => "beginning"
sincedb_path => "/dev/null"
close_older=> "1 second"
}
}
filter {
grok {
match => { "resource" => "%{TIMESTAMP_ISO8601:log_timestamp} %{LOGLEVEL:log_level} %{GREEDYDATA:log_message}" }
}
}
output {
elasticsearch {
hosts => ["localhost:9200"]
index => "sample_logs"
document_id => "%{my_doc_for_logs}"
document_type => "_doc"
codec => "json"
}
stdout { codec => rubydebug }
}
By running logstash as follow :
bin/logstash -f /etc/logstash/conf.d/sshd.conf --log.level debug
1- Only one line is inserted in Elasticsearch index : sample_logs
2- I have to click on CTRL + C to stop indefinitely loop of logstash, image below :
debug image
cheking Elasticsearch data : only 1 row instead of 4 is inserted !!!
curl -XGET "http://localhost:9200/sample_logs/_search?size=1000" | jq .
Elastic Data
1- Only one line is inserted in elasticsearch index : sample_logs
2- I have to click on CTRL + C to stop indefinitly loop of logstash, image below :
onepiece123 is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.
The problem is the document_id.
document_id => "%{my_doc_for_logs}"
In your case Elasticsearch indexes all the documents with one and the same id.
Try deleting it, elasticsearch will be indexing with auto generated ids.
If you want a value to assign it, you can use this pattern.
Example:
mutate { add_field => { "[@metadata][id]" => "%{[host][name]}_%{some_field}" } }
document_id => "%{[@metadata][id]}"
3