My kafka value is byte array, is not text or json
How to convert logstash to hex string after consuming binary data from kafka
input config
input {
beats {
port => 5044
}
tcp {
port => 50000
}
kafka {
bootstrap_servers => "192.168.1.5:9092"
group_id => "logstash"
topics => ["xx","xxx"]
decorate_events => true
id => "log"
codec => plain {
charset => "BINARY"
ecs_compatibility => "disabled"
}
}
}
filter {
mutate {
copy => { "[@metadata][kafka]" => "kafka" }
}
}
output {
elasticsearch {
hosts => "elasticsearch:9200"
user => "logstash_internal"
password => "${LOGSTASH_INTERNAL_PASSWORD}"
}
}
But this will get a string of utf-8 encode.
I want to view the message as a hex string in kibana.
So what do I do now to achieve what I want.
New contributor
yedajiang44 is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.