Ask Your Question
0

SDC log4j format for filebeat agent

asked 2020-05-05 14:52:50 -0500

Peter Delaney gravatar image

We are monitoring our StreamSets Pipeline applications using ELK with filebeat agent reading the SDC log files I created a simple pipeline to test how filebeat will react to reading SDC logs and these are the errors filebeat is spitting out.

2020-05-05T15:46:13-04:00 ERR Error decoding JSON: json: cannot unmarshal number into Go value of type map[string]interface {} 2020-05-05T15:46:13-04:00 ERR Error decoding JSON: json: cannot unmarshal number into Go value of type map[string]interface {}

Does anyone know if SDC log works out of the box for filebeat or do I need to alter the sdc-log4j.properties file for the correct output? I am new to ELK/filebeat do I need to alter what it is interpreting?

edit retag flag offensive close merge delete

1 Answer

Sort by ยป oldest newest most voted
1

answered 2020-05-08 23:07:44 -0500

wilson shamim gravatar image

I was able to get log data into ELK stack using filebeat --> elastic as well as filebeat --> logstash --> elastic without any issue. I am using elastic version 7.6

additonally, I was able to split the log message into respective fields using below pattern in logstash

filter { grok { match => { "message" => "%{TIMESTAMP_ISO8601:timestamp}\s+%{NOTSPACE:username}\s+%{NOTSPACE:pipelinename}*" } }

}

sample output in kibana: pipelinename : [pipeline:Send_KAFKA/SendKAFKAa23341da-9cf3-4e2e-b155-6dab74c2c1f0] timestamp: 2020-05-09 03:53:53,971 username: [user:*user1]

edit flag offensive delete link more
Login/Signup to Answer

Question Tools

1 follower

Stats

Asked: 2020-05-05 14:52:50 -0500

Seen: 25 times

Last updated: May 08