So I saw there were a few other questions of this type, but none seemed to solve my issue.
I am attempting to take Springboot logs from files, parse out useful information, and send the result to Elasticsearch, and ultimately read from Kibana. My fluentd.conf looks like the following:
<source>
type tail
read_from_head true
path /path/to/log/
pos_file /path/to/pos_file
format /^(?<date>[0-9]+-[0-9]+-[0-9]+\s+[0-9]+:[0-9]+:[0-9]+.[0-9]+)\s+(?<log_level>[Aa]lert|ALERT|[Tt]race|TRACE|[Dd]ebug|DEBUG|[Nn]otice|NOTICE|[Ii]nfo|INFO|[Ww]arn?(?:ing)?|WARN?(?:ING)?|[Ee]rr?(?:or)?|ERR?(?:OR)?|[Cc]rit?(?:ical)?|CRIT?(?:ICAL)?|[Ff]atal|FATAL|[Ss]evere|SEVERE|EMERG(?:ENCY)?|[Ee]merg(?:ency)?)\s+(?<pid>[0-9]+)\s+---\s+(?<message>.*)$/
tag my.app
</source>
<match my.app>
type stdout
</match>
<match my.app>
type elasticsearch
logstash_format true
host myhosthere
port 9200
index_name fluentd-app
type_name fluentd
</match>
Given a typical Springboot log line:
2015-07-16 19:20:04.074 INFO 16649 --- [ main] {springboot message}
By also writing to stdout as a test, I see my parser is resulting in:
{
"date":"2015-07-16 19:20:04.074",
"log_level":"INFO",
"pid":"16649",
"message":"[ main] {springboot message}"
}
However, when this gets written to Elasticsearch, all that results is:
{
_index: "fluentd-app-2015.07.16",
_type: "fluentd",
_id: "AU6YT5sjvkxiJXWCxeM8",
_score: 1,
_source: {
message: "2015-07-16 19:20:04.074 INFO 16649 --- [ main] {springboot message}",
@timestamp: "2015-07-16T19:20:04+00:00"
}
},
From what I had read about fluentd-plugin-elasticsearch I expected _source to contain all of the parsed fields that I see in stdout. I have also tried the grok parser - though it seems apparent the issue lies with understanding of the fluentd elasticsearch plugin. How do I get the fields I parsed to persist to elasticsearch?