Hi, When setting up syslog-ng -> ELK the logstash portion should not be needed as syslog-ng writes directly to an ES node or remote “transport” My understanding is logstash would post parse a log for a given interval and send it to ES like lodrotated. Not a solution for a syslog-ng realtime model. So my ElK stack is built but I don’t seem to be able to make the connection or data format between syslog-ng 3.8 and ES2. I can see syslog-ng writing data to the local log file destination and syslog-ng and es2 logs indicate the TCP connection was successful on startup… But I see no data in Kibana nor do I see any activity in syslog-ng or es logs using either of the templates below. NOTE: Removing syslog-ng->es and running flat logs through logstash does seem to populate data but that is not the solution. destination d_es { elasticsearch2( index("syslog-ng_${YEAR}.${MONTH}.${DAY}") type("syslog-ng") # Description: The type of the index. For example, type("test") template("$(format-json --scope rfc5424 --exclude DATE --key ISODATE @timestamp=${ISODATE})") #template("$(format-json --scope rfc3164 --scope nv-pairs --exclude R_DATE --key ISODATE)\n") port("9300") server("localhost") flush_limit("5000") client_mode("node") cluster("syslog-ng") custom_id("syslog") resource("/etc/elasticsearch/elasticsearch.yml") client_lib_dir("/usr/share/elasticsearch/lib") concurrent_requests("100") ); }; # sending logs to ES destination log { source(s_net); parser(pattern_db); destination(d_es); flags(flow-control); }; /etc/elasticsearch/elasticsearch.yml cluster.name: syslog node.name: node-1 path.home: /usr/share/elasticsearch