Hi, My initial thought is that your pattern matching is simply failing. I had an issue where an oddly formatted log source was not populating the program part of the syslog message correctly, causing syslog-ng to not match what I thought it should. You might try using some of the built-in macros (without json initially) to see what is getting parsed into which macros currently. You can play with this a bit to figure out what is happening. One way is to define templates in a test destination, and then you can see what is being parsed. Here is an example where I was unsure of what was being parsed into the date destination d_local_template { file("/var/log/template_messages" template("date ${ISODATE} host ${HOST} program ${PROGRAM} message ${MESSAGE}\n") ); }; Please look at the admin guide for other macros you can play with, but this should give you some basic tools to work with. Also - Have you looked at patterndb ? Here are a couple snippets that are working well for me: parser p_proxy { db-parser(file("/usr/local/etc/patterndb.d/proxy.xml")); }; destination d_redis { redis ( host("localhost") command("LPUSH", "logstash", "$(format-json type=proxyproxy_time=${PROXY.TIME} proxy_time_taken=${PROXY.TIME_TAKEN} proxy_c_ip=${PROXY.C_IP} proxy_sc_status=${PROXY.SC_STATUS} proxy_s_action=${PROXY.S_ACTION} proxy_sc_bytes=int64(${PROXY.SC_BYTES}) proxy_cs_bytes=int64(${PROXY.CS_BYTES}) proxy_cs_method=${PROXY.CS_METHOD} proxy_cs_uri_scheme=${PROXY.CS_URI_SCHEME} proxy_cs_host=${PROXY.CS_HOST} proxy_cs_uri_port=${PROXY.CS_URI_PORT} proxy_cs_uri_path=${PROXY.CS_URI_PATH} proxy_cs_uri_equery=${PROXY.CS_URI_EQUERY} proxy_cs_username=${PROXY.CS_USERNAME} proxy_cs_auth_group=${PROXY.CS_AUTH__GROUP} proxy_s_supplier_name=${PROXY.S_SUPPLIER_NAME} proxy_content_type=${PROXY.CONTENT_TYPE} proxy_referrer=${PROXY.REFERRER} proxy_user_agent=${PROXY.USER_AGENT} proxy_filter_result=${PROXY.FILTER_RESULT} proxy_cs_categories=${PROXY.CS_CATEGORIES} proxy_x_virus_id=${PROXY.X_VIRUS_ID} proxy_s_ip=${PROXY.S_IP} proxy_any=${PROXY.ANYREST})\n") ); }; log { source(s_network); parser(p_proxy); destination(d_redis); }; And from redis I (currently) use logstash to pull events and feed elasticsearch (acting as a very fast buffer between syslog-ng and elasticsearch). Hope this helps. Jim ---- Russell Fulton <r.fulton@auckland.ac.nz> wrote:
Hi
Thanks to various folks I managed to build 3.7b2 with json. I am now trying to convert my ELSA config to produce a parsed output in json that I can feed into Elastic Search. Having failed to make my full config work (I had to try ;) so I tried a basic one based on https://www.balabit.com/sites/default/files/documents/syslog-ng-ose-latest-g... [ Aside: minor syntax error on this page — lnside log {} you can not name parser elements ] here is my conf: @version: 3.7 source s_json { network(port(1514) flags(no-parse)); }; destination d_json { file("/data/russell/test.json” template("$(format-json --scope dot-nv-pairs)\n")); }; log { source(s_json); parser { json-parser (prefix(".json.")); }; destination(d_json); }; I get no output and ‘stats’ shows: [rful011@secmgrprd01 ~]$ sudo /usr/local/syslog-ng/sbin/syslog-ng-ctl stats SourceName;SourceId;SourceInstance;State;Type;Number src.none;;;a;processed;0 src.none;;;a;stamp;0 source;s_json;;a;processed;19375 global;payload_reallocs;;a;processed;25710 global;msg_clones;;a;processed;0 destination;d_json;;a;processed;0 center;;queued;a;processed;0 global;sdata_updates;;a;processed;0 center;;received;a;processed;19375 global;internal_queue_length;;a;processed;19378 Which is the same as I get with my full config with lots of patterns. As usual am missing something basic! ______________________________________________________________________________ Member info: https://lists.balabit.hu/mailman/listinfo/syslog-ng Documentation: http://www.balabit.com/support/documentation/?product=syslog-ng FAQ: http://www.balabit.com/wiki/syslog-ng-faq