log path with two destinations - flow control problem
Hello to all i'm trying to set syslog to store sendmaill messages into PostgreSQL tables. One simple table (tableA) which stores all messages, and another table (tableB) which stores only some relevant iformation. All messages are parsed in my parser, then logged into tableA, then checked against the filter, an those messages which pass are logged into tableB. it all looks like this: log { source(pipe_load); filter(f_sendmail); #-- basic filter for sendmail program parser(maillog); #-- parse message details destination(maillog_base); #-- store message in tableA - basic info only filter(f_sendmail_from_or_to); #-- filter only messages which have "from" or "to" address defined destination(maillog_detail); #-- strore the deatiled info into tableB flags(flow-control); } filterinhg itself works fine, but i have problem with flow-control Messages are send from file (old archived maillog) into syslog pipe source. Problemm is, that alhough flow-control is enabled, messages get droped on the maillog_base destination. Im doing "cat old_maillog > syslog-pile" where old_maillog has approx 1 800 000 records, but only about 400 000 records gets logged into tableA. Flow control does not seem to work in this scenario. In case when i comment-out lines with filter(f_sendmail_from_or_to) and destination(maillog_detail) effectively disabling logging into the 2nd table, all messages from input file are processed and stored into tableA. The flow control works fine in this case. Is there something wrong with such filtering and storing only some messages in the 2nd destination? I also thought about having two different log path for each table, but in that case, the parsing of the message should be done twice, which i want to avoid. My question is, if is it OK to use filtering like in my example, - filter - than destination - then filter again - then another destination + flow control Many thanks in advance -- Tomáš Novosad LinuxBox.cz, s.r.o. 28. října 168, 709 00 Ostrava tel.: +420 591 166 221 mobil: +420 737 238 655 email: tomas.novosad@linuxbox.cz jabber: novosad@linuxbox.cz www.linuxbox.cz mobil servis: +420 737 238 656 email servis: servis@linuxbox.cz
A while back I recall something related to multiple destinations in the same log statement and flow control issues. Try this log { source(pipe_load); parser(maillog); #-- parse message details log { filter(f_sendmail); #-- basic filter for sendmail program destination(maillog_base); #-- store message in tableA - basic info only flags(flow-control); } log { filter(f_sendmail_from_or_to); #-- filter only messages which have "from" or "to" address defined destination(maillog_detail); #-- strore the deatiled info into tableB flags(flow-control); } } Evan. On 02/17/2014 03:05 AM, Tomáš Novosad wrote:
Hello to all
i'm trying to set syslog to store sendmaill messages into PostgreSQL tables. One simple table (tableA) which stores all messages, and another table (tableB) which stores only some relevant iformation.
All messages are parsed in my parser, then logged into tableA, then checked against the filter, an those messages which pass are logged into tableB.
it all looks like this: log { source(pipe_load); filter(f_sendmail); #-- basic filter for sendmail program parser(maillog); #-- parse message details destination(maillog_base); #-- store message in tableA - basic info only
filter(f_sendmail_from_or_to); #-- filter only messages which have "from" or "to" address defined destination(maillog_detail); #-- strore the deatiled info into tableB flags(flow-control); }
filterinhg itself works fine, but i have problem with flow-control
Messages are send from file (old archived maillog) into syslog pipe source. Problemm is, that alhough flow-control is enabled, messages get droped on the maillog_base destination.
Im doing "cat old_maillog > syslog-pile" where old_maillog has approx 1 800 000 records, but only about 400 000 records gets logged into tableA. Flow control does not seem to work in this scenario.
In case when i comment-out lines with filter(f_sendmail_from_or_to) and destination(maillog_detail) effectively disabling logging into the 2nd table, all messages from input file are processed and stored into tableA. The flow control works fine in this case.
Is there something wrong with such filtering and storing only some messages in the 2nd destination?
I also thought about having two different log path for each table, but in that case, the parsing of the message should be done twice, which i want to avoid.
My question is, if is it OK to use filtering like in my example,
- filter - than destination - then filter again - then another destination + flow control
Many thanks in advance
-- Evan Rempel erempel@uvic.ca Senior Systems Administrator 250.721.7691 Data Centre Services, University Systems, University of Victoria
The multiple embedded log statement is the expected solution as Evan wrote. You can even use move common processing elements upwards, before the first embedded log statement so they get applied before both paths. The dst, filter, dst should also work though. I'll check that. On Feb 17, 2014 5:13 PM, "Evan Rempel" <erempel@uvic.ca> wrote:
A while back I recall something related to multiple destinations in the same log statement and flow control issues.
Try this
log { source(pipe_load); parser(maillog); #-- parse message details log { filter(f_sendmail); #-- basic filter for sendmail program destination(maillog_base); #-- store message in tableA - basic info only flags(flow-control); }
log { filter(f_sendmail_from_or_to); #-- filter only messages which have "from" or "to" address defined destination(maillog_detail); #-- strore the deatiled info into tableB flags(flow-control); } }
Evan.
On 02/17/2014 03:05 AM, Tomáš Novosad wrote:
Hello to all
i'm trying to set syslog to store sendmaill messages into PostgreSQL tables. One simple table (tableA) which stores all messages, and another table (tableB) which stores only some relevant iformation.
All messages are parsed in my parser, then logged into tableA, then checked against the filter, an those messages which pass are logged into tableB.
it all looks like this: log { source(pipe_load); filter(f_sendmail); #-- basic filter for sendmail program parser(maillog); #-- parse message details destination(maillog_base); #-- store message in tableA - basic info only
filter(f_sendmail_from_or_to); #-- filter only messages which have "from" or "to" address defined destination(maillog_detail); #-- strore the deatiled info into tableB flags(flow-control); }
filterinhg itself works fine, but i have problem with flow-control
Messages are send from file (old archived maillog) into syslog pipe source. Problemm is, that alhough flow-control is enabled, messages get droped on the maillog_base destination.
Im doing "cat old_maillog > syslog-pile" where old_maillog has approx 1 800 000 records, but only about 400 000 records gets logged into tableA. Flow control does not seem to work in this scenario.
In case when i comment-out lines with filter(f_sendmail_from_or_to) and destination(maillog_detail) effectively disabling logging into the 2nd table, all messages from input file are processed and stored into tableA. The flow control works fine in this case.
Is there something wrong with such filtering and storing only some messages in the 2nd destination?
I also thought about having two different log path for each table, but in that case, the parsing of the message should be done twice, which i want to avoid.
My question is, if is it OK to use filtering like in my example,
- filter - than destination - then filter again - then another destination + flow control
Many thanks in advance
-- Evan Rempel erempel@uvic.ca Senior Systems Administrator 250.721.7691 Data Centre Services, University Systems, University of Victoria
______________________________________________________________________________ Member info: https://lists.balabit.hu/mailman/listinfo/syslog-ng Documentation: http://www.balabit.com/support/documentation/?product=syslog-ng FAQ: http://www.balabit.com/wiki/syslog-ng-faq
Many thanks for the replies, i've just tested proposed solution and it seems to work as expected. Thanks again and wish You all a nice day. -- Tomáš Novosad LinuxBox.cz, s.r.o. 28. října 168, 709 00 Ostrava tel.: +420 591 166 221 mobil: +420 737 238 655 email: tomas.novosad@linuxbox.cz jabber: novosad@linuxbox.cz www.linuxbox.cz mobil servis: +420 737 238 656 email servis: servis@linuxbox.cz On 18. 2. 07:15, Balazs Scheidler wrote:
The multiple embedded log statement is the expected solution as Evan wrote. You can even use move common processing elements upwards, before the first embedded log statement so they get applied before both paths.
The dst, filter, dst should also work though. I'll check that. On Feb 17, 2014 5:13 PM, "Evan Rempel" <erempel@uvic.ca> wrote:
A while back I recall something related to multiple destinations in the same log statement and flow control issues.
Try this
log { source(pipe_load); parser(maillog); #-- parse message details log { filter(f_sendmail); #-- basic filter for sendmail program destination(maillog_base); #-- store message in tableA - basic info only flags(flow-control); }
log { filter(f_sendmail_from_or_to); #-- filter only messages which have "from" or "to" address defined destination(maillog_detail); #-- strore the deatiled info into tableB flags(flow-control); } }
Evan.
participants (3)
-
Balazs Scheidler
-
Evan Rempel
-
Tomáš Novosad