Hi all, I've a problem with squid and loghost saw tail |logger , apparently i have duplicate entry in the same line. Log output, -------------------- 11:42:46 arapacu.epm.br [user.notice] squid_access: 1122475366.008 220 172.16.160.185 TCP_MISS/200 437 GET http://kh.google.com/flatfile? - DIRECT/216.239.39.93 application/octet-stream squid_access: 1122475366.008 220 172.16.160.185 TCP_MISS/200 437 GET http://kh.google.com/flatfile? - DIRECT/216.239.39.93 application/octet-stream -------------------- syslog-ng resume, -------------------------------------------------------- options { sync(0); use_fqdn(yes); keep_hostname(yes); chain_hostnames(no); long_hostnames(no); create_dirs(yes); group(syslog); dir_group(syslog); perm(0640); dir_perm(0750); stats(3600); time_reopen(10); log_msg_size(10240); log_fifo_size(4096); dns_cache(yes); use_dns(yes); stats (3600); mark (600); bad_hostname("gconfd"); }; source src { unix-dgram("/var/run/log"); unix-dgram("/var/run/logpriv" perm(0600)); internal(); file("/dev/klog"); }; source s_stunnel { tcp(ip("127.0.0.1") port(514) keep-alive(yes) max-connections(300)); }; source s_udp { udp(); }; destination r_squida { file ("/syslog/clients/$HOST/squid_access/$YEAR.$MONTH.$DAY" template("$HOUR:$MIN:$SEC $HOST [$FACILITY.$PRIORITY] $MSG $MSG\n") template_escape(no)); }; filter f_squida { match ("squid_access"); }; log { source(src); source(s_udp); source(s_stunnel); filter(f_squida); destination(r_squida); flags(final); }; Thanks for Advanced, Atenciosamente Ricardo A. Reis UNIFESP - SENAI
participants (1)
-
Ricardo A Reis