On Tue, 30 Nov 2004, Russell Adams wrote:
That was going to be my next question! ;]
So what do you use to parse all that data?
I'm using Logmuncher which has proven to be quite flexible and meets my needs.
On Tue, Nov 30, 2004 at 12:21:26PM -0800, Bill Nash wrote:
[syslog]$ du 15289740 ./11-29
I'm chucking 15 gigs of syslog per day, to the tune of: [11-29]$ cat * | nice wc -l 40784743
On top of that, the entirety of it is thrown to the mercy of a perl based log analyzer (single threaded, no less), which in turn filters and logs to a db, at an average rate of 472 lines per second.
I use a custom POE based near real-time analyzer I built myself. Syslog-ng forks the incoming streams, logging to disk, and then logging (unfiltered) to tcp target, which is a socket on my analyzer daemon. It breaks up the incoming packet by facility and severity, runs a set of rules against it, and performs any number of functions on the backend depending on the rule matched (logging to mysql, alarms, etc). It's essentially a perl version of Netcool that uses perl's regexp engine. - billn