I thought I'd share something with the list that I discovered recently. I had been archiving logs for about 70 host using syslog-ng for the last year, and recently we discovered that our ssh gateway had been rooted. I archived the logs by host like this: destination hosts { file("/var/log/HOSTS/$HOST/$FACILITY/$YEAR/$MONTH/$DAY/$FACILITY$YEAR$MONTH$DAY" owner(root) group(root) perm(0600) dir_perm(0700) create_dirs(yes)); }; log { source(src); destination(hosts); }; It's easy to use find and grep for finding pertinent log messages relating to events, but I found that since I was only logging into directories by host, I couldn't easily search ALL logs on a certain date. I could write up scripts to look for certain dates in filenames but that's not as fast and efficient as just seaching a directory with logs from all hosts on just that day. So I did this too: destination times { file("/var/log/TIMES/$YEAR/$MONTH/$DAY/$HOST/$FACILITY$YEAR$MONTH$DAY" owner(root) group(root) perm(0600) dir_perm(0700) create_dirs(yes)); }; log { source(src); destination(times); }; What do people think? I mounted /var/log/HOSTS and /var/log/TIMES each on their own 18 gig ultra2 SCSI 10k RPM drives, on a machine with a single PIII 550, 768 RAM. Performance and disk space aren't an issue, plus I can add another proc and disks (hot swap SCA SCSI). I just want to hear if people think I'm just wasting disk space and I should script something up, or if they like the idea of fast searches by date or hostname. TIA, -- Nate