<div>
<blockquote class="gmail_quote" style="PADDING-LEFT: 1ex; MARGIN: 0px 0px 0px 0.8ex; BORDER-LEFT: #ccc 1px solid">> On 6/5/07, Alexander Clouter <<a href="mailto:ac56@soas.ac.uk">ac56@soas.ac.uk</a>> wrote:<br>> >
<br>> >If you are logging to a file and having Perl munch on a log file you can<br>> >remove the perl script and pipe from the whole works. syslog-ng can be<br>> >configured to read in directly a log file; it effectively does a 'tail -f'
<br>> >on the file.<br>><br>> Except for cases where you need the Perl script to post-process the log file<br>> to produce a normalized or custom syslog message format.<br>><br>Agreed, however a better design would be:
<br><br><Log> ---> <Perl> ---> <Log2><br><br><Log2> --[using 'tail' function]--> syslog-ng -> syslog-ng<br><br>These would be disconnected and you would not run into any nasty problems.
<br>Another advantage being that if the perl code soaks up the CPU cycles and the<br>logging is not urgent (needed at the final destination ASAP) you could nice<br>down the priority of the process to prevent it stomping in on syslog-ng's
<br>performance.</blockquote>
<div> </div>
<div>But that scenario only supports logs that are "syslog friendly". If you have to do any kind of normalization or post-processing or dealing with multi-line records in a log file, you have to use a post-processing process such as Perl. Not all logs have human-friendly readable text and the post-processing Perl script can translate the log record into something more useful.
</div>
<div> </div>
<div>The CPU cycles required by Perl parsing is minimal and is a common myth that Perl is a CPU hog. I have Perl scripts processing billions of log events per week.</div><br> </div>