[syslog-ng]splitting logs on central log host?

Jeffrey Forman syslog-ng@lists.balabit.hu
Mon, 16 Feb 2004 16:17:37 -0600


--=-JoG7XAojthPYOAx8U63C
Content-Type: text/plain; charset=iso-8859-1
Content-Transfer-Encoding: quoted-printable

Bazsi,

I'd like to have the client machines log on their own file structure
under /var/log and send copies to the loghost, that is why i have the
filtering on the client side. About your second suggestion, setting up
seperate ports per machine is going to be a logistical and security
nightmare, as we will be adding more and more machines as time
progresses. I tried specifying just a directory on the loghost via
/mnt/scratch/logs/$HOST but that didnt quite work, it either complained
about no file/directory found or just logged everything into one file.
Any suggestion?

-Jeffrey

On Mon, 2004-02-16 at 16:05, Balazs Scheidler wrote:
> 2004-02-16, h keltez=E9ssel 22:51-kor Jeffrey Forman ezt =EDrta:
> > I have just setup syslog-ng to log about 15 machines to a central log
> > host, and i absolutely love it, but i have a question. right now i'm
> > splitting the logs like this on the loghost
> > destination hosts {
> >    file("/mnt/backups/logs/$HOST/$YEAR/$MONTH/$FACILITY$YEAR$MONTH"
> >    owner(root) group(root) perm(0600) dir_perm(0700) create_dirs(yes));
> > };
> >=20
> > but that only splits the logs via facility. there are more logs being
> > written on the client log machines, which are configured like this
> >=20
> > destination messages { file("/var/log/messages"); };
> > destination ssh { file("/var/log/ssh.log"); };
> > destination cron { file("/var/log/cron.log"); };
> > destination auth { file("/var/log/auth.log"); };
> > destination syslog { file("/var/log/syslog.log"); };
> > destination xinetd { file("/var/log/xinetd.log"); };
> > destination rsync { file("/var/log/rsync.log"); };
> > destination cfengine { file("/var/log/cfengine.log"); };
> >=20
> > filter f_ssh { program("sshd"); };
> > filter f_cron { program("cron"); };
> > filter f_auth { program("su") or program("sudo"); };
> > filter f_syslog { program("syslog-ng"); };
> > filter f_xinetd { program("xinetd"); };
> > filter f_rsync { program("rsyncd"); };
> > filter f_cfengine { program("cfengine"); };
> > filter f_messages { ...}; // with messages getting everything else
> >=20
> > log { source(src); filter(f_ssh); destination(ssh); };
> > log { source(src); filter(f_cron); destination(cron); };
> > log { source(src); filter(f_auth); destination(auth); };
> > log { source(src); filter(f_syslog); destination(syslog); };
> > log { source(src); filter(f_xinetd); destination(xinetd); };
> > log { source(src); filter(f_rsync); destination(rsync); };
> > log { source(src); filter(f_cfengine); destination(cfengine); };
> > log { source(src); filter(f_messages); destination(messages); };
> >=20
> > what i would like to do is log the files as they are being logged
> > locally on the client machines, the same way on the loghost. do i have
> > to change the destination? how do i get the same file names, etc, that
> > are being logged, but on the loghost? any help is greatly appreciated.
>=20
> I'm not sure I understand you correctly, but I think you have two
> options:
>=20
> 1) move the filtering/destinations to the loghost
>=20
> 2) create separate destinations for each of your filter which send the
> log information on a specific port (e.g. send all logs about ssh to port
> 10001, cron to 10002, auth to 10003) and use this port information on
> the loghost to sort messages to various destinations

--=-JoG7XAojthPYOAx8U63C
Content-Type: application/pgp-signature; name=signature.asc
Content-Description: This is a digitally signed message part

-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.2.3 (GNU/Linux)

iD8DBQBAMUGB/VRN5BlQ3dQRAgnbAJ4m8Og6nulqWwgVHkxASGPn1rj+xQCfR7Wd
VITE+pKhp2GRfYI+v5iMd6Q=
=oLuF
-----END PGP SIGNATURE-----

--=-JoG7XAojthPYOAx8U63C--