Connect syslog-ng to HDFS
Hi, I'm trying to connect syslog-ng 3.14.1 to HDFS to store the syslog messages. The syslog-ng can start without error, and it's able to write into local file. However, the log is not written to the HDFS. As there is no single error, I'm not sure how I should troubleshoot on this. I have installed the syslog-ng from this repo, https://copr.fedorainfracloud.org/coprs/czanik/syslog-ng314/repo/epel-7/czan... Installed Packages syslog-ng.x86_64 3.14.1-4.el7.centos @czanik-syslog-ng314 syslog-ng-java.x86_64 3.14.1-4.el7.centos @czanik-syslog-ng314 syslog-ng-java-deps.noarch 1.0-2 @czanik-syslog-ng314 This is the message from /var/log/message: Log statistics; processed='src.internal(s_sys#0)=1', stamp='src.internal(s_sys#0)=1534491834', processed='destination(d_spol)=0', processed='destination(d_mlal)=0', processed='center(received)=2', processed='destination(d_mesg)=1', processed='destination(d_mail)=0', processed='destination(d_auth)=0', processed='destination(d_cron)=0', processed='destination(d_hdfs)=1', processed='center(queued)=3', queued='global(scratch_buffers_count)=0', processed='source(remote_log)=1', dropped='dst.java(d_hdfs#0,java_dst,hdfs,hdfs://x.x.x.x:25000,/user/syslog/$HOST-$DAY-$MONTH-$YEAR.log)=0', processed='dst.java(d_hdfs#0,java_dst,hdfs,hdfs://x.x.x.x:25000,/user/syslog/$HOST-$DAY-$MONTH-$YEAR.log)=1', queued='dst.java(d_hdfs#0,java_dst,hdfs,hdfs://x.x.x.x:25000,/user/syslog/$HOST-$DAY-$MONTH-$YEAR.log)=1', processed='global(payload_reallocs)=0', processed='src.journald(journal)=0', stamp='src.journald(journal)=0', processed='global(sdata_updates)=0', queued='global(scratch_buffers_bytes)=0', processed='destination(d_boot)=0', processed='destination(d_kern)=0', processed='source(s_sys)=1', processed='destination(remote)=1', processed='global(internal_queue_length)=0', processed='global(msg_clones)=0' Anyone has any idea how should I proceed the troubleshooting?
Hi, As https://copr.fedorainfracloud.org/coprs/czanik/syslog-ng314/ also writes: "Java is enabled, but JAR dependencies are not provided in package, except for Elasticsearch http mode." The syslog-ng-java-deps.noarch contains build time dependencies. Probably I should rename the package to syslog-ng-java-build-deps... Check the documentation at https://www.syslog-ng.com/technical-documents/doc/syslog-ng-open-source-edit... on how to download and configure HDFS related JAR dependencies. Bye, Peter Czanik (CzP) <peter.czanik@balabit.com> Balabit / syslog-ng upstream https://syslog-ng.com/community/ https://twitter.com/PCzanik On Fri, Aug 17, 2018 at 10:22 AM, Lee Keng Ket <kengket@gmail.com> wrote:
Hi,
I'm trying to connect syslog-ng 3.14.1 to HDFS to store the syslog messages. The syslog-ng can start without error, and it's able to write into local file. However, the log is not written to the HDFS. As there is no single error, I'm not sure how I should troubleshoot on this.
I have installed the syslog-ng from this repo, https://copr.fedorainfracloud.org/coprs/czanik/syslog-ng314/ repo/epel-7/czanik-syslog-ng314-epel-7.repo
Installed Packages syslog-ng.x86_64 3.14.1-4.el7.centos @czanik-syslog-ng314 syslog-ng-java.x86_64 3.14.1-4.el7.centos @czanik-syslog-ng314 syslog-ng-java-deps.noarch 1.0-2 @czanik-syslog-ng314
This is the message from /var/log/message: Log statistics; processed='src.internal(s_sys#0)=1', stamp='src.internal(s_sys#0)=1534491834', processed='destination(d_spol)=0', processed='destination(d_mlal)=0', processed='center(received)=2', processed='destination(d_mesg)=1', processed='destination(d_mail)=0', processed='destination(d_auth)=0', processed='destination(d_cron)=0', processed='destination(d_hdfs)=1', processed='center(queued)=3', queued='global(scratch_buffers_count)=0', processed='source(remote_log)=1', dropped='dst.java(d_hdfs#0,java_dst,hdfs,hdfs://x.x.x.x: 25000,/user/syslog/$HOST-$DAY-$MONTH-$YEAR.log)=0', processed='dst.java(d_hdfs#0,java_dst,hdfs,hdfs://x.x.x.x: 25000,/user/syslog/$HOST-$DAY-$MONTH-$YEAR.log)=1', queued='dst.java(d_hdfs#0,java_dst,hdfs,hdfs://x.x.x.x: 25000,/user/syslog/$HOST-$DAY-$MONTH-$YEAR.log)=1', processed='global(payload_reallocs)=0', processed='src.journald(journal)=0', stamp='src.journald(journal)=0', processed='global(sdata_updates)=0', queued='global(scratch_buffers_bytes)=0', processed='destination(d_boot)=0', processed='destination(d_kern)=0', processed='source(s_sys)=1', processed='destination(remote)=1', processed='global(internal_queue_length)=0', processed='global(msg_clones)=0'
Anyone has any idea how should I proceed the troubleshooting?
____________________________________________________________ __________________ Member info: https://lists.balabit.hu/mailman/listinfo/syslog-ng Documentation: http://www.balabit.com/support/documentation/? product=syslog-ng FAQ: http://www.balabit.com/wiki/syslog-ng-faq
Hello! In the statistics it can be seen that the log message is not sent to the HDFS server: dropped='dst.java(d_hdfs#0 java_dst hdfs hdfs://x.x.x.x:25000 /user/syslog/$HOST-$DAY-$MONTH-$YEAR.log)=0' processed='dst.java(d_hdfs#0 java_dst hdfs hdfs://x.x.x.x:25000 /user/syslog/$HOST-$DAY-$MONTH-$YEAR.log)=1' queued='dst.java(d_hdfs#0 java_dst hdfs hdfs://x.x.x.x:25000 /user/syslog/$HOST-$DAY-$MONTH-$YEAR.log)=1' Well, generally on write error there should be an exception that results in an error message. You should try debugging it either in running syslog-ng in foreground (-F option), forwarding internal logs to stderr (-e) and with debug mode (-dv) on. Or in service mode use the internal() source in your config and connect it to a destination (e.g. file()) which you prefer. You could turn on debug messages on java side too using jvm_options() in syslog-ng config and configuring the log4j logging service, e.g.: options { jvm_options("-Dlog4j.configuration=file:/etc/hadoop/log4j.properties -Dlog4j.debug=true"); }; Regards, Gabor On Fri, Aug 17, 2018 at 10:34 AM Czanik, Péter <peter.czanik@balabit.com> wrote:
Hi,
As https://copr.fedorainfracloud.org/coprs/czanik/syslog-ng314/ also writes: "Java is enabled, but JAR dependencies are not provided in package, except for Elasticsearch http mode." The syslog-ng-java-deps.noarch contains build time dependencies. Probably I should rename the package to syslog-ng-java-build-deps...
Check the documentation at https://www.syslog-ng.com/technical-documents/doc/syslog-ng-open-source-edit... on how to download and configure HDFS related JAR dependencies.
Bye,
Peter Czanik (CzP) <peter.czanik@balabit.com> Balabit / syslog-ng upstream https://syslog-ng.com/community/ https://twitter.com/PCzanik
On Fri, Aug 17, 2018 at 10:22 AM, Lee Keng Ket <kengket@gmail.com> wrote:
Hi,
I'm trying to connect syslog-ng 3.14.1 to HDFS to store the syslog messages. The syslog-ng can start without error, and it's able to write into local file. However, the log is not written to the HDFS. As there is no single error, I'm not sure how I should troubleshoot on this.
I have installed the syslog-ng from this repo, https://copr.fedorainfracloud.org/coprs/czanik/syslog-ng314/repo/epel-7/czan...
Installed Packages syslog-ng.x86_64 3.14.1-4.el7.centos @czanik-syslog-ng314 syslog-ng-java.x86_64 3.14.1-4.el7.centos @czanik-syslog-ng314 syslog-ng-java-deps.noarch 1.0-2 @czanik-syslog-ng314
This is the message from /var/log/message: Log statistics; processed='src.internal(s_sys#0)=1', stamp='src.internal(s_sys#0)=1534491834', processed='destination(d_spol)=0', processed='destination(d_mlal)=0', processed='center(received)=2', processed='destination(d_mesg)=1', processed='destination(d_mail)=0', processed='destination(d_auth)=0', processed='destination(d_cron)=0', processed='destination(d_hdfs)=1', processed='center(queued)=3', queued='global(scratch_buffers_count)=0', processed='source(remote_log)=1', dropped='dst.java(d_hdfs#0,java_dst,hdfs,hdfs://x.x.x.x:25000,/user/syslog/$HOST-$DAY-$MONTH-$YEAR.log)=0', processed='dst.java(d_hdfs#0,java_dst,hdfs,hdfs://x.x.x.x:25000,/user/syslog/$HOST-$DAY-$MONTH-$YEAR.log)=1', queued='dst.java(d_hdfs#0,java_dst,hdfs,hdfs://x.x.x.x:25000,/user/syslog/$HOST-$DAY-$MONTH-$YEAR.log)=1', processed='global(payload_reallocs)=0', processed='src.journald(journal)=0', stamp='src.journald(journal)=0', processed='global(sdata_updates)=0', queued='global(scratch_buffers_bytes)=0', processed='destination(d_boot)=0', processed='destination(d_kern)=0', processed='source(s_sys)=1', processed='destination(remote)=1', processed='global(internal_queue_length)=0', processed='global(msg_clones)=0'
Anyone has any idea how should I proceed the troubleshooting?
______________________________________________________________________________ Member info: https://lists.balabit.hu/mailman/listinfo/syslog-ng Documentation: http://www.balabit.com/support/documentation/?product=syslog-ng FAQ: http://www.balabit.com/wiki/syslog-ng-faq
______________________________________________________________________________ Member info: https://lists.balabit.hu/mailman/listinfo/syslog-ng Documentation: http://www.balabit.com/support/documentation/?product=syslog-ng FAQ: http://www.balabit.com/wiki/syslog-ng-faq
Hi, Gabor I have run it, seems like it stops at the HDFS side. [2018-08-21T10:07:51.212015] Worker thread started; driver='d_hdfs#0' [2018-08-21T10:07:51.212499] Running application hooks; hook='1' [2018-08-21T10:07:51.212516] Running application hooks; hook='3' [2018-08-21T10:07:51.212595] syslog-ng starting up; version='3.14.1' [2018-08-21T10:07:51.214113] Opening hdfs; [2018-08-21T10:08:01.215622] Opening hdfs; [2018-08-21T10:08:11.216050] Opening hdfs; [2018-08-21T10:08:21.226340] Opening hdfs; [2018-08-21T10:08:31.236589] Opening hdfs; [2018-08-21T10:08:41.240623] Opening hdfs; [2018-08-21T10:08:51.250879] Opening hdfs; [2018-08-21T10:09:01.261172] Opening hdfs; [2018-08-21T10:09:11.271410] Opening hdfs; [2018-08-21T10:09:21.281685] Opening hdfs; [2018-08-21T10:09:31.290765] Opening hdfs; [2018-08-21T10:09:41.301098] Opening hdfs; [2018-08-21T10:09:51.311362] Opening hdfs; [2018-08-21T10:10:01.321152] Opening hdfs; [2018-08-21T10:10:11.321818] Opening hdfs; [2018-08-21T10:10:21.330114] Opening hdfs; [2018-08-21T10:10:31.340413] Opening hdfs; [2018-08-21T10:10:41.350654] Opening hdfs; [2018-08-21T10:10:51.354016] Opening hdfs; [2018-08-21T10:11:01.364267] Opening hdfs; [2018-08-21T10:11:11.374516] Opening hdfs; [2018-08-21T10:11:21.384761] Opening hdfs; [2018-08-21T10:11:31.395017] Opening hdfs; [2018-08-21T10:11:41.402256] Opening hdfs; [2018-08-21T10:11:51.404097] Opening hdfs; ^C[2018-08-21T10:11:59.672252] syslog-ng shutting down; version='3.14.1' Exception in thread "" java.lang.NoClassDefFoundError: org/apache/hadoop/conf/Configuration at org.syslog_ng.hdfs.HdfsDestination.open(HdfsDestination.java:92) at org.syslog_ng.LogDestination.openProxy(LogDestination.java:65) [2018-08-21T10:11:59.774895] Worker thread finished; driver='d_hdfs#0' [2018-08-21T10:11:59.775384] Closing log transport fd; fd='13' [2018-08-21T10:11:59.775508] Deinitialize hdfs destination; [2018-08-21T10:11:59.776534] Java machine free; [2018-08-21T10:11:59.778421] Running application hooks; hook='4' Any idea what to be checked further? Thank you. On Fri, Aug 17, 2018 at 4:45 PM Nagy, Gábor <gabor.nagy@oneidentity.com> wrote:
Hello!
In the statistics it can be seen that the log message is not sent to the HDFS server: dropped='dst.java(d_hdfs#0 java_dst hdfs hdfs://x.x.x.x:25000 /user/syslog/$HOST-$DAY-$MONTH-$YEAR.log)=0' processed='dst.java(d_hdfs#0 java_dst hdfs hdfs://x.x.x.x:25000 /user/syslog/$HOST-$DAY-$MONTH-$YEAR.log)=1' queued='dst.java(d_hdfs#0 java_dst hdfs hdfs://x.x.x.x:25000 /user/syslog/$HOST-$DAY-$MONTH-$YEAR.log)=1'
Well, generally on write error there should be an exception that results in an error message.
You should try debugging it either in running syslog-ng in foreground (-F option), forwarding internal logs to stderr (-e) and with debug mode (-dv) on. Or in service mode use the internal() source in your config and connect it to a destination (e.g. file()) which you prefer.
You could turn on debug messages on java side too using jvm_options() in syslog-ng config and configuring the log4j logging service, e.g.: options {
jvm_options("-Dlog4j.configuration=file:/etc/hadoop/log4j.properties -Dlog4j.debug=true"); };
Regards, Gabor
On Fri, Aug 17, 2018 at 10:34 AM Czanik, Péter <peter.czanik@balabit.com> wrote:
Hi,
As https://copr.fedorainfracloud.org/coprs/czanik/syslog-ng314/ also writes: "Java is enabled, but JAR dependencies are not provided in package, except for Elasticsearch http mode." The syslog-ng-java-deps.noarch contains build time dependencies. Probably I should rename the package to syslog-ng-java-build-deps...
Check the documentation at https://www.syslog-ng.com/technical-documents/doc/syslog-ng-open-source-edit... on how to download and configure HDFS related JAR dependencies.
Bye,
Peter Czanik (CzP) <peter.czanik@balabit.com> Balabit / syslog-ng upstream https://syslog-ng.com/community/ https://twitter.com/PCzanik
On Fri, Aug 17, 2018 at 10:22 AM, Lee Keng Ket <kengket@gmail.com> wrote:
Hi,
I'm trying to connect syslog-ng 3.14.1 to HDFS to store the syslog messages. The syslog-ng can start without error, and it's able to write into local file. However, the log is not written to the HDFS. As there is no single error, I'm not sure how I should troubleshoot on this.
I have installed the syslog-ng from this repo, https://copr.fedorainfracloud.org/coprs/czanik/syslog-ng314/repo/epel-7/czan...
Installed Packages syslog-ng.x86_64 3.14.1-4.el7.centos @czanik-syslog-ng314 syslog-ng-java.x86_64 3.14.1-4.el7.centos @czanik-syslog-ng314 syslog-ng-java-deps.noarch 1.0-2 @czanik-syslog-ng314
This is the message from /var/log/message: Log statistics; processed='src.internal(s_sys#0)=1', stamp='src.internal(s_sys#0)=1534491834', processed='destination(d_spol)=0', processed='destination(d_mlal)=0', processed='center(received)=2', processed='destination(d_mesg)=1', processed='destination(d_mail)=0', processed='destination(d_auth)=0', processed='destination(d_cron)=0', processed='destination(d_hdfs)=1', processed='center(queued)=3', queued='global(scratch_buffers_count)=0', processed='source(remote_log)=1', dropped='dst.java(d_hdfs#0,java_dst,hdfs,hdfs://x.x.x.x:25000,/user/syslog/$HOST-$DAY-$MONTH-$YEAR.log)=0', processed='dst.java(d_hdfs#0,java_dst,hdfs,hdfs://x.x.x.x:25000,/user/syslog/$HOST-$DAY-$MONTH-$YEAR.log)=1', queued='dst.java(d_hdfs#0,java_dst,hdfs,hdfs://x.x.x.x:25000,/user/syslog/$HOST-$DAY-$MONTH-$YEAR.log)=1', processed='global(payload_reallocs)=0', processed='src.journald(journal)=0', stamp='src.journald(journal)=0', processed='global(sdata_updates)=0', queued='global(scratch_buffers_bytes)=0', processed='destination(d_boot)=0', processed='destination(d_kern)=0', processed='source(s_sys)=1', processed='destination(remote)=1', processed='global(internal_queue_length)=0', processed='global(msg_clones)=0'
Anyone has any idea how should I proceed the troubleshooting?
______________________________________________________________________________ Member info: https://lists.balabit.hu/mailman/listinfo/syslog-ng Documentation: http://www.balabit.com/support/documentation/?product=syslog-ng FAQ: http://www.balabit.com/wiki/syslog-ng-faq
______________________________________________________________________________ Member info: https://lists.balabit.hu/mailman/listinfo/syslog-ng Documentation: http://www.balabit.com/support/documentation/?product=syslog-ng FAQ: http://www.balabit.com/wiki/syslog-ng-faq
______________________________________________________________________________ Member info: https://lists.balabit.hu/mailman/listinfo/syslog-ng Documentation: http://www.balabit.com/support/documentation/?product=syslog-ng FAQ: http://www.balabit.com/wiki/syslog-ng-faq
Hi, Can you tell ne what is the version of the hadoop lib you use with syslog-ng, please? Can you share your syslog-ng configuration, mainly the hdfs part, please? Regards, Gabor On Tue, Aug 21, 2018 at 4:15 AM Lee Keng Ket <kengket@gmail.com> wrote:
Hi, Gabor
I have run it, seems like it stops at the HDFS side.
[2018-08-21T10:07:51.212015] Worker thread started; driver='d_hdfs#0' [2018-08-21T10:07:51.212499] Running application hooks; hook='1' [2018-08-21T10:07:51.212516] Running application hooks; hook='3' [2018-08-21T10:07:51.212595] syslog-ng starting up; version='3.14.1' [2018-08-21T10:07:51.214113] Opening hdfs; [2018-08-21T10:08:01.215622] Opening hdfs; [2018-08-21T10:08:11.216050] Opening hdfs; [2018-08-21T10:08:21.226340] Opening hdfs; [2018-08-21T10:08:31.236589] Opening hdfs; [2018-08-21T10:08:41.240623] Opening hdfs; [2018-08-21T10:08:51.250879] Opening hdfs; [2018-08-21T10:09:01.261172] Opening hdfs; [2018-08-21T10:09:11.271410] Opening hdfs; [2018-08-21T10:09:21.281685] Opening hdfs; [2018-08-21T10:09:31.290765] Opening hdfs; [2018-08-21T10:09:41.301098] Opening hdfs; [2018-08-21T10:09:51.311362] Opening hdfs; [2018-08-21T10:10:01.321152] Opening hdfs; [2018-08-21T10:10:11.321818] Opening hdfs; [2018-08-21T10:10:21.330114] Opening hdfs; [2018-08-21T10:10:31.340413] Opening hdfs; [2018-08-21T10:10:41.350654] Opening hdfs; [2018-08-21T10:10:51.354016] Opening hdfs; [2018-08-21T10:11:01.364267] Opening hdfs; [2018-08-21T10:11:11.374516] Opening hdfs; [2018-08-21T10:11:21.384761] Opening hdfs; [2018-08-21T10:11:31.395017] Opening hdfs; [2018-08-21T10:11:41.402256] Opening hdfs; [2018-08-21T10:11:51.404097] Opening hdfs; ^C[2018-08-21T10:11:59.672252] syslog-ng shutting down; version='3.14.1' Exception in thread "" java.lang.NoClassDefFoundError: org/apache/hadoop/conf/Configuration at org.syslog_ng.hdfs.HdfsDestination.open(HdfsDestination.java:92) at org.syslog_ng.LogDestination.openProxy(LogDestination.java:65) [2018-08-21T10:11:59.774895] Worker thread finished; driver='d_hdfs#0' [2018-08-21T10:11:59.775384] Closing log transport fd; fd='13' [2018-08-21T10:11:59.775508] Deinitialize hdfs destination; [2018-08-21T10:11:59.776534] Java machine free; [2018-08-21T10:11:59.778421] Running application hooks; hook='4'
Any idea what to be checked further?
Thank you.
On Fri, Aug 17, 2018 at 4:45 PM Nagy, Gábor <gabor.nagy@oneidentity.com> wrote:
Hello!
In the statistics it can be seen that the log message is not sent to the HDFS server: dropped='dst.java(d_hdfs#0 java_dst hdfs hdfs://x.x.x.x:25000 /user/syslog/$HOST-$DAY-$MONTH-$YEAR.log)=0' processed='dst.java(d_hdfs#0 java_dst hdfs hdfs://x.x.x.x:25000 /user/syslog/$HOST-$DAY-$MONTH-$YEAR.log)=1' queued='dst.java(d_hdfs#0 java_dst hdfs hdfs://x.x.x.x:25000 /user/syslog/$HOST-$DAY-$MONTH-$YEAR.log)=1'
Well, generally on write error there should be an exception that results in an error message.
You should try debugging it either in running syslog-ng in foreground (-F option), forwarding internal logs to stderr (-e) and with debug mode (-dv) on. Or in service mode use the internal() source in your config and connect it to a destination (e.g. file()) which you prefer.
You could turn on debug messages on java side too using jvm_options() in syslog-ng config and configuring the log4j logging service, e.g.: options {
jvm_options("-Dlog4j.configuration=file:/etc/hadoop/log4j.properties -Dlog4j.debug=true"); };
Regards, Gabor
On Fri, Aug 17, 2018 at 10:34 AM Czanik, Péter <peter.czanik@balabit.com> wrote:
Hi,
As https://copr.fedorainfracloud.org/coprs/czanik/syslog-ng314/ also writes: "Java is enabled, but JAR dependencies are not provided in package, except for Elasticsearch http mode." The syslog-ng-java-deps.noarch contains build time dependencies. Probably I should rename the package to syslog-ng-java-build-deps...
Check the documentation at https://www.syslog-ng.com/technical-documents/doc/syslog-ng-open-source-edit... on how to download and configure HDFS related JAR dependencies.
Bye,
Peter Czanik (CzP) <peter.czanik@balabit.com> Balabit / syslog-ng upstream https://syslog-ng.com/community/ https://twitter.com/PCzanik
On Fri, Aug 17, 2018 at 10:22 AM, Lee Keng Ket <kengket@gmail.com> wrote:
Hi,
I'm trying to connect syslog-ng 3.14.1 to HDFS to store the syslog messages. The syslog-ng can start without error, and it's able to write into local file. However, the log is not written to the HDFS. As there is no single error, I'm not sure how I should troubleshoot on this.
I have installed the syslog-ng from this repo, https://copr.fedorainfracloud.org/coprs/czanik/syslog-ng314/repo/epel-7/czan...
Installed Packages syslog-ng.x86_64 3.14.1-4.el7.centos @czanik-syslog-ng314 syslog-ng-java.x86_64 3.14.1-4.el7.centos @czanik-syslog-ng314 syslog-ng-java-deps.noarch 1.0-2 @czanik-syslog-ng314
This is the message from /var/log/message: Log statistics; processed='src.internal(s_sys#0)=1', stamp='src.internal(s_sys#0)=1534491834', processed='destination(d_spol)=0', processed='destination(d_mlal)=0', processed='center(received)=2', processed='destination(d_mesg)=1', processed='destination(d_mail)=0', processed='destination(d_auth)=0', processed='destination(d_cron)=0', processed='destination(d_hdfs)=1', processed='center(queued)=3', queued='global(scratch_buffers_count)=0', processed='source(remote_log)=1', dropped='dst.java(d_hdfs#0,java_dst,hdfs,hdfs://x.x.x.x:25000,/user/syslog/$HOST-$DAY-$MONTH-$YEAR.log)=0', processed='dst.java(d_hdfs#0,java_dst,hdfs,hdfs://x.x.x.x:25000,/user/syslog/$HOST-$DAY-$MONTH-$YEAR.log)=1', queued='dst.java(d_hdfs#0,java_dst,hdfs,hdfs://x.x.x.x:25000,/user/syslog/$HOST-$DAY-$MONTH-$YEAR.log)=1', processed='global(payload_reallocs)=0', processed='src.journald(journal)=0', stamp='src.journald(journal)=0', processed='global(sdata_updates)=0', queued='global(scratch_buffers_bytes)=0', processed='destination(d_boot)=0', processed='destination(d_kern)=0', processed='source(s_sys)=1', processed='destination(remote)=1', processed='global(internal_queue_length)=0', processed='global(msg_clones)=0'
Anyone has any idea how should I proceed the troubleshooting?
______________________________________________________________________________ Member info: https://lists.balabit.hu/mailman/listinfo/syslog-ng Documentation: http://www.balabit.com/support/documentation/?product=syslog-ng FAQ: http://www.balabit.com/wiki/syslog-ng-faq
______________________________________________________________________________ Member info: https://lists.balabit.hu/mailman/listinfo/syslog-ng Documentation: http://www.balabit.com/support/documentation/?product=syslog-ng FAQ: http://www.balabit.com/wiki/syslog-ng-faq
______________________________________________________________________________ Member info: https://lists.balabit.hu/mailman/listinfo/syslog-ng Documentation: http://www.balabit.com/support/documentation/?product=syslog-ng FAQ: http://www.balabit.com/wiki/syslog-ng-faq
______________________________________________________________________________ Member info: https://lists.balabit.hu/mailman/listinfo/syslog-ng Documentation: http://www.balabit.com/support/documentation/?product=syslog-ng FAQ: http://www.balabit.com/wiki/syslog-ng-faq
Hello, I remember seeing this very error message, when syslog-ng was not able to find the hdfs libraries at the directories I specified as client-lib-dir(). Then I realized, my directory was called "libs", and syslog-ng was looking for them under "lib". Regards, János -- Janos SZIGETVARI RHCE, License no. 150-053-692 <https://www.redhat.com/rhtapps/verify/?certId=150-053-692> LinkedIn: linkedin.com/in/janosszigetvari __@__˚V˚ Make the switch to open (source) applications, protocols, formats now: - windows -> Linux, iexplore -> Firefox, msoffice -> LibreOffice - msn -> jabber protocol (Pidgin, Google Talk) - mp3 -> ogg, wmv -> ogg, jpg -> png, doc/xls/ppt -> odt/ods/odp Nagy, Gábor <gabor.nagy@oneidentity.com> ezt írta (időpont: 2018. aug. 21., K, 12:48):
Hi,
Can you tell ne what is the version of the hadoop lib you use with syslog-ng, please?
Can you share your syslog-ng configuration, mainly the hdfs part, please?
Regards, Gabor
On Tue, Aug 21, 2018 at 4:15 AM Lee Keng Ket <kengket@gmail.com> wrote:
Hi, Gabor
I have run it, seems like it stops at the HDFS side.
[2018-08-21T10:07:51.212015] Worker thread started; driver='d_hdfs#0' [2018-08-21T10:07:51.212499] Running application hooks; hook='1' [2018-08-21T10:07:51.212516] Running application hooks; hook='3' [2018-08-21T10:07:51.212595] syslog-ng starting up; version='3.14.1' [2018-08-21T10:07:51.214113] Opening hdfs; [2018-08-21T10:08:01.215622] Opening hdfs; [2018-08-21T10:08:11.216050] Opening hdfs; [2018-08-21T10:08:21.226340] Opening hdfs; [2018-08-21T10:08:31.236589] Opening hdfs; [2018-08-21T10:08:41.240623] Opening hdfs; [2018-08-21T10:08:51.250879] Opening hdfs; [2018-08-21T10:09:01.261172] Opening hdfs; [2018-08-21T10:09:11.271410] Opening hdfs; [2018-08-21T10:09:21.281685] Opening hdfs; [2018-08-21T10:09:31.290765] Opening hdfs; [2018-08-21T10:09:41.301098] Opening hdfs; [2018-08-21T10:09:51.311362] Opening hdfs; [2018-08-21T10:10:01.321152] Opening hdfs; [2018-08-21T10:10:11.321818] Opening hdfs; [2018-08-21T10:10:21.330114] Opening hdfs; [2018-08-21T10:10:31.340413] Opening hdfs; [2018-08-21T10:10:41.350654] Opening hdfs; [2018-08-21T10:10:51.354016] Opening hdfs; [2018-08-21T10:11:01.364267] Opening hdfs; [2018-08-21T10:11:11.374516] Opening hdfs; [2018-08-21T10:11:21.384761] Opening hdfs; [2018-08-21T10:11:31.395017] Opening hdfs; [2018-08-21T10:11:41.402256] Opening hdfs; [2018-08-21T10:11:51.404097] Opening hdfs; ^C[2018-08-21T10:11:59.672252] syslog-ng shutting down; version='3.14.1' Exception in thread "" java.lang.NoClassDefFoundError: org/apache/hadoop/conf/Configuration at org.syslog_ng.hdfs.HdfsDestination.open(HdfsDestination.java:92) at org.syslog_ng.LogDestination.openProxy(LogDestination.java:65) [2018-08-21T10:11:59.774895] Worker thread finished; driver='d_hdfs#0' [2018-08-21T10:11:59.775384] Closing log transport fd; fd='13' [2018-08-21T10:11:59.775508] Deinitialize hdfs destination; [2018-08-21T10:11:59.776534] Java machine free; [2018-08-21T10:11:59.778421] Running application hooks; hook='4'
Any idea what to be checked further?
Thank you.
On Fri, Aug 17, 2018 at 4:45 PM Nagy, Gábor <gabor.nagy@oneidentity.com> wrote:
Hello!
In the statistics it can be seen that the log message is not sent to the HDFS server: dropped='dst.java(d_hdfs#0 java_dst hdfs hdfs://x.x.x.x:25000 /user/syslog/$HOST-$DAY-$MONTH-$YEAR.log)=0' processed='dst.java(d_hdfs#0 java_dst hdfs hdfs://x.x.x.x:25000 /user/syslog/$HOST-$DAY-$MONTH-$YEAR.log)=1' queued='dst.java(d_hdfs#0 java_dst hdfs hdfs://x.x.x.x:25000 /user/syslog/$HOST-$DAY-$MONTH-$YEAR.log)=1'
Well, generally on write error there should be an exception that results in an error message.
You should try debugging it either in running syslog-ng in foreground (-F option), forwarding internal logs to stderr (-e) and with debug mode (-dv) on. Or in service mode use the internal() source in your config and connect it to a destination (e.g. file()) which you prefer.
You could turn on debug messages on java side too using jvm_options() in syslog-ng config and configuring the log4j logging service, e.g.: options {
jvm_options("-Dlog4j.configuration=file:/etc/hadoop/log4j.properties -Dlog4j.debug=true"); };
Regards, Gabor
On Fri, Aug 17, 2018 at 10:34 AM Czanik, Péter <peter.czanik@balabit.com> wrote:
Hi,
As https://copr.fedorainfracloud.org/coprs/czanik/syslog-ng314/ also writes: "Java is enabled, but JAR dependencies are not provided in package, except for Elasticsearch http mode." The syslog-ng-java-deps.noarch contains build time dependencies. Probably I should rename the package to syslog-ng-java-build-deps...
Check the documentation at https://www.syslog-ng.com/technical-documents/doc/syslog-ng-open-source-edit... on how to download and configure HDFS related JAR dependencies.
Bye,
Peter Czanik (CzP) <peter.czanik@balabit.com> Balabit / syslog-ng upstream https://syslog-ng.com/community/ https://twitter.com/PCzanik
On Fri, Aug 17, 2018 at 10:22 AM, Lee Keng Ket <kengket@gmail.com> wrote:
Hi,
I'm trying to connect syslog-ng 3.14.1 to HDFS to store the syslog messages. The syslog-ng can start without error, and it's able to write into local file. However, the log is not written to the HDFS. As there is no single error, I'm not sure how I should troubleshoot on this.
I have installed the syslog-ng from this repo, https://copr.fedorainfracloud.org/coprs/czanik/syslog-ng314/repo/epel-7/czan...
Installed Packages syslog-ng.x86_64 3.14.1-4.el7.centos @czanik-syslog-ng314 syslog-ng-java.x86_64 3.14.1-4.el7.centos @czanik-syslog-ng314 syslog-ng-java-deps.noarch 1.0-2 @czanik-syslog-ng314
This is the message from /var/log/message: Log statistics; processed='src.internal(s_sys#0)=1', stamp='src.internal(s_sys#0)=1534491834', processed='destination(d_spol)=0', processed='destination(d_mlal)=0', processed='center(received)=2', processed='destination(d_mesg)=1', processed='destination(d_mail)=0', processed='destination(d_auth)=0', processed='destination(d_cron)=0', processed='destination(d_hdfs)=1', processed='center(queued)=3', queued='global(scratch_buffers_count)=0', processed='source(remote_log)=1', dropped='dst.java(d_hdfs#0,java_dst,hdfs,hdfs://x.x.x.x:25000,/user/syslog/$HOST-$DAY-$MONTH-$YEAR.log)=0', processed='dst.java(d_hdfs#0,java_dst,hdfs,hdfs://x.x.x.x:25000,/user/syslog/$HOST-$DAY-$MONTH-$YEAR.log)=1', queued='dst.java(d_hdfs#0,java_dst,hdfs,hdfs://x.x.x.x:25000,/user/syslog/$HOST-$DAY-$MONTH-$YEAR.log)=1', processed='global(payload_reallocs)=0', processed='src.journald(journal)=0', stamp='src.journald(journal)=0', processed='global(sdata_updates)=0', queued='global(scratch_buffers_bytes)=0', processed='destination(d_boot)=0', processed='destination(d_kern)=0', processed='source(s_sys)=1', processed='destination(remote)=1', processed='global(internal_queue_length)=0', processed='global(msg_clones)=0'
Anyone has any idea how should I proceed the troubleshooting?
______________________________________________________________________________ Member info: https://lists.balabit.hu/mailman/listinfo/syslog-ng Documentation: http://www.balabit.com/support/documentation/?product=syslog-ng FAQ: http://www.balabit.com/wiki/syslog-ng-faq
______________________________________________________________________________ Member info: https://lists.balabit.hu/mailman/listinfo/syslog-ng Documentation: http://www.balabit.com/support/documentation/?product=syslog-ng FAQ: http://www.balabit.com/wiki/syslog-ng-faq
______________________________________________________________________________ Member info: https://lists.balabit.hu/mailman/listinfo/syslog-ng Documentation: http://www.balabit.com/support/documentation/?product=syslog-ng FAQ: http://www.balabit.com/wiki/syslog-ng-faq
______________________________________________________________________________ Member info: https://lists.balabit.hu/mailman/listinfo/syslog-ng Documentation: http://www.balabit.com/support/documentation/?product=syslog-ng FAQ: http://www.balabit.com/wiki/syslog-ng-faq
______________________________________________________________________________ Member info: https://lists.balabit.hu/mailman/listinfo/syslog-ng Documentation: http://www.balabit.com/support/documentation/?product=syslog-ng FAQ: http://www.balabit.com/wiki/syslog-ng-faq
Hi, Janos Thanks for your reminder. The directory is correct, but I realize the jar files inside the lib is read-only. After I change it to executable, the error messages are different, no longer hang at opening HDFS. Thank you. On Wed, Aug 22, 2018 at 11:08 PM SZIGETVÁRI János <jszigetvari@gmail.com> wrote:
Hello,
I remember seeing this very error message, when syslog-ng was not able to find the hdfs libraries at the directories I specified as client-lib-dir(). Then I realized, my directory was called "libs", and syslog-ng was looking for them under "lib".
Regards, János
-- Janos SZIGETVARI RHCE, License no. 150-053-692 <https://www.redhat.com/rhtapps/verify/?certId=150-053-692>
LinkedIn: linkedin.com/in/janosszigetvari
__@__˚V˚ Make the switch to open (source) applications, protocols, formats now: - windows -> Linux, iexplore -> Firefox, msoffice -> LibreOffice - msn -> jabber protocol (Pidgin, Google Talk) - mp3 -> ogg, wmv -> ogg, jpg -> png, doc/xls/ppt -> odt/ods/odp
Nagy, Gábor <gabor.nagy@oneidentity.com> ezt írta (időpont: 2018. aug. 21., K, 12:48):
Hi,
Can you tell ne what is the version of the hadoop lib you use with syslog-ng, please?
Can you share your syslog-ng configuration, mainly the hdfs part, please?
Regards, Gabor
On Tue, Aug 21, 2018 at 4:15 AM Lee Keng Ket <kengket@gmail.com> wrote:
Hi, Gabor
I have run it, seems like it stops at the HDFS side.
[2018-08-21T10:07:51.212015] Worker thread started; driver='d_hdfs#0' [2018-08-21T10:07:51.212499] Running application hooks; hook='1' [2018-08-21T10:07:51.212516] Running application hooks; hook='3' [2018-08-21T10:07:51.212595] syslog-ng starting up; version='3.14.1' [2018-08-21T10:07:51.214113] Opening hdfs; [2018-08-21T10:08:01.215622] Opening hdfs; [2018-08-21T10:08:11.216050] Opening hdfs; [2018-08-21T10:08:21.226340] Opening hdfs; [2018-08-21T10:08:31.236589] Opening hdfs; [2018-08-21T10:08:41.240623] Opening hdfs; [2018-08-21T10:08:51.250879] Opening hdfs; [2018-08-21T10:09:01.261172] Opening hdfs; [2018-08-21T10:09:11.271410] Opening hdfs; [2018-08-21T10:09:21.281685] Opening hdfs; [2018-08-21T10:09:31.290765] Opening hdfs; [2018-08-21T10:09:41.301098] Opening hdfs; [2018-08-21T10:09:51.311362] Opening hdfs; [2018-08-21T10:10:01.321152] Opening hdfs; [2018-08-21T10:10:11.321818] Opening hdfs; [2018-08-21T10:10:21.330114] Opening hdfs; [2018-08-21T10:10:31.340413] Opening hdfs; [2018-08-21T10:10:41.350654] Opening hdfs; [2018-08-21T10:10:51.354016] Opening hdfs; [2018-08-21T10:11:01.364267] Opening hdfs; [2018-08-21T10:11:11.374516] Opening hdfs; [2018-08-21T10:11:21.384761] Opening hdfs; [2018-08-21T10:11:31.395017] Opening hdfs; [2018-08-21T10:11:41.402256] Opening hdfs; [2018-08-21T10:11:51.404097] Opening hdfs; ^C[2018-08-21T10:11:59.672252] syslog-ng shutting down; version='3.14.1' Exception in thread "" java.lang.NoClassDefFoundError: org/apache/hadoop/conf/Configuration at org.syslog_ng.hdfs.HdfsDestination.open(HdfsDestination.java:92) at org.syslog_ng.LogDestination.openProxy(LogDestination.java:65) [2018-08-21T10:11:59.774895] Worker thread finished; driver='d_hdfs#0' [2018-08-21T10:11:59.775384] Closing log transport fd; fd='13' [2018-08-21T10:11:59.775508] Deinitialize hdfs destination; [2018-08-21T10:11:59.776534] Java machine free; [2018-08-21T10:11:59.778421] Running application hooks; hook='4'
Any idea what to be checked further?
Thank you.
On Fri, Aug 17, 2018 at 4:45 PM Nagy, Gábor <gabor.nagy@oneidentity.com> wrote:
Hello!
In the statistics it can be seen that the log message is not sent to the HDFS server: dropped='dst.java(d_hdfs#0 java_dst hdfs hdfs://x.x.x.x:25000 /user/syslog/$HOST-$DAY-$MONTH-$YEAR.log)=0' processed='dst.java(d_hdfs#0 java_dst hdfs hdfs://x.x.x.x:25000 /user/syslog/$HOST-$DAY-$MONTH-$YEAR.log)=1' queued='dst.java(d_hdfs#0 java_dst hdfs hdfs://x.x.x.x:25000 /user/syslog/$HOST-$DAY-$MONTH-$YEAR.log)=1'
Well, generally on write error there should be an exception that results in an error message.
You should try debugging it either in running syslog-ng in foreground (-F option), forwarding internal logs to stderr (-e) and with debug mode (-dv) on. Or in service mode use the internal() source in your config and connect it to a destination (e.g. file()) which you prefer.
You could turn on debug messages on java side too using jvm_options() in syslog-ng config and configuring the log4j logging service, e.g.: options {
jvm_options("-Dlog4j.configuration=file:/etc/hadoop/log4j.properties -Dlog4j.debug=true"); };
Regards, Gabor
On Fri, Aug 17, 2018 at 10:34 AM Czanik, Péter < peter.czanik@balabit.com> wrote:
Hi,
As https://copr.fedorainfracloud.org/coprs/czanik/syslog-ng314/ also writes: "Java is enabled, but JAR dependencies are not provided in package, except for Elasticsearch http mode." The syslog-ng-java-deps.noarch contains build time dependencies. Probably I should rename the package to syslog-ng-java-build-deps...
Check the documentation at https://www.syslog-ng.com/technical-documents/doc/syslog-ng-open-source-edit... on how to download and configure HDFS related JAR dependencies.
Bye,
Peter Czanik (CzP) <peter.czanik@balabit.com> Balabit / syslog-ng upstream https://syslog-ng.com/community/ https://twitter.com/PCzanik
On Fri, Aug 17, 2018 at 10:22 AM, Lee Keng Ket <kengket@gmail.com> wrote:
Hi,
I'm trying to connect syslog-ng 3.14.1 to HDFS to store the syslog messages. The syslog-ng can start without error, and it's able to write into local file. However, the log is not written to the HDFS. As there is no single error, I'm not sure how I should troubleshoot on this.
I have installed the syslog-ng from this repo, https://copr.fedorainfracloud.org/coprs/czanik/syslog-ng314/repo/epel-7/czan...
Installed Packages syslog-ng.x86_64 3.14.1-4.el7.centos @czanik-syslog-ng314 syslog-ng-java.x86_64 3.14.1-4.el7.centos @czanik-syslog-ng314 syslog-ng-java-deps.noarch 1.0-2 @czanik-syslog-ng314
This is the message from /var/log/message: Log statistics; processed='src.internal(s_sys#0)=1', stamp='src.internal(s_sys#0)=1534491834', processed='destination(d_spol)=0', processed='destination(d_mlal)=0', processed='center(received)=2', processed='destination(d_mesg)=1', processed='destination(d_mail)=0', processed='destination(d_auth)=0', processed='destination(d_cron)=0', processed='destination(d_hdfs)=1', processed='center(queued)=3', queued='global(scratch_buffers_count)=0', processed='source(remote_log)=1', dropped='dst.java(d_hdfs#0,java_dst,hdfs,hdfs://x.x.x.x:25000,/user/syslog/$HOST-$DAY-$MONTH-$YEAR.log)=0', processed='dst.java(d_hdfs#0,java_dst,hdfs,hdfs://x.x.x.x:25000,/user/syslog/$HOST-$DAY-$MONTH-$YEAR.log)=1', queued='dst.java(d_hdfs#0,java_dst,hdfs,hdfs://x.x.x.x:25000,/user/syslog/$HOST-$DAY-$MONTH-$YEAR.log)=1', processed='global(payload_reallocs)=0', processed='src.journald(journal)=0', stamp='src.journald(journal)=0', processed='global(sdata_updates)=0', queued='global(scratch_buffers_bytes)=0', processed='destination(d_boot)=0', processed='destination(d_kern)=0', processed='source(s_sys)=1', processed='destination(remote)=1', processed='global(internal_queue_length)=0', processed='global(msg_clones)=0'
Anyone has any idea how should I proceed the troubleshooting?
______________________________________________________________________________ Member info: https://lists.balabit.hu/mailman/listinfo/syslog-ng Documentation: http://www.balabit.com/support/documentation/?product=syslog-ng FAQ: http://www.balabit.com/wiki/syslog-ng-faq
______________________________________________________________________________ Member info: https://lists.balabit.hu/mailman/listinfo/syslog-ng Documentation: http://www.balabit.com/support/documentation/?product=syslog-ng FAQ: http://www.balabit.com/wiki/syslog-ng-faq
______________________________________________________________________________ Member info: https://lists.balabit.hu/mailman/listinfo/syslog-ng Documentation: http://www.balabit.com/support/documentation/?product=syslog-ng FAQ: http://www.balabit.com/wiki/syslog-ng-faq
______________________________________________________________________________ Member info: https://lists.balabit.hu/mailman/listinfo/syslog-ng Documentation: http://www.balabit.com/support/documentation/?product=syslog-ng FAQ: http://www.balabit.com/wiki/syslog-ng-faq
______________________________________________________________________________ Member info: https://lists.balabit.hu/mailman/listinfo/syslog-ng Documentation: http://www.balabit.com/support/documentation/?product=syslog-ng FAQ: http://www.balabit.com/wiki/syslog-ng-faq
______________________________________________________________________________ Member info: https://lists.balabit.hu/mailman/listinfo/syslog-ng Documentation: http://www.balabit.com/support/documentation/?product=syslog-ng FAQ: http://www.balabit.com/wiki/syslog-ng-faq
Hi, Gabor After I change the lib to executable as reminded by Janos, now there is different error. Any idea if this is Kerberos issue or still the Hadoop lib issue? The Hadoop that I'm using is Huawei FusionInsight. Hadoop HDFS version should be 2.7.2. This is the syslog-ng.conf for HDFS part: destination d_hdfs { hdfs(client_lib_dir("/opt/hadoop/lib") hdfs_uri("hdfs://x.x.x.x:25000") kerberos-keytab-file("/etc/syslog.keytab") kerberos-principal("syslog@HADOOP.COM") hdfs_file("/user/syslog/$HOST-$DAY-$MONTH-$YEAR.log") template(t_cgnat) hdfs-append-enabled(true) ); }; There are the messages I get when starting in debug: [2018-08-23T09:40:09.210168] Running application hooks; hook='1' [2018-08-23T09:40:09.210192] Running application hooks; hook='3' [2018-08-23T09:40:09.210501] syslog-ng starting up; version='3.14.1' [2018-08-23T09:40:09.213049] Worker thread started; driver='d_hdfs#0' [2018-08-23T09:40:09.214922] Opening hdfs; [2018-08-23T09:40:09.548286] field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginSuccess with annotation @org.apache.hadoop.metrics2.annotation.Metric(about=, sampleName=Ops, always=false, type=DEFAULT, valueName=Time, value=[Rate of successful kerberos logins and latency (milliseconds)]); [2018-08-23T09:40:09.561345] field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginFailure with annotation @org.apache.hadoop.metrics2.annotation.Metric(about=, sampleName=Ops, always=false, type=DEFAULT, valueName=Time, value=[Rate of failed kerberos logins and latency (milliseconds)]); [2018-08-23T09:40:09.561608] field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.getGroups with annotation @org.apache.hadoop.metrics2.annotation.Metric(about=, sampleName=Ops, always=false, type=DEFAULT, valueName=Time, value=[GetGroups]); [2018-08-23T09:40:09.562485] UgiMetrics, User and group related metrics; [2018-08-23T09:40:09.604037] Failed to detect a valid hadoop home directory; [2018-08-23T09:40:09.687386] setsid exited with exit code 0; [2018-08-23T09:40:09.715804] Creating new Groups object; [2018-08-23T09:40:09.717743] Trying to load the custom-built native-hadoop library...; [2018-08-23T09:40:09.718065] Failed to load native-hadoop with error: java.lang.UnsatisfiedLinkError: no hadoop in java.library.path; [2018-08-23T09:40:09.718095] java.library.path=//usr/lib64/syslog-ng; [2018-08-23T09:40:09.718117] Unable to load native-hadoop library for your platform... using builtin-java classes where applicable; [2018-08-23T09:40:09.718418] Falling back to shell based; [2018-08-23T09:40:09.718997] Group mapping impl=org.apache.hadoop.security.ShellBasedUnixGroupsMapping; [2018-08-23T09:40:09.753615] Group mapping impl=org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback\; cacheTimeout=300000\; warningDeltaMs=5000; [2018-08-23T09:40:09.905060] hadoop login; [2018-08-23T09:40:09.906493] hadoop login commit; [2018-08-23T09:40:09.907300] using kerberos user:syslog@HADOOP.COM; [2018-08-23T09:40:09.907333] Using user: "syslog@HADOOP.COM" with name syslog@HADOOP.COM; [2018-08-23T09:40:09.907592] User entry: "syslog@HADOOP.COM"; [2018-08-23T09:40:09.908005] Login successful for user syslog@HADOOP.COM using keytab file syslog.keytab; [2018-08-23T09:40:10.104386] dfs.client.use.legacy.blockreader.local = false; [2018-08-23T09:40:10.104436] dfs.client.read.shortcircuit = false; [2018-08-23T09:40:10.104450] dfs.client.domain.socket.data.traffic = false; [2018-08-23T09:40:10.104461] dfs.domain.socket.path = ; [2018-08-23T09:40:10.121280] Sets dfs.client.block.write.replace-datanode-on-failure.replication to 0; [2018-08-23T09:40:10.144892] multipleLinearRandomRetry = null; [2018-08-23T09:40:10.168901] rpcKind=RPC_PROTOCOL_BUFFER, rpcRequestWrapperClass=class org.apache.hadoop.ipc.ProtobufRpcEngine$RpcProtobufRequest, rpcInvoker=org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker@4379f1a4 ; [2018-08-23T09:40:10.173738] getting client out of cache: org.apache.hadoop.ipc.Client@5b52b909; [2018-08-23T09:40:10.238466] finalize() called.; [2018-08-23T09:40:10.238705] finalize() called.; [2018-08-23T09:40:10.526021] Both short-circuit local reads and UNIX domain socket are disabled.; [2018-08-23T09:40:10.532037] DataTransferProtocol not using SaslPropertiesResolver, no QOP found in configuration for dfs.data.transfer.protection; [2018-08-23T09:40:10.555581] The ping interval is 60000 ms.; [2018-08-23T09:40:10.556336] Connecting to /x.x.x.x:25000; [2018-08-23T09:40:10.572385] PrivilegedAction as:syslog@HADOOP.COM (auth:KERBEROS) from:org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:771); [2018-08-23T09:40:10.613823] Get kerberos info proto:interface org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB info:@org.apache.hadoop.security.KerberosInfo(clientPrincipal=, serverPrincipal=dfs.namenode.kerberos.principal); [2018-08-23T09:40:10.723447] RPC Server's Kerberos principal name for protocol=org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB is hdfs/ hadoop.hadoop.com@HADOOP.COM; [2018-08-23T09:40:10.723514] Creating SASL GSSAPI(KERBEROS) client to authenticate to service at hadoop.hadoop.com; [2018-08-23T09:40:10.730466] Use KERBEROS authentication for protocol ClientNamenodeProtocolPB; [2018-08-23T09:40:10.758296] PrivilegedActionException as:syslog@HADOOP.COM (auth:KERBEROS) cause:javax.security.sasl.SaslException: No common protection layer between client and server; [2018-08-23T09:40:10.759031] PrivilegedAction as:syslog@HADOOP.COM (auth:KERBEROS) from:org.apache.hadoop.ipc.Client$Connection.handleSaslConnectionFailure(Client.java:694); [2018-08-23T09:40:10.759531] Exception encountered while connecting to the server : javax.security.sasl.SaslException: No common protection layer between client and server; [2018-08-23T09:40:14.446925] PrivilegedAction as:syslog@HADOOP.COM (auth:KERBEROS) from:org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:771); [2018-08-23T09:40:14.447824] Get kerberos info proto:interface org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB info:@org.apache.hadoop.security.KerberosInfo(clientPrincipal=, serverPrincipal=dfs.namenode.kerberos.principal); [2018-08-23T09:40:14.447982] RPC Server's Kerberos principal name for protocol=org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB is hdfs/ hadoop.hadoop.com@HADOOP.COM; [2018-08-23T09:40:14.448001] Creating SASL GSSAPI(KERBEROS) client to authenticate to service at hadoop.hadoop.com; [2018-08-23T09:40:14.449087] Use KERBEROS authentication for protocol ClientNamenodeProtocolPB; [2018-08-23T09:40:14.455070] PrivilegedActionException as:syslog@HADOOP.COM (auth:KERBEROS) cause:javax.security.sasl.SaslException: No common protection layer between client and server; [2018-08-23T09:40:14.455190] PrivilegedAction as:syslog@HADOOP.COM (auth:KERBEROS) from:org.apache.hadoop.ipc.Client$Connection.handleSaslConnectionFailure(Client.java:694); [2018-08-23T09:40:14.455476] Exception encountered while connecting to the server : javax.security.sasl.SaslException: No common protection layer between client and server; [2018-08-23T09:40:17.206928] PrivilegedAction as:syslog@HADOOP.COM (auth:KERBEROS) from:org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:771); [2018-08-23T09:40:17.207978] Get kerberos info proto:interface org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB info:@org.apache.hadoop.security.KerberosInfo(clientPrincipal=, serverPrincipal=dfs.namenode.kerberos.principal); [2018-08-23T09:40:17.208115] RPC Server's Kerberos principal name for protocol=org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB is hdfs/ hadoop.hadoop.com@HADOOP.COM; [2018-08-23T09:40:17.208133] Creating SASL GSSAPI(KERBEROS) client to authenticate to service at hadoop.hadoop.com; [2018-08-23T09:40:17.208877] Use KERBEROS authentication for protocol ClientNamenodeProtocolPB; [2018-08-23T09:40:17.214382] PrivilegedActionException as:syslog@HADOOP.COM (auth:KERBEROS) cause:javax.security.sasl.SaslException: No common protection layer between client and server; [2018-08-23T09:40:17.214536] PrivilegedAction as:syslog@HADOOP.COM (auth:KERBEROS) from:org.apache.hadoop.ipc.Client$Connection.handleSaslConnectionFailure(Client.java:694); [2018-08-23T09:40:17.214845] Exception encountered while connecting to the server : javax.security.sasl.SaslException: No common protection layer between client and server; [2018-08-23T09:40:17.535313] PrivilegedAction as:syslog@HADOOP.COM (auth:KERBEROS) from:org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:771); [2018-08-23T09:40:17.536419] Get kerberos info proto:interface org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB info:@org.apache.hadoop.security.KerberosInfo(clientPrincipal=, serverPrincipal=dfs.namenode.kerberos.principal); [2018-08-23T09:40:17.536615] RPC Server's Kerberos principal name for protocol=org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB is hdfs/ hadoop.hadoop.com@HADOOP.COM; [2018-08-23T09:40:17.536634] Creating SASL GSSAPI(KERBEROS) client to authenticate to service at hadoop.hadoop.com; [2018-08-23T09:40:17.537679] Use KERBEROS authentication for protocol ClientNamenodeProtocolPB; [2018-08-23T09:40:17.546375] PrivilegedActionException as:syslog@HADOOP.COM (auth:KERBEROS) cause:javax.security.sasl.SaslException: No common protection layer between client and server; [2018-08-23T09:40:17.546587] PrivilegedAction as:syslog@HADOOP.COM (auth:KERBEROS) from:org.apache.hadoop.ipc.Client$Connection.handleSaslConnectionFailure(Client.java:694); [2018-08-23T09:40:17.546963] Exception encountered while connecting to the server : javax.security.sasl.SaslException: No common protection layer between client and server; [2018-08-23T09:40:21.891382] PrivilegedAction as:syslog@HADOOP.COM (auth:KERBEROS) from:org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:771); [2018-08-23T09:40:21.892476] Get kerberos info proto:interface org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB info:@org.apache.hadoop.security.KerberosInfo(clientPrincipal=, serverPrincipal=dfs.namenode.kerberos.principal); [2018-08-23T09:40:21.909267] RPC Server's Kerberos principal name for protocol=org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB is hdfs/ hadoop.hadoop.com@HADOOP.COM; [2018-08-23T09:40:21.909306] Creating SASL GSSAPI(KERBEROS) client to authenticate to service at hadoop.hadoop.com; [2018-08-23T09:40:21.910125] Use KERBEROS authentication for protocol ClientNamenodeProtocolPB; [2018-08-23T09:40:21.915324] PrivilegedActionException as:syslog@HADOOP.COM (auth:KERBEROS) cause:javax.security.sasl.SaslException: No common protection layer between client and server; [2018-08-23T09:40:21.915540] PrivilegedAction as:syslog@HADOOP.COM (auth:KERBEROS) from:org.apache.hadoop.ipc.Client$Connection.handleSaslConnectionFailure(Client.java:694); [2018-08-23T09:40:21.915889] Exception encountered while connecting to the server : javax.security.sasl.SaslException: No common protection layer between client and server; [2018-08-23T09:40:26.095299] PrivilegedAction as:syslog@HADOOP.COM (auth:KERBEROS) from:org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:771); [2018-08-23T09:40:26.096234] Get kerberos info proto:interface org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB info:@org.apache.hadoop.security.KerberosInfo(clientPrincipal=, serverPrincipal=dfs.namenode.kerberos.principal); [2018-08-23T09:40:26.096391] RPC Server's Kerberos principal name for protocol=org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB is hdfs/ hadoop.hadoop.com@HADOOP.COM; [2018-08-23T09:40:26.096411] Creating SASL GSSAPI(KERBEROS) client to authenticate to service at hadoop.hadoop.com; [2018-08-23T09:40:26.097116] Use KERBEROS authentication for protocol ClientNamenodeProtocolPB; [2018-08-23T09:40:26.101823] PrivilegedActionException as:syslog@HADOOP.COM (auth:KERBEROS) cause:javax.security.sasl.SaslException: No common protection layer between client and server; [2018-08-23T09:40:26.101960] PrivilegedAction as:syslog@HADOOP.COM (auth:KERBEROS) from:org.apache.hadoop.ipc.Client$Connection.handleSaslConnectionFailure(Client.java:694); [2018-08-23T09:40:26.102283] Couldn't setup connection for syslog@HADOOP.COM to /x.x.x.x:25000; [2018-08-23T09:40:26.102365] PrivilegedActionException as:syslog@HADOOP.COM (auth:KERBEROS) cause:java.io.IOException: Couldn't setup connection for syslog@HADOOP.COM to /x.x.x.x:25000; [2018-08-23T09:40:26.103844] closing ipc connection to /x.x.x.x:25000: Couldn't setup connection for syslog@HADOOP.COM to /x.x.x.x:25000; [2018-08-23T09:40:26.103994] IPC Client (1766482975) connection to /x.x.x.x:25000 from syslog@HADOOP.COM: closed; Thank you. On Tue, Aug 21, 2018 at 6:48 PM Nagy, Gábor <gabor.nagy@oneidentity.com> wrote:
Hi,
Can you tell ne what is the version of the hadoop lib you use with syslog-ng, please?
Can you share your syslog-ng configuration, mainly the hdfs part, please?
Regards, Gabor
On Tue, Aug 21, 2018 at 4:15 AM Lee Keng Ket <kengket@gmail.com> wrote:
Hi, Gabor
I have run it, seems like it stops at the HDFS side.
[2018-08-21T10:07:51.212015] Worker thread started; driver='d_hdfs#0' [2018-08-21T10:07:51.212499] Running application hooks; hook='1' [2018-08-21T10:07:51.212516] Running application hooks; hook='3' [2018-08-21T10:07:51.212595] syslog-ng starting up; version='3.14.1' [2018-08-21T10:07:51.214113] Opening hdfs; [2018-08-21T10:08:01.215622] Opening hdfs; [2018-08-21T10:08:11.216050] Opening hdfs; [2018-08-21T10:08:21.226340] Opening hdfs; [2018-08-21T10:08:31.236589] Opening hdfs; [2018-08-21T10:08:41.240623] Opening hdfs; [2018-08-21T10:08:51.250879] Opening hdfs; [2018-08-21T10:09:01.261172] Opening hdfs; [2018-08-21T10:09:11.271410] Opening hdfs; [2018-08-21T10:09:21.281685] Opening hdfs; [2018-08-21T10:09:31.290765] Opening hdfs; [2018-08-21T10:09:41.301098] Opening hdfs; [2018-08-21T10:09:51.311362] Opening hdfs; [2018-08-21T10:10:01.321152] Opening hdfs; [2018-08-21T10:10:11.321818] Opening hdfs; [2018-08-21T10:10:21.330114] Opening hdfs; [2018-08-21T10:10:31.340413] Opening hdfs; [2018-08-21T10:10:41.350654] Opening hdfs; [2018-08-21T10:10:51.354016] Opening hdfs; [2018-08-21T10:11:01.364267] Opening hdfs; [2018-08-21T10:11:11.374516] Opening hdfs; [2018-08-21T10:11:21.384761] Opening hdfs; [2018-08-21T10:11:31.395017] Opening hdfs; [2018-08-21T10:11:41.402256] Opening hdfs; [2018-08-21T10:11:51.404097] Opening hdfs; ^C[2018-08-21T10:11:59.672252] syslog-ng shutting down; version='3.14.1' Exception in thread "" java.lang.NoClassDefFoundError: org/apache/hadoop/conf/Configuration at org.syslog_ng.hdfs.HdfsDestination.open(HdfsDestination.java:92) at org.syslog_ng.LogDestination.openProxy(LogDestination.java:65) [2018-08-21T10:11:59.774895] Worker thread finished; driver='d_hdfs#0' [2018-08-21T10:11:59.775384] Closing log transport fd; fd='13' [2018-08-21T10:11:59.775508] Deinitialize hdfs destination; [2018-08-21T10:11:59.776534] Java machine free; [2018-08-21T10:11:59.778421] Running application hooks; hook='4'
Any idea what to be checked further?
Thank you.
On Fri, Aug 17, 2018 at 4:45 PM Nagy, Gábor <gabor.nagy@oneidentity.com> wrote:
Hello!
In the statistics it can be seen that the log message is not sent to the HDFS server: dropped='dst.java(d_hdfs#0 java_dst hdfs hdfs://x.x.x.x:25000 /user/syslog/$HOST-$DAY-$MONTH-$YEAR.log)=0' processed='dst.java(d_hdfs#0 java_dst hdfs hdfs://x.x.x.x:25000 /user/syslog/$HOST-$DAY-$MONTH-$YEAR.log)=1' queued='dst.java(d_hdfs#0 java_dst hdfs hdfs://x.x.x.x:25000 /user/syslog/$HOST-$DAY-$MONTH-$YEAR.log)=1'
Well, generally on write error there should be an exception that results in an error message.
You should try debugging it either in running syslog-ng in foreground (-F option), forwarding internal logs to stderr (-e) and with debug mode (-dv) on. Or in service mode use the internal() source in your config and connect it to a destination (e.g. file()) which you prefer.
You could turn on debug messages on java side too using jvm_options() in syslog-ng config and configuring the log4j logging service, e.g.: options {
jvm_options("-Dlog4j.configuration=file:/etc/hadoop/log4j.properties -Dlog4j.debug=true"); };
Regards, Gabor
On Fri, Aug 17, 2018 at 10:34 AM Czanik, Péter <peter.czanik@balabit.com> wrote:
Hi,
As https://copr.fedorainfracloud.org/coprs/czanik/syslog-ng314/ also writes: "Java is enabled, but JAR dependencies are not provided in package, except for Elasticsearch http mode." The syslog-ng-java-deps.noarch contains build time dependencies. Probably I should rename the package to syslog-ng-java-build-deps...
Check the documentation at https://www.syslog-ng.com/technical-documents/doc/syslog-ng-open-source-edit... on how to download and configure HDFS related JAR dependencies.
Bye,
Peter Czanik (CzP) <peter.czanik@balabit.com> Balabit / syslog-ng upstream https://syslog-ng.com/community/ https://twitter.com/PCzanik
On Fri, Aug 17, 2018 at 10:22 AM, Lee Keng Ket <kengket@gmail.com> wrote:
Hi,
I'm trying to connect syslog-ng 3.14.1 to HDFS to store the syslog messages. The syslog-ng can start without error, and it's able to write into local file. However, the log is not written to the HDFS. As there is no single error, I'm not sure how I should troubleshoot on this.
I have installed the syslog-ng from this repo, https://copr.fedorainfracloud.org/coprs/czanik/syslog-ng314/repo/epel-7/czan...
Installed Packages syslog-ng.x86_64 3.14.1-4.el7.centos @czanik-syslog-ng314 syslog-ng-java.x86_64 3.14.1-4.el7.centos @czanik-syslog-ng314 syslog-ng-java-deps.noarch 1.0-2 @czanik-syslog-ng314
This is the message from /var/log/message: Log statistics; processed='src.internal(s_sys#0)=1', stamp='src.internal(s_sys#0)=1534491834', processed='destination(d_spol)=0', processed='destination(d_mlal)=0', processed='center(received)=2', processed='destination(d_mesg)=1', processed='destination(d_mail)=0', processed='destination(d_auth)=0', processed='destination(d_cron)=0', processed='destination(d_hdfs)=1', processed='center(queued)=3', queued='global(scratch_buffers_count)=0', processed='source(remote_log)=1', dropped='dst.java(d_hdfs#0,java_dst,hdfs,hdfs://x.x.x.x:25000,/user/syslog/$HOST-$DAY-$MONTH-$YEAR.log)=0', processed='dst.java(d_hdfs#0,java_dst,hdfs,hdfs://x.x.x.x:25000,/user/syslog/$HOST-$DAY-$MONTH-$YEAR.log)=1', queued='dst.java(d_hdfs#0,java_dst,hdfs,hdfs://x.x.x.x:25000,/user/syslog/$HOST-$DAY-$MONTH-$YEAR.log)=1', processed='global(payload_reallocs)=0', processed='src.journald(journal)=0', stamp='src.journald(journal)=0', processed='global(sdata_updates)=0', queued='global(scratch_buffers_bytes)=0', processed='destination(d_boot)=0', processed='destination(d_kern)=0', processed='source(s_sys)=1', processed='destination(remote)=1', processed='global(internal_queue_length)=0', processed='global(msg_clones)=0'
Anyone has any idea how should I proceed the troubleshooting?
______________________________________________________________________________ Member info: https://lists.balabit.hu/mailman/listinfo/syslog-ng Documentation: http://www.balabit.com/support/documentation/?product=syslog-ng FAQ: http://www.balabit.com/wiki/syslog-ng-faq
______________________________________________________________________________ Member info: https://lists.balabit.hu/mailman/listinfo/syslog-ng Documentation: http://www.balabit.com/support/documentation/?product=syslog-ng FAQ: http://www.balabit.com/wiki/syslog-ng-faq
______________________________________________________________________________ Member info: https://lists.balabit.hu/mailman/listinfo/syslog-ng Documentation: http://www.balabit.com/support/documentation/?product=syslog-ng FAQ: http://www.balabit.com/wiki/syslog-ng-faq
______________________________________________________________________________ Member info: https://lists.balabit.hu/mailman/listinfo/syslog-ng Documentation: http://www.balabit.com/support/documentation/?product=syslog-ng FAQ: http://www.balabit.com/wiki/syslog-ng-faq
______________________________________________________________________________ Member info: https://lists.balabit.hu/mailman/listinfo/syslog-ng Documentation: http://www.balabit.com/support/documentation/?product=syslog-ng FAQ: http://www.balabit.com/wiki/syslog-ng-faq
Hi, As I was searching for answer what i saw was that "SaslException: No common protection layer between client and server;" can occur if the configuration differs on client-server side. You should configure the "hadoop.rpc.protection" value in core-site.xml on the client side to the same one as on the server side. To use that xml with hdfs, you should put the filename in syslog-ng config in the hdfs destination block: destination d_hdfs { ... hdfs-resources("/path/to/xmlconfig/core-site.xml"); }; Regards, Gabor On Thu, Aug 23, 2018 at 3:53 AM Lee Keng Ket <kengket@gmail.com> wrote:
Hi, Gabor
After I change the lib to executable as reminded by Janos, now there is different error. Any idea if this is Kerberos issue or still the Hadoop lib issue?
The Hadoop that I'm using is Huawei FusionInsight. Hadoop HDFS version should be 2.7.2.
This is the syslog-ng.conf for HDFS part: destination d_hdfs { hdfs(client_lib_dir("/opt/hadoop/lib") hdfs_uri("hdfs://x.x.x.x:25000") kerberos-keytab-file("/etc/syslog.keytab") kerberos-principal("syslog@HADOOP.COM") hdfs_file("/user/syslog/$HOST-$DAY-$MONTH-$YEAR.log") template(t_cgnat) hdfs-append-enabled(true) ); };
There are the messages I get when starting in debug:
[2018-08-23T09:40:09.210168] Running application hooks; hook='1' [2018-08-23T09:40:09.210192] Running application hooks; hook='3' [2018-08-23T09:40:09.210501] syslog-ng starting up; version='3.14.1' [2018-08-23T09:40:09.213049] Worker thread started; driver='d_hdfs#0' [2018-08-23T09:40:09.214922] Opening hdfs; [2018-08-23T09:40:09.548286] field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginSuccess with annotation @org.apache.hadoop.metrics2.annotation.Metric(about=, sampleName=Ops, always=false, type=DEFAULT, valueName=Time, value=[Rate of successful kerberos logins and latency (milliseconds)]); [2018-08-23T09:40:09.561345] field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginFailure with annotation @org.apache.hadoop.metrics2.annotation.Metric(about=, sampleName=Ops, always=false, type=DEFAULT, valueName=Time, value=[Rate of failed kerberos logins and latency (milliseconds)]); [2018-08-23T09:40:09.561608] field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.getGroups with annotation @org.apache.hadoop.metrics2.annotation.Metric(about=, sampleName=Ops, always=false, type=DEFAULT, valueName=Time, value=[GetGroups]); [2018-08-23T09:40:09.562485] UgiMetrics, User and group related metrics; [2018-08-23T09:40:09.604037] Failed to detect a valid hadoop home directory; [2018-08-23T09:40:09.687386] setsid exited with exit code 0; [2018-08-23T09:40:09.715804] Creating new Groups object; [2018-08-23T09:40:09.717743] Trying to load the custom-built native-hadoop library...; [2018-08-23T09:40:09.718065] Failed to load native-hadoop with error: java.lang.UnsatisfiedLinkError: no hadoop in java.library.path; [2018-08-23T09:40:09.718095] java.library.path=//usr/lib64/syslog-ng; [2018-08-23T09:40:09.718117] Unable to load native-hadoop library for your platform... using builtin-java classes where applicable; [2018-08-23T09:40:09.718418] Falling back to shell based; [2018-08-23T09:40:09.718997] Group mapping impl=org.apache.hadoop.security.ShellBasedUnixGroupsMapping; [2018-08-23T09:40:09.753615] Group mapping impl=org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback\; cacheTimeout=300000\; warningDeltaMs=5000; [2018-08-23T09:40:09.905060] hadoop login; [2018-08-23T09:40:09.906493] hadoop login commit; [2018-08-23T09:40:09.907300] using kerberos user:syslog@HADOOP.COM; [2018-08-23T09:40:09.907333] Using user: "syslog@HADOOP.COM" with name syslog@HADOOP.COM; [2018-08-23T09:40:09.907592] User entry: "syslog@HADOOP.COM"; [2018-08-23T09:40:09.908005] Login successful for user syslog@HADOOP.COM using keytab file syslog.keytab; [2018-08-23T09:40:10.104386] dfs.client.use.legacy.blockreader.local = false; [2018-08-23T09:40:10.104436] dfs.client.read.shortcircuit = false; [2018-08-23T09:40:10.104450] dfs.client.domain.socket.data.traffic = false; [2018-08-23T09:40:10.104461] dfs.domain.socket.path = ; [2018-08-23T09:40:10.121280] Sets dfs.client.block.write.replace-datanode-on-failure.replication to 0; [2018-08-23T09:40:10.144892] multipleLinearRandomRetry = null; [2018-08-23T09:40:10.168901] rpcKind=RPC_PROTOCOL_BUFFER, rpcRequestWrapperClass=class org.apache.hadoop.ipc.ProtobufRpcEngine$RpcProtobufRequest, rpcInvoker=org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker@4379f1a4 ; [2018-08-23T09:40:10.173738] getting client out of cache: org.apache.hadoop.ipc.Client@5b52b909; [2018-08-23T09:40:10.238466] finalize() called.; [2018-08-23T09:40:10.238705] finalize() called.; [2018-08-23T09:40:10.526021] Both short-circuit local reads and UNIX domain socket are disabled.; [2018-08-23T09:40:10.532037] DataTransferProtocol not using SaslPropertiesResolver, no QOP found in configuration for dfs.data.transfer.protection; [2018-08-23T09:40:10.555581] The ping interval is 60000 ms.; [2018-08-23T09:40:10.556336] Connecting to /x.x.x.x:25000; [2018-08-23T09:40:10.572385] PrivilegedAction as:syslog@HADOOP.COM (auth:KERBEROS) from:org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:771); [2018-08-23T09:40:10.613823] Get kerberos info proto:interface org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB info:@org.apache.hadoop.security.KerberosInfo(clientPrincipal=, serverPrincipal=dfs.namenode.kerberos.principal); [2018-08-23T09:40:10.723447] RPC Server's Kerberos principal name for protocol=org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB is hdfs/ hadoop.hadoop.com@HADOOP.COM; [2018-08-23T09:40:10.723514] Creating SASL GSSAPI(KERBEROS) client to authenticate to service at hadoop.hadoop.com; [2018-08-23T09:40:10.730466] Use KERBEROS authentication for protocol ClientNamenodeProtocolPB; [2018-08-23T09:40:10.758296] PrivilegedActionException as:syslog@HADOOP.COM (auth:KERBEROS) cause:javax.security.sasl.SaslException: No common protection layer between client and server; [2018-08-23T09:40:10.759031] PrivilegedAction as:syslog@HADOOP.COM (auth:KERBEROS) from:org.apache.hadoop.ipc.Client$Connection.handleSaslConnectionFailure(Client.java:694); [2018-08-23T09:40:10.759531] Exception encountered while connecting to the server : javax.security.sasl.SaslException: No common protection layer between client and server; [2018-08-23T09:40:14.446925] PrivilegedAction as:syslog@HADOOP.COM (auth:KERBEROS) from:org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:771); [2018-08-23T09:40:14.447824] Get kerberos info proto:interface org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB info:@org.apache.hadoop.security.KerberosInfo(clientPrincipal=, serverPrincipal=dfs.namenode.kerberos.principal); [2018-08-23T09:40:14.447982] RPC Server's Kerberos principal name for protocol=org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB is hdfs/ hadoop.hadoop.com@HADOOP.COM; [2018-08-23T09:40:14.448001] Creating SASL GSSAPI(KERBEROS) client to authenticate to service at hadoop.hadoop.com; [2018-08-23T09:40:14.449087] Use KERBEROS authentication for protocol ClientNamenodeProtocolPB; [2018-08-23T09:40:14.455070] PrivilegedActionException as:syslog@HADOOP.COM (auth:KERBEROS) cause:javax.security.sasl.SaslException: No common protection layer between client and server; [2018-08-23T09:40:14.455190] PrivilegedAction as:syslog@HADOOP.COM (auth:KERBEROS) from:org.apache.hadoop.ipc.Client$Connection.handleSaslConnectionFailure(Client.java:694); [2018-08-23T09:40:14.455476] Exception encountered while connecting to the server : javax.security.sasl.SaslException: No common protection layer between client and server; [2018-08-23T09:40:17.206928] PrivilegedAction as:syslog@HADOOP.COM (auth:KERBEROS) from:org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:771); [2018-08-23T09:40:17.207978] Get kerberos info proto:interface org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB info:@org.apache.hadoop.security.KerberosInfo(clientPrincipal=, serverPrincipal=dfs.namenode.kerberos.principal); [2018-08-23T09:40:17.208115] RPC Server's Kerberos principal name for protocol=org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB is hdfs/ hadoop.hadoop.com@HADOOP.COM; [2018-08-23T09:40:17.208133] Creating SASL GSSAPI(KERBEROS) client to authenticate to service at hadoop.hadoop.com; [2018-08-23T09:40:17.208877] Use KERBEROS authentication for protocol ClientNamenodeProtocolPB; [2018-08-23T09:40:17.214382] PrivilegedActionException as:syslog@HADOOP.COM (auth:KERBEROS) cause:javax.security.sasl.SaslException: No common protection layer between client and server; [2018-08-23T09:40:17.214536] PrivilegedAction as:syslog@HADOOP.COM (auth:KERBEROS) from:org.apache.hadoop.ipc.Client$Connection.handleSaslConnectionFailure(Client.java:694); [2018-08-23T09:40:17.214845] Exception encountered while connecting to the server : javax.security.sasl.SaslException: No common protection layer between client and server; [2018-08-23T09:40:17.535313] PrivilegedAction as:syslog@HADOOP.COM (auth:KERBEROS) from:org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:771); [2018-08-23T09:40:17.536419] Get kerberos info proto:interface org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB info:@org.apache.hadoop.security.KerberosInfo(clientPrincipal=, serverPrincipal=dfs.namenode.kerberos.principal); [2018-08-23T09:40:17.536615] RPC Server's Kerberos principal name for protocol=org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB is hdfs/ hadoop.hadoop.com@HADOOP.COM; [2018-08-23T09:40:17.536634] Creating SASL GSSAPI(KERBEROS) client to authenticate to service at hadoop.hadoop.com; [2018-08-23T09:40:17.537679] Use KERBEROS authentication for protocol ClientNamenodeProtocolPB; [2018-08-23T09:40:17.546375] PrivilegedActionException as:syslog@HADOOP.COM (auth:KERBEROS) cause:javax.security.sasl.SaslException: No common protection layer between client and server; [2018-08-23T09:40:17.546587] PrivilegedAction as:syslog@HADOOP.COM (auth:KERBEROS) from:org.apache.hadoop.ipc.Client$Connection.handleSaslConnectionFailure(Client.java:694); [2018-08-23T09:40:17.546963] Exception encountered while connecting to the server : javax.security.sasl.SaslException: No common protection layer between client and server; [2018-08-23T09:40:21.891382] PrivilegedAction as:syslog@HADOOP.COM (auth:KERBEROS) from:org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:771); [2018-08-23T09:40:21.892476] Get kerberos info proto:interface org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB info:@org.apache.hadoop.security.KerberosInfo(clientPrincipal=, serverPrincipal=dfs.namenode.kerberos.principal); [2018-08-23T09:40:21.909267] RPC Server's Kerberos principal name for protocol=org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB is hdfs/ hadoop.hadoop.com@HADOOP.COM; [2018-08-23T09:40:21.909306] Creating SASL GSSAPI(KERBEROS) client to authenticate to service at hadoop.hadoop.com; [2018-08-23T09:40:21.910125] Use KERBEROS authentication for protocol ClientNamenodeProtocolPB; [2018-08-23T09:40:21.915324] PrivilegedActionException as:syslog@HADOOP.COM (auth:KERBEROS) cause:javax.security.sasl.SaslException: No common protection layer between client and server; [2018-08-23T09:40:21.915540] PrivilegedAction as:syslog@HADOOP.COM (auth:KERBEROS) from:org.apache.hadoop.ipc.Client$Connection.handleSaslConnectionFailure(Client.java:694); [2018-08-23T09:40:21.915889] Exception encountered while connecting to the server : javax.security.sasl.SaslException: No common protection layer between client and server; [2018-08-23T09:40:26.095299] PrivilegedAction as:syslog@HADOOP.COM (auth:KERBEROS) from:org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:771); [2018-08-23T09:40:26.096234] Get kerberos info proto:interface org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB info:@org.apache.hadoop.security.KerberosInfo(clientPrincipal=, serverPrincipal=dfs.namenode.kerberos.principal); [2018-08-23T09:40:26.096391] RPC Server's Kerberos principal name for protocol=org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB is hdfs/ hadoop.hadoop.com@HADOOP.COM; [2018-08-23T09:40:26.096411] Creating SASL GSSAPI(KERBEROS) client to authenticate to service at hadoop.hadoop.com; [2018-08-23T09:40:26.097116] Use KERBEROS authentication for protocol ClientNamenodeProtocolPB; [2018-08-23T09:40:26.101823] PrivilegedActionException as:syslog@HADOOP.COM (auth:KERBEROS) cause:javax.security.sasl.SaslException: No common protection layer between client and server; [2018-08-23T09:40:26.101960] PrivilegedAction as:syslog@HADOOP.COM (auth:KERBEROS) from:org.apache.hadoop.ipc.Client$Connection.handleSaslConnectionFailure(Client.java:694); [2018-08-23T09:40:26.102283] Couldn't setup connection for syslog@HADOOP.COM to /x.x.x.x:25000; [2018-08-23T09:40:26.102365] PrivilegedActionException as:syslog@HADOOP.COM (auth:KERBEROS) cause:java.io.IOException: Couldn't setup connection for syslog@HADOOP.COM to /x.x.x.x:25000; [2018-08-23T09:40:26.103844] closing ipc connection to /x.x.x.x:25000: Couldn't setup connection for syslog@HADOOP.COM to /x.x.x.x:25000; [2018-08-23T09:40:26.103994] IPC Client (1766482975) connection to /x.x.x.x:25000 from syslog@HADOOP.COM: closed;
Thank you.
On Tue, Aug 21, 2018 at 6:48 PM Nagy, Gábor <gabor.nagy@oneidentity.com> wrote:
Hi,
Can you tell ne what is the version of the hadoop lib you use with syslog-ng, please?
Can you share your syslog-ng configuration, mainly the hdfs part, please?
Regards, Gabor
On Tue, Aug 21, 2018 at 4:15 AM Lee Keng Ket <kengket@gmail.com> wrote:
Hi, Gabor
I have run it, seems like it stops at the HDFS side.
[2018-08-21T10:07:51.212015] Worker thread started; driver='d_hdfs#0' [2018-08-21T10:07:51.212499] Running application hooks; hook='1' [2018-08-21T10:07:51.212516] Running application hooks; hook='3' [2018-08-21T10:07:51.212595] syslog-ng starting up; version='3.14.1' [2018-08-21T10:07:51.214113] Opening hdfs; [2018-08-21T10:08:01.215622] Opening hdfs; [2018-08-21T10:08:11.216050] Opening hdfs; [2018-08-21T10:08:21.226340] Opening hdfs; [2018-08-21T10:08:31.236589] Opening hdfs; [2018-08-21T10:08:41.240623] Opening hdfs; [2018-08-21T10:08:51.250879] Opening hdfs; [2018-08-21T10:09:01.261172] Opening hdfs; [2018-08-21T10:09:11.271410] Opening hdfs; [2018-08-21T10:09:21.281685] Opening hdfs; [2018-08-21T10:09:31.290765] Opening hdfs; [2018-08-21T10:09:41.301098] Opening hdfs; [2018-08-21T10:09:51.311362] Opening hdfs; [2018-08-21T10:10:01.321152] Opening hdfs; [2018-08-21T10:10:11.321818] Opening hdfs; [2018-08-21T10:10:21.330114] Opening hdfs; [2018-08-21T10:10:31.340413] Opening hdfs; [2018-08-21T10:10:41.350654] Opening hdfs; [2018-08-21T10:10:51.354016] Opening hdfs; [2018-08-21T10:11:01.364267] Opening hdfs; [2018-08-21T10:11:11.374516] Opening hdfs; [2018-08-21T10:11:21.384761] Opening hdfs; [2018-08-21T10:11:31.395017] Opening hdfs; [2018-08-21T10:11:41.402256] Opening hdfs; [2018-08-21T10:11:51.404097] Opening hdfs; ^C[2018-08-21T10:11:59.672252] syslog-ng shutting down; version='3.14.1' Exception in thread "" java.lang.NoClassDefFoundError: org/apache/hadoop/conf/Configuration at org.syslog_ng.hdfs.HdfsDestination.open(HdfsDestination.java:92) at org.syslog_ng.LogDestination.openProxy(LogDestination.java:65) [2018-08-21T10:11:59.774895] Worker thread finished; driver='d_hdfs#0' [2018-08-21T10:11:59.775384] Closing log transport fd; fd='13' [2018-08-21T10:11:59.775508] Deinitialize hdfs destination; [2018-08-21T10:11:59.776534] Java machine free; [2018-08-21T10:11:59.778421] Running application hooks; hook='4'
Any idea what to be checked further?
Thank you.
On Fri, Aug 17, 2018 at 4:45 PM Nagy, Gábor <gabor.nagy@oneidentity.com> wrote:
Hello!
In the statistics it can be seen that the log message is not sent to the HDFS server: dropped='dst.java(d_hdfs#0 java_dst hdfs hdfs://x.x.x.x:25000 /user/syslog/$HOST-$DAY-$MONTH-$YEAR.log)=0' processed='dst.java(d_hdfs#0 java_dst hdfs hdfs://x.x.x.x:25000 /user/syslog/$HOST-$DAY-$MONTH-$YEAR.log)=1' queued='dst.java(d_hdfs#0 java_dst hdfs hdfs://x.x.x.x:25000 /user/syslog/$HOST-$DAY-$MONTH-$YEAR.log)=1'
Well, generally on write error there should be an exception that results in an error message.
You should try debugging it either in running syslog-ng in foreground (-F option), forwarding internal logs to stderr (-e) and with debug mode (-dv) on. Or in service mode use the internal() source in your config and connect it to a destination (e.g. file()) which you prefer.
You could turn on debug messages on java side too using jvm_options() in syslog-ng config and configuring the log4j logging service, e.g.: options {
jvm_options("-Dlog4j.configuration=file:/etc/hadoop/log4j.properties -Dlog4j.debug=true"); };
Regards, Gabor
On Fri, Aug 17, 2018 at 10:34 AM Czanik, Péter < peter.czanik@balabit.com> wrote:
Hi,
As https://copr.fedorainfracloud.org/coprs/czanik/syslog-ng314/ also writes: "Java is enabled, but JAR dependencies are not provided in package, except for Elasticsearch http mode." The syslog-ng-java-deps.noarch contains build time dependencies. Probably I should rename the package to syslog-ng-java-build-deps...
Check the documentation at https://www.syslog-ng.com/technical-documents/doc/syslog-ng-open-source-edit... on how to download and configure HDFS related JAR dependencies.
Bye,
Peter Czanik (CzP) <peter.czanik@balabit.com> Balabit / syslog-ng upstream https://syslog-ng.com/community/ https://twitter.com/PCzanik
On Fri, Aug 17, 2018 at 10:22 AM, Lee Keng Ket <kengket@gmail.com> wrote:
Hi,
I'm trying to connect syslog-ng 3.14.1 to HDFS to store the syslog messages. The syslog-ng can start without error, and it's able to write into local file. However, the log is not written to the HDFS. As there is no single error, I'm not sure how I should troubleshoot on this.
I have installed the syslog-ng from this repo, https://copr.fedorainfracloud.org/coprs/czanik/syslog-ng314/repo/epel-7/czan...
Installed Packages syslog-ng.x86_64 3.14.1-4.el7.centos @czanik-syslog-ng314 syslog-ng-java.x86_64 3.14.1-4.el7.centos @czanik-syslog-ng314 syslog-ng-java-deps.noarch 1.0-2 @czanik-syslog-ng314
This is the message from /var/log/message: Log statistics; processed='src.internal(s_sys#0)=1', stamp='src.internal(s_sys#0)=1534491834', processed='destination(d_spol)=0', processed='destination(d_mlal)=0', processed='center(received)=2', processed='destination(d_mesg)=1', processed='destination(d_mail)=0', processed='destination(d_auth)=0', processed='destination(d_cron)=0', processed='destination(d_hdfs)=1', processed='center(queued)=3', queued='global(scratch_buffers_count)=0', processed='source(remote_log)=1', dropped='dst.java(d_hdfs#0,java_dst,hdfs,hdfs://x.x.x.x:25000,/user/syslog/$HOST-$DAY-$MONTH-$YEAR.log)=0', processed='dst.java(d_hdfs#0,java_dst,hdfs,hdfs://x.x.x.x:25000,/user/syslog/$HOST-$DAY-$MONTH-$YEAR.log)=1', queued='dst.java(d_hdfs#0,java_dst,hdfs,hdfs://x.x.x.x:25000,/user/syslog/$HOST-$DAY-$MONTH-$YEAR.log)=1', processed='global(payload_reallocs)=0', processed='src.journald(journal)=0', stamp='src.journald(journal)=0', processed='global(sdata_updates)=0', queued='global(scratch_buffers_bytes)=0', processed='destination(d_boot)=0', processed='destination(d_kern)=0', processed='source(s_sys)=1', processed='destination(remote)=1', processed='global(internal_queue_length)=0', processed='global(msg_clones)=0'
Anyone has any idea how should I proceed the troubleshooting?
______________________________________________________________________________ Member info: https://lists.balabit.hu/mailman/listinfo/syslog-ng Documentation: http://www.balabit.com/support/documentation/?product=syslog-ng FAQ: http://www.balabit.com/wiki/syslog-ng-faq
______________________________________________________________________________ Member info: https://lists.balabit.hu/mailman/listinfo/syslog-ng Documentation: http://www.balabit.com/support/documentation/?product=syslog-ng FAQ: http://www.balabit.com/wiki/syslog-ng-faq
______________________________________________________________________________ Member info: https://lists.balabit.hu/mailman/listinfo/syslog-ng Documentation: http://www.balabit.com/support/documentation/?product=syslog-ng FAQ: http://www.balabit.com/wiki/syslog-ng-faq
______________________________________________________________________________ Member info: https://lists.balabit.hu/mailman/listinfo/syslog-ng Documentation: http://www.balabit.com/support/documentation/?product=syslog-ng FAQ: http://www.balabit.com/wiki/syslog-ng-faq
______________________________________________________________________________ Member info: https://lists.balabit.hu/mailman/listinfo/syslog-ng Documentation: http://www.balabit.com/support/documentation/?product=syslog-ng FAQ: http://www.balabit.com/wiki/syslog-ng-faq
______________________________________________________________________________ Member info: https://lists.balabit.hu/mailman/listinfo/syslog-ng Documentation: http://www.balabit.com/support/documentation/?product=syslog-ng FAQ: http://www.balabit.com/wiki/syslog-ng-faq
Hi, Gabor I am able to send the syslog to HDFS after adding in the hdfs-resources option. Thanks a lot for your help. Thank you. Regards, KK On Fri, Aug 24, 2018 at 10:29 PM Nagy, Gábor <gabor.nagy@oneidentity.com> wrote:
Hi,
As I was searching for answer what i saw was that "SaslException: No common protection layer between client and server;" can occur if the configuration differs on client-server side.
You should configure the "hadoop.rpc.protection" value in core-site.xml on the client side to the same one as on the server side. To use that xml with hdfs, you should put the filename in syslog-ng config in the hdfs destination block: destination d_hdfs { ... hdfs-resources("/path/to/xmlconfig/core-site.xml"); };
Regards, Gabor
On Thu, Aug 23, 2018 at 3:53 AM Lee Keng Ket <kengket@gmail.com> wrote:
Hi, Gabor
After I change the lib to executable as reminded by Janos, now there is different error. Any idea if this is Kerberos issue or still the Hadoop lib issue?
The Hadoop that I'm using is Huawei FusionInsight. Hadoop HDFS version should be 2.7.2.
This is the syslog-ng.conf for HDFS part: destination d_hdfs { hdfs(client_lib_dir("/opt/hadoop/lib") hdfs_uri("hdfs://x.x.x.x:25000") kerberos-keytab-file("/etc/syslog.keytab") kerberos-principal("syslog@HADOOP.COM") hdfs_file("/user/syslog/$HOST-$DAY-$MONTH-$YEAR.log") template(t_cgnat) hdfs-append-enabled(true) ); };
There are the messages I get when starting in debug:
[2018-08-23T09:40:09.210168] Running application hooks; hook='1' [2018-08-23T09:40:09.210192] Running application hooks; hook='3' [2018-08-23T09:40:09.210501] syslog-ng starting up; version='3.14.1' [2018-08-23T09:40:09.213049] Worker thread started; driver='d_hdfs#0' [2018-08-23T09:40:09.214922] Opening hdfs; [2018-08-23T09:40:09.548286] field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginSuccess with annotation @org.apache.hadoop.metrics2.annotation.Metric(about=, sampleName=Ops, always=false, type=DEFAULT, valueName=Time, value=[Rate of successful kerberos logins and latency (milliseconds)]); [2018-08-23T09:40:09.561345] field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginFailure with annotation @org.apache.hadoop.metrics2.annotation.Metric(about=, sampleName=Ops, always=false, type=DEFAULT, valueName=Time, value=[Rate of failed kerberos logins and latency (milliseconds)]); [2018-08-23T09:40:09.561608] field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.getGroups with annotation @org.apache.hadoop.metrics2.annotation.Metric(about=, sampleName=Ops, always=false, type=DEFAULT, valueName=Time, value=[GetGroups]); [2018-08-23T09:40:09.562485] UgiMetrics, User and group related metrics; [2018-08-23T09:40:09.604037] Failed to detect a valid hadoop home directory; [2018-08-23T09:40:09.687386] setsid exited with exit code 0; [2018-08-23T09:40:09.715804] Creating new Groups object; [2018-08-23T09:40:09.717743] Trying to load the custom-built native-hadoop library...; [2018-08-23T09:40:09.718065] Failed to load native-hadoop with error: java.lang.UnsatisfiedLinkError: no hadoop in java.library.path; [2018-08-23T09:40:09.718095] java.library.path=//usr/lib64/syslog-ng; [2018-08-23T09:40:09.718117] Unable to load native-hadoop library for your platform... using builtin-java classes where applicable; [2018-08-23T09:40:09.718418] Falling back to shell based; [2018-08-23T09:40:09.718997] Group mapping impl=org.apache.hadoop.security.ShellBasedUnixGroupsMapping; [2018-08-23T09:40:09.753615] Group mapping impl=org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback\; cacheTimeout=300000\; warningDeltaMs=5000; [2018-08-23T09:40:09.905060] hadoop login; [2018-08-23T09:40:09.906493] hadoop login commit; [2018-08-23T09:40:09.907300] using kerberos user:syslog@HADOOP.COM; [2018-08-23T09:40:09.907333] Using user: "syslog@HADOOP.COM" with name syslog@HADOOP.COM; [2018-08-23T09:40:09.907592] User entry: "syslog@HADOOP.COM"; [2018-08-23T09:40:09.908005] Login successful for user syslog@HADOOP.COM using keytab file syslog.keytab; [2018-08-23T09:40:10.104386] dfs.client.use.legacy.blockreader.local = false; [2018-08-23T09:40:10.104436] dfs.client.read.shortcircuit = false; [2018-08-23T09:40:10.104450] dfs.client.domain.socket.data.traffic = false; [2018-08-23T09:40:10.104461] dfs.domain.socket.path = ; [2018-08-23T09:40:10.121280] Sets dfs.client.block.write.replace-datanode-on-failure.replication to 0; [2018-08-23T09:40:10.144892] multipleLinearRandomRetry = null; [2018-08-23T09:40:10.168901] rpcKind=RPC_PROTOCOL_BUFFER, rpcRequestWrapperClass=class org.apache.hadoop.ipc.ProtobufRpcEngine$RpcProtobufRequest, rpcInvoker=org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker@4379f1a4 ; [2018-08-23T09:40:10.173738] getting client out of cache: org.apache.hadoop.ipc.Client@5b52b909; [2018-08-23T09:40:10.238466] finalize() called.; [2018-08-23T09:40:10.238705] finalize() called.; [2018-08-23T09:40:10.526021] Both short-circuit local reads and UNIX domain socket are disabled.; [2018-08-23T09:40:10.532037] DataTransferProtocol not using SaslPropertiesResolver, no QOP found in configuration for dfs.data.transfer.protection; [2018-08-23T09:40:10.555581] The ping interval is 60000 ms.; [2018-08-23T09:40:10.556336] Connecting to /x.x.x.x:25000; [2018-08-23T09:40:10.572385] PrivilegedAction as:syslog@HADOOP.COM (auth:KERBEROS) from:org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:771); [2018-08-23T09:40:10.613823] Get kerberos info proto:interface org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB info:@org.apache.hadoop.security.KerberosInfo(clientPrincipal=, serverPrincipal=dfs.namenode.kerberos.principal); [2018-08-23T09:40:10.723447] RPC Server's Kerberos principal name for protocol=org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB is hdfs/ hadoop.hadoop.com@HADOOP.COM; [2018-08-23T09:40:10.723514] Creating SASL GSSAPI(KERBEROS) client to authenticate to service at hadoop.hadoop.com; [2018-08-23T09:40:10.730466] Use KERBEROS authentication for protocol ClientNamenodeProtocolPB; [2018-08-23T09:40:10.758296] PrivilegedActionException as:syslog@HADOOP.COM (auth:KERBEROS) cause:javax.security.sasl.SaslException: No common protection layer between client and server; [2018-08-23T09:40:10.759031] PrivilegedAction as:syslog@HADOOP.COM (auth:KERBEROS) from:org.apache.hadoop.ipc.Client$Connection.handleSaslConnectionFailure(Client.java:694); [2018-08-23T09:40:10.759531] Exception encountered while connecting to the server : javax.security.sasl.SaslException: No common protection layer between client and server; [2018-08-23T09:40:14.446925] PrivilegedAction as:syslog@HADOOP.COM (auth:KERBEROS) from:org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:771); [2018-08-23T09:40:14.447824] Get kerberos info proto:interface org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB info:@org.apache.hadoop.security.KerberosInfo(clientPrincipal=, serverPrincipal=dfs.namenode.kerberos.principal); [2018-08-23T09:40:14.447982] RPC Server's Kerberos principal name for protocol=org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB is hdfs/ hadoop.hadoop.com@HADOOP.COM; [2018-08-23T09:40:14.448001] Creating SASL GSSAPI(KERBEROS) client to authenticate to service at hadoop.hadoop.com; [2018-08-23T09:40:14.449087] Use KERBEROS authentication for protocol ClientNamenodeProtocolPB; [2018-08-23T09:40:14.455070] PrivilegedActionException as:syslog@HADOOP.COM (auth:KERBEROS) cause:javax.security.sasl.SaslException: No common protection layer between client and server; [2018-08-23T09:40:14.455190] PrivilegedAction as:syslog@HADOOP.COM (auth:KERBEROS) from:org.apache.hadoop.ipc.Client$Connection.handleSaslConnectionFailure(Client.java:694); [2018-08-23T09:40:14.455476] Exception encountered while connecting to the server : javax.security.sasl.SaslException: No common protection layer between client and server; [2018-08-23T09:40:17.206928] PrivilegedAction as:syslog@HADOOP.COM (auth:KERBEROS) from:org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:771); [2018-08-23T09:40:17.207978] Get kerberos info proto:interface org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB info:@org.apache.hadoop.security.KerberosInfo(clientPrincipal=, serverPrincipal=dfs.namenode.kerberos.principal); [2018-08-23T09:40:17.208115] RPC Server's Kerberos principal name for protocol=org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB is hdfs/ hadoop.hadoop.com@HADOOP.COM; [2018-08-23T09:40:17.208133] Creating SASL GSSAPI(KERBEROS) client to authenticate to service at hadoop.hadoop.com; [2018-08-23T09:40:17.208877] Use KERBEROS authentication for protocol ClientNamenodeProtocolPB; [2018-08-23T09:40:17.214382] PrivilegedActionException as:syslog@HADOOP.COM (auth:KERBEROS) cause:javax.security.sasl.SaslException: No common protection layer between client and server; [2018-08-23T09:40:17.214536] PrivilegedAction as:syslog@HADOOP.COM (auth:KERBEROS) from:org.apache.hadoop.ipc.Client$Connection.handleSaslConnectionFailure(Client.java:694); [2018-08-23T09:40:17.214845] Exception encountered while connecting to the server : javax.security.sasl.SaslException: No common protection layer between client and server; [2018-08-23T09:40:17.535313] PrivilegedAction as:syslog@HADOOP.COM (auth:KERBEROS) from:org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:771); [2018-08-23T09:40:17.536419] Get kerberos info proto:interface org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB info:@org.apache.hadoop.security.KerberosInfo(clientPrincipal=, serverPrincipal=dfs.namenode.kerberos.principal); [2018-08-23T09:40:17.536615] RPC Server's Kerberos principal name for protocol=org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB is hdfs/ hadoop.hadoop.com@HADOOP.COM; [2018-08-23T09:40:17.536634] Creating SASL GSSAPI(KERBEROS) client to authenticate to service at hadoop.hadoop.com; [2018-08-23T09:40:17.537679] Use KERBEROS authentication for protocol ClientNamenodeProtocolPB; [2018-08-23T09:40:17.546375] PrivilegedActionException as:syslog@HADOOP.COM (auth:KERBEROS) cause:javax.security.sasl.SaslException: No common protection layer between client and server; [2018-08-23T09:40:17.546587] PrivilegedAction as:syslog@HADOOP.COM (auth:KERBEROS) from:org.apache.hadoop.ipc.Client$Connection.handleSaslConnectionFailure(Client.java:694); [2018-08-23T09:40:17.546963] Exception encountered while connecting to the server : javax.security.sasl.SaslException: No common protection layer between client and server; [2018-08-23T09:40:21.891382] PrivilegedAction as:syslog@HADOOP.COM (auth:KERBEROS) from:org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:771); [2018-08-23T09:40:21.892476] Get kerberos info proto:interface org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB info:@org.apache.hadoop.security.KerberosInfo(clientPrincipal=, serverPrincipal=dfs.namenode.kerberos.principal); [2018-08-23T09:40:21.909267] RPC Server's Kerberos principal name for protocol=org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB is hdfs/ hadoop.hadoop.com@HADOOP.COM; [2018-08-23T09:40:21.909306] Creating SASL GSSAPI(KERBEROS) client to authenticate to service at hadoop.hadoop.com; [2018-08-23T09:40:21.910125] Use KERBEROS authentication for protocol ClientNamenodeProtocolPB; [2018-08-23T09:40:21.915324] PrivilegedActionException as:syslog@HADOOP.COM (auth:KERBEROS) cause:javax.security.sasl.SaslException: No common protection layer between client and server; [2018-08-23T09:40:21.915540] PrivilegedAction as:syslog@HADOOP.COM (auth:KERBEROS) from:org.apache.hadoop.ipc.Client$Connection.handleSaslConnectionFailure(Client.java:694); [2018-08-23T09:40:21.915889] Exception encountered while connecting to the server : javax.security.sasl.SaslException: No common protection layer between client and server; [2018-08-23T09:40:26.095299] PrivilegedAction as:syslog@HADOOP.COM (auth:KERBEROS) from:org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:771); [2018-08-23T09:40:26.096234] Get kerberos info proto:interface org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB info:@org.apache.hadoop.security.KerberosInfo(clientPrincipal=, serverPrincipal=dfs.namenode.kerberos.principal); [2018-08-23T09:40:26.096391] RPC Server's Kerberos principal name for protocol=org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB is hdfs/ hadoop.hadoop.com@HADOOP.COM; [2018-08-23T09:40:26.096411] Creating SASL GSSAPI(KERBEROS) client to authenticate to service at hadoop.hadoop.com; [2018-08-23T09:40:26.097116] Use KERBEROS authentication for protocol ClientNamenodeProtocolPB; [2018-08-23T09:40:26.101823] PrivilegedActionException as:syslog@HADOOP.COM (auth:KERBEROS) cause:javax.security.sasl.SaslException: No common protection layer between client and server; [2018-08-23T09:40:26.101960] PrivilegedAction as:syslog@HADOOP.COM (auth:KERBEROS) from:org.apache.hadoop.ipc.Client$Connection.handleSaslConnectionFailure(Client.java:694); [2018-08-23T09:40:26.102283] Couldn't setup connection for syslog@HADOOP.COM to /x.x.x.x:25000; [2018-08-23T09:40:26.102365] PrivilegedActionException as:syslog@HADOOP.COM (auth:KERBEROS) cause:java.io.IOException: Couldn't setup connection for syslog@HADOOP.COM to /x.x.x.x:25000; [2018-08-23T09:40:26.103844] closing ipc connection to /x.x.x.x:25000: Couldn't setup connection for syslog@HADOOP.COM to /x.x.x.x:25000; [2018-08-23T09:40:26.103994] IPC Client (1766482975) connection to /x.x.x.x:25000 from syslog@HADOOP.COM: closed;
Thank you.
On Tue, Aug 21, 2018 at 6:48 PM Nagy, Gábor <gabor.nagy@oneidentity.com> wrote:
Hi,
Can you tell ne what is the version of the hadoop lib you use with syslog-ng, please?
Can you share your syslog-ng configuration, mainly the hdfs part, please?
Regards, Gabor
On Tue, Aug 21, 2018 at 4:15 AM Lee Keng Ket <kengket@gmail.com> wrote:
Hi, Gabor
I have run it, seems like it stops at the HDFS side.
[2018-08-21T10:07:51.212015] Worker thread started; driver='d_hdfs#0' [2018-08-21T10:07:51.212499] Running application hooks; hook='1' [2018-08-21T10:07:51.212516] Running application hooks; hook='3' [2018-08-21T10:07:51.212595] syslog-ng starting up; version='3.14.1' [2018-08-21T10:07:51.214113] Opening hdfs; [2018-08-21T10:08:01.215622] Opening hdfs; [2018-08-21T10:08:11.216050] Opening hdfs; [2018-08-21T10:08:21.226340] Opening hdfs; [2018-08-21T10:08:31.236589] Opening hdfs; [2018-08-21T10:08:41.240623] Opening hdfs; [2018-08-21T10:08:51.250879] Opening hdfs; [2018-08-21T10:09:01.261172] Opening hdfs; [2018-08-21T10:09:11.271410] Opening hdfs; [2018-08-21T10:09:21.281685] Opening hdfs; [2018-08-21T10:09:31.290765] Opening hdfs; [2018-08-21T10:09:41.301098] Opening hdfs; [2018-08-21T10:09:51.311362] Opening hdfs; [2018-08-21T10:10:01.321152] Opening hdfs; [2018-08-21T10:10:11.321818] Opening hdfs; [2018-08-21T10:10:21.330114] Opening hdfs; [2018-08-21T10:10:31.340413] Opening hdfs; [2018-08-21T10:10:41.350654] Opening hdfs; [2018-08-21T10:10:51.354016] Opening hdfs; [2018-08-21T10:11:01.364267] Opening hdfs; [2018-08-21T10:11:11.374516] Opening hdfs; [2018-08-21T10:11:21.384761] Opening hdfs; [2018-08-21T10:11:31.395017] Opening hdfs; [2018-08-21T10:11:41.402256] Opening hdfs; [2018-08-21T10:11:51.404097] Opening hdfs; ^C[2018-08-21T10:11:59.672252] syslog-ng shutting down; version='3.14.1' Exception in thread "" java.lang.NoClassDefFoundError: org/apache/hadoop/conf/Configuration at org.syslog_ng.hdfs.HdfsDestination.open(HdfsDestination.java:92) at org.syslog_ng.LogDestination.openProxy(LogDestination.java:65) [2018-08-21T10:11:59.774895] Worker thread finished; driver='d_hdfs#0' [2018-08-21T10:11:59.775384] Closing log transport fd; fd='13' [2018-08-21T10:11:59.775508] Deinitialize hdfs destination; [2018-08-21T10:11:59.776534] Java machine free; [2018-08-21T10:11:59.778421] Running application hooks; hook='4'
Any idea what to be checked further?
Thank you.
On Fri, Aug 17, 2018 at 4:45 PM Nagy, Gábor <gabor.nagy@oneidentity.com> wrote:
Hello!
In the statistics it can be seen that the log message is not sent to the HDFS server: dropped='dst.java(d_hdfs#0 java_dst hdfs hdfs://x.x.x.x:25000 /user/syslog/$HOST-$DAY-$MONTH-$YEAR.log)=0' processed='dst.java(d_hdfs#0 java_dst hdfs hdfs://x.x.x.x:25000 /user/syslog/$HOST-$DAY-$MONTH-$YEAR.log)=1' queued='dst.java(d_hdfs#0 java_dst hdfs hdfs://x.x.x.x:25000 /user/syslog/$HOST-$DAY-$MONTH-$YEAR.log)=1'
Well, generally on write error there should be an exception that results in an error message.
You should try debugging it either in running syslog-ng in foreground (-F option), forwarding internal logs to stderr (-e) and with debug mode (-dv) on. Or in service mode use the internal() source in your config and connect it to a destination (e.g. file()) which you prefer.
You could turn on debug messages on java side too using jvm_options() in syslog-ng config and configuring the log4j logging service, e.g.: options {
jvm_options("-Dlog4j.configuration=file:/etc/hadoop/log4j.properties -Dlog4j.debug=true"); };
Regards, Gabor
On Fri, Aug 17, 2018 at 10:34 AM Czanik, Péter < peter.czanik@balabit.com> wrote:
Hi,
As https://copr.fedorainfracloud.org/coprs/czanik/syslog-ng314/ also writes: "Java is enabled, but JAR dependencies are not provided in package, except for Elasticsearch http mode." The syslog-ng-java-deps.noarch contains build time dependencies. Probably I should rename the package to syslog-ng-java-build-deps...
Check the documentation at https://www.syslog-ng.com/technical-documents/doc/syslog-ng-open-source-edit... on how to download and configure HDFS related JAR dependencies.
Bye,
Peter Czanik (CzP) <peter.czanik@balabit.com> Balabit / syslog-ng upstream https://syslog-ng.com/community/ https://twitter.com/PCzanik
On Fri, Aug 17, 2018 at 10:22 AM, Lee Keng Ket <kengket@gmail.com> wrote:
> Hi, > > I'm trying to connect syslog-ng 3.14.1 to HDFS to store the syslog > messages. The syslog-ng can start without error, and it's able to write > into local file. However, the log is not written to the HDFS. As there is > no single error, I'm not sure how I should troubleshoot on this. > > I have installed the syslog-ng from this repo, > https://copr.fedorainfracloud.org/coprs/czanik/syslog-ng314/repo/epel-7/czan... > > Installed Packages > syslog-ng.x86_64 > 3.14.1-4.el7.centos > @czanik-syslog-ng314 > syslog-ng-java.x86_64 > 3.14.1-4.el7.centos > @czanik-syslog-ng314 > syslog-ng-java-deps.noarch > 1.0-2 > @czanik-syslog-ng314 > > This is the message from /var/log/message: > Log statistics; processed='src.internal(s_sys#0)=1', > stamp='src.internal(s_sys#0)=1534491834', > processed='destination(d_spol)=0', processed='destination(d_mlal)=0', > processed='center(received)=2', processed='destination(d_mesg)=1', > processed='destination(d_mail)=0', processed='destination(d_auth)=0', > processed='destination(d_cron)=0', processed='destination(d_hdfs)=1', > processed='center(queued)=3', queued='global(scratch_buffers_count)=0', > processed='source(remote_log)=1', > dropped='dst.java(d_hdfs#0,java_dst,hdfs,hdfs://x.x.x.x:25000,/user/syslog/$HOST-$DAY-$MONTH-$YEAR.log)=0', > processed='dst.java(d_hdfs#0,java_dst,hdfs,hdfs://x.x.x.x:25000,/user/syslog/$HOST-$DAY-$MONTH-$YEAR.log)=1', > queued='dst.java(d_hdfs#0,java_dst,hdfs,hdfs://x.x.x.x:25000,/user/syslog/$HOST-$DAY-$MONTH-$YEAR.log)=1', > processed='global(payload_reallocs)=0', > processed='src.journald(journal)=0', stamp='src.journald(journal)=0', > processed='global(sdata_updates)=0', > queued='global(scratch_buffers_bytes)=0', > processed='destination(d_boot)=0', processed='destination(d_kern)=0', > processed='source(s_sys)=1', processed='destination(remote)=1', > processed='global(internal_queue_length)=0', > processed='global(msg_clones)=0' > > Anyone has any idea how should I proceed the troubleshooting? > > > ______________________________________________________________________________ > Member info: https://lists.balabit.hu/mailman/listinfo/syslog-ng > Documentation: > http://www.balabit.com/support/documentation/?product=syslog-ng > FAQ: http://www.balabit.com/wiki/syslog-ng-faq > > >
______________________________________________________________________________ Member info: https://lists.balabit.hu/mailman/listinfo/syslog-ng Documentation: http://www.balabit.com/support/documentation/?product=syslog-ng FAQ: http://www.balabit.com/wiki/syslog-ng-faq
______________________________________________________________________________ Member info: https://lists.balabit.hu/mailman/listinfo/syslog-ng Documentation: http://www.balabit.com/support/documentation/?product=syslog-ng FAQ: http://www.balabit.com/wiki/syslog-ng-faq
______________________________________________________________________________ Member info: https://lists.balabit.hu/mailman/listinfo/syslog-ng Documentation: http://www.balabit.com/support/documentation/?product=syslog-ng FAQ: http://www.balabit.com/wiki/syslog-ng-faq
______________________________________________________________________________ Member info: https://lists.balabit.hu/mailman/listinfo/syslog-ng Documentation: http://www.balabit.com/support/documentation/?product=syslog-ng FAQ: http://www.balabit.com/wiki/syslog-ng-faq
______________________________________________________________________________ Member info: https://lists.balabit.hu/mailman/listinfo/syslog-ng Documentation: http://www.balabit.com/support/documentation/?product=syslog-ng FAQ: http://www.balabit.com/wiki/syslog-ng-faq
______________________________________________________________________________ Member info: https://lists.balabit.hu/mailman/listinfo/syslog-ng Documentation: http://www.balabit.com/support/documentation/?product=syslog-ng FAQ: http://www.balabit.com/wiki/syslog-ng-faq
Hi, I'm able to send the syslog to HDFS, but it seems not real time. The file in HDFS does not increase size, until I stop the syslog-ng process, then all the messages go into HDFS and we can see the file size increase. Any idea how frequent the syslog-ng send the syslog to HDFS? If the syslog-ng incoming syslog is a lot, is there any fine-tuning required? The syslog-ng also write log file in local server, I can see the file size increase by 40MB in 20min. However, the file in HDFS remains same size until I stop the syslog-ng process. Once I stop the process, the log is written into the HDFS. This is log stats: Log statistics; processed='src.internal(s_sys#1)=4', stamp='src.internal(s_sys#1)=1535352937', processed='destination(d_mlal)=0', processed='center(received)=30694', processed='destination(d_spol)=0', processed='destination(d_mesg)=4', processed='destination(d_mail)=0', processed='destination(d_auth)=0', processed='destination(d_cron)=0', processed='destination(d_hdfs)=30690', processed='center(queued)=61384', queued='global(scratch_buffers_count)=0', processed='source(remote_log)=30690', dropped='dst.java(d_hdfs#0,java_dst,hdfs,hdfs://CN03:25000,/user/syslog/$HOST-$DAY-$MONTH-$YEAR.log)=0', processed='dst.java(d_hdfs#0,java_dst,hdfs,hdfs://CN03:25000,/user/syslog/$HOST-$DAY-$MONTH-$YEAR.log)=30690', queued='dst.java(d_hdfs#0,java_dst,hdfs,hdfs://CN03:25000,/user/syslog/$HOST-$DAY-$MONTH-$YEAR.log)=0', processed='global(payload_reallocs)=0', processed='global(sdata_updates)=0', queued='global(scratch_buffers_bytes)=0', processed='src.journald(s_sys#0,journal)=0', stamp='src.journald(s_sys#0,journal)=0', processed='destination(d_boot)=0', processed='destination(d_kern)=0', processed='source(s_sys)=4', processed='destination(remote)=30690', processed='global(internal_queue_length)=0', processed='global(msg_clones)=0' Thank you. Regards, KK On Mon, Aug 27, 2018 at 10:41 AM Lee Keng Ket <kengket@gmail.com> wrote:
Hi, Gabor
I am able to send the syslog to HDFS after adding in the hdfs-resources option. Thanks a lot for your help.
Thank you.
Regards, KK
On Fri, Aug 24, 2018 at 10:29 PM Nagy, Gábor <gabor.nagy@oneidentity.com> wrote:
Hi,
As I was searching for answer what i saw was that "SaslException: No common protection layer between client and server;" can occur if the configuration differs on client-server side.
You should configure the "hadoop.rpc.protection" value in core-site.xml on the client side to the same one as on the server side. To use that xml with hdfs, you should put the filename in syslog-ng config in the hdfs destination block: destination d_hdfs { ... hdfs-resources("/path/to/xmlconfig/core-site.xml"); };
Regards, Gabor
On Thu, Aug 23, 2018 at 3:53 AM Lee Keng Ket <kengket@gmail.com> wrote:
Hi, Gabor
After I change the lib to executable as reminded by Janos, now there is different error. Any idea if this is Kerberos issue or still the Hadoop lib issue?
The Hadoop that I'm using is Huawei FusionInsight. Hadoop HDFS version should be 2.7.2.
This is the syslog-ng.conf for HDFS part: destination d_hdfs { hdfs(client_lib_dir("/opt/hadoop/lib") hdfs_uri("hdfs://x.x.x.x:25000") kerberos-keytab-file("/etc/syslog.keytab") kerberos-principal("syslog@HADOOP.COM") hdfs_file("/user/syslog/$HOST-$DAY-$MONTH-$YEAR.log") template(t_cgnat) hdfs-append-enabled(true) ); };
There are the messages I get when starting in debug:
[2018-08-23T09:40:09.210168] Running application hooks; hook='1' [2018-08-23T09:40:09.210192] Running application hooks; hook='3' [2018-08-23T09:40:09.210501] syslog-ng starting up; version='3.14.1' [2018-08-23T09:40:09.213049] Worker thread started; driver='d_hdfs#0' [2018-08-23T09:40:09.214922] Opening hdfs; [2018-08-23T09:40:09.548286] field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginSuccess with annotation @org.apache.hadoop.metrics2.annotation.Metric(about=, sampleName=Ops, always=false, type=DEFAULT, valueName=Time, value=[Rate of successful kerberos logins and latency (milliseconds)]); [2018-08-23T09:40:09.561345] field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginFailure with annotation @org.apache.hadoop.metrics2.annotation.Metric(about=, sampleName=Ops, always=false, type=DEFAULT, valueName=Time, value=[Rate of failed kerberos logins and latency (milliseconds)]); [2018-08-23T09:40:09.561608] field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.getGroups with annotation @org.apache.hadoop.metrics2.annotation.Metric(about=, sampleName=Ops, always=false, type=DEFAULT, valueName=Time, value=[GetGroups]); [2018-08-23T09:40:09.562485] UgiMetrics, User and group related metrics; [2018-08-23T09:40:09.604037] Failed to detect a valid hadoop home directory; [2018-08-23T09:40:09.687386] setsid exited with exit code 0; [2018-08-23T09:40:09.715804] Creating new Groups object; [2018-08-23T09:40:09.717743] Trying to load the custom-built native-hadoop library...; [2018-08-23T09:40:09.718065] Failed to load native-hadoop with error: java.lang.UnsatisfiedLinkError: no hadoop in java.library.path; [2018-08-23T09:40:09.718095] java.library.path=//usr/lib64/syslog-ng; [2018-08-23T09:40:09.718117] Unable to load native-hadoop library for your platform... using builtin-java classes where applicable; [2018-08-23T09:40:09.718418] Falling back to shell based; [2018-08-23T09:40:09.718997] Group mapping impl=org.apache.hadoop.security.ShellBasedUnixGroupsMapping; [2018-08-23T09:40:09.753615] Group mapping impl=org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback\; cacheTimeout=300000\; warningDeltaMs=5000; [2018-08-23T09:40:09.905060] hadoop login; [2018-08-23T09:40:09.906493] hadoop login commit; [2018-08-23T09:40:09.907300] using kerberos user:syslog@HADOOP.COM; [2018-08-23T09:40:09.907333] Using user: "syslog@HADOOP.COM" with name syslog@HADOOP.COM; [2018-08-23T09:40:09.907592] User entry: "syslog@HADOOP.COM"; [2018-08-23T09:40:09.908005] Login successful for user syslog@HADOOP.COM using keytab file syslog.keytab; [2018-08-23T09:40:10.104386] dfs.client.use.legacy.blockreader.local = false; [2018-08-23T09:40:10.104436] dfs.client.read.shortcircuit = false; [2018-08-23T09:40:10.104450] dfs.client.domain.socket.data.traffic = false; [2018-08-23T09:40:10.104461] dfs.domain.socket.path = ; [2018-08-23T09:40:10.121280] Sets dfs.client.block.write.replace-datanode-on-failure.replication to 0; [2018-08-23T09:40:10.144892] multipleLinearRandomRetry = null; [2018-08-23T09:40:10.168901] rpcKind=RPC_PROTOCOL_BUFFER, rpcRequestWrapperClass=class org.apache.hadoop.ipc.ProtobufRpcEngine$RpcProtobufRequest, rpcInvoker=org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker@4379f1a4 ; [2018-08-23T09:40:10.173738] getting client out of cache: org.apache.hadoop.ipc.Client@5b52b909; [2018-08-23T09:40:10.238466] finalize() called.; [2018-08-23T09:40:10.238705] finalize() called.; [2018-08-23T09:40:10.526021] Both short-circuit local reads and UNIX domain socket are disabled.; [2018-08-23T09:40:10.532037] DataTransferProtocol not using SaslPropertiesResolver, no QOP found in configuration for dfs.data.transfer.protection; [2018-08-23T09:40:10.555581] The ping interval is 60000 ms.; [2018-08-23T09:40:10.556336] Connecting to /x.x.x.x:25000; [2018-08-23T09:40:10.572385] PrivilegedAction as:syslog@HADOOP.COM (auth:KERBEROS) from:org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:771); [2018-08-23T09:40:10.613823] Get kerberos info proto:interface org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB info:@org.apache.hadoop.security.KerberosInfo(clientPrincipal=, serverPrincipal=dfs.namenode.kerberos.principal); [2018-08-23T09:40:10.723447] RPC Server's Kerberos principal name for protocol=org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB is hdfs/ hadoop.hadoop.com@HADOOP.COM; [2018-08-23T09:40:10.723514] Creating SASL GSSAPI(KERBEROS) client to authenticate to service at hadoop.hadoop.com; [2018-08-23T09:40:10.730466] Use KERBEROS authentication for protocol ClientNamenodeProtocolPB; [2018-08-23T09:40:10.758296] PrivilegedActionException as:syslog@HADOOP.COM (auth:KERBEROS) cause:javax.security.sasl.SaslException: No common protection layer between client and server; [2018-08-23T09:40:10.759031] PrivilegedAction as:syslog@HADOOP.COM (auth:KERBEROS) from:org.apache.hadoop.ipc.Client$Connection.handleSaslConnectionFailure(Client.java:694); [2018-08-23T09:40:10.759531] Exception encountered while connecting to the server : javax.security.sasl.SaslException: No common protection layer between client and server; [2018-08-23T09:40:14.446925] PrivilegedAction as:syslog@HADOOP.COM (auth:KERBEROS) from:org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:771); [2018-08-23T09:40:14.447824] Get kerberos info proto:interface org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB info:@org.apache.hadoop.security.KerberosInfo(clientPrincipal=, serverPrincipal=dfs.namenode.kerberos.principal); [2018-08-23T09:40:14.447982] RPC Server's Kerberos principal name for protocol=org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB is hdfs/ hadoop.hadoop.com@HADOOP.COM; [2018-08-23T09:40:14.448001] Creating SASL GSSAPI(KERBEROS) client to authenticate to service at hadoop.hadoop.com; [2018-08-23T09:40:14.449087] Use KERBEROS authentication for protocol ClientNamenodeProtocolPB; [2018-08-23T09:40:14.455070] PrivilegedActionException as:syslog@HADOOP.COM (auth:KERBEROS) cause:javax.security.sasl.SaslException: No common protection layer between client and server; [2018-08-23T09:40:14.455190] PrivilegedAction as:syslog@HADOOP.COM (auth:KERBEROS) from:org.apache.hadoop.ipc.Client$Connection.handleSaslConnectionFailure(Client.java:694); [2018-08-23T09:40:14.455476] Exception encountered while connecting to the server : javax.security.sasl.SaslException: No common protection layer between client and server; [2018-08-23T09:40:17.206928] PrivilegedAction as:syslog@HADOOP.COM (auth:KERBEROS) from:org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:771); [2018-08-23T09:40:17.207978] Get kerberos info proto:interface org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB info:@org.apache.hadoop.security.KerberosInfo(clientPrincipal=, serverPrincipal=dfs.namenode.kerberos.principal); [2018-08-23T09:40:17.208115] RPC Server's Kerberos principal name for protocol=org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB is hdfs/ hadoop.hadoop.com@HADOOP.COM; [2018-08-23T09:40:17.208133] Creating SASL GSSAPI(KERBEROS) client to authenticate to service at hadoop.hadoop.com; [2018-08-23T09:40:17.208877] Use KERBEROS authentication for protocol ClientNamenodeProtocolPB; [2018-08-23T09:40:17.214382] PrivilegedActionException as:syslog@HADOOP.COM (auth:KERBEROS) cause:javax.security.sasl.SaslException: No common protection layer between client and server; [2018-08-23T09:40:17.214536] PrivilegedAction as:syslog@HADOOP.COM (auth:KERBEROS) from:org.apache.hadoop.ipc.Client$Connection.handleSaslConnectionFailure(Client.java:694); [2018-08-23T09:40:17.214845] Exception encountered while connecting to the server : javax.security.sasl.SaslException: No common protection layer between client and server; [2018-08-23T09:40:17.535313] PrivilegedAction as:syslog@HADOOP.COM (auth:KERBEROS) from:org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:771); [2018-08-23T09:40:17.536419] Get kerberos info proto:interface org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB info:@org.apache.hadoop.security.KerberosInfo(clientPrincipal=, serverPrincipal=dfs.namenode.kerberos.principal); [2018-08-23T09:40:17.536615] RPC Server's Kerberos principal name for protocol=org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB is hdfs/ hadoop.hadoop.com@HADOOP.COM; [2018-08-23T09:40:17.536634] Creating SASL GSSAPI(KERBEROS) client to authenticate to service at hadoop.hadoop.com; [2018-08-23T09:40:17.537679] Use KERBEROS authentication for protocol ClientNamenodeProtocolPB; [2018-08-23T09:40:17.546375] PrivilegedActionException as:syslog@HADOOP.COM (auth:KERBEROS) cause:javax.security.sasl.SaslException: No common protection layer between client and server; [2018-08-23T09:40:17.546587] PrivilegedAction as:syslog@HADOOP.COM (auth:KERBEROS) from:org.apache.hadoop.ipc.Client$Connection.handleSaslConnectionFailure(Client.java:694); [2018-08-23T09:40:17.546963] Exception encountered while connecting to the server : javax.security.sasl.SaslException: No common protection layer between client and server; [2018-08-23T09:40:21.891382] PrivilegedAction as:syslog@HADOOP.COM (auth:KERBEROS) from:org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:771); [2018-08-23T09:40:21.892476] Get kerberos info proto:interface org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB info:@org.apache.hadoop.security.KerberosInfo(clientPrincipal=, serverPrincipal=dfs.namenode.kerberos.principal); [2018-08-23T09:40:21.909267] RPC Server's Kerberos principal name for protocol=org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB is hdfs/ hadoop.hadoop.com@HADOOP.COM; [2018-08-23T09:40:21.909306] Creating SASL GSSAPI(KERBEROS) client to authenticate to service at hadoop.hadoop.com; [2018-08-23T09:40:21.910125] Use KERBEROS authentication for protocol ClientNamenodeProtocolPB; [2018-08-23T09:40:21.915324] PrivilegedActionException as:syslog@HADOOP.COM (auth:KERBEROS) cause:javax.security.sasl.SaslException: No common protection layer between client and server; [2018-08-23T09:40:21.915540] PrivilegedAction as:syslog@HADOOP.COM (auth:KERBEROS) from:org.apache.hadoop.ipc.Client$Connection.handleSaslConnectionFailure(Client.java:694); [2018-08-23T09:40:21.915889] Exception encountered while connecting to the server : javax.security.sasl.SaslException: No common protection layer between client and server; [2018-08-23T09:40:26.095299] PrivilegedAction as:syslog@HADOOP.COM (auth:KERBEROS) from:org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:771); [2018-08-23T09:40:26.096234] Get kerberos info proto:interface org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB info:@org.apache.hadoop.security.KerberosInfo(clientPrincipal=, serverPrincipal=dfs.namenode.kerberos.principal); [2018-08-23T09:40:26.096391] RPC Server's Kerberos principal name for protocol=org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB is hdfs/ hadoop.hadoop.com@HADOOP.COM; [2018-08-23T09:40:26.096411] Creating SASL GSSAPI(KERBEROS) client to authenticate to service at hadoop.hadoop.com; [2018-08-23T09:40:26.097116] Use KERBEROS authentication for protocol ClientNamenodeProtocolPB; [2018-08-23T09:40:26.101823] PrivilegedActionException as:syslog@HADOOP.COM (auth:KERBEROS) cause:javax.security.sasl.SaslException: No common protection layer between client and server; [2018-08-23T09:40:26.101960] PrivilegedAction as:syslog@HADOOP.COM (auth:KERBEROS) from:org.apache.hadoop.ipc.Client$Connection.handleSaslConnectionFailure(Client.java:694); [2018-08-23T09:40:26.102283] Couldn't setup connection for syslog@HADOOP.COM to /x.x.x.x:25000; [2018-08-23T09:40:26.102365] PrivilegedActionException as:syslog@HADOOP.COM (auth:KERBEROS) cause:java.io.IOException: Couldn't setup connection for syslog@HADOOP.COM to /x.x.x.x:25000; [2018-08-23T09:40:26.103844] closing ipc connection to /x.x.x.x:25000: Couldn't setup connection for syslog@HADOOP.COM to /x.x.x.x:25000; [2018-08-23T09:40:26.103994] IPC Client (1766482975) connection to /x.x.x.x:25000 from syslog@HADOOP.COM: closed;
Thank you.
On Tue, Aug 21, 2018 at 6:48 PM Nagy, Gábor <gabor.nagy@oneidentity.com> wrote:
Hi,
Can you tell ne what is the version of the hadoop lib you use with syslog-ng, please?
Can you share your syslog-ng configuration, mainly the hdfs part, please?
Regards, Gabor
On Tue, Aug 21, 2018 at 4:15 AM Lee Keng Ket <kengket@gmail.com> wrote:
Hi, Gabor
I have run it, seems like it stops at the HDFS side.
[2018-08-21T10:07:51.212015] Worker thread started; driver='d_hdfs#0' [2018-08-21T10:07:51.212499] Running application hooks; hook='1' [2018-08-21T10:07:51.212516] Running application hooks; hook='3' [2018-08-21T10:07:51.212595] syslog-ng starting up; version='3.14.1' [2018-08-21T10:07:51.214113] Opening hdfs; [2018-08-21T10:08:01.215622] Opening hdfs; [2018-08-21T10:08:11.216050] Opening hdfs; [2018-08-21T10:08:21.226340] Opening hdfs; [2018-08-21T10:08:31.236589] Opening hdfs; [2018-08-21T10:08:41.240623] Opening hdfs; [2018-08-21T10:08:51.250879] Opening hdfs; [2018-08-21T10:09:01.261172] Opening hdfs; [2018-08-21T10:09:11.271410] Opening hdfs; [2018-08-21T10:09:21.281685] Opening hdfs; [2018-08-21T10:09:31.290765] Opening hdfs; [2018-08-21T10:09:41.301098] Opening hdfs; [2018-08-21T10:09:51.311362] Opening hdfs; [2018-08-21T10:10:01.321152] Opening hdfs; [2018-08-21T10:10:11.321818] Opening hdfs; [2018-08-21T10:10:21.330114] Opening hdfs; [2018-08-21T10:10:31.340413] Opening hdfs; [2018-08-21T10:10:41.350654] Opening hdfs; [2018-08-21T10:10:51.354016] Opening hdfs; [2018-08-21T10:11:01.364267] Opening hdfs; [2018-08-21T10:11:11.374516] Opening hdfs; [2018-08-21T10:11:21.384761] Opening hdfs; [2018-08-21T10:11:31.395017] Opening hdfs; [2018-08-21T10:11:41.402256] Opening hdfs; [2018-08-21T10:11:51.404097] Opening hdfs; ^C[2018-08-21T10:11:59.672252] syslog-ng shutting down; version='3.14.1' Exception in thread "" java.lang.NoClassDefFoundError: org/apache/hadoop/conf/Configuration at org.syslog_ng.hdfs.HdfsDestination.open(HdfsDestination.java:92) at org.syslog_ng.LogDestination.openProxy(LogDestination.java:65) [2018-08-21T10:11:59.774895] Worker thread finished; driver='d_hdfs#0' [2018-08-21T10:11:59.775384] Closing log transport fd; fd='13' [2018-08-21T10:11:59.775508] Deinitialize hdfs destination; [2018-08-21T10:11:59.776534] Java machine free; [2018-08-21T10:11:59.778421] Running application hooks; hook='4'
Any idea what to be checked further?
Thank you.
On Fri, Aug 17, 2018 at 4:45 PM Nagy, Gábor < gabor.nagy@oneidentity.com> wrote:
Hello!
In the statistics it can be seen that the log message is not sent to the HDFS server: dropped='dst.java(d_hdfs#0 java_dst hdfs hdfs://x.x.x.x:25000 /user/syslog/$HOST-$DAY-$MONTH-$YEAR.log)=0' processed='dst.java(d_hdfs#0 java_dst hdfs hdfs://x.x.x.x:25000 /user/syslog/$HOST-$DAY-$MONTH-$YEAR.log)=1' queued='dst.java(d_hdfs#0 java_dst hdfs hdfs://x.x.x.x:25000 /user/syslog/$HOST-$DAY-$MONTH-$YEAR.log)=1'
Well, generally on write error there should be an exception that results in an error message.
You should try debugging it either in running syslog-ng in foreground (-F option), forwarding internal logs to stderr (-e) and with debug mode (-dv) on. Or in service mode use the internal() source in your config and connect it to a destination (e.g. file()) which you prefer.
You could turn on debug messages on java side too using jvm_options() in syslog-ng config and configuring the log4j logging service, e.g.: options {
jvm_options("-Dlog4j.configuration=file:/etc/hadoop/log4j.properties -Dlog4j.debug=true"); };
Regards, Gabor
On Fri, Aug 17, 2018 at 10:34 AM Czanik, Péter < peter.czanik@balabit.com> wrote:
> Hi, > > As https://copr.fedorainfracloud.org/coprs/czanik/syslog-ng314/ > also writes: "Java is enabled, but JAR dependencies are not provided in > package, except for Elasticsearch http mode." The > syslog-ng-java-deps.noarch contains build time dependencies. Probably I > should rename the package to syslog-ng-java-build-deps... > > Check the documentation at > https://www.syslog-ng.com/technical-documents/doc/syslog-ng-open-source-edit... > on how to download and configure HDFS related JAR dependencies. > > Bye, > > Peter Czanik (CzP) <peter.czanik@balabit.com> > Balabit / syslog-ng upstream > https://syslog-ng.com/community/ > https://twitter.com/PCzanik > > On Fri, Aug 17, 2018 at 10:22 AM, Lee Keng Ket <kengket@gmail.com> > wrote: > >> Hi, >> >> I'm trying to connect syslog-ng 3.14.1 to HDFS to store the syslog >> messages. The syslog-ng can start without error, and it's able to write >> into local file. However, the log is not written to the HDFS. As there is >> no single error, I'm not sure how I should troubleshoot on this. >> >> I have installed the syslog-ng from this repo, >> https://copr.fedorainfracloud.org/coprs/czanik/syslog-ng314/repo/epel-7/czan... >> >> Installed Packages >> syslog-ng.x86_64 >> 3.14.1-4.el7.centos >> @czanik-syslog-ng314 >> syslog-ng-java.x86_64 >> 3.14.1-4.el7.centos >> @czanik-syslog-ng314 >> syslog-ng-java-deps.noarch >> 1.0-2 >> @czanik-syslog-ng314 >> >> This is the message from /var/log/message: >> Log statistics; processed='src.internal(s_sys#0)=1', >> stamp='src.internal(s_sys#0)=1534491834', >> processed='destination(d_spol)=0', processed='destination(d_mlal)=0', >> processed='center(received)=2', processed='destination(d_mesg)=1', >> processed='destination(d_mail)=0', processed='destination(d_auth)=0', >> processed='destination(d_cron)=0', processed='destination(d_hdfs)=1', >> processed='center(queued)=3', queued='global(scratch_buffers_count)=0', >> processed='source(remote_log)=1', >> dropped='dst.java(d_hdfs#0,java_dst,hdfs,hdfs://x.x.x.x:25000,/user/syslog/$HOST-$DAY-$MONTH-$YEAR.log)=0', >> processed='dst.java(d_hdfs#0,java_dst,hdfs,hdfs://x.x.x.x:25000,/user/syslog/$HOST-$DAY-$MONTH-$YEAR.log)=1', >> queued='dst.java(d_hdfs#0,java_dst,hdfs,hdfs://x.x.x.x:25000,/user/syslog/$HOST-$DAY-$MONTH-$YEAR.log)=1', >> processed='global(payload_reallocs)=0', >> processed='src.journald(journal)=0', stamp='src.journald(journal)=0', >> processed='global(sdata_updates)=0', >> queued='global(scratch_buffers_bytes)=0', >> processed='destination(d_boot)=0', processed='destination(d_kern)=0', >> processed='source(s_sys)=1', processed='destination(remote)=1', >> processed='global(internal_queue_length)=0', >> processed='global(msg_clones)=0' >> >> Anyone has any idea how should I proceed the troubleshooting? >> >> >> ______________________________________________________________________________ >> Member info: https://lists.balabit.hu/mailman/listinfo/syslog-ng >> Documentation: >> http://www.balabit.com/support/documentation/?product=syslog-ng >> FAQ: http://www.balabit.com/wiki/syslog-ng-faq >> >> >> > > ______________________________________________________________________________ > Member info: https://lists.balabit.hu/mailman/listinfo/syslog-ng > Documentation: > http://www.balabit.com/support/documentation/?product=syslog-ng > FAQ: http://www.balabit.com/wiki/syslog-ng-faq > > ______________________________________________________________________________ Member info: https://lists.balabit.hu/mailman/listinfo/syslog-ng Documentation: http://www.balabit.com/support/documentation/?product=syslog-ng FAQ: http://www.balabit.com/wiki/syslog-ng-faq
______________________________________________________________________________ Member info: https://lists.balabit.hu/mailman/listinfo/syslog-ng Documentation: http://www.balabit.com/support/documentation/?product=syslog-ng FAQ: http://www.balabit.com/wiki/syslog-ng-faq
______________________________________________________________________________ Member info: https://lists.balabit.hu/mailman/listinfo/syslog-ng Documentation: http://www.balabit.com/support/documentation/?product=syslog-ng FAQ: http://www.balabit.com/wiki/syslog-ng-faq
______________________________________________________________________________ Member info: https://lists.balabit.hu/mailman/listinfo/syslog-ng Documentation: http://www.balabit.com/support/documentation/?product=syslog-ng FAQ: http://www.balabit.com/wiki/syslog-ng-faq
______________________________________________________________________________ Member info: https://lists.balabit.hu/mailman/listinfo/syslog-ng Documentation: http://www.balabit.com/support/documentation/?product=syslog-ng FAQ: http://www.balabit.com/wiki/syslog-ng-faq
Hi, As I know we have no control over flushing the file in HDFS. I've checked our admin guide, let me quote it:
You cannot set when log messages are flushed. Hadoop performs this action automatically, depending on its configured block size, and the amount of data received. There is no way for the syslog-ng OSE application to influence when the messages are actually written to disk. This means that syslog-ng OSE cannot guarantee that a message sent to HDFS is actually written to disk.
You can reload syslog-ng periodically. During reload the used files are closed and HDFS should flushes them to the datanodes. Do you use append in your HDFS environment? Regards, Gabor On Mon, Aug 27, 2018 at 9:13 AM Lee Keng Ket <kengket@gmail.com> wrote:
Hi,
I'm able to send the syslog to HDFS, but it seems not real time. The file in HDFS does not increase size, until I stop the syslog-ng process, then all the messages go into HDFS and we can see the file size increase. Any idea how frequent the syslog-ng send the syslog to HDFS? If the syslog-ng incoming syslog is a lot, is there any fine-tuning required? The syslog-ng also write log file in local server, I can see the file size increase by 40MB in 20min. However, the file in HDFS remains same size until I stop the syslog-ng process. Once I stop the process, the log is written into the HDFS.
This is log stats: Log statistics; processed='src.internal(s_sys#1)=4', stamp='src.internal(s_sys#1)=1535352937', processed='destination(d_mlal)=0', processed='center(received)=30694', processed='destination(d_spol)=0', processed='destination(d_mesg)=4', processed='destination(d_mail)=0', processed='destination(d_auth)=0', processed='destination(d_cron)=0', processed='destination(d_hdfs)=30690', processed='center(queued)=61384', queued='global(scratch_buffers_count)=0', processed='source(remote_log)=30690', dropped='dst.java(d_hdfs#0,java_dst,hdfs,hdfs://CN03:25000,/user/syslog/$HOST-$DAY-$MONTH-$YEAR.log)=0', processed='dst.java(d_hdfs#0,java_dst,hdfs,hdfs://CN03:25000,/user/syslog/$HOST-$DAY-$MONTH-$YEAR.log)=30690', queued='dst.java(d_hdfs#0,java_dst,hdfs,hdfs://CN03:25000,/user/syslog/$HOST-$DAY-$MONTH-$YEAR.log)=0', processed='global(payload_reallocs)=0', processed='global(sdata_updates)=0', queued='global(scratch_buffers_bytes)=0', processed='src.journald(s_sys#0,journal)=0', stamp='src.journald(s_sys#0,journal)=0', processed='destination(d_boot)=0', processed='destination(d_kern)=0', processed='source(s_sys)=4', processed='destination(remote)=30690', processed='global(internal_queue_length)=0', processed='global(msg_clones)=0'
Thank you.
Regards, KK
On Mon, Aug 27, 2018 at 10:41 AM Lee Keng Ket <kengket@gmail.com> wrote:
Hi, Gabor
I am able to send the syslog to HDFS after adding in the hdfs-resources option. Thanks a lot for your help.
Thank you.
Regards, KK
On Fri, Aug 24, 2018 at 10:29 PM Nagy, Gábor <gabor.nagy@oneidentity.com> wrote:
Hi,
As I was searching for answer what i saw was that "SaslException: No common protection layer between client and server;" can occur if the configuration differs on client-server side.
You should configure the "hadoop.rpc.protection" value in core-site.xml on the client side to the same one as on the server side. To use that xml with hdfs, you should put the filename in syslog-ng config in the hdfs destination block: destination d_hdfs { ... hdfs-resources("/path/to/xmlconfig/core-site.xml"); };
Regards, Gabor
On Thu, Aug 23, 2018 at 3:53 AM Lee Keng Ket <kengket@gmail.com> wrote:
Hi, Gabor
After I change the lib to executable as reminded by Janos, now there is different error. Any idea if this is Kerberos issue or still the Hadoop lib issue?
The Hadoop that I'm using is Huawei FusionInsight. Hadoop HDFS version should be 2.7.2.
This is the syslog-ng.conf for HDFS part: destination d_hdfs { hdfs(client_lib_dir("/opt/hadoop/lib") hdfs_uri("hdfs://x.x.x.x:25000") kerberos-keytab-file("/etc/syslog.keytab") kerberos-principal("syslog@HADOOP.COM") hdfs_file("/user/syslog/$HOST-$DAY-$MONTH-$YEAR.log") template(t_cgnat) hdfs-append-enabled(true) ); };
There are the messages I get when starting in debug:
[2018-08-23T09:40:09.210168] Running application hooks; hook='1' [2018-08-23T09:40:09.210192] Running application hooks; hook='3' [2018-08-23T09:40:09.210501] syslog-ng starting up; version='3.14.1' [2018-08-23T09:40:09.213049] Worker thread started; driver='d_hdfs#0' [2018-08-23T09:40:09.214922] Opening hdfs; [2018-08-23T09:40:09.548286] field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginSuccess with annotation @org.apache.hadoop.metrics2.annotation.Metric(about=, sampleName=Ops, always=false, type=DEFAULT, valueName=Time, value=[Rate of successful kerberos logins and latency (milliseconds)]); [2018-08-23T09:40:09.561345] field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginFailure with annotation @org.apache.hadoop.metrics2.annotation.Metric(about=, sampleName=Ops, always=false, type=DEFAULT, valueName=Time, value=[Rate of failed kerberos logins and latency (milliseconds)]); [2018-08-23T09:40:09.561608] field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.getGroups with annotation @org.apache.hadoop.metrics2.annotation.Metric(about=, sampleName=Ops, always=false, type=DEFAULT, valueName=Time, value=[GetGroups]); [2018-08-23T09:40:09.562485] UgiMetrics, User and group related metrics; [2018-08-23T09:40:09.604037] Failed to detect a valid hadoop home directory; [2018-08-23T09:40:09.687386] setsid exited with exit code 0; [2018-08-23T09:40:09.715804] Creating new Groups object; [2018-08-23T09:40:09.717743] Trying to load the custom-built native-hadoop library...; [2018-08-23T09:40:09.718065] Failed to load native-hadoop with error: java.lang.UnsatisfiedLinkError: no hadoop in java.library.path; [2018-08-23T09:40:09.718095] java.library.path=//usr/lib64/syslog-ng; [2018-08-23T09:40:09.718117] Unable to load native-hadoop library for your platform... using builtin-java classes where applicable; [2018-08-23T09:40:09.718418] Falling back to shell based; [2018-08-23T09:40:09.718997] Group mapping impl=org.apache.hadoop.security.ShellBasedUnixGroupsMapping; [2018-08-23T09:40:09.753615] Group mapping impl=org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback\; cacheTimeout=300000\; warningDeltaMs=5000; [2018-08-23T09:40:09.905060] hadoop login; [2018-08-23T09:40:09.906493] hadoop login commit; [2018-08-23T09:40:09.907300] using kerberos user:syslog@HADOOP.COM; [2018-08-23T09:40:09.907333] Using user: "syslog@HADOOP.COM" with name syslog@HADOOP.COM; [2018-08-23T09:40:09.907592] User entry: "syslog@HADOOP.COM"; [2018-08-23T09:40:09.908005] Login successful for user syslog@HADOOP.COM using keytab file syslog.keytab; [2018-08-23T09:40:10.104386] dfs.client.use.legacy.blockreader.local = false; [2018-08-23T09:40:10.104436] dfs.client.read.shortcircuit = false; [2018-08-23T09:40:10.104450] dfs.client.domain.socket.data.traffic = false; [2018-08-23T09:40:10.104461] dfs.domain.socket.path = ; [2018-08-23T09:40:10.121280] Sets dfs.client.block.write.replace-datanode-on-failure.replication to 0; [2018-08-23T09:40:10.144892] multipleLinearRandomRetry = null; [2018-08-23T09:40:10.168901] rpcKind=RPC_PROTOCOL_BUFFER, rpcRequestWrapperClass=class org.apache.hadoop.ipc.ProtobufRpcEngine$RpcProtobufRequest, rpcInvoker=org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker@4379f1a4 ; [2018-08-23T09:40:10.173738] getting client out of cache: org.apache.hadoop.ipc.Client@5b52b909; [2018-08-23T09:40:10.238466] finalize() called.; [2018-08-23T09:40:10.238705] finalize() called.; [2018-08-23T09:40:10.526021] Both short-circuit local reads and UNIX domain socket are disabled.; [2018-08-23T09:40:10.532037] DataTransferProtocol not using SaslPropertiesResolver, no QOP found in configuration for dfs.data.transfer.protection; [2018-08-23T09:40:10.555581] The ping interval is 60000 ms.; [2018-08-23T09:40:10.556336] Connecting to /x.x.x.x:25000; [2018-08-23T09:40:10.572385] PrivilegedAction as:syslog@HADOOP.COM (auth:KERBEROS) from:org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:771); [2018-08-23T09:40:10.613823] Get kerberos info proto:interface org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB info:@org.apache.hadoop.security.KerberosInfo(clientPrincipal=, serverPrincipal=dfs.namenode.kerberos.principal); [2018-08-23T09:40:10.723447] RPC Server's Kerberos principal name for protocol=org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB is hdfs/ hadoop.hadoop.com@HADOOP.COM; [2018-08-23T09:40:10.723514] Creating SASL GSSAPI(KERBEROS) client to authenticate to service at hadoop.hadoop.com; [2018-08-23T09:40:10.730466] Use KERBEROS authentication for protocol ClientNamenodeProtocolPB; [2018-08-23T09:40:10.758296] PrivilegedActionException as:syslog@HADOOP.COM (auth:KERBEROS) cause:javax.security.sasl.SaslException: No common protection layer between client and server; [2018-08-23T09:40:10.759031] PrivilegedAction as:syslog@HADOOP.COM (auth:KERBEROS) from:org.apache.hadoop.ipc.Client$Connection.handleSaslConnectionFailure(Client.java:694); [2018-08-23T09:40:10.759531] Exception encountered while connecting to the server : javax.security.sasl.SaslException: No common protection layer between client and server; [2018-08-23T09:40:14.446925] PrivilegedAction as:syslog@HADOOP.COM (auth:KERBEROS) from:org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:771); [2018-08-23T09:40:14.447824] Get kerberos info proto:interface org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB info:@org.apache.hadoop.security.KerberosInfo(clientPrincipal=, serverPrincipal=dfs.namenode.kerberos.principal); [2018-08-23T09:40:14.447982] RPC Server's Kerberos principal name for protocol=org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB is hdfs/ hadoop.hadoop.com@HADOOP.COM; [2018-08-23T09:40:14.448001] Creating SASL GSSAPI(KERBEROS) client to authenticate to service at hadoop.hadoop.com; [2018-08-23T09:40:14.449087] Use KERBEROS authentication for protocol ClientNamenodeProtocolPB; [2018-08-23T09:40:14.455070] PrivilegedActionException as:syslog@HADOOP.COM (auth:KERBEROS) cause:javax.security.sasl.SaslException: No common protection layer between client and server; [2018-08-23T09:40:14.455190] PrivilegedAction as:syslog@HADOOP.COM (auth:KERBEROS) from:org.apache.hadoop.ipc.Client$Connection.handleSaslConnectionFailure(Client.java:694); [2018-08-23T09:40:14.455476] Exception encountered while connecting to the server : javax.security.sasl.SaslException: No common protection layer between client and server; [2018-08-23T09:40:17.206928] PrivilegedAction as:syslog@HADOOP.COM (auth:KERBEROS) from:org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:771); [2018-08-23T09:40:17.207978] Get kerberos info proto:interface org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB info:@org.apache.hadoop.security.KerberosInfo(clientPrincipal=, serverPrincipal=dfs.namenode.kerberos.principal); [2018-08-23T09:40:17.208115] RPC Server's Kerberos principal name for protocol=org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB is hdfs/ hadoop.hadoop.com@HADOOP.COM; [2018-08-23T09:40:17.208133] Creating SASL GSSAPI(KERBEROS) client to authenticate to service at hadoop.hadoop.com; [2018-08-23T09:40:17.208877] Use KERBEROS authentication for protocol ClientNamenodeProtocolPB; [2018-08-23T09:40:17.214382] PrivilegedActionException as:syslog@HADOOP.COM (auth:KERBEROS) cause:javax.security.sasl.SaslException: No common protection layer between client and server; [2018-08-23T09:40:17.214536] PrivilegedAction as:syslog@HADOOP.COM (auth:KERBEROS) from:org.apache.hadoop.ipc.Client$Connection.handleSaslConnectionFailure(Client.java:694); [2018-08-23T09:40:17.214845] Exception encountered while connecting to the server : javax.security.sasl.SaslException: No common protection layer between client and server; [2018-08-23T09:40:17.535313] PrivilegedAction as:syslog@HADOOP.COM (auth:KERBEROS) from:org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:771); [2018-08-23T09:40:17.536419] Get kerberos info proto:interface org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB info:@org.apache.hadoop.security.KerberosInfo(clientPrincipal=, serverPrincipal=dfs.namenode.kerberos.principal); [2018-08-23T09:40:17.536615] RPC Server's Kerberos principal name for protocol=org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB is hdfs/ hadoop.hadoop.com@HADOOP.COM; [2018-08-23T09:40:17.536634] Creating SASL GSSAPI(KERBEROS) client to authenticate to service at hadoop.hadoop.com; [2018-08-23T09:40:17.537679] Use KERBEROS authentication for protocol ClientNamenodeProtocolPB; [2018-08-23T09:40:17.546375] PrivilegedActionException as:syslog@HADOOP.COM (auth:KERBEROS) cause:javax.security.sasl.SaslException: No common protection layer between client and server; [2018-08-23T09:40:17.546587] PrivilegedAction as:syslog@HADOOP.COM (auth:KERBEROS) from:org.apache.hadoop.ipc.Client$Connection.handleSaslConnectionFailure(Client.java:694); [2018-08-23T09:40:17.546963] Exception encountered while connecting to the server : javax.security.sasl.SaslException: No common protection layer between client and server; [2018-08-23T09:40:21.891382] PrivilegedAction as:syslog@HADOOP.COM (auth:KERBEROS) from:org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:771); [2018-08-23T09:40:21.892476] Get kerberos info proto:interface org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB info:@org.apache.hadoop.security.KerberosInfo(clientPrincipal=, serverPrincipal=dfs.namenode.kerberos.principal); [2018-08-23T09:40:21.909267] RPC Server's Kerberos principal name for protocol=org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB is hdfs/ hadoop.hadoop.com@HADOOP.COM; [2018-08-23T09:40:21.909306] Creating SASL GSSAPI(KERBEROS) client to authenticate to service at hadoop.hadoop.com; [2018-08-23T09:40:21.910125] Use KERBEROS authentication for protocol ClientNamenodeProtocolPB; [2018-08-23T09:40:21.915324] PrivilegedActionException as:syslog@HADOOP.COM (auth:KERBEROS) cause:javax.security.sasl.SaslException: No common protection layer between client and server; [2018-08-23T09:40:21.915540] PrivilegedAction as:syslog@HADOOP.COM (auth:KERBEROS) from:org.apache.hadoop.ipc.Client$Connection.handleSaslConnectionFailure(Client.java:694); [2018-08-23T09:40:21.915889] Exception encountered while connecting to the server : javax.security.sasl.SaslException: No common protection layer between client and server; [2018-08-23T09:40:26.095299] PrivilegedAction as:syslog@HADOOP.COM (auth:KERBEROS) from:org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:771); [2018-08-23T09:40:26.096234] Get kerberos info proto:interface org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB info:@org.apache.hadoop.security.KerberosInfo(clientPrincipal=, serverPrincipal=dfs.namenode.kerberos.principal); [2018-08-23T09:40:26.096391] RPC Server's Kerberos principal name for protocol=org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB is hdfs/ hadoop.hadoop.com@HADOOP.COM; [2018-08-23T09:40:26.096411] Creating SASL GSSAPI(KERBEROS) client to authenticate to service at hadoop.hadoop.com; [2018-08-23T09:40:26.097116] Use KERBEROS authentication for protocol ClientNamenodeProtocolPB; [2018-08-23T09:40:26.101823] PrivilegedActionException as:syslog@HADOOP.COM (auth:KERBEROS) cause:javax.security.sasl.SaslException: No common protection layer between client and server; [2018-08-23T09:40:26.101960] PrivilegedAction as:syslog@HADOOP.COM (auth:KERBEROS) from:org.apache.hadoop.ipc.Client$Connection.handleSaslConnectionFailure(Client.java:694); [2018-08-23T09:40:26.102283] Couldn't setup connection for syslog@HADOOP.COM to /x.x.x.x:25000; [2018-08-23T09:40:26.102365] PrivilegedActionException as:syslog@HADOOP.COM (auth:KERBEROS) cause:java.io.IOException: Couldn't setup connection for syslog@HADOOP.COM to /x.x.x.x:25000; [2018-08-23T09:40:26.103844] closing ipc connection to /x.x.x.x:25000: Couldn't setup connection for syslog@HADOOP.COM to /x.x.x.x:25000; [2018-08-23T09:40:26.103994] IPC Client (1766482975) connection to /x.x.x.x:25000 from syslog@HADOOP.COM: closed;
Thank you.
On Tue, Aug 21, 2018 at 6:48 PM Nagy, Gábor <gabor.nagy@oneidentity.com> wrote:
Hi,
Can you tell ne what is the version of the hadoop lib you use with syslog-ng, please?
Can you share your syslog-ng configuration, mainly the hdfs part, please?
Regards, Gabor
On Tue, Aug 21, 2018 at 4:15 AM Lee Keng Ket <kengket@gmail.com> wrote:
Hi, Gabor
I have run it, seems like it stops at the HDFS side.
[2018-08-21T10:07:51.212015] Worker thread started; driver='d_hdfs#0' [2018-08-21T10:07:51.212499] Running application hooks; hook='1' [2018-08-21T10:07:51.212516] Running application hooks; hook='3' [2018-08-21T10:07:51.212595] syslog-ng starting up; version='3.14.1' [2018-08-21T10:07:51.214113] Opening hdfs; [2018-08-21T10:08:01.215622] Opening hdfs; [2018-08-21T10:08:11.216050] Opening hdfs; [2018-08-21T10:08:21.226340] Opening hdfs; [2018-08-21T10:08:31.236589] Opening hdfs; [2018-08-21T10:08:41.240623] Opening hdfs; [2018-08-21T10:08:51.250879] Opening hdfs; [2018-08-21T10:09:01.261172] Opening hdfs; [2018-08-21T10:09:11.271410] Opening hdfs; [2018-08-21T10:09:21.281685] Opening hdfs; [2018-08-21T10:09:31.290765] Opening hdfs; [2018-08-21T10:09:41.301098] Opening hdfs; [2018-08-21T10:09:51.311362] Opening hdfs; [2018-08-21T10:10:01.321152] Opening hdfs; [2018-08-21T10:10:11.321818] Opening hdfs; [2018-08-21T10:10:21.330114] Opening hdfs; [2018-08-21T10:10:31.340413] Opening hdfs; [2018-08-21T10:10:41.350654] Opening hdfs; [2018-08-21T10:10:51.354016] Opening hdfs; [2018-08-21T10:11:01.364267] Opening hdfs; [2018-08-21T10:11:11.374516] Opening hdfs; [2018-08-21T10:11:21.384761] Opening hdfs; [2018-08-21T10:11:31.395017] Opening hdfs; [2018-08-21T10:11:41.402256] Opening hdfs; [2018-08-21T10:11:51.404097] Opening hdfs; ^C[2018-08-21T10:11:59.672252] syslog-ng shutting down; version='3.14.1' Exception in thread "" java.lang.NoClassDefFoundError: org/apache/hadoop/conf/Configuration at org.syslog_ng.hdfs.HdfsDestination.open(HdfsDestination.java:92) at org.syslog_ng.LogDestination.openProxy(LogDestination.java:65) [2018-08-21T10:11:59.774895] Worker thread finished; driver='d_hdfs#0' [2018-08-21T10:11:59.775384] Closing log transport fd; fd='13' [2018-08-21T10:11:59.775508] Deinitialize hdfs destination; [2018-08-21T10:11:59.776534] Java machine free; [2018-08-21T10:11:59.778421] Running application hooks; hook='4'
Any idea what to be checked further?
Thank you.
On Fri, Aug 17, 2018 at 4:45 PM Nagy, Gábor < gabor.nagy@oneidentity.com> wrote:
> Hello! > > In the statistics it can be seen that the log message is not sent to > the HDFS server: > dropped='dst.java(d_hdfs#0 java_dst hdfs hdfs://x.x.x.x:25000 > /user/syslog/$HOST-$DAY-$MONTH-$YEAR.log)=0' > processed='dst.java(d_hdfs#0 java_dst hdfs hdfs://x.x.x.x:25000 > /user/syslog/$HOST-$DAY-$MONTH-$YEAR.log)=1' > queued='dst.java(d_hdfs#0 java_dst hdfs hdfs://x.x.x.x:25000 > /user/syslog/$HOST-$DAY-$MONTH-$YEAR.log)=1' > > Well, generally on write error there should be an exception that > results in an error message. > > You should try debugging it either in running syslog-ng in > foreground (-F option), forwarding internal logs to stderr (-e) and with > debug mode (-dv) on. > Or in service mode use the internal() source in your config and > connect it to a destination (e.g. file()) which you prefer. > > You could turn on debug messages on java side too using > jvm_options() in syslog-ng config and configuring the log4j logging > service, e.g.: > options { > > jvm_options("-Dlog4j.configuration=file:/etc/hadoop/log4j.properties > -Dlog4j.debug=true"); > }; > > Regards, > Gabor > > On Fri, Aug 17, 2018 at 10:34 AM Czanik, Péter < > peter.czanik@balabit.com> wrote: > >> Hi, >> >> As https://copr.fedorainfracloud.org/coprs/czanik/syslog-ng314/ >> also writes: "Java is enabled, but JAR dependencies are not provided in >> package, except for Elasticsearch http mode." The >> syslog-ng-java-deps.noarch contains build time dependencies. Probably I >> should rename the package to syslog-ng-java-build-deps... >> >> Check the documentation at >> https://www.syslog-ng.com/technical-documents/doc/syslog-ng-open-source-edit... >> on how to download and configure HDFS related JAR dependencies. >> >> Bye, >> >> Peter Czanik (CzP) <peter.czanik@balabit.com> >> Balabit / syslog-ng upstream >> https://syslog-ng.com/community/ >> https://twitter.com/PCzanik >> >> On Fri, Aug 17, 2018 at 10:22 AM, Lee Keng Ket <kengket@gmail.com> >> wrote: >> >>> Hi, >>> >>> I'm trying to connect syslog-ng 3.14.1 to HDFS to store the syslog >>> messages. The syslog-ng can start without error, and it's able to write >>> into local file. However, the log is not written to the HDFS. As there is >>> no single error, I'm not sure how I should troubleshoot on this. >>> >>> I have installed the syslog-ng from this repo, >>> https://copr.fedorainfracloud.org/coprs/czanik/syslog-ng314/repo/epel-7/czan... >>> >>> Installed Packages >>> syslog-ng.x86_64 >>> 3.14.1-4.el7.centos >>> @czanik-syslog-ng314 >>> syslog-ng-java.x86_64 >>> 3.14.1-4.el7.centos >>> @czanik-syslog-ng314 >>> syslog-ng-java-deps.noarch >>> 1.0-2 >>> @czanik-syslog-ng314 >>> >>> This is the message from /var/log/message: >>> Log statistics; processed='src.internal(s_sys#0)=1', >>> stamp='src.internal(s_sys#0)=1534491834', >>> processed='destination(d_spol)=0', processed='destination(d_mlal)=0', >>> processed='center(received)=2', processed='destination(d_mesg)=1', >>> processed='destination(d_mail)=0', processed='destination(d_auth)=0', >>> processed='destination(d_cron)=0', processed='destination(d_hdfs)=1', >>> processed='center(queued)=3', queued='global(scratch_buffers_count)=0', >>> processed='source(remote_log)=1', >>> dropped='dst.java(d_hdfs#0,java_dst,hdfs,hdfs://x.x.x.x:25000,/user/syslog/$HOST-$DAY-$MONTH-$YEAR.log)=0', >>> processed='dst.java(d_hdfs#0,java_dst,hdfs,hdfs://x.x.x.x:25000,/user/syslog/$HOST-$DAY-$MONTH-$YEAR.log)=1', >>> queued='dst.java(d_hdfs#0,java_dst,hdfs,hdfs://x.x.x.x:25000,/user/syslog/$HOST-$DAY-$MONTH-$YEAR.log)=1', >>> processed='global(payload_reallocs)=0', >>> processed='src.journald(journal)=0', stamp='src.journald(journal)=0', >>> processed='global(sdata_updates)=0', >>> queued='global(scratch_buffers_bytes)=0', >>> processed='destination(d_boot)=0', processed='destination(d_kern)=0', >>> processed='source(s_sys)=1', processed='destination(remote)=1', >>> processed='global(internal_queue_length)=0', >>> processed='global(msg_clones)=0' >>> >>> Anyone has any idea how should I proceed the troubleshooting? >>> >>> >>> ______________________________________________________________________________ >>> Member info: https://lists.balabit.hu/mailman/listinfo/syslog-ng >>> Documentation: >>> http://www.balabit.com/support/documentation/?product=syslog-ng >>> FAQ: http://www.balabit.com/wiki/syslog-ng-faq >>> >>> >>> >> >> ______________________________________________________________________________ >> Member info: https://lists.balabit.hu/mailman/listinfo/syslog-ng >> Documentation: >> http://www.balabit.com/support/documentation/?product=syslog-ng >> FAQ: http://www.balabit.com/wiki/syslog-ng-faq >> >> > ______________________________________________________________________________ > Member info: https://lists.balabit.hu/mailman/listinfo/syslog-ng > Documentation: > http://www.balabit.com/support/documentation/?product=syslog-ng > FAQ: http://www.balabit.com/wiki/syslog-ng-faq > > ______________________________________________________________________________ Member info: https://lists.balabit.hu/mailman/listinfo/syslog-ng Documentation: http://www.balabit.com/support/documentation/?product=syslog-ng FAQ: http://www.balabit.com/wiki/syslog-ng-faq
______________________________________________________________________________ Member info: https://lists.balabit.hu/mailman/listinfo/syslog-ng Documentation: http://www.balabit.com/support/documentation/?product=syslog-ng FAQ: http://www.balabit.com/wiki/syslog-ng-faq
______________________________________________________________________________ Member info: https://lists.balabit.hu/mailman/listinfo/syslog-ng Documentation: http://www.balabit.com/support/documentation/?product=syslog-ng FAQ: http://www.balabit.com/wiki/syslog-ng-faq
______________________________________________________________________________ Member info: https://lists.balabit.hu/mailman/listinfo/syslog-ng Documentation: http://www.balabit.com/support/documentation/?product=syslog-ng FAQ: http://www.balabit.com/wiki/syslog-ng-faq
Member info: https://lists.balabit.hu/mailman/listinfo/syslog-ng Documentation: http://www.balabit.com/support/documentation/?product=syslog-ng FAQ: http://www.balabit.com/wiki/syslog-ng-faq
Hi, Gabor Yes, I see this note as well, so I'm monitoring how long it takes to write into the disk. I'm using append in hdfs, but I'm not really sure what's the difference between no append and append. Thank you. Regards, KK On Mon, 27 Aug 2018, 15:41 Nagy, Gábor, <gabor.nagy@oneidentity.com> wrote:
Hi,
As I know we have no control over flushing the file in HDFS. I've checked our admin guide, let me quote it:
You cannot set when log messages are flushed. Hadoop performs this action automatically, depending on its configured block size, and the amount of data received. There is no way for the syslog-ng OSE application to influence when the messages are actually written to disk. This means that syslog-ng OSE cannot guarantee that a message sent to HDFS is actually written to disk.
You can reload syslog-ng periodically. During reload the used files are closed and HDFS should flushes them to the datanodes. Do you use append in your HDFS environment?
Regards, Gabor
On Mon, Aug 27, 2018 at 9:13 AM Lee Keng Ket <kengket@gmail.com> wrote:
Hi,
I'm able to send the syslog to HDFS, but it seems not real time. The file in HDFS does not increase size, until I stop the syslog-ng process, then all the messages go into HDFS and we can see the file size increase. Any idea how frequent the syslog-ng send the syslog to HDFS? If the syslog-ng incoming syslog is a lot, is there any fine-tuning required? The syslog-ng also write log file in local server, I can see the file size increase by 40MB in 20min. However, the file in HDFS remains same size until I stop the syslog-ng process. Once I stop the process, the log is written into the HDFS.
This is log stats: Log statistics; processed='src.internal(s_sys#1)=4', stamp='src.internal(s_sys#1)=1535352937', processed='destination(d_mlal)=0', processed='center(received)=30694', processed='destination(d_spol)=0', processed='destination(d_mesg)=4', processed='destination(d_mail)=0', processed='destination(d_auth)=0', processed='destination(d_cron)=0', processed='destination(d_hdfs)=30690', processed='center(queued)=61384', queued='global(scratch_buffers_count)=0', processed='source(remote_log)=30690', dropped='dst.java(d_hdfs#0,java_dst,hdfs,hdfs://CN03:25000,/user/syslog/$HOST-$DAY-$MONTH-$YEAR.log)=0', processed='dst.java(d_hdfs#0,java_dst,hdfs,hdfs://CN03:25000,/user/syslog/$HOST-$DAY-$MONTH-$YEAR.log)=30690', queued='dst.java(d_hdfs#0,java_dst,hdfs,hdfs://CN03:25000,/user/syslog/$HOST-$DAY-$MONTH-$YEAR.log)=0', processed='global(payload_reallocs)=0', processed='global(sdata_updates)=0', queued='global(scratch_buffers_bytes)=0', processed='src.journald(s_sys#0,journal)=0', stamp='src.journald(s_sys#0,journal)=0', processed='destination(d_boot)=0', processed='destination(d_kern)=0', processed='source(s_sys)=4', processed='destination(remote)=30690', processed='global(internal_queue_length)=0', processed='global(msg_clones)=0'
Thank you.
Regards, KK
On Mon, Aug 27, 2018 at 10:41 AM Lee Keng Ket <kengket@gmail.com> wrote:
Hi, Gabor
I am able to send the syslog to HDFS after adding in the hdfs-resources option. Thanks a lot for your help.
Thank you.
Regards, KK
On Fri, Aug 24, 2018 at 10:29 PM Nagy, Gábor <gabor.nagy@oneidentity.com> wrote:
Hi,
As I was searching for answer what i saw was that "SaslException: No common protection layer between client and server;" can occur if the configuration differs on client-server side.
You should configure the "hadoop.rpc.protection" value in core-site.xml on the client side to the same one as on the server side. To use that xml with hdfs, you should put the filename in syslog-ng config in the hdfs destination block: destination d_hdfs { ... hdfs-resources("/path/to/xmlconfig/core-site.xml"); };
Regards, Gabor
On Thu, Aug 23, 2018 at 3:53 AM Lee Keng Ket <kengket@gmail.com> wrote:
Hi, Gabor
After I change the lib to executable as reminded by Janos, now there is different error. Any idea if this is Kerberos issue or still the Hadoop lib issue?
The Hadoop that I'm using is Huawei FusionInsight. Hadoop HDFS version should be 2.7.2.
This is the syslog-ng.conf for HDFS part: destination d_hdfs { hdfs(client_lib_dir("/opt/hadoop/lib") hdfs_uri("hdfs://x.x.x.x:25000") kerberos-keytab-file("/etc/syslog.keytab") kerberos-principal("syslog@HADOOP.COM") hdfs_file("/user/syslog/$HOST-$DAY-$MONTH-$YEAR.log") template(t_cgnat) hdfs-append-enabled(true) ); };
There are the messages I get when starting in debug:
[2018-08-23T09:40:09.210168] Running application hooks; hook='1' [2018-08-23T09:40:09.210192] Running application hooks; hook='3' [2018-08-23T09:40:09.210501] syslog-ng starting up; version='3.14.1' [2018-08-23T09:40:09.213049] Worker thread started; driver='d_hdfs#0' [2018-08-23T09:40:09.214922] Opening hdfs; [2018-08-23T09:40:09.548286] field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginSuccess with annotation @org.apache.hadoop.metrics2.annotation.Metric(about=, sampleName=Ops, always=false, type=DEFAULT, valueName=Time, value=[Rate of successful kerberos logins and latency (milliseconds)]); [2018-08-23T09:40:09.561345] field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginFailure with annotation @org.apache.hadoop.metrics2.annotation.Metric(about=, sampleName=Ops, always=false, type=DEFAULT, valueName=Time, value=[Rate of failed kerberos logins and latency (milliseconds)]); [2018-08-23T09:40:09.561608] field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.getGroups with annotation @org.apache.hadoop.metrics2.annotation.Metric(about=, sampleName=Ops, always=false, type=DEFAULT, valueName=Time, value=[GetGroups]); [2018-08-23T09:40:09.562485] UgiMetrics, User and group related metrics; [2018-08-23T09:40:09.604037] Failed to detect a valid hadoop home directory; [2018-08-23T09:40:09.687386] setsid exited with exit code 0; [2018-08-23T09:40:09.715804] Creating new Groups object; [2018-08-23T09:40:09.717743] Trying to load the custom-built native-hadoop library...; [2018-08-23T09:40:09.718065] Failed to load native-hadoop with error: java.lang.UnsatisfiedLinkError: no hadoop in java.library.path; [2018-08-23T09:40:09.718095] java.library.path=//usr/lib64/syslog-ng; [2018-08-23T09:40:09.718117] Unable to load native-hadoop library for your platform... using builtin-java classes where applicable; [2018-08-23T09:40:09.718418] Falling back to shell based; [2018-08-23T09:40:09.718997] Group mapping impl=org.apache.hadoop.security.ShellBasedUnixGroupsMapping; [2018-08-23T09:40:09.753615] Group mapping impl=org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback\; cacheTimeout=300000\; warningDeltaMs=5000; [2018-08-23T09:40:09.905060] hadoop login; [2018-08-23T09:40:09.906493] hadoop login commit; [2018-08-23T09:40:09.907300] using kerberos user:syslog@HADOOP.COM; [2018-08-23T09:40:09.907333] Using user: "syslog@HADOOP.COM" with name syslog@HADOOP.COM; [2018-08-23T09:40:09.907592] User entry: "syslog@HADOOP.COM"; [2018-08-23T09:40:09.908005] Login successful for user syslog@HADOOP.COM using keytab file syslog.keytab; [2018-08-23T09:40:10.104386] dfs.client.use.legacy.blockreader.local = false; [2018-08-23T09:40:10.104436] dfs.client.read.shortcircuit = false; [2018-08-23T09:40:10.104450] dfs.client.domain.socket.data.traffic = false; [2018-08-23T09:40:10.104461] dfs.domain.socket.path = ; [2018-08-23T09:40:10.121280] Sets dfs.client.block.write.replace-datanode-on-failure.replication to 0; [2018-08-23T09:40:10.144892] multipleLinearRandomRetry = null; [2018-08-23T09:40:10.168901] rpcKind=RPC_PROTOCOL_BUFFER, rpcRequestWrapperClass=class org.apache.hadoop.ipc.ProtobufRpcEngine$RpcProtobufRequest, rpcInvoker=org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker@4379f1a4 ; [2018-08-23T09:40:10.173738] getting client out of cache: org.apache.hadoop.ipc.Client@5b52b909; [2018-08-23T09:40:10.238466] finalize() called.; [2018-08-23T09:40:10.238705] finalize() called.; [2018-08-23T09:40:10.526021] Both short-circuit local reads and UNIX domain socket are disabled.; [2018-08-23T09:40:10.532037] DataTransferProtocol not using SaslPropertiesResolver, no QOP found in configuration for dfs.data.transfer.protection; [2018-08-23T09:40:10.555581] The ping interval is 60000 ms.; [2018-08-23T09:40:10.556336] Connecting to /x.x.x.x:25000; [2018-08-23T09:40:10.572385] PrivilegedAction as:syslog@HADOOP.COM (auth:KERBEROS) from:org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:771); [2018-08-23T09:40:10.613823] Get kerberos info proto:interface org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB info:@org.apache.hadoop.security.KerberosInfo(clientPrincipal=, serverPrincipal=dfs.namenode.kerberos.principal); [2018-08-23T09:40:10.723447] RPC Server's Kerberos principal name for protocol=org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB is hdfs/ hadoop.hadoop.com@HADOOP.COM; [2018-08-23T09:40:10.723514] Creating SASL GSSAPI(KERBEROS) client to authenticate to service at hadoop.hadoop.com; [2018-08-23T09:40:10.730466] Use KERBEROS authentication for protocol ClientNamenodeProtocolPB; [2018-08-23T09:40:10.758296] PrivilegedActionException as:syslog@HADOOP.COM (auth:KERBEROS) cause:javax.security.sasl.SaslException: No common protection layer between client and server; [2018-08-23T09:40:10.759031] PrivilegedAction as:syslog@HADOOP.COM (auth:KERBEROS) from:org.apache.hadoop.ipc.Client$Connection.handleSaslConnectionFailure(Client.java:694); [2018-08-23T09:40:10.759531] Exception encountered while connecting to the server : javax.security.sasl.SaslException: No common protection layer between client and server; [2018-08-23T09:40:14.446925] PrivilegedAction as:syslog@HADOOP.COM (auth:KERBEROS) from:org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:771); [2018-08-23T09:40:14.447824] Get kerberos info proto:interface org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB info:@org.apache.hadoop.security.KerberosInfo(clientPrincipal=, serverPrincipal=dfs.namenode.kerberos.principal); [2018-08-23T09:40:14.447982] RPC Server's Kerberos principal name for protocol=org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB is hdfs/ hadoop.hadoop.com@HADOOP.COM; [2018-08-23T09:40:14.448001] Creating SASL GSSAPI(KERBEROS) client to authenticate to service at hadoop.hadoop.com; [2018-08-23T09:40:14.449087] Use KERBEROS authentication for protocol ClientNamenodeProtocolPB; [2018-08-23T09:40:14.455070] PrivilegedActionException as:syslog@HADOOP.COM (auth:KERBEROS) cause:javax.security.sasl.SaslException: No common protection layer between client and server; [2018-08-23T09:40:14.455190] PrivilegedAction as:syslog@HADOOP.COM (auth:KERBEROS) from:org.apache.hadoop.ipc.Client$Connection.handleSaslConnectionFailure(Client.java:694); [2018-08-23T09:40:14.455476] Exception encountered while connecting to the server : javax.security.sasl.SaslException: No common protection layer between client and server; [2018-08-23T09:40:17.206928] PrivilegedAction as:syslog@HADOOP.COM (auth:KERBEROS) from:org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:771); [2018-08-23T09:40:17.207978] Get kerberos info proto:interface org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB info:@org.apache.hadoop.security.KerberosInfo(clientPrincipal=, serverPrincipal=dfs.namenode.kerberos.principal); [2018-08-23T09:40:17.208115] RPC Server's Kerberos principal name for protocol=org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB is hdfs/ hadoop.hadoop.com@HADOOP.COM; [2018-08-23T09:40:17.208133] Creating SASL GSSAPI(KERBEROS) client to authenticate to service at hadoop.hadoop.com; [2018-08-23T09:40:17.208877] Use KERBEROS authentication for protocol ClientNamenodeProtocolPB; [2018-08-23T09:40:17.214382] PrivilegedActionException as:syslog@HADOOP.COM (auth:KERBEROS) cause:javax.security.sasl.SaslException: No common protection layer between client and server; [2018-08-23T09:40:17.214536] PrivilegedAction as:syslog@HADOOP.COM (auth:KERBEROS) from:org.apache.hadoop.ipc.Client$Connection.handleSaslConnectionFailure(Client.java:694); [2018-08-23T09:40:17.214845] Exception encountered while connecting to the server : javax.security.sasl.SaslException: No common protection layer between client and server; [2018-08-23T09:40:17.535313] PrivilegedAction as:syslog@HADOOP.COM (auth:KERBEROS) from:org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:771); [2018-08-23T09:40:17.536419] Get kerberos info proto:interface org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB info:@org.apache.hadoop.security.KerberosInfo(clientPrincipal=, serverPrincipal=dfs.namenode.kerberos.principal); [2018-08-23T09:40:17.536615] RPC Server's Kerberos principal name for protocol=org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB is hdfs/ hadoop.hadoop.com@HADOOP.COM; [2018-08-23T09:40:17.536634] Creating SASL GSSAPI(KERBEROS) client to authenticate to service at hadoop.hadoop.com; [2018-08-23T09:40:17.537679] Use KERBEROS authentication for protocol ClientNamenodeProtocolPB; [2018-08-23T09:40:17.546375] PrivilegedActionException as:syslog@HADOOP.COM (auth:KERBEROS) cause:javax.security.sasl.SaslException: No common protection layer between client and server; [2018-08-23T09:40:17.546587] PrivilegedAction as:syslog@HADOOP.COM (auth:KERBEROS) from:org.apache.hadoop.ipc.Client$Connection.handleSaslConnectionFailure(Client.java:694); [2018-08-23T09:40:17.546963] Exception encountered while connecting to the server : javax.security.sasl.SaslException: No common protection layer between client and server; [2018-08-23T09:40:21.891382] PrivilegedAction as:syslog@HADOOP.COM (auth:KERBEROS) from:org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:771); [2018-08-23T09:40:21.892476] Get kerberos info proto:interface org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB info:@org.apache.hadoop.security.KerberosInfo(clientPrincipal=, serverPrincipal=dfs.namenode.kerberos.principal); [2018-08-23T09:40:21.909267] RPC Server's Kerberos principal name for protocol=org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB is hdfs/ hadoop.hadoop.com@HADOOP.COM; [2018-08-23T09:40:21.909306] Creating SASL GSSAPI(KERBEROS) client to authenticate to service at hadoop.hadoop.com; [2018-08-23T09:40:21.910125] Use KERBEROS authentication for protocol ClientNamenodeProtocolPB; [2018-08-23T09:40:21.915324] PrivilegedActionException as:syslog@HADOOP.COM (auth:KERBEROS) cause:javax.security.sasl.SaslException: No common protection layer between client and server; [2018-08-23T09:40:21.915540] PrivilegedAction as:syslog@HADOOP.COM (auth:KERBEROS) from:org.apache.hadoop.ipc.Client$Connection.handleSaslConnectionFailure(Client.java:694); [2018-08-23T09:40:21.915889] Exception encountered while connecting to the server : javax.security.sasl.SaslException: No common protection layer between client and server; [2018-08-23T09:40:26.095299] PrivilegedAction as:syslog@HADOOP.COM (auth:KERBEROS) from:org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:771); [2018-08-23T09:40:26.096234] Get kerberos info proto:interface org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB info:@org.apache.hadoop.security.KerberosInfo(clientPrincipal=, serverPrincipal=dfs.namenode.kerberos.principal); [2018-08-23T09:40:26.096391] RPC Server's Kerberos principal name for protocol=org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB is hdfs/ hadoop.hadoop.com@HADOOP.COM; [2018-08-23T09:40:26.096411] Creating SASL GSSAPI(KERBEROS) client to authenticate to service at hadoop.hadoop.com; [2018-08-23T09:40:26.097116] Use KERBEROS authentication for protocol ClientNamenodeProtocolPB; [2018-08-23T09:40:26.101823] PrivilegedActionException as:syslog@HADOOP.COM (auth:KERBEROS) cause:javax.security.sasl.SaslException: No common protection layer between client and server; [2018-08-23T09:40:26.101960] PrivilegedAction as:syslog@HADOOP.COM (auth:KERBEROS) from:org.apache.hadoop.ipc.Client$Connection.handleSaslConnectionFailure(Client.java:694); [2018-08-23T09:40:26.102283] Couldn't setup connection for syslog@HADOOP.COM to /x.x.x.x:25000; [2018-08-23T09:40:26.102365] PrivilegedActionException as:syslog@HADOOP.COM (auth:KERBEROS) cause:java.io.IOException: Couldn't setup connection for syslog@HADOOP.COM to /x.x.x.x:25000; [2018-08-23T09:40:26.103844] closing ipc connection to /x.x.x.x:25000: Couldn't setup connection for syslog@HADOOP.COM to /x.x.x.x:25000; [2018-08-23T09:40:26.103994] IPC Client (1766482975) connection to /x.x.x.x:25000 from syslog@HADOOP.COM: closed;
Thank you.
On Tue, Aug 21, 2018 at 6:48 PM Nagy, Gábor < gabor.nagy@oneidentity.com> wrote:
Hi,
Can you tell ne what is the version of the hadoop lib you use with syslog-ng, please?
Can you share your syslog-ng configuration, mainly the hdfs part, please?
Regards, Gabor
On Tue, Aug 21, 2018 at 4:15 AM Lee Keng Ket <kengket@gmail.com> wrote:
> Hi, Gabor > > I have run it, seems like it stops at the HDFS side. > > [2018-08-21T10:07:51.212015] Worker thread started; driver='d_hdfs#0' > [2018-08-21T10:07:51.212499] Running application hooks; hook='1' > [2018-08-21T10:07:51.212516] Running application hooks; hook='3' > [2018-08-21T10:07:51.212595] syslog-ng starting up; version='3.14.1' > [2018-08-21T10:07:51.214113] Opening hdfs; > [2018-08-21T10:08:01.215622] Opening hdfs; > [2018-08-21T10:08:11.216050] Opening hdfs; > [2018-08-21T10:08:21.226340] Opening hdfs; > [2018-08-21T10:08:31.236589] Opening hdfs; > [2018-08-21T10:08:41.240623] Opening hdfs; > [2018-08-21T10:08:51.250879] Opening hdfs; > [2018-08-21T10:09:01.261172] Opening hdfs; > [2018-08-21T10:09:11.271410] Opening hdfs; > [2018-08-21T10:09:21.281685] Opening hdfs; > [2018-08-21T10:09:31.290765] Opening hdfs; > [2018-08-21T10:09:41.301098] Opening hdfs; > [2018-08-21T10:09:51.311362] Opening hdfs; > [2018-08-21T10:10:01.321152] Opening hdfs; > [2018-08-21T10:10:11.321818] Opening hdfs; > [2018-08-21T10:10:21.330114] Opening hdfs; > [2018-08-21T10:10:31.340413] Opening hdfs; > [2018-08-21T10:10:41.350654] Opening hdfs; > [2018-08-21T10:10:51.354016] Opening hdfs; > [2018-08-21T10:11:01.364267] Opening hdfs; > [2018-08-21T10:11:11.374516] Opening hdfs; > [2018-08-21T10:11:21.384761] Opening hdfs; > [2018-08-21T10:11:31.395017] Opening hdfs; > [2018-08-21T10:11:41.402256] Opening hdfs; > [2018-08-21T10:11:51.404097] Opening hdfs; > ^C[2018-08-21T10:11:59.672252] syslog-ng shutting down; > version='3.14.1' > Exception in thread "" java.lang.NoClassDefFoundError: > org/apache/hadoop/conf/Configuration > at > org.syslog_ng.hdfs.HdfsDestination.open(HdfsDestination.java:92) > at > org.syslog_ng.LogDestination.openProxy(LogDestination.java:65) > [2018-08-21T10:11:59.774895] Worker thread finished; > driver='d_hdfs#0' > [2018-08-21T10:11:59.775384] Closing log transport fd; fd='13' > [2018-08-21T10:11:59.775508] Deinitialize hdfs destination; > [2018-08-21T10:11:59.776534] Java machine free; > [2018-08-21T10:11:59.778421] Running application hooks; hook='4' > > Any idea what to be checked further? > > Thank you. > > On Fri, Aug 17, 2018 at 4:45 PM Nagy, Gábor < > gabor.nagy@oneidentity.com> wrote: > >> Hello! >> >> In the statistics it can be seen that the log message is not sent >> to the HDFS server: >> dropped='dst.java(d_hdfs#0 java_dst hdfs hdfs://x.x.x.x:25000 >> /user/syslog/$HOST-$DAY-$MONTH-$YEAR.log)=0' >> processed='dst.java(d_hdfs#0 java_dst hdfs hdfs://x.x.x.x:25000 >> /user/syslog/$HOST-$DAY-$MONTH-$YEAR.log)=1' >> queued='dst.java(d_hdfs#0 java_dst hdfs hdfs://x.x.x.x:25000 >> /user/syslog/$HOST-$DAY-$MONTH-$YEAR.log)=1' >> >> Well, generally on write error there should be an exception that >> results in an error message. >> >> You should try debugging it either in running syslog-ng in >> foreground (-F option), forwarding internal logs to stderr (-e) and with >> debug mode (-dv) on. >> Or in service mode use the internal() source in your config and >> connect it to a destination (e.g. file()) which you prefer. >> >> You could turn on debug messages on java side too using >> jvm_options() in syslog-ng config and configuring the log4j logging >> service, e.g.: >> options { >> >> jvm_options("-Dlog4j.configuration=file:/etc/hadoop/log4j.properties >> -Dlog4j.debug=true"); >> }; >> >> Regards, >> Gabor >> >> On Fri, Aug 17, 2018 at 10:34 AM Czanik, Péter < >> peter.czanik@balabit.com> wrote: >> >>> Hi, >>> >>> As https://copr.fedorainfracloud.org/coprs/czanik/syslog-ng314/ >>> also writes: "Java is enabled, but JAR dependencies are not provided in >>> package, except for Elasticsearch http mode." The >>> syslog-ng-java-deps.noarch contains build time dependencies. Probably I >>> should rename the package to syslog-ng-java-build-deps... >>> >>> Check the documentation at >>> https://www.syslog-ng.com/technical-documents/doc/syslog-ng-open-source-edit... >>> on how to download and configure HDFS related JAR dependencies. >>> >>> Bye, >>> >>> Peter Czanik (CzP) <peter.czanik@balabit.com> >>> Balabit / syslog-ng upstream >>> https://syslog-ng.com/community/ >>> https://twitter.com/PCzanik >>> >>> On Fri, Aug 17, 2018 at 10:22 AM, Lee Keng Ket <kengket@gmail.com> >>> wrote: >>> >>>> Hi, >>>> >>>> I'm trying to connect syslog-ng 3.14.1 to HDFS to store the >>>> syslog messages. The syslog-ng can start without error, and it's able to >>>> write into local file. However, the log is not written to the HDFS. As >>>> there is no single error, I'm not sure how I should troubleshoot on this. >>>> >>>> I have installed the syslog-ng from this repo, >>>> https://copr.fedorainfracloud.org/coprs/czanik/syslog-ng314/repo/epel-7/czan... >>>> >>>> Installed Packages >>>> syslog-ng.x86_64 >>>> 3.14.1-4.el7.centos >>>> @czanik-syslog-ng314 >>>> syslog-ng-java.x86_64 >>>> 3.14.1-4.el7.centos >>>> @czanik-syslog-ng314 >>>> syslog-ng-java-deps.noarch >>>> 1.0-2 >>>> @czanik-syslog-ng314 >>>> >>>> This is the message from /var/log/message: >>>> Log statistics; processed='src.internal(s_sys#0)=1', >>>> stamp='src.internal(s_sys#0)=1534491834', >>>> processed='destination(d_spol)=0', processed='destination(d_mlal)=0', >>>> processed='center(received)=2', processed='destination(d_mesg)=1', >>>> processed='destination(d_mail)=0', processed='destination(d_auth)=0', >>>> processed='destination(d_cron)=0', processed='destination(d_hdfs)=1', >>>> processed='center(queued)=3', queued='global(scratch_buffers_count)=0', >>>> processed='source(remote_log)=1', >>>> dropped='dst.java(d_hdfs#0,java_dst,hdfs,hdfs://x.x.x.x:25000,/user/syslog/$HOST-$DAY-$MONTH-$YEAR.log)=0', >>>> processed='dst.java(d_hdfs#0,java_dst,hdfs,hdfs://x.x.x.x:25000,/user/syslog/$HOST-$DAY-$MONTH-$YEAR.log)=1', >>>> queued='dst.java(d_hdfs#0,java_dst,hdfs,hdfs://x.x.x.x:25000,/user/syslog/$HOST-$DAY-$MONTH-$YEAR.log)=1', >>>> processed='global(payload_reallocs)=0', >>>> processed='src.journald(journal)=0', stamp='src.journald(journal)=0', >>>> processed='global(sdata_updates)=0', >>>> queued='global(scratch_buffers_bytes)=0', >>>> processed='destination(d_boot)=0', processed='destination(d_kern)=0', >>>> processed='source(s_sys)=1', processed='destination(remote)=1', >>>> processed='global(internal_queue_length)=0', >>>> processed='global(msg_clones)=0' >>>> >>>> Anyone has any idea how should I proceed the troubleshooting? >>>> >>>> >>>> ______________________________________________________________________________ >>>> Member info: https://lists.balabit.hu/mailman/listinfo/syslog-ng >>>> Documentation: >>>> http://www.balabit.com/support/documentation/?product=syslog-ng >>>> FAQ: http://www.balabit.com/wiki/syslog-ng-faq >>>> >>>> >>>> >>> >>> ______________________________________________________________________________ >>> Member info: https://lists.balabit.hu/mailman/listinfo/syslog-ng >>> Documentation: >>> http://www.balabit.com/support/documentation/?product=syslog-ng >>> FAQ: http://www.balabit.com/wiki/syslog-ng-faq >>> >>> >> ______________________________________________________________________________ >> Member info: https://lists.balabit.hu/mailman/listinfo/syslog-ng >> Documentation: >> http://www.balabit.com/support/documentation/?product=syslog-ng >> FAQ: http://www.balabit.com/wiki/syslog-ng-faq >> >> > ______________________________________________________________________________ > Member info: https://lists.balabit.hu/mailman/listinfo/syslog-ng > Documentation: > http://www.balabit.com/support/documentation/?product=syslog-ng > FAQ: http://www.balabit.com/wiki/syslog-ng-faq > > ______________________________________________________________________________ Member info: https://lists.balabit.hu/mailman/listinfo/syslog-ng Documentation: http://www.balabit.com/support/documentation/?product=syslog-ng FAQ: http://www.balabit.com/wiki/syslog-ng-faq
______________________________________________________________________________ Member info: https://lists.balabit.hu/mailman/listinfo/syslog-ng Documentation: http://www.balabit.com/support/documentation/?product=syslog-ng FAQ: http://www.balabit.com/wiki/syslog-ng-faq
______________________________________________________________________________ Member info: https://lists.balabit.hu/mailman/listinfo/syslog-ng Documentation: http://www.balabit.com/support/documentation/?product=syslog-ng FAQ: http://www.balabit.com/wiki/syslog-ng-faq
Member info: https://lists.balabit.hu/mailman/listinfo/syslog-ng Documentation: http://www.balabit.com/support/documentation/?product=syslog-ng FAQ: http://www.balabit.com/wiki/syslog-ng-faq
______________________________________________________________________________ Member info: https://lists.balabit.hu/mailman/listinfo/syslog-ng Documentation: http://www.balabit.com/support/documentation/?product=syslog-ng FAQ: http://www.balabit.com/wiki/syslog-ng-faq
I've asked about append to know whether you will write to the same file again. On Mon, Aug 27, 2018 at 10:02 AM Lee Keng Ket <kengket@gmail.com> wrote:
Hi, Gabor
Yes, I see this note as well, so I'm monitoring how long it takes to write into the disk.
I'm using append in hdfs, but I'm not really sure what's the difference between no append and append.
Thank you.
Regards, KK
On Mon, 27 Aug 2018, 15:41 Nagy, Gábor, <gabor.nagy@oneidentity.com> wrote:
Hi,
As I know we have no control over flushing the file in HDFS. I've checked our admin guide, let me quote it:
You cannot set when log messages are flushed. Hadoop performs this action automatically, depending on its configured block size, and the amount of data received. There is no way for the syslog-ng OSE application to influence when the messages are actually written to disk. This means that syslog-ng OSE cannot guarantee that a message sent to HDFS is actually written to disk.
You can reload syslog-ng periodically. During reload the used files are closed and HDFS should flushes them to the datanodes. Do you use append in your HDFS environment?
Regards, Gabor
On Mon, Aug 27, 2018 at 9:13 AM Lee Keng Ket <kengket@gmail.com> wrote:
Hi,
I'm able to send the syslog to HDFS, but it seems not real time. The file in HDFS does not increase size, until I stop the syslog-ng process, then all the messages go into HDFS and we can see the file size increase. Any idea how frequent the syslog-ng send the syslog to HDFS? If the syslog-ng incoming syslog is a lot, is there any fine-tuning required? The syslog-ng also write log file in local server, I can see the file size increase by 40MB in 20min. However, the file in HDFS remains same size until I stop the syslog-ng process. Once I stop the process, the log is written into the HDFS.
This is log stats: Log statistics; processed='src.internal(s_sys#1)=4', stamp='src.internal(s_sys#1)=1535352937', processed='destination(d_mlal)=0', processed='center(received)=30694', processed='destination(d_spol)=0', processed='destination(d_mesg)=4', processed='destination(d_mail)=0', processed='destination(d_auth)=0', processed='destination(d_cron)=0', processed='destination(d_hdfs)=30690', processed='center(queued)=61384', queued='global(scratch_buffers_count)=0', processed='source(remote_log)=30690', dropped='dst.java(d_hdfs#0,java_dst,hdfs,hdfs://CN03:25000,/user/syslog/$HOST-$DAY-$MONTH-$YEAR.log)=0', processed='dst.java(d_hdfs#0,java_dst,hdfs,hdfs://CN03:25000,/user/syslog/$HOST-$DAY-$MONTH-$YEAR.log)=30690', queued='dst.java(d_hdfs#0,java_dst,hdfs,hdfs://CN03:25000,/user/syslog/$HOST-$DAY-$MONTH-$YEAR.log)=0', processed='global(payload_reallocs)=0', processed='global(sdata_updates)=0', queued='global(scratch_buffers_bytes)=0', processed='src.journald(s_sys#0,journal)=0', stamp='src.journald(s_sys#0,journal)=0', processed='destination(d_boot)=0', processed='destination(d_kern)=0', processed='source(s_sys)=4', processed='destination(remote)=30690', processed='global(internal_queue_length)=0', processed='global(msg_clones)=0'
Thank you.
Regards, KK
On Mon, Aug 27, 2018 at 10:41 AM Lee Keng Ket <kengket@gmail.com> wrote:
Hi, Gabor
I am able to send the syslog to HDFS after adding in the hdfs-resources option. Thanks a lot for your help.
Thank you.
Regards, KK
On Fri, Aug 24, 2018 at 10:29 PM Nagy, Gábor < gabor.nagy@oneidentity.com> wrote:
Hi,
As I was searching for answer what i saw was that "SaslException: No common protection layer between client and server;" can occur if the configuration differs on client-server side.
You should configure the "hadoop.rpc.protection" value in core-site.xml on the client side to the same one as on the server side. To use that xml with hdfs, you should put the filename in syslog-ng config in the hdfs destination block: destination d_hdfs { ... hdfs-resources("/path/to/xmlconfig/core-site.xml"); };
Regards, Gabor
On Thu, Aug 23, 2018 at 3:53 AM Lee Keng Ket <kengket@gmail.com> wrote:
Hi, Gabor
After I change the lib to executable as reminded by Janos, now there is different error. Any idea if this is Kerberos issue or still the Hadoop lib issue?
The Hadoop that I'm using is Huawei FusionInsight. Hadoop HDFS version should be 2.7.2.
This is the syslog-ng.conf for HDFS part: destination d_hdfs { hdfs(client_lib_dir("/opt/hadoop/lib") hdfs_uri("hdfs://x.x.x.x:25000") kerberos-keytab-file("/etc/syslog.keytab") kerberos-principal("syslog@HADOOP.COM") hdfs_file("/user/syslog/$HOST-$DAY-$MONTH-$YEAR.log") template(t_cgnat) hdfs-append-enabled(true) ); };
There are the messages I get when starting in debug:
[2018-08-23T09:40:09.210168] Running application hooks; hook='1' [2018-08-23T09:40:09.210192] Running application hooks; hook='3' [2018-08-23T09:40:09.210501] syslog-ng starting up; version='3.14.1' [2018-08-23T09:40:09.213049] Worker thread started; driver='d_hdfs#0' [2018-08-23T09:40:09.214922] Opening hdfs; [2018-08-23T09:40:09.548286] field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginSuccess with annotation @org.apache.hadoop.metrics2.annotation.Metric(about=, sampleName=Ops, always=false, type=DEFAULT, valueName=Time, value=[Rate of successful kerberos logins and latency (milliseconds)]); [2018-08-23T09:40:09.561345] field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginFailure with annotation @org.apache.hadoop.metrics2.annotation.Metric(about=, sampleName=Ops, always=false, type=DEFAULT, valueName=Time, value=[Rate of failed kerberos logins and latency (milliseconds)]); [2018-08-23T09:40:09.561608] field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.getGroups with annotation @org.apache.hadoop.metrics2.annotation.Metric(about=, sampleName=Ops, always=false, type=DEFAULT, valueName=Time, value=[GetGroups]); [2018-08-23T09:40:09.562485] UgiMetrics, User and group related metrics; [2018-08-23T09:40:09.604037] Failed to detect a valid hadoop home directory; [2018-08-23T09:40:09.687386] setsid exited with exit code 0; [2018-08-23T09:40:09.715804] Creating new Groups object; [2018-08-23T09:40:09.717743] Trying to load the custom-built native-hadoop library...; [2018-08-23T09:40:09.718065] Failed to load native-hadoop with error: java.lang.UnsatisfiedLinkError: no hadoop in java.library.path; [2018-08-23T09:40:09.718095] java.library.path=//usr/lib64/syslog-ng; [2018-08-23T09:40:09.718117] Unable to load native-hadoop library for your platform... using builtin-java classes where applicable; [2018-08-23T09:40:09.718418] Falling back to shell based; [2018-08-23T09:40:09.718997] Group mapping impl=org.apache.hadoop.security.ShellBasedUnixGroupsMapping; [2018-08-23T09:40:09.753615] Group mapping impl=org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback\; cacheTimeout=300000\; warningDeltaMs=5000; [2018-08-23T09:40:09.905060] hadoop login; [2018-08-23T09:40:09.906493] hadoop login commit; [2018-08-23T09:40:09.907300] using kerberos user:syslog@HADOOP.COM; [2018-08-23T09:40:09.907333] Using user: "syslog@HADOOP.COM" with name syslog@HADOOP.COM; [2018-08-23T09:40:09.907592] User entry: "syslog@HADOOP.COM"; [2018-08-23T09:40:09.908005] Login successful for user syslog@HADOOP.COM using keytab file syslog.keytab; [2018-08-23T09:40:10.104386] dfs.client.use.legacy.blockreader.local = false; [2018-08-23T09:40:10.104436] dfs.client.read.shortcircuit = false; [2018-08-23T09:40:10.104450] dfs.client.domain.socket.data.traffic = false; [2018-08-23T09:40:10.104461] dfs.domain.socket.path = ; [2018-08-23T09:40:10.121280] Sets dfs.client.block.write.replace-datanode-on-failure.replication to 0; [2018-08-23T09:40:10.144892] multipleLinearRandomRetry = null; [2018-08-23T09:40:10.168901] rpcKind=RPC_PROTOCOL_BUFFER, rpcRequestWrapperClass=class org.apache.hadoop.ipc.ProtobufRpcEngine$RpcProtobufRequest, rpcInvoker=org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker@4379f1a4 ; [2018-08-23T09:40:10.173738] getting client out of cache: org.apache.hadoop.ipc.Client@5b52b909; [2018-08-23T09:40:10.238466] finalize() called.; [2018-08-23T09:40:10.238705] finalize() called.; [2018-08-23T09:40:10.526021] Both short-circuit local reads and UNIX domain socket are disabled.; [2018-08-23T09:40:10.532037] DataTransferProtocol not using SaslPropertiesResolver, no QOP found in configuration for dfs.data.transfer.protection; [2018-08-23T09:40:10.555581] The ping interval is 60000 ms.; [2018-08-23T09:40:10.556336] Connecting to /x.x.x.x:25000; [2018-08-23T09:40:10.572385] PrivilegedAction as:syslog@HADOOP.COM (auth:KERBEROS) from:org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:771); [2018-08-23T09:40:10.613823] Get kerberos info proto:interface org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB info:@org.apache.hadoop.security.KerberosInfo(clientPrincipal=, serverPrincipal=dfs.namenode.kerberos.principal); [2018-08-23T09:40:10.723447] RPC Server's Kerberos principal name for protocol=org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB is hdfs/ hadoop.hadoop.com@HADOOP.COM; [2018-08-23T09:40:10.723514] Creating SASL GSSAPI(KERBEROS) client to authenticate to service at hadoop.hadoop.com; [2018-08-23T09:40:10.730466] Use KERBEROS authentication for protocol ClientNamenodeProtocolPB; [2018-08-23T09:40:10.758296] PrivilegedActionException as:syslog@HADOOP.COM (auth:KERBEROS) cause:javax.security.sasl.SaslException: No common protection layer between client and server; [2018-08-23T09:40:10.759031] PrivilegedAction as:syslog@HADOOP.COM (auth:KERBEROS) from:org.apache.hadoop.ipc.Client$Connection.handleSaslConnectionFailure(Client.java:694); [2018-08-23T09:40:10.759531] Exception encountered while connecting to the server : javax.security.sasl.SaslException: No common protection layer between client and server; [2018-08-23T09:40:14.446925] PrivilegedAction as:syslog@HADOOP.COM (auth:KERBEROS) from:org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:771); [2018-08-23T09:40:14.447824] Get kerberos info proto:interface org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB info:@org.apache.hadoop.security.KerberosInfo(clientPrincipal=, serverPrincipal=dfs.namenode.kerberos.principal); [2018-08-23T09:40:14.447982] RPC Server's Kerberos principal name for protocol=org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB is hdfs/ hadoop.hadoop.com@HADOOP.COM; [2018-08-23T09:40:14.448001] Creating SASL GSSAPI(KERBEROS) client to authenticate to service at hadoop.hadoop.com; [2018-08-23T09:40:14.449087] Use KERBEROS authentication for protocol ClientNamenodeProtocolPB; [2018-08-23T09:40:14.455070] PrivilegedActionException as:syslog@HADOOP.COM (auth:KERBEROS) cause:javax.security.sasl.SaslException: No common protection layer between client and server; [2018-08-23T09:40:14.455190] PrivilegedAction as:syslog@HADOOP.COM (auth:KERBEROS) from:org.apache.hadoop.ipc.Client$Connection.handleSaslConnectionFailure(Client.java:694); [2018-08-23T09:40:14.455476] Exception encountered while connecting to the server : javax.security.sasl.SaslException: No common protection layer between client and server; [2018-08-23T09:40:17.206928] PrivilegedAction as:syslog@HADOOP.COM (auth:KERBEROS) from:org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:771); [2018-08-23T09:40:17.207978] Get kerberos info proto:interface org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB info:@org.apache.hadoop.security.KerberosInfo(clientPrincipal=, serverPrincipal=dfs.namenode.kerberos.principal); [2018-08-23T09:40:17.208115] RPC Server's Kerberos principal name for protocol=org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB is hdfs/ hadoop.hadoop.com@HADOOP.COM; [2018-08-23T09:40:17.208133] Creating SASL GSSAPI(KERBEROS) client to authenticate to service at hadoop.hadoop.com; [2018-08-23T09:40:17.208877] Use KERBEROS authentication for protocol ClientNamenodeProtocolPB; [2018-08-23T09:40:17.214382] PrivilegedActionException as:syslog@HADOOP.COM (auth:KERBEROS) cause:javax.security.sasl.SaslException: No common protection layer between client and server; [2018-08-23T09:40:17.214536] PrivilegedAction as:syslog@HADOOP.COM (auth:KERBEROS) from:org.apache.hadoop.ipc.Client$Connection.handleSaslConnectionFailure(Client.java:694); [2018-08-23T09:40:17.214845] Exception encountered while connecting to the server : javax.security.sasl.SaslException: No common protection layer between client and server; [2018-08-23T09:40:17.535313] PrivilegedAction as:syslog@HADOOP.COM (auth:KERBEROS) from:org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:771); [2018-08-23T09:40:17.536419] Get kerberos info proto:interface org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB info:@org.apache.hadoop.security.KerberosInfo(clientPrincipal=, serverPrincipal=dfs.namenode.kerberos.principal); [2018-08-23T09:40:17.536615] RPC Server's Kerberos principal name for protocol=org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB is hdfs/ hadoop.hadoop.com@HADOOP.COM; [2018-08-23T09:40:17.536634] Creating SASL GSSAPI(KERBEROS) client to authenticate to service at hadoop.hadoop.com; [2018-08-23T09:40:17.537679] Use KERBEROS authentication for protocol ClientNamenodeProtocolPB; [2018-08-23T09:40:17.546375] PrivilegedActionException as:syslog@HADOOP.COM (auth:KERBEROS) cause:javax.security.sasl.SaslException: No common protection layer between client and server; [2018-08-23T09:40:17.546587] PrivilegedAction as:syslog@HADOOP.COM (auth:KERBEROS) from:org.apache.hadoop.ipc.Client$Connection.handleSaslConnectionFailure(Client.java:694); [2018-08-23T09:40:17.546963] Exception encountered while connecting to the server : javax.security.sasl.SaslException: No common protection layer between client and server; [2018-08-23T09:40:21.891382] PrivilegedAction as:syslog@HADOOP.COM (auth:KERBEROS) from:org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:771); [2018-08-23T09:40:21.892476] Get kerberos info proto:interface org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB info:@org.apache.hadoop.security.KerberosInfo(clientPrincipal=, serverPrincipal=dfs.namenode.kerberos.principal); [2018-08-23T09:40:21.909267] RPC Server's Kerberos principal name for protocol=org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB is hdfs/ hadoop.hadoop.com@HADOOP.COM; [2018-08-23T09:40:21.909306] Creating SASL GSSAPI(KERBEROS) client to authenticate to service at hadoop.hadoop.com; [2018-08-23T09:40:21.910125] Use KERBEROS authentication for protocol ClientNamenodeProtocolPB; [2018-08-23T09:40:21.915324] PrivilegedActionException as:syslog@HADOOP.COM (auth:KERBEROS) cause:javax.security.sasl.SaslException: No common protection layer between client and server; [2018-08-23T09:40:21.915540] PrivilegedAction as:syslog@HADOOP.COM (auth:KERBEROS) from:org.apache.hadoop.ipc.Client$Connection.handleSaslConnectionFailure(Client.java:694); [2018-08-23T09:40:21.915889] Exception encountered while connecting to the server : javax.security.sasl.SaslException: No common protection layer between client and server; [2018-08-23T09:40:26.095299] PrivilegedAction as:syslog@HADOOP.COM (auth:KERBEROS) from:org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:771); [2018-08-23T09:40:26.096234] Get kerberos info proto:interface org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB info:@org.apache.hadoop.security.KerberosInfo(clientPrincipal=, serverPrincipal=dfs.namenode.kerberos.principal); [2018-08-23T09:40:26.096391] RPC Server's Kerberos principal name for protocol=org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB is hdfs/ hadoop.hadoop.com@HADOOP.COM; [2018-08-23T09:40:26.096411] Creating SASL GSSAPI(KERBEROS) client to authenticate to service at hadoop.hadoop.com; [2018-08-23T09:40:26.097116] Use KERBEROS authentication for protocol ClientNamenodeProtocolPB; [2018-08-23T09:40:26.101823] PrivilegedActionException as:syslog@HADOOP.COM (auth:KERBEROS) cause:javax.security.sasl.SaslException: No common protection layer between client and server; [2018-08-23T09:40:26.101960] PrivilegedAction as:syslog@HADOOP.COM (auth:KERBEROS) from:org.apache.hadoop.ipc.Client$Connection.handleSaslConnectionFailure(Client.java:694); [2018-08-23T09:40:26.102283] Couldn't setup connection for syslog@HADOOP.COM to /x.x.x.x:25000; [2018-08-23T09:40:26.102365] PrivilegedActionException as:syslog@HADOOP.COM (auth:KERBEROS) cause:java.io.IOException: Couldn't setup connection for syslog@HADOOP.COM to /x.x.x.x:25000; [2018-08-23T09:40:26.103844] closing ipc connection to /x.x.x.x:25000: Couldn't setup connection for syslog@HADOOP.COM to /x.x.x.x:25000; [2018-08-23T09:40:26.103994] IPC Client (1766482975) connection to /x.x.x.x:25000 from syslog@HADOOP.COM: closed;
Thank you.
On Tue, Aug 21, 2018 at 6:48 PM Nagy, Gábor < gabor.nagy@oneidentity.com> wrote:
> Hi, > > Can you tell ne what is the version of the hadoop lib you use with > syslog-ng, please? > > Can you share your syslog-ng configuration, mainly the hdfs part, > please? > > Regards, > Gabor > > On Tue, Aug 21, 2018 at 4:15 AM Lee Keng Ket <kengket@gmail.com> > wrote: > >> Hi, Gabor >> >> I have run it, seems like it stops at the HDFS side. >> >> [2018-08-21T10:07:51.212015] Worker thread started; >> driver='d_hdfs#0' >> [2018-08-21T10:07:51.212499] Running application hooks; hook='1' >> [2018-08-21T10:07:51.212516] Running application hooks; hook='3' >> [2018-08-21T10:07:51.212595] syslog-ng starting up; version='3.14.1' >> [2018-08-21T10:07:51.214113] Opening hdfs; >> [2018-08-21T10:08:01.215622] Opening hdfs; >> [2018-08-21T10:08:11.216050] Opening hdfs; >> [2018-08-21T10:08:21.226340] Opening hdfs; >> [2018-08-21T10:08:31.236589] Opening hdfs; >> [2018-08-21T10:08:41.240623] Opening hdfs; >> [2018-08-21T10:08:51.250879] Opening hdfs; >> [2018-08-21T10:09:01.261172] Opening hdfs; >> [2018-08-21T10:09:11.271410] Opening hdfs; >> [2018-08-21T10:09:21.281685] Opening hdfs; >> [2018-08-21T10:09:31.290765] Opening hdfs; >> [2018-08-21T10:09:41.301098] Opening hdfs; >> [2018-08-21T10:09:51.311362] Opening hdfs; >> [2018-08-21T10:10:01.321152] Opening hdfs; >> [2018-08-21T10:10:11.321818] Opening hdfs; >> [2018-08-21T10:10:21.330114] Opening hdfs; >> [2018-08-21T10:10:31.340413] Opening hdfs; >> [2018-08-21T10:10:41.350654] Opening hdfs; >> [2018-08-21T10:10:51.354016] Opening hdfs; >> [2018-08-21T10:11:01.364267] Opening hdfs; >> [2018-08-21T10:11:11.374516] Opening hdfs; >> [2018-08-21T10:11:21.384761] Opening hdfs; >> [2018-08-21T10:11:31.395017] Opening hdfs; >> [2018-08-21T10:11:41.402256] Opening hdfs; >> [2018-08-21T10:11:51.404097] Opening hdfs; >> ^C[2018-08-21T10:11:59.672252] syslog-ng shutting down; >> version='3.14.1' >> Exception in thread "" java.lang.NoClassDefFoundError: >> org/apache/hadoop/conf/Configuration >> at >> org.syslog_ng.hdfs.HdfsDestination.open(HdfsDestination.java:92) >> at >> org.syslog_ng.LogDestination.openProxy(LogDestination.java:65) >> [2018-08-21T10:11:59.774895] Worker thread finished; >> driver='d_hdfs#0' >> [2018-08-21T10:11:59.775384] Closing log transport fd; fd='13' >> [2018-08-21T10:11:59.775508] Deinitialize hdfs destination; >> [2018-08-21T10:11:59.776534] Java machine free; >> [2018-08-21T10:11:59.778421] Running application hooks; hook='4' >> >> Any idea what to be checked further? >> >> Thank you. >> >> On Fri, Aug 17, 2018 at 4:45 PM Nagy, Gábor < >> gabor.nagy@oneidentity.com> wrote: >> >>> Hello! >>> >>> In the statistics it can be seen that the log message is not sent >>> to the HDFS server: >>> dropped='dst.java(d_hdfs#0 java_dst hdfs hdfs://x.x.x.x:25000 >>> /user/syslog/$HOST-$DAY-$MONTH-$YEAR.log)=0' >>> processed='dst.java(d_hdfs#0 java_dst hdfs hdfs://x.x.x.x:25000 >>> /user/syslog/$HOST-$DAY-$MONTH-$YEAR.log)=1' >>> queued='dst.java(d_hdfs#0 java_dst hdfs hdfs://x.x.x.x:25000 >>> /user/syslog/$HOST-$DAY-$MONTH-$YEAR.log)=1' >>> >>> Well, generally on write error there should be an exception that >>> results in an error message. >>> >>> You should try debugging it either in running syslog-ng in >>> foreground (-F option), forwarding internal logs to stderr (-e) and with >>> debug mode (-dv) on. >>> Or in service mode use the internal() source in your config and >>> connect it to a destination (e.g. file()) which you prefer. >>> >>> You could turn on debug messages on java side too using >>> jvm_options() in syslog-ng config and configuring the log4j logging >>> service, e.g.: >>> options { >>> >>> jvm_options("-Dlog4j.configuration=file:/etc/hadoop/log4j.properties >>> -Dlog4j.debug=true"); >>> }; >>> >>> Regards, >>> Gabor >>> >>> On Fri, Aug 17, 2018 at 10:34 AM Czanik, Péter < >>> peter.czanik@balabit.com> wrote: >>> >>>> Hi, >>>> >>>> As https://copr.fedorainfracloud.org/coprs/czanik/syslog-ng314/ >>>> also writes: "Java is enabled, but JAR dependencies are not provided in >>>> package, except for Elasticsearch http mode." The >>>> syslog-ng-java-deps.noarch contains build time dependencies. Probably I >>>> should rename the package to syslog-ng-java-build-deps... >>>> >>>> Check the documentation at >>>> https://www.syslog-ng.com/technical-documents/doc/syslog-ng-open-source-edit... >>>> on how to download and configure HDFS related JAR dependencies. >>>> >>>> Bye, >>>> >>>> Peter Czanik (CzP) <peter.czanik@balabit.com> >>>> Balabit / syslog-ng upstream >>>> https://syslog-ng.com/community/ >>>> https://twitter.com/PCzanik >>>> >>>> On Fri, Aug 17, 2018 at 10:22 AM, Lee Keng Ket <kengket@gmail.com >>>> > wrote: >>>> >>>>> Hi, >>>>> >>>>> I'm trying to connect syslog-ng 3.14.1 to HDFS to store the >>>>> syslog messages. The syslog-ng can start without error, and it's able to >>>>> write into local file. However, the log is not written to the HDFS. As >>>>> there is no single error, I'm not sure how I should troubleshoot on this. >>>>> >>>>> I have installed the syslog-ng from this repo, >>>>> https://copr.fedorainfracloud.org/coprs/czanik/syslog-ng314/repo/epel-7/czan... >>>>> >>>>> Installed Packages >>>>> syslog-ng.x86_64 >>>>> 3.14.1-4.el7.centos >>>>> @czanik-syslog-ng314 >>>>> syslog-ng-java.x86_64 >>>>> 3.14.1-4.el7.centos >>>>> @czanik-syslog-ng314 >>>>> syslog-ng-java-deps.noarch >>>>> 1.0-2 >>>>> @czanik-syslog-ng314 >>>>> >>>>> This is the message from /var/log/message: >>>>> Log statistics; processed='src.internal(s_sys#0)=1', >>>>> stamp='src.internal(s_sys#0)=1534491834', >>>>> processed='destination(d_spol)=0', processed='destination(d_mlal)=0', >>>>> processed='center(received)=2', processed='destination(d_mesg)=1', >>>>> processed='destination(d_mail)=0', processed='destination(d_auth)=0', >>>>> processed='destination(d_cron)=0', processed='destination(d_hdfs)=1', >>>>> processed='center(queued)=3', queued='global(scratch_buffers_count)=0', >>>>> processed='source(remote_log)=1', >>>>> dropped='dst.java(d_hdfs#0,java_dst,hdfs,hdfs://x.x.x.x:25000,/user/syslog/$HOST-$DAY-$MONTH-$YEAR.log)=0', >>>>> processed='dst.java(d_hdfs#0,java_dst,hdfs,hdfs://x.x.x.x:25000,/user/syslog/$HOST-$DAY-$MONTH-$YEAR.log)=1', >>>>> queued='dst.java(d_hdfs#0,java_dst,hdfs,hdfs://x.x.x.x:25000,/user/syslog/$HOST-$DAY-$MONTH-$YEAR.log)=1', >>>>> processed='global(payload_reallocs)=0', >>>>> processed='src.journald(journal)=0', stamp='src.journald(journal)=0', >>>>> processed='global(sdata_updates)=0', >>>>> queued='global(scratch_buffers_bytes)=0', >>>>> processed='destination(d_boot)=0', processed='destination(d_kern)=0', >>>>> processed='source(s_sys)=1', processed='destination(remote)=1', >>>>> processed='global(internal_queue_length)=0', >>>>> processed='global(msg_clones)=0' >>>>> >>>>> Anyone has any idea how should I proceed the troubleshooting? >>>>> >>>>> >>>>> ______________________________________________________________________________ >>>>> Member info: https://lists.balabit.hu/mailman/listinfo/syslog-ng >>>>> Documentation: >>>>> http://www.balabit.com/support/documentation/?product=syslog-ng >>>>> FAQ: http://www.balabit.com/wiki/syslog-ng-faq >>>>> >>>>> >>>>> >>>> >>>> ______________________________________________________________________________ >>>> Member info: https://lists.balabit.hu/mailman/listinfo/syslog-ng >>>> Documentation: >>>> http://www.balabit.com/support/documentation/?product=syslog-ng >>>> FAQ: http://www.balabit.com/wiki/syslog-ng-faq >>>> >>>> >>> ______________________________________________________________________________ >>> Member info: https://lists.balabit.hu/mailman/listinfo/syslog-ng >>> Documentation: >>> http://www.balabit.com/support/documentation/?product=syslog-ng >>> FAQ: http://www.balabit.com/wiki/syslog-ng-faq >>> >>> >> ______________________________________________________________________________ >> Member info: https://lists.balabit.hu/mailman/listinfo/syslog-ng >> Documentation: >> http://www.balabit.com/support/documentation/?product=syslog-ng >> FAQ: http://www.balabit.com/wiki/syslog-ng-faq >> >> > ______________________________________________________________________________ > Member info: https://lists.balabit.hu/mailman/listinfo/syslog-ng > Documentation: > http://www.balabit.com/support/documentation/?product=syslog-ng > FAQ: http://www.balabit.com/wiki/syslog-ng-faq > > ______________________________________________________________________________ Member info: https://lists.balabit.hu/mailman/listinfo/syslog-ng Documentation: http://www.balabit.com/support/documentation/?product=syslog-ng FAQ: http://www.balabit.com/wiki/syslog-ng-faq
______________________________________________________________________________ Member info: https://lists.balabit.hu/mailman/listinfo/syslog-ng Documentation: http://www.balabit.com/support/documentation/?product=syslog-ng FAQ: http://www.balabit.com/wiki/syslog-ng-faq
Member info: https://lists.balabit.hu/mailman/listinfo/syslog-ng Documentation: http://www.balabit.com/support/documentation/?product=syslog-ng FAQ: http://www.balabit.com/wiki/syslog-ng-faq
______________________________________________________________________________ Member info: https://lists.balabit.hu/mailman/listinfo/syslog-ng Documentation: http://www.balabit.com/support/documentation/?product=syslog-ng FAQ: http://www.balabit.com/wiki/syslog-ng-faq
______________________________________________________________________________ Member info: https://lists.balabit.hu/mailman/listinfo/syslog-ng Documentation: http://www.balabit.com/support/documentation/?product=syslog-ng FAQ: http://www.balabit.com/wiki/syslog-ng-faq
Hi, Gabor Yes, it will write to the same file until the next day, as the file name is date based. Thank you. Regards, KK On Mon, 27 Aug 2018, 16:09 Nagy, Gábor, <gabor.nagy@oneidentity.com> wrote:
I've asked about append to know whether you will write to the same file again.
On Mon, Aug 27, 2018 at 10:02 AM Lee Keng Ket <kengket@gmail.com> wrote:
Hi, Gabor
Yes, I see this note as well, so I'm monitoring how long it takes to write into the disk.
I'm using append in hdfs, but I'm not really sure what's the difference between no append and append.
Thank you.
Regards, KK
On Mon, 27 Aug 2018, 15:41 Nagy, Gábor, <gabor.nagy@oneidentity.com> wrote:
Hi,
As I know we have no control over flushing the file in HDFS. I've checked our admin guide, let me quote it:
You cannot set when log messages are flushed. Hadoop performs this action automatically, depending on its configured block size, and the amount of data received. There is no way for the syslog-ng OSE application to influence when the messages are actually written to disk. This means that syslog-ng OSE cannot guarantee that a message sent to HDFS is actually written to disk.
You can reload syslog-ng periodically. During reload the used files are closed and HDFS should flushes them to the datanodes. Do you use append in your HDFS environment?
Regards, Gabor
On Mon, Aug 27, 2018 at 9:13 AM Lee Keng Ket <kengket@gmail.com> wrote:
Hi,
I'm able to send the syslog to HDFS, but it seems not real time. The file in HDFS does not increase size, until I stop the syslog-ng process, then all the messages go into HDFS and we can see the file size increase. Any idea how frequent the syslog-ng send the syslog to HDFS? If the syslog-ng incoming syslog is a lot, is there any fine-tuning required? The syslog-ng also write log file in local server, I can see the file size increase by 40MB in 20min. However, the file in HDFS remains same size until I stop the syslog-ng process. Once I stop the process, the log is written into the HDFS.
This is log stats: Log statistics; processed='src.internal(s_sys#1)=4', stamp='src.internal(s_sys#1)=1535352937', processed='destination(d_mlal)=0', processed='center(received)=30694', processed='destination(d_spol)=0', processed='destination(d_mesg)=4', processed='destination(d_mail)=0', processed='destination(d_auth)=0', processed='destination(d_cron)=0', processed='destination(d_hdfs)=30690', processed='center(queued)=61384', queued='global(scratch_buffers_count)=0', processed='source(remote_log)=30690', dropped='dst.java(d_hdfs#0,java_dst,hdfs,hdfs://CN03:25000,/user/syslog/$HOST-$DAY-$MONTH-$YEAR.log)=0', processed='dst.java(d_hdfs#0,java_dst,hdfs,hdfs://CN03:25000,/user/syslog/$HOST-$DAY-$MONTH-$YEAR.log)=30690', queued='dst.java(d_hdfs#0,java_dst,hdfs,hdfs://CN03:25000,/user/syslog/$HOST-$DAY-$MONTH-$YEAR.log)=0', processed='global(payload_reallocs)=0', processed='global(sdata_updates)=0', queued='global(scratch_buffers_bytes)=0', processed='src.journald(s_sys#0,journal)=0', stamp='src.journald(s_sys#0,journal)=0', processed='destination(d_boot)=0', processed='destination(d_kern)=0', processed='source(s_sys)=4', processed='destination(remote)=30690', processed='global(internal_queue_length)=0', processed='global(msg_clones)=0'
Thank you.
Regards, KK
On Mon, Aug 27, 2018 at 10:41 AM Lee Keng Ket <kengket@gmail.com> wrote:
Hi, Gabor
I am able to send the syslog to HDFS after adding in the hdfs-resources option. Thanks a lot for your help.
Thank you.
Regards, KK
On Fri, Aug 24, 2018 at 10:29 PM Nagy, Gábor < gabor.nagy@oneidentity.com> wrote:
Hi,
As I was searching for answer what i saw was that "SaslException: No common protection layer between client and server;" can occur if the configuration differs on client-server side.
You should configure the "hadoop.rpc.protection" value in core-site.xml on the client side to the same one as on the server side. To use that xml with hdfs, you should put the filename in syslog-ng config in the hdfs destination block: destination d_hdfs { ... hdfs-resources("/path/to/xmlconfig/core-site.xml"); };
Regards, Gabor
On Thu, Aug 23, 2018 at 3:53 AM Lee Keng Ket <kengket@gmail.com> wrote:
> Hi, Gabor > > After I change the lib to executable as reminded by Janos, now there > is different error. Any idea if this is Kerberos issue or still the Hadoop > lib issue? > > The Hadoop that I'm using is Huawei FusionInsight. Hadoop HDFS > version should be 2.7.2. > > This is the syslog-ng.conf for HDFS part: > destination d_hdfs { > hdfs(client_lib_dir("/opt/hadoop/lib") > hdfs_uri("hdfs://x.x.x.x:25000") > kerberos-keytab-file("/etc/syslog.keytab") > kerberos-principal("syslog@HADOOP.COM") > hdfs_file("/user/syslog/$HOST-$DAY-$MONTH-$YEAR.log") > template(t_cgnat) > hdfs-append-enabled(true) > ); > }; > > There are the messages I get when starting in debug: > > [2018-08-23T09:40:09.210168] Running application hooks; hook='1' > [2018-08-23T09:40:09.210192] Running application hooks; hook='3' > [2018-08-23T09:40:09.210501] syslog-ng starting up; version='3.14.1' > [2018-08-23T09:40:09.213049] Worker thread started; driver='d_hdfs#0' > [2018-08-23T09:40:09.214922] Opening hdfs; > [2018-08-23T09:40:09.548286] field > org.apache.hadoop.metrics2.lib.MutableRate > org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginSuccess > with annotation @org.apache.hadoop.metrics2.annotation.Metric(about=, > sampleName=Ops, always=false, type=DEFAULT, valueName=Time, value=[Rate of > successful kerberos logins and latency (milliseconds)]); > [2018-08-23T09:40:09.561345] field > org.apache.hadoop.metrics2.lib.MutableRate > org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginFailure > with annotation @org.apache.hadoop.metrics2.annotation.Metric(about=, > sampleName=Ops, always=false, type=DEFAULT, valueName=Time, value=[Rate of > failed kerberos logins and latency (milliseconds)]); > [2018-08-23T09:40:09.561608] field > org.apache.hadoop.metrics2.lib.MutableRate > org.apache.hadoop.security.UserGroupInformation$UgiMetrics.getGroups with > annotation @org.apache.hadoop.metrics2.annotation.Metric(about=, > sampleName=Ops, always=false, type=DEFAULT, valueName=Time, > value=[GetGroups]); > [2018-08-23T09:40:09.562485] UgiMetrics, User and group related > metrics; > [2018-08-23T09:40:09.604037] Failed to detect a valid hadoop home > directory; > [2018-08-23T09:40:09.687386] setsid exited with exit code 0; > [2018-08-23T09:40:09.715804] Creating new Groups object; > [2018-08-23T09:40:09.717743] Trying to load the custom-built > native-hadoop library...; > [2018-08-23T09:40:09.718065] Failed to load native-hadoop with > error: java.lang.UnsatisfiedLinkError: no hadoop in java.library.path; > [2018-08-23T09:40:09.718095] java.library.path=//usr/lib64/syslog-ng; > [2018-08-23T09:40:09.718117] Unable to load native-hadoop library > for your platform... using builtin-java classes where applicable; > [2018-08-23T09:40:09.718418] Falling back to shell based; > [2018-08-23T09:40:09.718997] Group mapping > impl=org.apache.hadoop.security.ShellBasedUnixGroupsMapping; > [2018-08-23T09:40:09.753615] Group mapping > impl=org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback\; > cacheTimeout=300000\; warningDeltaMs=5000; > [2018-08-23T09:40:09.905060] hadoop login; > [2018-08-23T09:40:09.906493] hadoop login commit; > [2018-08-23T09:40:09.907300] using kerberos user:syslog@HADOOP.COM; > [2018-08-23T09:40:09.907333] Using user: "syslog@HADOOP.COM" with > name syslog@HADOOP.COM; > [2018-08-23T09:40:09.907592] User entry: "syslog@HADOOP.COM"; > [2018-08-23T09:40:09.908005] Login successful for user > syslog@HADOOP.COM using keytab file syslog.keytab; > [2018-08-23T09:40:10.104386] dfs.client.use.legacy.blockreader.local > = false; > [2018-08-23T09:40:10.104436] dfs.client.read.shortcircuit = false; > [2018-08-23T09:40:10.104450] dfs.client.domain.socket.data.traffic = > false; > [2018-08-23T09:40:10.104461] dfs.domain.socket.path = ; > [2018-08-23T09:40:10.121280] Sets > dfs.client.block.write.replace-datanode-on-failure.replication to 0; > [2018-08-23T09:40:10.144892] multipleLinearRandomRetry = null; > [2018-08-23T09:40:10.168901] rpcKind=RPC_PROTOCOL_BUFFER, > rpcRequestWrapperClass=class > org.apache.hadoop.ipc.ProtobufRpcEngine$RpcProtobufRequest, > rpcInvoker=org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker@4379f1a4 > ; > [2018-08-23T09:40:10.173738] getting client out of cache: > org.apache.hadoop.ipc.Client@5b52b909; > [2018-08-23T09:40:10.238466] finalize() called.; > [2018-08-23T09:40:10.238705] finalize() called.; > [2018-08-23T09:40:10.526021] Both short-circuit local reads and UNIX > domain socket are disabled.; > [2018-08-23T09:40:10.532037] DataTransferProtocol not using > SaslPropertiesResolver, no QOP found in configuration for > dfs.data.transfer.protection; > [2018-08-23T09:40:10.555581] The ping interval is 60000 ms.; > [2018-08-23T09:40:10.556336] Connecting to /x.x.x.x:25000; > [2018-08-23T09:40:10.572385] PrivilegedAction as:syslog@HADOOP.COM > (auth:KERBEROS) > from:org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:771); > [2018-08-23T09:40:10.613823] Get kerberos info proto:interface > org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB > info:@org.apache.hadoop.security.KerberosInfo(clientPrincipal=, > serverPrincipal=dfs.namenode.kerberos.principal); > [2018-08-23T09:40:10.723447] RPC Server's Kerberos principal name > for protocol=org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB is > hdfs/hadoop.hadoop.com@HADOOP.COM; > [2018-08-23T09:40:10.723514] Creating SASL GSSAPI(KERBEROS) client > to authenticate to service at hadoop.hadoop.com; > [2018-08-23T09:40:10.730466] Use KERBEROS authentication for > protocol ClientNamenodeProtocolPB; > [2018-08-23T09:40:10.758296] PrivilegedActionException > as:syslog@HADOOP.COM (auth:KERBEROS) > cause:javax.security.sasl.SaslException: No common protection layer between > client and server; > [2018-08-23T09:40:10.759031] PrivilegedAction as:syslog@HADOOP.COM > (auth:KERBEROS) > from:org.apache.hadoop.ipc.Client$Connection.handleSaslConnectionFailure(Client.java:694); > [2018-08-23T09:40:10.759531] Exception encountered while connecting > to the server : javax.security.sasl.SaslException: No common protection > layer between client and server; > [2018-08-23T09:40:14.446925] PrivilegedAction as:syslog@HADOOP.COM > (auth:KERBEROS) > from:org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:771); > [2018-08-23T09:40:14.447824] Get kerberos info proto:interface > org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB > info:@org.apache.hadoop.security.KerberosInfo(clientPrincipal=, > serverPrincipal=dfs.namenode.kerberos.principal); > [2018-08-23T09:40:14.447982] RPC Server's Kerberos principal name > for protocol=org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB is > hdfs/hadoop.hadoop.com@HADOOP.COM; > [2018-08-23T09:40:14.448001] Creating SASL GSSAPI(KERBEROS) client > to authenticate to service at hadoop.hadoop.com; > [2018-08-23T09:40:14.449087] Use KERBEROS authentication for > protocol ClientNamenodeProtocolPB; > [2018-08-23T09:40:14.455070] PrivilegedActionException > as:syslog@HADOOP.COM (auth:KERBEROS) > cause:javax.security.sasl.SaslException: No common protection layer between > client and server; > [2018-08-23T09:40:14.455190] PrivilegedAction as:syslog@HADOOP.COM > (auth:KERBEROS) > from:org.apache.hadoop.ipc.Client$Connection.handleSaslConnectionFailure(Client.java:694); > [2018-08-23T09:40:14.455476] Exception encountered while connecting > to the server : javax.security.sasl.SaslException: No common protection > layer between client and server; > [2018-08-23T09:40:17.206928] PrivilegedAction as:syslog@HADOOP.COM > (auth:KERBEROS) > from:org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:771); > [2018-08-23T09:40:17.207978] Get kerberos info proto:interface > org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB > info:@org.apache.hadoop.security.KerberosInfo(clientPrincipal=, > serverPrincipal=dfs.namenode.kerberos.principal); > [2018-08-23T09:40:17.208115] RPC Server's Kerberos principal name > for protocol=org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB is > hdfs/hadoop.hadoop.com@HADOOP.COM; > [2018-08-23T09:40:17.208133] Creating SASL GSSAPI(KERBEROS) client > to authenticate to service at hadoop.hadoop.com; > [2018-08-23T09:40:17.208877] Use KERBEROS authentication for > protocol ClientNamenodeProtocolPB; > [2018-08-23T09:40:17.214382] PrivilegedActionException > as:syslog@HADOOP.COM (auth:KERBEROS) > cause:javax.security.sasl.SaslException: No common protection layer between > client and server; > [2018-08-23T09:40:17.214536] PrivilegedAction as:syslog@HADOOP.COM > (auth:KERBEROS) > from:org.apache.hadoop.ipc.Client$Connection.handleSaslConnectionFailure(Client.java:694); > [2018-08-23T09:40:17.214845] Exception encountered while connecting > to the server : javax.security.sasl.SaslException: No common protection > layer between client and server; > [2018-08-23T09:40:17.535313] PrivilegedAction as:syslog@HADOOP.COM > (auth:KERBEROS) > from:org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:771); > [2018-08-23T09:40:17.536419] Get kerberos info proto:interface > org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB > info:@org.apache.hadoop.security.KerberosInfo(clientPrincipal=, > serverPrincipal=dfs.namenode.kerberos.principal); > [2018-08-23T09:40:17.536615] RPC Server's Kerberos principal name > for protocol=org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB is > hdfs/hadoop.hadoop.com@HADOOP.COM; > [2018-08-23T09:40:17.536634] Creating SASL GSSAPI(KERBEROS) client > to authenticate to service at hadoop.hadoop.com; > [2018-08-23T09:40:17.537679] Use KERBEROS authentication for > protocol ClientNamenodeProtocolPB; > [2018-08-23T09:40:17.546375] PrivilegedActionException > as:syslog@HADOOP.COM (auth:KERBEROS) > cause:javax.security.sasl.SaslException: No common protection layer between > client and server; > [2018-08-23T09:40:17.546587] PrivilegedAction as:syslog@HADOOP.COM > (auth:KERBEROS) > from:org.apache.hadoop.ipc.Client$Connection.handleSaslConnectionFailure(Client.java:694); > [2018-08-23T09:40:17.546963] Exception encountered while connecting > to the server : javax.security.sasl.SaslException: No common protection > layer between client and server; > [2018-08-23T09:40:21.891382] PrivilegedAction as:syslog@HADOOP.COM > (auth:KERBEROS) > from:org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:771); > [2018-08-23T09:40:21.892476] Get kerberos info proto:interface > org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB > info:@org.apache.hadoop.security.KerberosInfo(clientPrincipal=, > serverPrincipal=dfs.namenode.kerberos.principal); > [2018-08-23T09:40:21.909267] RPC Server's Kerberos principal name > for protocol=org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB is > hdfs/hadoop.hadoop.com@HADOOP.COM; > [2018-08-23T09:40:21.909306] Creating SASL GSSAPI(KERBEROS) client > to authenticate to service at hadoop.hadoop.com; > [2018-08-23T09:40:21.910125] Use KERBEROS authentication for > protocol ClientNamenodeProtocolPB; > [2018-08-23T09:40:21.915324] PrivilegedActionException > as:syslog@HADOOP.COM (auth:KERBEROS) > cause:javax.security.sasl.SaslException: No common protection layer between > client and server; > [2018-08-23T09:40:21.915540] PrivilegedAction as:syslog@HADOOP.COM > (auth:KERBEROS) > from:org.apache.hadoop.ipc.Client$Connection.handleSaslConnectionFailure(Client.java:694); > [2018-08-23T09:40:21.915889] Exception encountered while connecting > to the server : javax.security.sasl.SaslException: No common protection > layer between client and server; > [2018-08-23T09:40:26.095299] PrivilegedAction as:syslog@HADOOP.COM > (auth:KERBEROS) > from:org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:771); > [2018-08-23T09:40:26.096234] Get kerberos info proto:interface > org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB > info:@org.apache.hadoop.security.KerberosInfo(clientPrincipal=, > serverPrincipal=dfs.namenode.kerberos.principal); > [2018-08-23T09:40:26.096391] RPC Server's Kerberos principal name > for protocol=org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB is > hdfs/hadoop.hadoop.com@HADOOP.COM; > [2018-08-23T09:40:26.096411] Creating SASL GSSAPI(KERBEROS) client > to authenticate to service at hadoop.hadoop.com; > [2018-08-23T09:40:26.097116] Use KERBEROS authentication for > protocol ClientNamenodeProtocolPB; > [2018-08-23T09:40:26.101823] PrivilegedActionException > as:syslog@HADOOP.COM (auth:KERBEROS) > cause:javax.security.sasl.SaslException: No common protection layer between > client and server; > [2018-08-23T09:40:26.101960] PrivilegedAction as:syslog@HADOOP.COM > (auth:KERBEROS) > from:org.apache.hadoop.ipc.Client$Connection.handleSaslConnectionFailure(Client.java:694); > [2018-08-23T09:40:26.102283] Couldn't setup connection for > syslog@HADOOP.COM to /x.x.x.x:25000; > [2018-08-23T09:40:26.102365] PrivilegedActionException > as:syslog@HADOOP.COM (auth:KERBEROS) cause:java.io.IOException: > Couldn't setup connection for syslog@HADOOP.COM to /x.x.x.x:25000; > [2018-08-23T09:40:26.103844] closing ipc connection to > /x.x.x.x:25000: Couldn't setup connection for syslog@HADOOP.COM to > /x.x.x.x:25000; > [2018-08-23T09:40:26.103994] IPC Client (1766482975) connection to > /x.x.x.x:25000 from syslog@HADOOP.COM: closed; > > Thank you. > > > On Tue, Aug 21, 2018 at 6:48 PM Nagy, Gábor < > gabor.nagy@oneidentity.com> wrote: > >> Hi, >> >> Can you tell ne what is the version of the hadoop lib you use with >> syslog-ng, please? >> >> Can you share your syslog-ng configuration, mainly the hdfs part, >> please? >> >> Regards, >> Gabor >> >> On Tue, Aug 21, 2018 at 4:15 AM Lee Keng Ket <kengket@gmail.com> >> wrote: >> >>> Hi, Gabor >>> >>> I have run it, seems like it stops at the HDFS side. >>> >>> [2018-08-21T10:07:51.212015] Worker thread started; >>> driver='d_hdfs#0' >>> [2018-08-21T10:07:51.212499] Running application hooks; hook='1' >>> [2018-08-21T10:07:51.212516] Running application hooks; hook='3' >>> [2018-08-21T10:07:51.212595] syslog-ng starting up; >>> version='3.14.1' >>> [2018-08-21T10:07:51.214113] Opening hdfs; >>> [2018-08-21T10:08:01.215622] Opening hdfs; >>> [2018-08-21T10:08:11.216050] Opening hdfs; >>> [2018-08-21T10:08:21.226340] Opening hdfs; >>> [2018-08-21T10:08:31.236589] Opening hdfs; >>> [2018-08-21T10:08:41.240623] Opening hdfs; >>> [2018-08-21T10:08:51.250879] Opening hdfs; >>> [2018-08-21T10:09:01.261172] Opening hdfs; >>> [2018-08-21T10:09:11.271410] Opening hdfs; >>> [2018-08-21T10:09:21.281685] Opening hdfs; >>> [2018-08-21T10:09:31.290765] Opening hdfs; >>> [2018-08-21T10:09:41.301098] Opening hdfs; >>> [2018-08-21T10:09:51.311362] Opening hdfs; >>> [2018-08-21T10:10:01.321152] Opening hdfs; >>> [2018-08-21T10:10:11.321818] Opening hdfs; >>> [2018-08-21T10:10:21.330114] Opening hdfs; >>> [2018-08-21T10:10:31.340413] Opening hdfs; >>> [2018-08-21T10:10:41.350654] Opening hdfs; >>> [2018-08-21T10:10:51.354016] Opening hdfs; >>> [2018-08-21T10:11:01.364267] Opening hdfs; >>> [2018-08-21T10:11:11.374516] Opening hdfs; >>> [2018-08-21T10:11:21.384761] Opening hdfs; >>> [2018-08-21T10:11:31.395017] Opening hdfs; >>> [2018-08-21T10:11:41.402256] Opening hdfs; >>> [2018-08-21T10:11:51.404097] Opening hdfs; >>> ^C[2018-08-21T10:11:59.672252] syslog-ng shutting down; >>> version='3.14.1' >>> Exception in thread "" java.lang.NoClassDefFoundError: >>> org/apache/hadoop/conf/Configuration >>> at >>> org.syslog_ng.hdfs.HdfsDestination.open(HdfsDestination.java:92) >>> at >>> org.syslog_ng.LogDestination.openProxy(LogDestination.java:65) >>> [2018-08-21T10:11:59.774895] Worker thread finished; >>> driver='d_hdfs#0' >>> [2018-08-21T10:11:59.775384] Closing log transport fd; fd='13' >>> [2018-08-21T10:11:59.775508] Deinitialize hdfs destination; >>> [2018-08-21T10:11:59.776534] Java machine free; >>> [2018-08-21T10:11:59.778421] Running application hooks; hook='4' >>> >>> Any idea what to be checked further? >>> >>> Thank you. >>> >>> On Fri, Aug 17, 2018 at 4:45 PM Nagy, Gábor < >>> gabor.nagy@oneidentity.com> wrote: >>> >>>> Hello! >>>> >>>> In the statistics it can be seen that the log message is not sent >>>> to the HDFS server: >>>> dropped='dst.java(d_hdfs#0 java_dst hdfs hdfs://x.x.x.x:25000 >>>> /user/syslog/$HOST-$DAY-$MONTH-$YEAR.log)=0' >>>> processed='dst.java(d_hdfs#0 java_dst hdfs hdfs://x.x.x.x:25000 >>>> /user/syslog/$HOST-$DAY-$MONTH-$YEAR.log)=1' >>>> queued='dst.java(d_hdfs#0 java_dst hdfs hdfs://x.x.x.x:25000 >>>> /user/syslog/$HOST-$DAY-$MONTH-$YEAR.log)=1' >>>> >>>> Well, generally on write error there should be an exception that >>>> results in an error message. >>>> >>>> You should try debugging it either in running syslog-ng in >>>> foreground (-F option), forwarding internal logs to stderr (-e) and with >>>> debug mode (-dv) on. >>>> Or in service mode use the internal() source in your config and >>>> connect it to a destination (e.g. file()) which you prefer. >>>> >>>> You could turn on debug messages on java side too using >>>> jvm_options() in syslog-ng config and configuring the log4j logging >>>> service, e.g.: >>>> options { >>>> >>>> jvm_options("-Dlog4j.configuration=file:/etc/hadoop/log4j.properties >>>> -Dlog4j.debug=true"); >>>> }; >>>> >>>> Regards, >>>> Gabor >>>> >>>> On Fri, Aug 17, 2018 at 10:34 AM Czanik, Péter < >>>> peter.czanik@balabit.com> wrote: >>>> >>>>> Hi, >>>>> >>>>> As https://copr.fedorainfracloud.org/coprs/czanik/syslog-ng314/ >>>>> also writes: "Java is enabled, but JAR dependencies are not provided in >>>>> package, except for Elasticsearch http mode." The >>>>> syslog-ng-java-deps.noarch contains build time dependencies. Probably I >>>>> should rename the package to syslog-ng-java-build-deps... >>>>> >>>>> Check the documentation at >>>>> https://www.syslog-ng.com/technical-documents/doc/syslog-ng-open-source-edit... >>>>> on how to download and configure HDFS related JAR dependencies. >>>>> >>>>> Bye, >>>>> >>>>> Peter Czanik (CzP) <peter.czanik@balabit.com> >>>>> Balabit / syslog-ng upstream >>>>> https://syslog-ng.com/community/ >>>>> https://twitter.com/PCzanik >>>>> >>>>> On Fri, Aug 17, 2018 at 10:22 AM, Lee Keng Ket < >>>>> kengket@gmail.com> wrote: >>>>> >>>>>> Hi, >>>>>> >>>>>> I'm trying to connect syslog-ng 3.14.1 to HDFS to store the >>>>>> syslog messages. The syslog-ng can start without error, and it's able to >>>>>> write into local file. However, the log is not written to the HDFS. As >>>>>> there is no single error, I'm not sure how I should troubleshoot on this. >>>>>> >>>>>> I have installed the syslog-ng from this repo, >>>>>> https://copr.fedorainfracloud.org/coprs/czanik/syslog-ng314/repo/epel-7/czan... >>>>>> >>>>>> Installed Packages >>>>>> syslog-ng.x86_64 >>>>>> 3.14.1-4.el7.centos >>>>>> @czanik-syslog-ng314 >>>>>> syslog-ng-java.x86_64 >>>>>> 3.14.1-4.el7.centos >>>>>> @czanik-syslog-ng314 >>>>>> syslog-ng-java-deps.noarch >>>>>> 1.0-2 >>>>>> @czanik-syslog-ng314 >>>>>> >>>>>> This is the message from /var/log/message: >>>>>> Log statistics; processed='src.internal(s_sys#0)=1', >>>>>> stamp='src.internal(s_sys#0)=1534491834', >>>>>> processed='destination(d_spol)=0', processed='destination(d_mlal)=0', >>>>>> processed='center(received)=2', processed='destination(d_mesg)=1', >>>>>> processed='destination(d_mail)=0', processed='destination(d_auth)=0', >>>>>> processed='destination(d_cron)=0', processed='destination(d_hdfs)=1', >>>>>> processed='center(queued)=3', queued='global(scratch_buffers_count)=0', >>>>>> processed='source(remote_log)=1', >>>>>> dropped='dst.java(d_hdfs#0,java_dst,hdfs,hdfs://x.x.x.x:25000,/user/syslog/$HOST-$DAY-$MONTH-$YEAR.log)=0', >>>>>> processed='dst.java(d_hdfs#0,java_dst,hdfs,hdfs://x.x.x.x:25000,/user/syslog/$HOST-$DAY-$MONTH-$YEAR.log)=1', >>>>>> queued='dst.java(d_hdfs#0,java_dst,hdfs,hdfs://x.x.x.x:25000,/user/syslog/$HOST-$DAY-$MONTH-$YEAR.log)=1', >>>>>> processed='global(payload_reallocs)=0', >>>>>> processed='src.journald(journal)=0', stamp='src.journald(journal)=0', >>>>>> processed='global(sdata_updates)=0', >>>>>> queued='global(scratch_buffers_bytes)=0', >>>>>> processed='destination(d_boot)=0', processed='destination(d_kern)=0', >>>>>> processed='source(s_sys)=1', processed='destination(remote)=1', >>>>>> processed='global(internal_queue_length)=0', >>>>>> processed='global(msg_clones)=0' >>>>>> >>>>>> Anyone has any idea how should I proceed the troubleshooting? >>>>>> >>>>>> >>>>>> ______________________________________________________________________________ >>>>>> Member info: >>>>>> https://lists.balabit.hu/mailman/listinfo/syslog-ng >>>>>> Documentation: >>>>>> http://www.balabit.com/support/documentation/?product=syslog-ng >>>>>> FAQ: http://www.balabit.com/wiki/syslog-ng-faq >>>>>> >>>>>> >>>>>> >>>>> >>>>> ______________________________________________________________________________ >>>>> Member info: https://lists.balabit.hu/mailman/listinfo/syslog-ng >>>>> Documentation: >>>>> http://www.balabit.com/support/documentation/?product=syslog-ng >>>>> FAQ: http://www.balabit.com/wiki/syslog-ng-faq >>>>> >>>>> >>>> ______________________________________________________________________________ >>>> Member info: https://lists.balabit.hu/mailman/listinfo/syslog-ng >>>> Documentation: >>>> http://www.balabit.com/support/documentation/?product=syslog-ng >>>> FAQ: http://www.balabit.com/wiki/syslog-ng-faq >>>> >>>> >>> ______________________________________________________________________________ >>> Member info: https://lists.balabit.hu/mailman/listinfo/syslog-ng >>> Documentation: >>> http://www.balabit.com/support/documentation/?product=syslog-ng >>> FAQ: http://www.balabit.com/wiki/syslog-ng-faq >>> >>> >> ______________________________________________________________________________ >> Member info: https://lists.balabit.hu/mailman/listinfo/syslog-ng >> Documentation: >> http://www.balabit.com/support/documentation/?product=syslog-ng >> FAQ: http://www.balabit.com/wiki/syslog-ng-faq >> >> > ______________________________________________________________________________ > Member info: https://lists.balabit.hu/mailman/listinfo/syslog-ng > Documentation: > http://www.balabit.com/support/documentation/?product=syslog-ng > FAQ: http://www.balabit.com/wiki/syslog-ng-faq > > ______________________________________________________________________________ Member info: https://lists.balabit.hu/mailman/listinfo/syslog-ng Documentation: http://www.balabit.com/support/documentation/?product=syslog-ng FAQ: http://www.balabit.com/wiki/syslog-ng-faq
Member info: https://lists.balabit.hu/mailman/listinfo/syslog-ng Documentation: http://www.balabit.com/support/documentation/?product=syslog-ng FAQ: http://www.balabit.com/wiki/syslog-ng-faq
______________________________________________________________________________ Member info: https://lists.balabit.hu/mailman/listinfo/syslog-ng Documentation: http://www.balabit.com/support/documentation/?product=syslog-ng FAQ: http://www.balabit.com/wiki/syslog-ng-faq
______________________________________________________________________________ Member info: https://lists.balabit.hu/mailman/listinfo/syslog-ng Documentation: http://www.balabit.com/support/documentation/?product=syslog-ng FAQ: http://www.balabit.com/wiki/syslog-ng-faq
______________________________________________________________________________ Member info: https://lists.balabit.hu/mailman/listinfo/syslog-ng Documentation: http://www.balabit.com/support/documentation/?product=syslog-ng FAQ: http://www.balabit.com/wiki/syslog-ng-faq
Hi, Peter I have put in all the jar files from the Hadoop client to the /opt/hadoop/lib and declare this in the conf file. This should solve the JAR dependencies, right? Or please correct me if I'm wrong. Thank you. Regards, KK On Fri, Aug 17, 2018 at 4:34 PM Czanik, Péter <peter.czanik@balabit.com> wrote:
Hi,
As https://copr.fedorainfracloud.org/coprs/czanik/syslog-ng314/ also writes: "Java is enabled, but JAR dependencies are not provided in package, except for Elasticsearch http mode." The syslog-ng-java-deps.noarch contains build time dependencies. Probably I should rename the package to syslog-ng-java-build-deps...
Check the documentation at https://www.syslog-ng.com/technical-documents/doc/syslog-ng-open-source-edit... on how to download and configure HDFS related JAR dependencies.
Bye,
Peter Czanik (CzP) <peter.czanik@balabit.com> Balabit / syslog-ng upstream https://syslog-ng.com/community/ https://twitter.com/PCzanik
On Fri, Aug 17, 2018 at 10:22 AM, Lee Keng Ket <kengket@gmail.com> wrote:
Hi,
I'm trying to connect syslog-ng 3.14.1 to HDFS to store the syslog messages. The syslog-ng can start without error, and it's able to write into local file. However, the log is not written to the HDFS. As there is no single error, I'm not sure how I should troubleshoot on this.
I have installed the syslog-ng from this repo, https://copr.fedorainfracloud.org/coprs/czanik/syslog-ng314/repo/epel-7/czan...
Installed Packages syslog-ng.x86_64 3.14.1-4.el7.centos @czanik-syslog-ng314 syslog-ng-java.x86_64 3.14.1-4.el7.centos @czanik-syslog-ng314 syslog-ng-java-deps.noarch 1.0-2 @czanik-syslog-ng314
This is the message from /var/log/message: Log statistics; processed='src.internal(s_sys#0)=1', stamp='src.internal(s_sys#0)=1534491834', processed='destination(d_spol)=0', processed='destination(d_mlal)=0', processed='center(received)=2', processed='destination(d_mesg)=1', processed='destination(d_mail)=0', processed='destination(d_auth)=0', processed='destination(d_cron)=0', processed='destination(d_hdfs)=1', processed='center(queued)=3', queued='global(scratch_buffers_count)=0', processed='source(remote_log)=1', dropped='dst.java(d_hdfs#0,java_dst,hdfs,hdfs://x.x.x.x:25000,/user/syslog/$HOST-$DAY-$MONTH-$YEAR.log)=0', processed='dst.java(d_hdfs#0,java_dst,hdfs,hdfs://x.x.x.x:25000,/user/syslog/$HOST-$DAY-$MONTH-$YEAR.log)=1', queued='dst.java(d_hdfs#0,java_dst,hdfs,hdfs://x.x.x.x:25000,/user/syslog/$HOST-$DAY-$MONTH-$YEAR.log)=1', processed='global(payload_reallocs)=0', processed='src.journald(journal)=0', stamp='src.journald(journal)=0', processed='global(sdata_updates)=0', queued='global(scratch_buffers_bytes)=0', processed='destination(d_boot)=0', processed='destination(d_kern)=0', processed='source(s_sys)=1', processed='destination(remote)=1', processed='global(internal_queue_length)=0', processed='global(msg_clones)=0'
Anyone has any idea how should I proceed the troubleshooting?
______________________________________________________________________________ Member info: https://lists.balabit.hu/mailman/listinfo/syslog-ng Documentation: http://www.balabit.com/support/documentation/?product=syslog-ng FAQ: http://www.balabit.com/wiki/syslog-ng-faq
______________________________________________________________________________ Member info: https://lists.balabit.hu/mailman/listinfo/syslog-ng Documentation: http://www.balabit.com/support/documentation/?product=syslog-ng FAQ: http://www.balabit.com/wiki/syslog-ng-faq
Hi, Yes, that's right. Just seeing syslog-ng-java-deps installed I assumed that you skipped that step. Bye, Peter Czanik (CzP) <peter.czanik@balabit.com> Balabit / syslog-ng upstream https://syslog-ng.com/community/ https://twitter.com/PCzanik On Fri, Aug 17, 2018 at 11:27 AM, Lee Keng Ket <kengket@gmail.com> wrote:
Hi, Peter
I have put in all the jar files from the Hadoop client to the /opt/hadoop/lib and declare this in the conf file. This should solve the JAR dependencies, right? Or please correct me if I'm wrong.
Thank you.
Regards, KK
On Fri, Aug 17, 2018 at 4:34 PM Czanik, Péter <peter.czanik@balabit.com> wrote:
Hi,
As https://copr.fedorainfracloud.org/coprs/czanik/syslog-ng314/ also writes: "Java is enabled, but JAR dependencies are not provided in package, except for Elasticsearch http mode." The syslog-ng-java-deps.noarch contains build time dependencies. Probably I should rename the package to syslog-ng-java-build-deps...
Check the documentation at https://www.syslog-ng.com/ technical-documents/doc/syslog-ng-open-source-edition/ 3.16/administration-guide/33#TOPIC-956506 on how to download and configure HDFS related JAR dependencies.
Bye,
Peter Czanik (CzP) <peter.czanik@balabit.com> Balabit / syslog-ng upstream https://syslog-ng.com/community/ https://twitter.com/PCzanik
On Fri, Aug 17, 2018 at 10:22 AM, Lee Keng Ket <kengket@gmail.com> wrote:
Hi,
I'm trying to connect syslog-ng 3.14.1 to HDFS to store the syslog messages. The syslog-ng can start without error, and it's able to write into local file. However, the log is not written to the HDFS. As there is no single error, I'm not sure how I should troubleshoot on this.
I have installed the syslog-ng from this repo, https://copr.fedorainfracloud.org/coprs/czanik/syslog-ng314/ repo/epel-7/czanik-syslog-ng314-epel-7.repo
Installed Packages syslog-ng.x86_64 3.14.1-4.el7.centos @czanik-syslog-ng314 syslog-ng-java.x86_64 3.14.1-4.el7.centos @czanik-syslog-ng314 syslog-ng-java-deps.noarch 1.0-2 @czanik-syslog-ng314
This is the message from /var/log/message: Log statistics; processed='src.internal(s_sys#0)=1', stamp='src.internal(s_sys#0)=1534491834', processed='destination(d_spol)=0', processed='destination(d_mlal)=0', processed='center(received)=2', processed='destination(d_mesg)=1', processed='destination(d_mail)=0', processed='destination(d_auth)=0', processed='destination(d_cron)=0', processed='destination(d_hdfs)=1', processed='center(queued)=3', queued='global(scratch_buffers_count)=0', processed='source(remote_log)=1', dropped='dst.java(d_hdfs#0,java_dst,hdfs,hdfs://x.x.x.x: 25000,/user/syslog/$HOST-$DAY-$MONTH-$YEAR.log)=0', processed='dst.java(d_hdfs#0,java_dst,hdfs,hdfs://x.x.x.x: 25000,/user/syslog/$HOST-$DAY-$MONTH-$YEAR.log)=1', queued='dst.java(d_hdfs#0,java_dst,hdfs,hdfs://x.x.x.x: 25000,/user/syslog/$HOST-$DAY-$MONTH-$YEAR.log)=1', processed='global(payload_reallocs)=0', processed='src.journald(journal)=0', stamp='src.journald(journal)=0', processed='global(sdata_updates)=0', queued='global(scratch_buffers_bytes)=0', processed='destination(d_boot)=0', processed='destination(d_kern)=0', processed='source(s_sys)=1', processed='destination(remote)=1', processed='global(internal_queue_length)=0', processed='global(msg_clones)=0'
Anyone has any idea how should I proceed the troubleshooting?
____________________________________________________________ __________________ Member info: https://lists.balabit.hu/mailman/listinfo/syslog-ng Documentation: http://www.balabit.com/support/documentation/? product=syslog-ng FAQ: http://www.balabit.com/wiki/syslog-ng-faq
____________________________________________________________ __________________ Member info: https://lists.balabit.hu/mailman/listinfo/syslog-ng Documentation: http://www.balabit.com/support/documentation/? product=syslog-ng FAQ: http://www.balabit.com/wiki/syslog-ng-faq
____________________________________________________________ __________________ Member info: https://lists.balabit.hu/mailman/listinfo/syslog-ng Documentation: http://www.balabit.com/support/documentation/? product=syslog-ng FAQ: http://www.balabit.com/wiki/syslog-ng-faq
Hi Team We are capturing unix events from native syslog client on sylogng server and it’s legacy version which is 3.0.4. How do we capture the actual time zone information of the syslog messages in syslog Ng server sent by native syslog clients ? Regards Sathish
Hi Sathish! Can you share some examples of what does those unix events look like, please? Capturing the time zone information depends on how does that native syslog client send the messages to syslog-ng. Can you explain what are those native syslog clients, please? It can happen that without modifying the configuration of those clients, little can be done on server side. Also, can you share details about your syslog-ng configuration, please? Regards, Gabor On Fri, Aug 17, 2018 at 1:54 PM Sathish Sundaravel < sathish.sundaravel@gmail.com> wrote:
Hi Team
We are capturing unix events from native syslog client on sylogng server and it’s legacy version which is 3.0.4.
How do we capture the actual time zone information of the syslog messages in syslog Ng server sent by native syslog clients ?
Regards Sathish
______________________________________________________________________________ Member info: https://lists.balabit.hu/mailman/listinfo/syslog-ng Documentation: http://www.balabit.com/support/documentation/?product=syslog-ng FAQ: http://www.balabit.com/wiki/syslog-ng-faq
Hi, Did you read the following blog about java troubleshooting? https://www.syslog-ng.com/community/b/blog/posts/troubleshooting-java-suppor...
Hi, Fabien Yes, I have put the LD_LIBRARY_PATH and syslog-ng can be started without any issue. Thank you. On Mon, Aug 20, 2018 at 3:53 PM Fabien Wernli <wernli@in2p3.fr> wrote:
Hi,
Did you read the following blog about java troubleshooting?
https://www.syslog-ng.com/community/b/blog/posts/troubleshooting-java-suppor...
______________________________________________________________________________ Member info: https://lists.balabit.hu/mailman/listinfo/syslog-ng Documentation: http://www.balabit.com/support/documentation/?product=syslog-ng FAQ: http://www.balabit.com/wiki/syslog-ng-faq
participants (6)
-
Czanik, Péter
-
Fabien Wernli
-
Lee Keng Ket
-
Nagy, Gábor
-
Sathish Sundaravel
-
SZIGETVÁRI János