There are the messages I get when starting in debug:
[2018-08-23T09:40:09.210168] Running application hooks; hook='1'
[2018-08-23T09:40:09.210192] Running application hooks; hook='3'
[2018-08-23T09:40:09.210501] syslog-ng starting up; version='3.14.1'
[2018-08-23T09:40:09.213049] Worker thread started; driver='d_hdfs#0'
[2018-08-23T09:40:09.214922] Opening hdfs;
[2018-08-23T09:40:09.548286] field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginSuccess with annotation @org.apache.hadoop.metrics2.annotation.Metric(about=, sampleName=Ops, always=false, type=DEFAULT, valueName=Time, value=[Rate of successful kerberos logins and latency (milliseconds)]);
[2018-08-23T09:40:09.561345] field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginFailure with annotation @org.apache.hadoop.metrics2.annotation.Metric(about=, sampleName=Ops, always=false, type=DEFAULT, valueName=Time, value=[Rate of failed kerberos logins and latency (milliseconds)]);
[2018-08-23T09:40:09.561608] field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.getGroups with annotation @org.apache.hadoop.metrics2.annotation.Metric(about=, sampleName=Ops, always=false, type=DEFAULT, valueName=Time, value=[GetGroups]);
[2018-08-23T09:40:09.562485] UgiMetrics, User and group related metrics;
[2018-08-23T09:40:09.604037] Failed to detect a valid hadoop home directory;
[2018-08-23T09:40:09.687386] setsid exited with exit code 0;
[2018-08-23T09:40:09.715804] Creating new Groups object;
[2018-08-23T09:40:09.717743] Trying to load the custom-built native-hadoop library...;
[2018-08-23T09:40:09.718065] Failed to load native-hadoop with error: java.lang.UnsatisfiedLinkError: no hadoop in java.library.path;
[2018-08-23T09:40:09.718095] java.library.path=//usr/lib64/syslog-ng;
[2018-08-23T09:40:09.718117] Unable to load native-hadoop library for your platform... using builtin-java classes where applicable;
[2018-08-23T09:40:09.718418] Falling back to shell based;
[2018-08-23T09:40:09.718997] Group mapping impl=org.apache.hadoop.security.ShellBasedUnixGroupsMapping;
[2018-08-23T09:40:09.753615] Group mapping impl=org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback\; cacheTimeout=300000\; warningDeltaMs=5000;
[2018-08-23T09:40:09.905060] hadoop login;
[2018-08-23T09:40:09.906493] hadoop login commit;
[2018-08-23T09:40:09.907300] using kerberos
user:syslog@HADOOP.COM;
[2018-08-23T09:40:09.907333] Using user: "
syslog@HADOOP.COM" with name
syslog@HADOOP.COM;
[2018-08-23T09:40:09.907592] User entry: "
syslog@HADOOP.COM";
[2018-08-23T09:40:09.908005] Login successful for user
syslog@HADOOP.COM using keytab file syslog.keytab;
[2018-08-23T09:40:10.104386] dfs.client.use.legacy.blockreader.local = false;
[2018-08-23T09:40:10.104436] dfs.client.read.shortcircuit = false;
[2018-08-23T09:40:10.104450] dfs.client.domain.socket.data.traffic = false;
[2018-08-23T09:40:10.104461] dfs.domain.socket.path = ;
[2018-08-23T09:40:10.121280] Sets dfs.client.block.write.replace-datanode-on-failure.replication to 0;
[2018-08-23T09:40:10.144892] multipleLinearRandomRetry = null;
[2018-08-23T09:40:10.168901] rpcKind=RPC_PROTOCOL_BUFFER, rpcRequestWrapperClass=class org.apache.hadoop.ipc.ProtobufRpcEngine$RpcProtobufRequest, rpcInvoker=org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker@4379f1a4;
[2018-08-23T09:40:10.173738] getting client out of cache: org.apache.hadoop.ipc.Client@5b52b909;
[2018-08-23T09:40:10.238466] finalize() called.;
[2018-08-23T09:40:10.238705] finalize() called.;
[2018-08-23T09:40:10.526021] Both short-circuit local reads and UNIX domain socket are disabled.;
[2018-08-23T09:40:10.532037] DataTransferProtocol not using SaslPropertiesResolver, no QOP found in configuration for dfs.data.transfer.protection;
[2018-08-23T09:40:10.555581] The ping interval is 60000 ms.;
[2018-08-23T09:40:10.556336] Connecting to /x.x.x.x:25000;
[2018-08-23T09:40:10.572385] PrivilegedAction
as:syslog@HADOOP.COM (auth:KERBEROS) from:org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:771);
[2018-08-23T09:40:10.613823] Get kerberos info proto:interface org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB info:@org.apache.hadoop.security.KerberosInfo(clientPrincipal=, serverPrincipal=dfs.namenode.kerberos.principal);
[2018-08-23T09:40:10.723447] RPC Server's Kerberos principal name for protocol=org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB is hdfs/
hadoop.hadoop.com@HADOOP.COM;
[2018-08-23T09:40:10.723514] Creating SASL GSSAPI(KERBEROS) client to authenticate to service at
hadoop.hadoop.com;
[2018-08-23T09:40:10.730466] Use KERBEROS authentication for protocol ClientNamenodeProtocolPB;
[2018-08-23T09:40:10.758296] PrivilegedActionException
as:syslog@HADOOP.COM (auth:KERBEROS) cause:javax.security.sasl.SaslException: No common protection layer between client and server;
[2018-08-23T09:40:10.759031] PrivilegedAction
as:syslog@HADOOP.COM (auth:KERBEROS) from:org.apache.hadoop.ipc.Client$Connection.handleSaslConnectionFailure(Client.java:694);
[2018-08-23T09:40:10.759531] Exception encountered while connecting to the server : javax.security.sasl.SaslException: No common protection layer between client and server;
[2018-08-23T09:40:14.446925] PrivilegedAction
as:syslog@HADOOP.COM (auth:KERBEROS) from:org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:771);
[2018-08-23T09:40:14.447824] Get kerberos info proto:interface org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB info:@org.apache.hadoop.security.KerberosInfo(clientPrincipal=, serverPrincipal=dfs.namenode.kerberos.principal);
[2018-08-23T09:40:14.447982] RPC Server's Kerberos principal name for protocol=org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB is hdfs/
hadoop.hadoop.com@HADOOP.COM;
[2018-08-23T09:40:14.448001] Creating SASL GSSAPI(KERBEROS) client to authenticate to service at
hadoop.hadoop.com;
[2018-08-23T09:40:14.449087] Use KERBEROS authentication for protocol ClientNamenodeProtocolPB;
[2018-08-23T09:40:14.455070] PrivilegedActionException
as:syslog@HADOOP.COM (auth:KERBEROS) cause:javax.security.sasl.SaslException: No common protection layer between client and server;
[2018-08-23T09:40:14.455190] PrivilegedAction
as:syslog@HADOOP.COM (auth:KERBEROS) from:org.apache.hadoop.ipc.Client$Connection.handleSaslConnectionFailure(Client.java:694);
[2018-08-23T09:40:14.455476] Exception encountered while connecting to the server : javax.security.sasl.SaslException: No common protection layer between client and server;
[2018-08-23T09:40:17.206928] PrivilegedAction
as:syslog@HADOOP.COM (auth:KERBEROS) from:org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:771);
[2018-08-23T09:40:17.207978] Get kerberos info proto:interface org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB info:@org.apache.hadoop.security.KerberosInfo(clientPrincipal=, serverPrincipal=dfs.namenode.kerberos.principal);
[2018-08-23T09:40:17.208115] RPC Server's Kerberos principal name for protocol=org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB is hdfs/
hadoop.hadoop.com@HADOOP.COM;
[2018-08-23T09:40:17.208133] Creating SASL GSSAPI(KERBEROS) client to authenticate to service at
hadoop.hadoop.com;
[2018-08-23T09:40:17.208877] Use KERBEROS authentication for protocol ClientNamenodeProtocolPB;
[2018-08-23T09:40:17.214382] PrivilegedActionException
as:syslog@HADOOP.COM (auth:KERBEROS) cause:javax.security.sasl.SaslException: No common protection layer between client and server;
[2018-08-23T09:40:17.214536] PrivilegedAction
as:syslog@HADOOP.COM (auth:KERBEROS) from:org.apache.hadoop.ipc.Client$Connection.handleSaslConnectionFailure(Client.java:694);
[2018-08-23T09:40:17.214845] Exception encountered while connecting to the server : javax.security.sasl.SaslException: No common protection layer between client and server;
[2018-08-23T09:40:17.535313] PrivilegedAction
as:syslog@HADOOP.COM (auth:KERBEROS) from:org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:771);
[2018-08-23T09:40:17.536419] Get kerberos info proto:interface org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB info:@org.apache.hadoop.security.KerberosInfo(clientPrincipal=, serverPrincipal=dfs.namenode.kerberos.principal);
[2018-08-23T09:40:17.536615] RPC Server's Kerberos principal name for protocol=org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB is hdfs/
hadoop.hadoop.com@HADOOP.COM;
[2018-08-23T09:40:17.536634] Creating SASL GSSAPI(KERBEROS) client to authenticate to service at
hadoop.hadoop.com;
[2018-08-23T09:40:17.537679] Use KERBEROS authentication for protocol ClientNamenodeProtocolPB;
[2018-08-23T09:40:17.546375] PrivilegedActionException
as:syslog@HADOOP.COM (auth:KERBEROS) cause:javax.security.sasl.SaslException: No common protection layer between client and server;
[2018-08-23T09:40:17.546587] PrivilegedAction
as:syslog@HADOOP.COM (auth:KERBEROS) from:org.apache.hadoop.ipc.Client$Connection.handleSaslConnectionFailure(Client.java:694);
[2018-08-23T09:40:17.546963] Exception encountered while connecting to the server : javax.security.sasl.SaslException: No common protection layer between client and server;
[2018-08-23T09:40:21.891382] PrivilegedAction
as:syslog@HADOOP.COM (auth:KERBEROS) from:org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:771);
[2018-08-23T09:40:21.892476] Get kerberos info proto:interface org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB info:@org.apache.hadoop.security.KerberosInfo(clientPrincipal=, serverPrincipal=dfs.namenode.kerberos.principal);
[2018-08-23T09:40:21.909267] RPC Server's Kerberos principal name for protocol=org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB is hdfs/
hadoop.hadoop.com@HADOOP.COM;
[2018-08-23T09:40:21.909306] Creating SASL GSSAPI(KERBEROS) client to authenticate to service at
hadoop.hadoop.com;
[2018-08-23T09:40:21.910125] Use KERBEROS authentication for protocol ClientNamenodeProtocolPB;
[2018-08-23T09:40:21.915324] PrivilegedActionException
as:syslog@HADOOP.COM (auth:KERBEROS) cause:javax.security.sasl.SaslException: No common protection layer between client and server;
[2018-08-23T09:40:21.915540] PrivilegedAction
as:syslog@HADOOP.COM (auth:KERBEROS) from:org.apache.hadoop.ipc.Client$Connection.handleSaslConnectionFailure(Client.java:694);
[2018-08-23T09:40:21.915889] Exception encountered while connecting to the server : javax.security.sasl.SaslException: No common protection layer between client and server;
[2018-08-23T09:40:26.095299] PrivilegedAction
as:syslog@HADOOP.COM (auth:KERBEROS) from:org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:771);
[2018-08-23T09:40:26.096234] Get kerberos info proto:interface org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB info:@org.apache.hadoop.security.KerberosInfo(clientPrincipal=, serverPrincipal=dfs.namenode.kerberos.principal);
[2018-08-23T09:40:26.096391] RPC Server's Kerberos principal name for protocol=org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB is hdfs/
hadoop.hadoop.com@HADOOP.COM;
[2018-08-23T09:40:26.096411] Creating SASL GSSAPI(KERBEROS) client to authenticate to service at
hadoop.hadoop.com;
[2018-08-23T09:40:26.097116] Use KERBEROS authentication for protocol ClientNamenodeProtocolPB;
[2018-08-23T09:40:26.101823] PrivilegedActionException
as:syslog@HADOOP.COM (auth:KERBEROS) cause:javax.security.sasl.SaslException: No common protection layer between client and server;
[2018-08-23T09:40:26.101960] PrivilegedAction
as:syslog@HADOOP.COM (auth:KERBEROS) from:org.apache.hadoop.ipc.Client$Connection.handleSaslConnectionFailure(Client.java:694);
[2018-08-23T09:40:26.102283] Couldn't setup connection for
syslog@HADOOP.COM to /x.x.x.x:25000;
[2018-08-23T09:40:26.102365] PrivilegedActionException
as:syslog@HADOOP.COM (auth:KERBEROS) cause:java.io.IOException: Couldn't setup connection for
syslog@HADOOP.COM to /x.x.x.x:25000;
[2018-08-23T09:40:26.103844] closing ipc connection to /x.x.x.x:25000: Couldn't setup connection for
syslog@HADOOP.COM to /x.x.x.x:25000;
[2018-08-23T09:40:26.103994] IPC Client (1766482975) connection to /x.x.x.x:25000 from
syslog@HADOOP.COM: closed;
Thank you.