balabit/syslog-ng:latest container and kafka output
Hi all, I'm trying to push logs from files to Kafka using syslog-ng container provided there: https://hub.docker.com/r/balabit/syslog-ng/ I used also : https://www.syslog-ng.com/technical-documents/doc/syslog-ng-open-source-edit... The ERROR: When running the container including Kafka configuration I get: [2018-11-30T12:20:51.324773] WARNING: With use-dns(no), dns-cache() will be forced to 'no' too!; [2018-11-30T12:20:51.478631] Can't find class; class_name='org.syslog_ng.kafka.KafkaDestination' [2018-11-30T12:20:51.478667] If you have reloaded syslog-ng, the JVM is not used anymore, but it is still running. If you want to stop JVM, stop syslog-ng and then start syslog-ng again; [2018-11-30T12:20:51.478680] Error initializing message pipeline; plugin_name='java', location='#buffer:4:3' Here are my configuration files: /etc/syslog-ng/syslog-ng.conf : @version: 3.18 @include "scl.conf" # Syslog-ng configuration file, compatible with default Debian syslogd # installation. # First, set some global options. options { chain_hostnames(off); flush_lines(0); #use_dns(yes); #use_fqdn(yes); use_dns(no); use_fqdn(no); #owner("root"); #group("adm"); perm(0640); # The default action of syslog-ng is to log a STATS line # to the file every 10 minutes. That's pretty ugly after a while. # Change it to every 12 hours so you get a nice daily update of # how many messages syslog-ng missed (0). stats_freq(0); stats_level(1); # The default action of syslog-ng is to log a MARK line # to the file every 20 minutes. That's seems high for most # people so turn it down to once an hour. Set it to zero # if you don't want the functionality at all. mark_freq(0); bad_hostname("^gconfd$"); threaded(yes); log_msg_size (16384); log_fifo_size (8000000); # # Time management: # # The goal is to have in logs the original date and time set by # the syslog which effectively receives the message # and that this timestamp is not changed during it's journey # to reach the database. # # To achieve that all clients are configured with keep_timestamp(no) # for syslog-ng marks when it receives the message. # This is done globally because system() and internal() sources are not # both supporting this keep_timestamp option when the syslog() source # used for networked syslog support that option. This options is forced to "yes" # in remote source(). keep_timestamp(no); # # Time precision is increased to microseconds frac_digits(6); # And format is set to "iso": YYYY-MM-DDThh:mm:ss.0123456+02.00 # where "T" is the separator between date and time # and +02.00 GMT decay (as an example) ts_format("iso"); }; ######################## # Sources ######################## # This is the default behavior of sysklogd package # Logs may come from unix stream, but not from another machine. # source s_src { system(); internal(); }; # Log from any hosts around # source s_net { network( transport("tcp") port(601) flags("syslog-protocol", "threaded") keep_timestamp(yes) max-connections(250) log_iw_size(25000) log_fetch_limit(100) flags("threaded") ); }; source s_app_logs { file(/opt/test.log,flags(no-parse),follow_freq(1),keep-timestamp(yes)); }; ################################################################################################# # Destinations ################################################################################################# destination d_messages { file("/var/log/messages"); }; destination d_clients { file("/var/log/syslog_clients/$HOST"); }; destination d_clients_all { file("/var/log/syslog_clients/all_logs"); }; @include "kafka_destination.conf" ################################################################################################# # Destinations ################################################################################################# log { source(s_net); # destination(d_clients); }; log { source(s_src); # destination(d_messages); }; ################################################################################################# # Filters ################################################################################################# filter f_no_debug { not level(debug); }; --------------------------------------------------------------------------------------------- kafka_destination.conf: @module mod-java @define java-module-dir "/usr/lib/syslog-ng/3.18/java-modules:/usr/lib/syslog-ng/3.18/java-modules/:/usr/lib/syslog-ng/3.18/java-modules/*.jar" destination d_kafka { kafka( client-lib-dir("/usr/lib/syslog-ng/3.18/java-modules:/usr/lib/syslog-ng/3.18/java-modules/:/usr/lib/syslog-ng/3.18/java-modules/*.jar:/etc/syslog-ng/kafka_libs/libs/:/etc/syslog-ng/kafka_libs/libs/*.jar:/etc/syslog-ng/kafka_libs/libs") #client-lib-dir("/usr/lib/syslog-ng/3.18/java-modules/:/usr/lib/x86_64-linux-gnu/") kafka-bootstrap-servers("10.11.12.129:9093,10.11.12.130:9093, 10.11.12.131:9093,10.11.12.132:9093,10.11.12.133:9093,10.11.12.134:9093") topic("roma") properties-file("/etc/syslog-ng/kafka_producer.properties") ); }; log { source(s_app_logs); destination(d_kafka); }; --------------------------------------------------------------------------------------------- Directory /etc/syslog-ng/kafka_libs/libs is in fact "libs" directory from kafka_2.11-2.1.0.tgz which comes from http://kafka.apache.org/downloads. Here is the content of this libs directory : ls kafka_libs/libs/ -1 activation-1.1.1.jar aopalliance-repackaged-2.5.0-b42.jar argparse4j-0.7.0.jar audience-annotations-0.5.0.jar commons-lang3-3.5.jar compileScala.mapping compileScala.mapping.asc connect-api-2.1.0.jar connect-basic-auth-extension-2.1.0.jar connect-file-2.1.0.jar connect-json-2.1.0.jar connect-runtime-2.1.0.jar connect-transforms-2.1.0.jar guava-20.0.jar hk2-api-2.5.0-b42.jar hk2-locator-2.5.0-b42.jar hk2-utils-2.5.0-b42.jar jackson-annotations-2.9.7.jar jackson-core-2.9.7.jar jackson-databind-2.9.7.jar jackson-jaxrs-base-2.9.7.jar jackson-jaxrs-json-provider-2.9.7.jar jackson-module-jaxb-annotations-2.9.7.jar javassist-3.22.0-CR2.jar javax.annotation-api-1.2.jar javax.inject-1.jar javax.inject-2.5.0-b42.jar javax.servlet-api-3.1.0.jar javax.ws.rs-api-2.1.1.jar javax.ws.rs-api-2.1.jar jaxb-api-2.3.0.jar jersey-client-2.27.jar jersey-common-2.27.jar jersey-container-servlet-2.27.jar jersey-container-servlet-core-2.27.jar jersey-hk2-2.27.jar jersey-media-jaxb-2.27.jar jersey-server-2.27.jar jetty-client-9.4.12.v20180830.jar jetty-continuation-9.4.12.v20180830.jar jetty-http-9.4.12.v20180830.jar jetty-io-9.4.12.v20180830.jar jetty-security-9.4.12.v20180830.jar jetty-server-9.4.12.v20180830.jar jetty-servlet-9.4.12.v20180830.jar jetty-servlets-9.4.12.v20180830.jar jetty-util-9.4.12.v20180830.jar jopt-simple-5.0.4.jar kafka_2.11-2.1.0.jar kafka_2.11-2.1.0.jar.asc kafka_2.11-2.1.0-javadoc.jar kafka_2.11-2.1.0-javadoc.jar.asc kafka_2.11-2.1.0-scaladoc.jar kafka_2.11-2.1.0-scaladoc.jar.asc kafka_2.11-2.1.0-sources.jar kafka_2.11-2.1.0-sources.jar.asc kafka_2.11-2.1.0-test.jar kafka_2.11-2.1.0-test.jar.asc kafka_2.11-2.1.0-test-sources.jar kafka_2.11-2.1.0-test-sources.jar.asc kafka-clients-2.1.0.jar kafka-log4j-appender-2.1.0.jar kafka-streams-2.1.0.jar kafka-streams-examples-2.1.0.jar kafka-streams-scala_2.11-2.1.0.jar kafka-streams-test-utils-2.1.0.jar kafka-tools-2.1.0.jar log4j-1.2.17.jar lz4-java-1.5.0.jar maven-artifact-3.5.4.jar metrics-core-2.2.0.jar osgi-resource-locator-1.0.1.jar plexus-utils-3.1.0.jar reflections-0.9.11.jar rocksdbjni-5.14.2.jar scala-library-2.11.12.jar scala-logging_2.11-3.9.0.jar scala-reflect-2.11.12.jar slf4j-api-1.7.25.jar slf4j-log4j12-1.7.25.jar snappy-java-1.1.7.2.jar validation-api-1.1.0.Final.jar zkclient-0.10.jar zookeeper-3.4.13.jar zstd-jni-1.3.5-4.jar ---------------------------------------------------------------------------------------------- Here is "syslog-ng --version" inside the conainer: root@e1ca5fdc292a:/# syslog-ng --version syslog-ng 3 (3.18.1) Config version: 3.18 Installer-Version: 3.18.1 Revision: 3.18.1-1 Compile-Date: Oct 24 2018 10:04:41 Module-Directory: /usr/lib/syslog-ng/3.18 Module-Path: /usr/lib/syslog-ng/3.18 Available-Modules: add-contextual-data,afamqp,affile,afmongodb,afprog,afsmtp,afsocket,afsql,afstomp,afuser,appmodel,basicfuncs,cef,confgen,cryptofuncs,csvparser,date,dbparser,disk-buffer,examples,geoip-plugin,geoip2-plugin,graphite,hook-commands,http,json-plugin,kvformat,linux-kmsg-format,map-value-pairs,mod-java,mod-python,pacctformat,pseudofile,redis,riemann,sdjournal,snmptrapd-parser,stardate,syslogformat,system-source,tags-parser,tfgetent,xml Enable-Debug: off Enable-GProf: off Enable-Memtrace: off Enable-IPv6: on Enable-Spoof-Source: on Enable-TCP-Wrapper: on Enable-Linux-Caps: on Enable-Systemd: on root@e1ca5fdc292a:/# mod-java is there but that's not enough and I can"t understand what I am missing. ------------------------------------------- Packages: root@e1ca5fdc292a:/# dpkg -l | grep kafka ii librdkafka-dev:amd64 0.8.5-2 amd64 library implementing the Apache Kafka protocol (development headers) ii librdkafka1:amd64 0.8.5-2 amd64 library implementing the Apache Kafka protocol ii syslog-ng-mod-kafka 3.18.1-1 amd64 Enhanced system logging daemon (Kafka destination) root@e1ca5fdc292a:/# If information are missing, please tell me. If you have some clue, that would be really appreciated. Best regards, mathias
participants (1)
-
Mathias Dufresne