Hmmm - I have done some testing but not settled on the perfect solution (yet).

- I have multi-line logs being parsed by a program destination and then writing to mongodb (perl parser)
- I have started testing straight mongo, but getting the right data in the right fields seems important
- I have done very basic testing with syslog-ng --> redis which I am planning on then --> elasticsearch
   (I may need to stick Logstash in between redis and elasticsearch)

In general, any relational database (SQL) wins when you can represent the relationships in the schema.

"nosql" like mongo typically wins with unstructured data, but at a space penalty (needing to store json format)

If I had to pick one *right now* I would probably use syslog-ng --> redis --> logstash --> elasticsearch --> kibana
The (R) ELK stack has a lot of support and development, and is pretty close to a free splunk.

Although I can see using a sharded/replicated mongodb having some basic advantages, but I have not (yet) found the perfect way to do ad-hoc queries against the store.

Good luck (and report back!)

Thanks,
Jim


On 08/18/2014 05:06 PM, VMI X wrote:
Hi,
Currently we're logging everything to text files for a few LAN clients.
We're considering using a database instead and have a few questions to help us decide:
  • Would a database be a good option to replace existing text files for long term storage considering storage space?
  • Would mongo OR mysql be better suited for storing system logs? 
I understand answers to these questions can vary depending on specific use case but seeking a general recommendation to see what's typically being used and what the most stable/supported options would be.


--
Nullius In Verba


______________________________________________________________________________
Member info: https://lists.balabit.hu/mailman/listinfo/syslog-ng
Documentation: http://www.balabit.com/support/documentation/?product=syslog-ng
FAQ: http://www.balabit.com/wiki/syslog-ng-faq