My 2 cents (what works for you depends on your infrastructure, resources and capabilities)
I like the model where syslog-ng does all the following:
- writes text files of the raw data (that way - whatever your search head is can re-ingest files later using basically the same parsers)
- filters out highly false-positive prone data from being forwarded
- handles parsing of data elements (using patterndb or whatever) and sends specific information to a search engine (like Elasticsearch)
- forwards specific data (based on security use cases) to a SIEM
Whether you use Elasticsearch, mongo, splunk, or whatever is really up to you and your budget.
That said, I find syslog-ng to elasticsearch directly with kibana as the front end is *very* scalable for a search engine.
As far as a SIEM - it's kind of up to you.
Good luck,
Jim
---- Ivan Adji - Krstev <akivanradix@gmail.com> wrote:
Hi all,
What is the best practice for storing all those logs in one central
environment. I have one Linux Box running Syslog-NG with LogAnalyzer and
MongoDB ( for now ), and is the best way to configure and use it with
MongoDB or with MariaDB ( MySQL ) ? I have once install MySQL but it was
getting very slow as the logs getting bigger and bigger ( for one week ).
Now i have done with MongoDB ( still testing ) but i have problem as
LogAnalyzer does not show me the real pictures, i have no Date info, no
Facility, no serverity, Hosts, syslogtag, i just have ProcessID.
Any hints on this ?
I have the following configuration on the syslog-ng.cfg:
destination d_mongodb {
mongodb(
servers("localhost:27017")
database("logs")
# uri('mongodb://localhost/syslog-ng')
collection("syslog")
value-pairs(
scope("selected-macros" "nv-pairs" "sdata")
)
);
};
Kind regards
Ivan