On Jul 22, 2009, at 3:17 PM, Clayton Dukes wrote:
Hi folks, I'm thinking about completely re-writing my app (php-syslog-ng) in order to provide high end scalability and a better (ajax) front end.
At the base of it, my biggest concern is scalability. I want to write the backend to be able to handle many thousands of messages per second.
In your opinion, what is the best way to accomplish this? Should I: Log everything to MySQL directly? If so, can MySQL handle that insert rate?
It can. I have perl based analyzers that eat a feed from syslog-ng and insert into mysql.
Log everything to disk and use flat files to do analysis and reporting? Log everything to disk and use MySQL's load_data_infile?
Other suggestions?
Also, is anyone out there using MySQL for this level of insert rates? Do you have any recommendations on table structures, indexes, etc.?
I'm at 16 million rows for today already, and it's only 10 am here. Look into bulk insert formats. Keep in mind that mysql updates indexes after every insert completes, so row by row inserts will choke. Also, insert times increase slightly as the table grows, so you'll see a performance hit late in the day. You want to keep indexes as light as possible for this. - billn