Search code examples
mysqlfluentd

mysql general log generates a lot of data so that it burdens the server


My IT team recently activated the MySQL general log to meet regulatory requirements. However, we’ve encountered significant performance issues:

  • After 20+ hours of activation, the general log generated over 3 million rows.
  • Today, when my team attempted to query the general log data, our application failed to load the logs.
  • Our Grafana dashboard showed 100% CPU usage during the issue.

Our database server has the following hardware specifications:

  1. vCPU
  2. GB RAM

It’s clear that the current hardware is insufficient to handle the load caused by the general log. I’ve explored alternatives like using Fluentd to forward logs, but I’m unsure if this is the right approach.

Here my current mysql general log configuration

general_log_file       = /var/log/mysql/mysql.log
general_log            = 1
log_output             = 'table'

Here are my questions:

  • Should I consider upgrading the server hardware, or is there a better way to manage the general log efficiently?
  • Are there any best practices or tools that can help offload and process the MySQL general log data without impacting server performance?

Solution

  • I don't recommend saving the general log into tables as it will have a significant impact on the performance of MySQL. You should set the log_output to File to store the general log in files. You can use some log tools to collect it in real time and then analyze it instead of querying the mysql.general_log of MySQL. Both reading from and writing to the mysql.general_log will consume I/O cpu and thus affect the performance of MySQL.