EmergencyEMERGENCY? Get 24/7 Help Now!

Poor man’s query logging

Posted on:



Share Button

Occasionally there is a need to see what queries reach MySQL. The database provides several ways to share that information with you. One is called general log activated with


in MySQL 5.1+) start-up parameter. The log writes any query being executed by MySQL to a file with limited amount of additional information. The other is slow log enabled by

parameter (MySQL 5.1 requires also

), which was designed to store poorly performing queries that run at least 2 seconds. Percona actually extended the slow log to, among others, include any query regardless of the execution time.

The problem is that for both you need to prepare earlier either by enabling the logging before starting the database instance or, even more work, by applying the patch and rebuilding the entire database from sources.

I know that many databases out there run with none of these and it would require a restart to get the logging in place and possibly another restart to disable it when no longer necessary (though actually slow log can be disabled by simply setting

MySQL variable vale high enough).

So what can be done when you really need to see the queries, but can’t afford any downtime?

If you are a privileged user (i.e. root), you can use tcpdump on a database server to take a peek into a network stream and filter for packets that go to MySQL. Those packets contain queries. Here’s my quick one-liner which I will write in multiple lines:

The output may contain little garbage, but it can be easily filtered out.

Obviously this method works only when applications communicate with MySQL through TCP sockets. When localhost (not to be confused with is used as a MySQL host, this will not work since all traffic goes through a unix socket file.

It’s most definitely not a MySQL log replacement, but can be very useful if you need just a few minute dump.


Share Button

Maciej Dobrzanski

Maciek is a former Percona employee. An IT consultant with the primary focus on systems, databases and application stacks performance and scalability. Expert on open source technologies such as Linux, BSD, Apache, nginx, MySQL, and many more. Co-author of dba square - a blog about how to manage, scale, and optimize MySQL performance!


Insight for DBAs

  • These issues are generally solved in 5.1.

    I have a little sql script that I run on occasion when I want to see what’s going on:

    TRUNCATE TABLE mysql.general_log;
    SET GLOBAL log_output = ‘TABLE’;
    SET GLOBAL general_log = ‘ON’;
    SET GLOBAL general_log = ‘OFF’;

    If I need to narrow or widen the logging session, I just adjust the sleep time accordingly. I prefer to log to table instead of file because I don’t have to have file access on the server to see the log. I can just log into a mysql shell with a SUPER account and search the log with standard SQL syntax.


  • There is also MySQL Query Sniffer. Have anyone tried it ?

    It looks like it was not updated for a while.


  • This perl script is working fine Peter, I am using it since some time because of the reasons exposed in the post (production environment with general query logging disabled, no downtime possible, poor granularity with the slow query logging on unpatched MySQL servers). You just need libpcap, and some Perl modules (installable easily via the CPAN).

    By the way, thank you a lot for all the information you are sharing on this blog!


  • Thanks Charles,

    Some software is indeed working well even if it was not updated for a while :)


  • There is also a much more fully-featured TCP protocol interpreter in Maatkit’s mk-query-digest. It understands the compressed protocol, the binary protocol, the status flags, and a lot of other stuff. You can use its –print output to print out the TCP traffic in slow-query-log format, and then analyze it as you usually might with mk-query-digest.


  • Hi Maciej,

    I Need to log super user login with time stamp in mysql,if you know something kindly share it..


Leave a Reply

Percona’s widely read Percona Data Performance blog highlights our expertise in enterprise-class software, support, consulting and managed services solutions for both MySQL® and MongoDB® across traditional and cloud-based platforms. The decades of experience represented by our consultants is found daily in numerous and relevant blog posts.

Besides specific database help, the blog also provides notices on upcoming events and webinars.
Want to get weekly updates listing the latest blog posts? Subscribe to our blog now! Submit your email address below and we’ll send you an update every Friday at 1pm ET.

No, thank you. Please do not ask me again.