I recently had to do some customer work involving the McAfee MySQL Audit Plugin and would like to share my experience in this post.
Auditing user activity in MySQL has traditionally been challenging. Most data can be obtained from the slow or general log, but this involves a lot of data you don’t need too, and isn’t flexible at all. The specific problem of logging failed connection attempts has been discussed on a previous post in our blog.
Starting with 5.1, the new plugin API gives us more flexibility by allowing users to extend the server’s functionality with their own code, and this is what the McAffee plugin does.
Installation and configuration are straightforward following the available instructions. The only extra step I had to take was to extract the offsets for the Percona Server version I was using for the test (5.5.28-29.1). This is needed as the plugin needs the offset to some MySQL data structures that, the plugin authors say, aren’t exposed by a consistent API. If you also need to do this, the details are clearly explained here.
The plugin writes its output in json format, and supports writing it directly to a file, or to a unix socket, which means you can write a script to listen on this socket and process the audit records as you wish.
Performance-wise, I did basic tests on the VM I was working in and didn’t get significant differences between either output option, or between using the plugin or enabling the general log. Bear in mind these were basic tests (just a few mysqlslap runs with increasing levels of concurrency), but initially, I would think the advantage of the plugin is its flexibility, and not its performance, which seems to be on par with having the general log enabled.
The flexibility comes from the three variables that can be set to control what is logged by the plugin:
– audit_record_cmds : This is the list of commands you want written to the log (all the lists in these variables are comma separated). As pointed here, anything that would generate a write to the general log will be sent to the plugin, and you can control if it gets written on not with this list. I tested this with “connect,Quit” to log successful and failed connections. Yes, it had to be a capital Q in Quit for that to work, and no, my code-fu was not enough to understand why that is the case. Maybe someone more knowledgeable in MySQL internals can enlighten me here.
– audit_record_objs : List of database objects (tables, according to the docs) for which you want events written to the log.
– audit_whitelist_users : This one is undocumented on the wiki at the time of writing, and is a list of users for which you do not want events written to the log.
Just for reference, these are the lines I had to add to my config file for the plugin to work (plus one commented line for switching between file and socket for output):
audit_offsets=6464, 6512, 4072, 4512, 104, 2584
Notice the audit_offsets that I mentioned had to be extracted due to this Percona Server version not being included in the binary.
And here’s a few sample output lines generated by the plugin with this configuration:
In conclusion, the plugin API seems to be opening new possibilities of extending MySQL’s behavior in a way that, once set up, is transparent to users, and the McAfee MySQL Audit Plugin is only one of example of what can be achieved with it. It is a very good one for me, since I think proper audit trail support has been an important missing feature on the server, which has made using MySQL in PCI or SOX compliant environments, to name just two, artificially complicated, as one had to rely on too much info (general log) or external help (snort or similar IDS).
Percona’s widely read Percona Data Performance blog highlights our expertise in enterprise-class software, support, consulting and managed services solutions for both MySQL® and MongoDB® across traditional and cloud-based platforms. The decades of experience represented by our consultants is found daily in numerous and relevant blog posts.
Besides specific database help, the blog also provides notices on upcoming events and webinars.
Want to get weekly updates listing the latest blog posts? Subscribe to our blog now! Submit your email address below and we’ll send you an update every Friday at 1pm ET.