GET 24/7 LIVE HELP NOW

Announcement

Announcement Module
Collapse
No announcement yet.

large files with mk-query-digest

Page Title Module
Move Remove Collapse
X
Conversation Detail Module
Collapse
  • Filter
  • Time
  • Show
Clear All
new posts

  • large files with mk-query-digest

    I have 2 slow query log files. One is 6GB, the other is 20GB. I'd like to run them through mk-query-digest, any suggestions? I've got one running already for 1+ hours.

    Thanks

  • #2
    You can split up the source files using the 'split' command, run mk-query-digest on the split files to produce one report per file, then use http://www.maatkit.org/doc/mk-merge-mqd-results.html to merge the reports together.

    Comment


    • #3
      Thanks, the split command worked well.

      I now have 76 250MB files.

      I processed each of them through mk-query-digest using the --save-result option which generated 76 .gz files.

      I then ran mk-merge-mqd-results res1.gz ... res76.gz, but it fails with this error:

      Error merging class/sample: undefined min value at ./mk-merge-mqd-results line 2788

      I guess there are problems with some of the files, as ultimately the composite log is still generated.

      Comment


      • #4
        That seems to be a bug. Can you generate a reproducible test case and submit it to Maatkit's bug tracker at http://code.google.com/p/maatkit/issues/list ?

        Comment


        • #5
          I'm not sure how to reproduce a test case. I have 76 files, it fails on some of the files. I have identified one specific file that fails and can reproduce the error when merging up to that file.

          Comment

          Working...
          X