GET 24/7 LIVE HELP NOW

Announcement

Announcement Module
Collapse
No announcement yet.

Archiving data over date range.

Page Title Module
Move Remove Collapse
X
Conversation Detail Module
Collapse
  • Filter
  • Time
  • Show
Clear All
new posts

  • Archiving data over date range.

    Hello Everyone,

    I am looking at a better solution for archiving large amounts of data over a date range. At the moment I am using a script that I created that creates a temporary table and inserts data into it using the following query:

    INSERT INTO temp_candata SELECT * FROM candata WHERE timestamp between '2011-02-01 00:00:00' and '2011-03-01 00:00:00'

    Then I use mysqldump to dump the table, tar it and then delete it.

    I know this method is not elegant or optimal. The time it takes to archive is much longer than I think it could be.

    I had a quick look at mk-archiver but I am unsure of how to get the data from a specific date range.

    Can anyone suggest a way I could do this?

    Thanks in advance

  • #2
    http://www.maatkit.org/doc/mk-archiver.html
    As you can see, there is a --where option that can be set to "timestamp between '2011-02-01 00:00:00' and '2011-03-01 00:00:00'". Try --dry-run first if you are experimenting.

    Comment


    • #3
      Thank you for the reply.

      When I run

      xxxx@xxxx:~/mysql_files$ mk-archiver --source h=localhost,D=xxxx,t=datavalue --purge --where 'timestamp < '2011-03-01 00:00:00''

      I get the output:


      Usage: /usr/bin/mk-archiver --source DSN --where WHERE

      Errors in command-line arguments:
      * Unrecognized command-line options 00:00:00

      mk-archiver nibbles records from a MySQL table. The --source and --dest
      arguments use DSN syntax; if COPY is yes, --dest defaults to the key's value
      from --source. For more details, please use the --help option, or try 'perldoc
      /usr/bin/mk-archiver' for complete documentation.

      Comment


      • #4
        Your string ends after the space after < because you end it with a '. Look into escaping.

        Comment


        • #5
          That worked.

          Thank you

          Comment

          Working...
          X