GET 24/7 LIVE HELP NOW

Announcement

Announcement Module
Collapse
No announcement yet.

How to restore xbstream compressed backup

Page Title Module
Move Remove Collapse
X
Conversation Detail Module
Collapse
  • Filter
  • Time
  • Show
Clear All
new posts

  • How to restore xbstream compressed backup

    I have a backup created via stream method to xbstream, and, compressed. The output is of course a single xbstream file, which the whole thing can be extracted via xbstream -x. But, several things are missing here:

    1. Is there any way to simply get a listing of all files in the file?
    2. After running xbstream, the output is all of the compressed files, each of which can be extract via qpress. But, let's say all I have is the xbstream file. Is there any way I can restore a single file from it, without first running xbstream, and then qpress on one of the many extracted files?

    Seems much less convenient than tar. But, maybe there are some methods I am missing here.

  • #2
    I'm curious if you've figured this one out. Restoring a compressed backup (independent of xbstream) couldn't be this painful. Please let me know if you've either figured this out or if anyone has responded to you.

    Thanks!

    Comment


    • #3
      Have not heard a word from anyone. It seems WAY less useful.

      Comment


      • #4
        This, combined with tar failing above 15 GB had made xtrabackup 2.0.0 really a big pain for me. I can't seem to get responses from anyone either.

        Comment


        • #5
          I wish I had an answer to your problem. But as soon as I encountered the issue with tar failing after so large of a database, I tried the compress option, without stream.

          So what I am using is this:

          innobackupex -slave-info -parallel=2 -compress -compress-threads=2 /backups/

          I am having issues find an easy way to restore this other type of compressed backups as well. The compressed files are compressed using an archiver called qpress. I have never heard of this archiver before playing around with Xtrabackup. Why qpress? gzip anyone?

          The --copy-back option fails because it can't find the uncompressed versions of files when it goes to copy them back, so you have to "unqpress" them before copying them back, which is not a very clean task. This also defeats the purpose of having compressed files on an optional mount location or across network.

          So does anyone have a good recipe yet for compressed backups of databases larger than 15G with XtraBackup?

          Comment


          • #6
            Without xbstream really isn't much different. xbstream merely adds a simple step to the restore, namely, xbstream -x. Which gives you the same scenario you are trying to resolve with qpress, which appears too limited.

            So, I had just done a restore to a machine. Had to

            1. xbstream -x
            2. Write a small script to visit each directory and qpress -d each .qp file. Of course, I had to FIND qpress first, and download and install it.
            3. apply-log since this whole process is way too hard and time consuming to do once it's compressed, daily.
            4. Stop MySQL
            5. copy-back
            6. Copy out stuff I did not restore (mysql database) since the directory must be empty
            7. chown everything in mysql datadir
            8. Start mysql

            Worked fine. Ideally, I'll need to write a script to do the whole thing so it's one command to do the restore. Definitely way harder than a simple tar -xzf.

            I'll have to re-visit this process and see if I might avoid the compress (and thus qpress), though, when you send off site, it's sort of nice since it's transfers faster. I can see there might be ways to stream to gzip or other such techniques, will have to try them one day.

            Comment


            • #7
              I've fallen back to xtrabackup 1.6.5 using tar piped into gzip. It takes a bit longer to backup, but the resulting tarball is smaller and is a single command to extract. Once they get these issues worked out in a newer release, I might upgrade.

              I'd love to see if someone who has a support contact can get a response out of Percona regarding these issues. It's pretty frustrating.

              The tar stream issue is known in 2.x but won't be fixed until the next release which is due in June.

              Comment


              • #8
                Statula, your steps are the exact same as mine as well. I have a 60G (and growing by around 1G a week) database, so the compression is very favorable when saving backups off to an alternate backup mount.

                The thing about qpress which isn't ideal is that it apparently doesn't decompress the file in place, like gzip does. So in your script, you will also have to run an additional command to rm the *.qp file once its successfully decompressed. I used a command like the following:

                find $QPLOCATION -name "*.qp" -exec qpress -dv {} `dirname {}` \; -exec rm {} \;

                This command has to be ran for each $QPLOCATION (where any .qp files are at)

                Comment


                • #9
                  More importantly, in a disk space constrained environment, this is simply unworkable. On average, I also found that qpress compression was roughly 15-20% larger than gzip. That said, the qpress method was significantly faster because it could be made parallel.

                  Comment


                  • #10
                    I am going to experiment with the stream between machines method gzipping before sending across the net, ungzip after, and, seeing how fast that is. In that way, the network traffic would still be compressed. And, it might avoid the tar issue, not sure.

                    Comment

                    Working...
                    X