Retrieving Akeeba Backups from S3 on a space-limited website

Nicholas Dionysopoulos' Akeeba Backup is a fantastic tool, but what do you do if you're short of space on your hosting? If you're using 4GB of a 5GB limit, you won't have space to store the generated backup file.

The issue of configuring Akeeba to backup to S3 is well covered (Hint: if space is a real issue, enable 'Process each part immediately' and 'Delete archive after processing'), but what happens if you want to retrieve the backup for use elsewhere (perhaps to test Jupgrade on a localhost?). If your backup is 3.8GB you won't be able to use the 'Fetch back to server' option as you lack the space. Do you really want to sit and click each of the Part links to download the files?

This documentation details an easier route to downloading all the parts in one fell swoop (so easy, in fact, it's incredibly obvious when you think about it!).

If you haven't already, follow my instructions on Syncing your files with an S3 account on Linux.

Now, it's pretty simple to retrieve the files. We'll assume the following

  • S3 Bucket: mybucket
  • Directory: site-backups
  • Filename format: site-www.bentaskercouk-{date}.jpa
  • Directory to retrieve to: /var/www

Depending on the size of the backup parts you've opted to use, you probably have tens if not hundreds of backup files to download. Thankfully, we can retrieve them all in one fell swoop


cd /var/www
s3cmd get s3://mybucket/site-backups/site-www.bentaskercouk-20120627.*
wget http://bit.ly/L1VEo4 && unzip kickstart-core-3.5.2.zip

You may need to alter ownership/permissions so that the user running your Apache (or whatever) instance can read/write


chown apache:apache *

Finally browse to your server with a web-browser and away you go!

 

Note: If you do enable 'Process each part immediately' and 'Delete Archive after processing', Akeeba may report your backup sizes incorrectly. Although you'll get a minor panic when you see a 10MB backup instead of a 4GB one, it's nothing to worry about!