For a while now, I have been using my shared hosting to backup/store numerous image files. Retrieving those files over the internet has always been a pain. Especially when there are hundreds of files in a given folder. The end goal is to be able to download the archived folder to my computer for easy viewing.
So, I took the time to code something up. Here’s the psuedo code:
- zip folder
- rename zip to include date stamp
- move zip to another destination
- delete contents of folder for future backups
- give myself a high five
Of course, I Google’d and looked for an easy way out… to see if some has posted the exact code I was looking for. No such luck but here are the links that were handy:
What I learned:
- Cron Jobs on shared web hosting gives us the ability to run automated shell level scripts.
- Cron Jobs coding was essentially Unix, nice I can code Unix!
Once I got started, I did run into a few Cron Jobs specifics:
- you have to escape the ‘%’ sign with a ‘\’ in a cron job, http://www.webhostingtalk.com/showthread.php?t=678062
- bonus: && operator, helped me reduce my script to one line, http://forums.freebsd.org/showthread.php?t=30964
Nevertheless, here’s the script that will hopefully help someone else looking to compress and backup a whole directory then delete its contents:
zip /destination/of/directory/with/date/label/$(date +\%Y\%m\%d).zip /directory/to/be/zipped/* && rm /directory/to/be/zipped/cleared/*