So I was setting up this new VM and spend almost my whole day in copying my files from previous installation to this new one… Now, as for me… I am a little curious for trying up new things to see how they act out. Sometimes I don’t even see that I am using my actual machine to try those stuff out… And VMs end up crashing, as of me hindering with those config files at a regular intervals. So, I wanted there to be a system that whenever I install a new VM. I could easily just copy paste my previous machine state to my new state.
Pro Tip: Yeaps I tend to use a lot more VMs than to setup a whole new system, where I have some constant data of more than 100GB including games, torrent, tools etc. F. 100GB backups to cloud!! If those 100GB things could stay constant on host, then crashing VMs will not be a problem for me lol.
Also, there may be a feature to auto create a snapshot of a VM in VMware at regular intervals, but I am pretty sure I am not aware of it (not yet). And, I am ok to not work with a whole file system image and then carrying it over to my new devices via pendrives, HDDs etc. (coz again… uploading common OS files(/bin, /run, /proc) to cloud is just waste of data and cloud storage).
And hence, I now schedule automatic backups of important files to my private repo on github.
The 1337 Stuff
Starting with creating a private repo on github(It’s entirely your choice if you want them to be public). I created one for me…
Then clone the repo, to your previous installation, to copy the backup files into that repo. I use /opt directory by default to keep my backups and other stuff.
As you see… It asked for a username and password because it was a private repo on my github.
Then move/copy all your important files/directories into the cloned directory… I generally copy my config files, other automation scripts, and TODO lists (just in case I have a bunch tasks on my table and suddenly my VM crashes and I have no way to recover those tasks) So that’s that.
Great step 1 complete… Now time to code our backup script. For ease of read, I won’t be going in detail of each step inside the script(most is self-explanatory).
I just like to name my every commit for a private repo by the sync date and time. And this is it for my auto-backup script. Make sure to give it executable permissions.
chmod +x auto-backup.sh
***Note: For the first time when you commit to a new repo on github via git push, you need to enter your username and password. So before running this script add your creds to git so that it can default to those for future logins. You can do that by running the following commands***
git config credential.helper store
git config --global user.name 'Your username'
git config --global user.password 'Your password'
git config --global user.email 'Your email'
After running the above commands… All you gotta do is STEP 3. Add your script to crontab, such that it executes at every reboot. To add the crontab entry for a user, run:
crontab -e -u username
If this is your first time you might be asked to choose between editor to use. Choose your pick, add a entry like the following.
This will run auto-backup, as soon as the system boots up… To be more sophisticated in handling memory you can give a sleep time to crontab so that it doesn’t pressurize backing up, while the system boots up.
You can list the cronjobs for your user by the following command:
That’s it, and…
Now, all I have to do is copy the files that I needed to auto-backup, into /opt/j4x0n folder, and it will be auto-backed up at every reboot.
If you still got any queries feel free to reach out to me. I’ll be happy to help ;) Also, let me know if you want me to put more blog posts like this on automation scripts or my home lab creation(software only). I’ll be glad to share that too. Thanks for reading.