Photos are now digital items that have great value because they are unique. Nowadays we are flooded by the number of photos we take and maybe you never look at them anymore. But knowing your memories are somewhere on your hard-drive feels reassuring, you can look at them when you want. And if you are a professional photographer, knowing that all hard drives will fail, it’s just a question of time, you can’t allow yourself to loose your hard work. I don’t say my backup strategy for my collection of photos is the best one but I just give some ideas and good practices that work well for me.
First I try to respect the 3 backup basic rules:
For the 2 first rules, I have my original photos and videos on a NAS (network attached storage) and a local copy on USB drives attached to it or another self made NAS. I use a Synology Nas for my live work and to manage my backups locally and an old desktop tower where I have multiple hard drive bays to store local backup and manage remote backups. I installed Openmediavault on it, an opensource NAS linux solution.
Now for a more technical part, I use Western Digital WD Red hard drives as I only had one disk failure in 7 years for a 24h/24 and 7d/7 use and I was after an unplanned powercut. On the Synology I use the Hyperbackup tool to do my local backup as it’s very easy to manage and plan backups. I don’t use it for remote backups in the cloud because the cloud options are too limited. On the Openmediavault NAS, I use Duplicati, an opensource backup tool with many cloud options.
Now, let’s talk about my remote backups in the cloud. I told you I use Duplicati and one of it’s backup option for the cloud is Rclone, a command line linux backup utility with countless cloud options. So I use Duplicati as a kind of GUI for Rclone… In many cases, Rclone is more efficient than clouds proprietary protocols. Duplicati is not as easy as Hyperbackup to manage the backups but it’s not too difficult… and the AES 256 bits encryption is very easy to set up. Now, what clouds do I use for these backups? first, I have a Pcloud 2 To lifetime plan. I advise you to wait for a special offer like black friday to buy this plan as the normal price is quite expensive. I also use Mega.nz as an additional cloud. Living in China, I have a 10 To cloud plan with Baidupan and it’s really the best offer you can get on the market for the capacity. But I can’t use directly my backup solutions with it. Now I’m testing the 5To Polarbackup solution, I will update this page when I have more data on it.
Now for my backup plan, I planned an incremental daily backup which runs at night. For the most recent work, I also use some synchronize functions to another computer and cloud, managed by my Synology with the CloudStationServer App (local), and Cloudsync app (cloud), the last can encrypt data and is Baidupan compliant…
So that’s it for my regular backups. But as I’m a little paranoid with my data, I also do another backup on harddrives who had failures (instead of throwing them)! Yes! but only if there has not too many bad sectors and I keep them offline in a box. I first repair the bad sectors with EaseUS Partition Master app. This app marks the hard drives bad sectors so that they can’t be accessed anymore. After this I create my backups archives with Winrar and a data protection parity check of 10 percent. Normally, I will only use these disk in a very unlikely disaster recovery scenario so they will be read maybe only once after the data is written. Even if with time, some sectors turn bad, I hope the 10 percent data recovery option is enough.
I know all of this article may seem very technical for people who are not used to this so I you have questions, don’t hesitate to contact me.
None found
Leave a reply