There are few things sadder than getting an email from someone who has just lost all their photos because of a hard drive crash, lost laptop, fire, or even the proverbial lightning strike. By then it’s usually too late to do much about it. The time to deal with how you store and protect your digital media and memories is before something bad happens. Those of us who photograph professionally also have a business need for both accessible photo storage and reliable backups. Best practices change with the technology, and are affected by the size of your image library and your budget, but there are a number of basic principles and techniques that are well worth sharing.
Think of Your Image Library as a Multi-level Waterfall
Every image library has certain common elements. First, there is working storage–typically the hard drive or drives where you keep the images and videos you’re currently working on. Second, there should be at least one level of reliable local backups. Third, for disaster protection you need an offsite copy of your library. We’ll walk through each of those, and some tools for managing them, in turn.
Waterfall Level 1: Your Local “Working” Storage
In the simplest case, your working storage is the hard drive in your computer. Ideally you want to use an SSD for performance. However, as your image library grows, an SSD may be too expensive to house all your images. That leaves you with some options. You can live with the performance of a traditional hard drive, or manually pull over images to your local SSD and put them back onto a larger drive when you’re finished.
Recently, though, a third option has become available. With the advent of high-speed RAID solutions, you can actually get near-SSD speeds from a network attached server (NAS) or directly attached equivalent. If you’re going over the network, you’ll need to get a NAS that supports 10 Gigabit Ethernet. I’m able to get impressive performance with a Synology DS1517+. Qnap’s new entry-level 10-Gigabit units are another option.
If you’re connecting locally, you’ll want a high-speed Thunderbolt connection or something equivalent. This works because your NAS isn’t tying up your CPU doing disk access, and is able to read data from several disks at once, speeding up transfers. You can configure a RAID locally to get a similar effect, but it will put additional workload on the same CPU you’re using to catalog and edit your images. Note that dual Gigabit LAN ports don’t really help here. Even if they are bonded, a single client-server connection only runs up to 1Gbps. Since 10-Gigabit switches are still fairly expensive, I’ve got my high-speed NAS hooked up point-to-point over a private network to my main workstation, with the NAS using a separate Gigabit link to connect to everything else.
Waterfall Level 2: A Reliable Copy of Your Images
If you have a limited-size image library you may be perfectly happy simply adding it to your normal system backup routine (and you do have a system backup routine, don’t you?!). However, as libraries get larger, the overhead of backup software packaging all the files and storing them in its own format is painful. Instead, it is usually more practical to simply clone the library file-by-file to another location. That has the big additional advantage that you can easily browse the backup or restore individual files without having to crack open proprietary backup files.
There are a number of tools to help you do this. Two that I use on Windows are Allway Sync and GoodSync. Both are reasonably priced, well maintained, and offer plenty of options. You will need to make some decisions, though. First is whether you want the cloning to be automated. If it is, you don’t have to worry about forgetting. But you also risk having the system propagate your mistake if you accidentally mess up a file or folder. Second, you can either do a living clone, so deletions are propagated, or you can just let everything pile up on the backup system, so you can always find even images you’ve since deleted.
If you are using a NAS, like ones from Synology or QNAP for example, they also offer Sync utilities that work with a variety of client systems. Because my image library is quite important to me, I actually use two NAS units for my local backup copies. The first one is automatically synced with my working NAS, so I have additional redundancy. The second is only synced manually, which I do only when I’m fairly sure the library is in good shape. Essentially that gives me a Waterfall 2a and 2b. The good news for your budget is performance isn’t important for these units, so you can go with a low-cost NAS and inexpensive hard drives.
A Quick Note About RAID Arrays
Any place you have a hard drive with valuable information, it’s worth considering making it part of a RAID array. However, not all RAID levels offer protection from a drive failure. RAID 0, for example, simply spreads data across multiple disks for performance. So make sure whichever type of RAID you use provides the level of fault tolerance you’re looking for. NAS vendors have also developed their own RAID versions that often offer additional capabilities like dynamic expansion of an existing array, and are usually more user friendly than the traditional RAID levels.
There is also a new filesystem, btrfs, that enhances the more traditional ext4 file system with better error detection and even repair. It isn’t available on all NAS models, but is something to look for. I’ve just started to work with it, but it appears to offer help with the “bit rot” problem that plagues large image libraries.
Waterfall Level 3: Offsite Storage
The simplest kind of offsite storage is hand-carrying a drive or drives with all your images to a friend’s house every once in a while. Keep in mind that RAID drive sets (other than mirrored drives) are proprietary to a particular vendor and model, so you probably don’t want to rely on a set of them being readable long-term. Drobo promises its drives can be used in different models, so I’ve used a set of Drobo drives when my image library wouldn’t fit on a single drive. These days you can get 10TB hard drives, so for almost everyone a single drive is plenty.
Using the Cloud for Offsite Storage
There are a number of cloud vendors who will be happy to take your money to store your images. Most recently Adobe has gotten into the game by emphasizing Adobe Cloud in the new version of Lightroom. Figure on spending about $ 100 per terabyte per year. That adds up. But there are some ways to work around it. One I’ve been experimenting with is Amazon Drive. For $ 60 per year you get not just 1TB of storage, but unlimited photo storage (including RAW files). That’s pretty cool. We’ll see how long it lasts though, as one problem with the cloud is the plans keep changing. Microsoft and Google have both offered generous storage plans in the past, only to discontinue them later.
Of course you need to get your images to the cloud to take advantage of one of these systems. You’ll want to do the math about how long it will take you to upload all your images. You should also check whether your ISP has a data cap. I learned the hard way that Comcast has put a 1TB-per-month cap on most of its users. There are some hefty charges if you go over that. My workaround is to use the CloudSync utility that Synology provides for my NAS, but set it to a low enough bandwidth limit to stay under 1TB per month. The bad news is that means it will take nearly a year for all my images to be uploaded. I hope they don’t change their rate plan by then! In the meantime, I simply keep a set of drives offsite. Some cloud vendors may allow you to send them a drive to seed your storage, but that often comes with its own price tag.
Check out our ExtremeTech Explains series for more in-depth coverage of today’s hottest tech topics.
[Image credit: David Cardinal]