mikegreenimages

Mike Green's thoughts on landscape photography

Protecting your files

This article was prompted by a couple of friends – friends, that is, who’d lost image files – asking me to describe to them how I avoid this happening. It was considerably easier to write it down than to actually tell them, verbally. I hope the following description is useful and helps you in formulating a strategy for reducing risk and protecting your own photographs from the dangers of various forms of accidental loss.

The requirement
One of the benefits of digital photography over film, and one which seems to be mentioned rarely, is the ability to protect the photograph, relatively easily and from the point of capture, against loss or damage. This article is about how to achieve that. More precisely, the aim is to consider how image files can become ‘lost’, in various meanings of that word, and the type of thing you should consider doing to avoid this happening.

My own file protection system is rather comprehensive; arguably, it’s excessive, yet it:
1. protects me against the majority of incidents which could cause loss;
2. is largely automated;
3. and costs very little compared to the degree of upset I’d feel at losing my files.

It amounts to multiple copies of any given image, on multiple devices, in multiple places and in multiple formats. These are created at key points in the cycle, and with a minimum of what could be described as ‘effort’, after all, it’s not as if this is fun, is it? I’ve set my system up since I wouldn’t like to lose something which turns out to have been ‘important’, or perhaps ‘good’ – and it’s pretty certain that anything that I lost would at least seem to have been either important, good, or both!

What’s the point?
I consider losing files to be a very bad thing; you might be less concerned, or you might be less bothered by certain types of risk. For example, there’s a big difference between losing every single photo you’ve ever taken and losing the image from the last time you pressed the shutter button. Both those risks, and everything in between, can be reduced to near zero, but only you can decide whether the expense involved, in terms of money, setting up time, and any added steps in the whole ‘make a photograph’ process, is worth it.

So, this item considers largely the risks, with some suggestions on how to mitigate them; it’s not prescriptive as to exactly what you should do to achieve this, though I do explain my own system for the sake of illustration. Also, it deals with digital capture, though all those aspects relating to how to handle digital files applies to scanned negatives and transparencies too.

How to lose photographs
It’s way too easy to lose images, and the best way to work out what you, personally, need to do to avoid it is to work out what could happen, and how likely it is. Here’s a non-exhaustive list:

  • Physical loss or complete failure of the medium they’re stored on – the data card, the camera, the computer, or any intermediate devices.
  • Data corruption – possibly caused by a physical problem (dropping the camera into water, for example), but essentially involving an unreadable file which is therefore useless. This can happen to data cards with no intervention on your part whatsoever.
  • Mistakes – over-writing things, deleting the wrong file or directory; all whilst manipulating the images or reorganising them on a computer, or even on the camera.
  • Saving in a format which can’t later be read – it may be difficult, at best, to convert RAW files in a decade or so; it’s likely to be at least inconvenient, and quite possibly expensive too.

The outcome of all that is that you need to make your own list and work out what you need to do to mitigate, or completely avoid, each of the risks you’ve identified. Having done that, you then need to consider the effort and cost of doing each, and whether it’s worthwhile to you.

I strongly recommend that the answer to the ‘effort’ issue is to automate as much as possible, thereby removing said effort; and the answer to the cost question is to really sit back and imagine how upset, or even out of pocket, you’d be if something did go horribly wrong. The classic example of this is a ‘never again’ trip: relying solely on memory cards, with no backup whilst ‘in the field’, is probably a rather bad economy!

How to protect against physical loss and data corruption
Apart from the rather obvious solution of merely avoiding losing your camera, cards, computer, etc. this boils down to the relatively simple idea of making sure that you have more than one copy of everything, preferably more than two, and preferably in more than one place, and that the second copy comes into being as close to the point in time when the image is captured as is possible.

When out making photographs, I use a camera with two cards, and I copy the files from those cards onto a purpose-designed portable backup device every so often, particularly when I’m not going to be near my laptop in the near future. I then try to avoid keeping the device in my camera bag . That last point sorts out the ‘camera bag falls over cliff or into water’ scenario, at least in terms of not losing already-captured images. If I fall over a cliff myself, with the bag and the backup, I don’t imagine I’ll be focusing too heavily on the loss of images.

At home, my laptop has two hard drives and the files are copied to both in the first instance, and then automatically copied to several other hard drives scattered around my home network. Automatically, is the critical word here, and the other critical word is immediately. The problem with anything which is not at the very least automatic, and ideally immediate, is that there will be a time when you – well, I, certainly – will forget to do whatever it is that you’ve decided to do to protect your image data. Manually copying everything to an external hard drive is fine, so long as you remember to always do it. I don’t trust myself in that respect so I have a network attached storage device, or NAS, on my home network. This is basically an enclosure containing two hard drives and which mirrors all the image files on my laptop. Each of those NAS drives contains the same files, which protects against one of them failing.

The above means that there are copies of my RAW files on at least four, and often five, hard drives, plus two memory cards, before I have the chance to edit them at all. That may be a little excessive, but it costs me very little, either in terms of money or in terms of time. The only ‘effort’ involved is making copies to the portable backup device ‘in the field’.

As well as being backups of backups, these drives build up various levels of history over time, so that I can return to partially edited versions of images as they progress through the various bits of software I use for post-processing. That’s all automatic; I can’t imagine being able to do it consistently without automation. This sort of system is reasonably easy to set up and automate using a NAS and the software which comes with them. To make things even safer, the NAS itself copies changes over to another drive every few hours, which guards against the NAS enclosure itself failing, rather than one of the drives inside it.

How to make sure you can find those files which you’ve not lost, but could have mislaid
I should add that the transfer process includes renaming and adding additional location information to the RAW files, then automatically putting them into a structured filing system that I readily understand. One way of ‘losing’ files is the literal one of being unable to find them, even though they’re ‘there’ on a computer; adding all this metadata (descriptive information about the photo) is invaluable to avoid that. Lots of transfer software can perform renaming, and there are also numerous tools for adding tags to images. Do it early on in the process so that all the files created from the originals inherit all this useful information. Once the RAW files are tagged with metadata, desktop search tools can find them quickly using keywords and dates, and the downstream, edited versions should inherit the same metadata and be similarly searchable.

Don’t rely solely on the library system of particular software
Leading on from the previous recommendation, I think it’s very important, especially over the long term, to have a physical filing system which can cope with any new files you might want to store in it, and which is not dependent on a particular piece of software. Software changes over time and some cataloguing systems allow you to create what looks like an ordered filing system, but in fact allows the physical files themselves to be scattered over the disk. At some point, when that software changes or you replace it with something else, that type of system can be more than a little problematic to recover!

In my case, my files are named by date and time, and they exist in a structure based on date. Whilst I can’t go directly to, say, ‘South America’, I can search for all files tagged with ‘South America’, which is almost as good. In parallel, I keep a record of where I was on what dates, so the combination means that I can easily find anything I’m looking for.

Protect against ‘disasters’
Finally, in terms of making copies of the images, you may decide that you need an ‘off-site backup’. This protects against a disaster occurring to your main storage area (fire, flood, etc.) It could be a physical disk which you keep somewhere other than in your house, but to be honest that’s a major amount of effort in my terms, and I don’t imagine I’d be able to maintain the rigour to do it regularly and reliably. It also presupposes that you have a physical ‘somewhere else’ to store your backups. Further, it either means lots of CDs or DVDs, or more than one hard drive, all of which is neither cheap, reliable, nor terribly convenient, not to mention that such backups could become obsolete.

My solution involves every RAW file being copied up to a storage area in ‘the cloud’. i.e. to some heavily backed-up hard drive out on the internet somewhere. At least that way, if something in the form of a disaster happens to my house, all my original files are still available. I don’t anticipate this being a huge consolation at the time, but it has the additional benefit of allowing me to get to them from pretty much anywhere.

There are many services which provide this function, and they’re increasing in number all the time. I use Zumodrive, which has the major advantage of appearing exactly like a local hard drive on your computer, so there’s no need to know anything much about it once it’s set up in the first place; the files just find there way ‘there’ automatically.

Don’t edit originals!
This is really obvious, but worth stating. Keep the unaltered files from the camera forever, and edit copies. Further, lots of software uses ‘sidecar’ files to store information about what has been done to an image: file these somewhere so that you can potentially reproduce a sequence of edits at some future point.

I use only two primary programs to process images: DxO Optics Pro, followed, sometimes, by Photoshop for dodging and burning. In each case, the intermediate files and the ‘sidecar’ files are copied to some or all of the hard drives which lurk around my house. That way, I can go back to any stage of editing and start there, avoiding the need to return to the unedited RAW file, which I never touch beyond reading a copy into DxO, which saves the processed file as both a TIFF and a JPEG.

Keep JPEGs, or TIFFs, or both
Right now, and for the next year or several, it’s unlikely that you’ll have a problem reading and editing a file you’ve saved in any current format. Looking further ahead, it’s relatively likely that JPEG format images will be readable by then-current software in, say, five years from now; that’s probably true for a ten year outlook too; the same is true of TIFFs. The further into the future you go, however, the more likely it is that RAW format files created in 2011 will be effectively useless, or at least difficult to process without acquiring copies of old software and making it run on then-current hardware. It’s therefore a very good idea to save JPEG or TIFF copies of ‘finished’ images. I do both, and they end up both as local copies on my laptop and in ‘the cloud’, where I anticipate they will remain accessible for a very long time with minimal cost and minimal, if any, intervention on my part (I’m remarkably lazy when it comes to mundane tasks such as housekeeping files).

Treat SD and CF cards carefully
Returning to the earliest stage of the capture process, I’m always surprised at how reliable data cards are, but they do fail. Either individual files can become corrupt, or the whole card. Using a camera with two cards comes close to preventing any problems from card failure – the chances of both failing at once are rather small, but you can also minimise the risk by avoiding doing things like manipulating images whilst they’re on the card. Writing to a card a great deal increases the risk of some form of corruption. I minimise that risk by formatting the cards, in the camera, as soon as their content has been copied to the various backup media. Freshly formatted cards are, in theory and anecdotally, less likely to fail than those from which ‘old’ images have merely been deleted.

Conclusion
I believe the above gives some ideas about how to protect yourself from the vast majority of eventualities which could cause you to lose files. Once again, it’s not prescriptive though. If you haven’t given this much thought to date, beyond occasionally copying your files to an external hard drive, I suggest you think about which risks you’re still exposed to, and whether it’s worth doing a bit more to mitigate them. And finally, a very important aspect of any file protection strategy: test it! It’s very easy to set up a whole lot of copying of files to all sorts of places, but you really need to make sure that you can get them back; after all,there’s not much point going to the trouble of making copies in the first place if those copies are unreadable for some reason. So, when you first set up the process, imagine that each problem you’re protecting against has happened, work out what you can do about it, and confirm that whatever backup you’re recovering from is what you expect it to be. The time to make the discovery that you’ve introduced a systematic error in your set-up should not be the first time you come to use it in earnest! And test again every so often too; apart from anything else, if you’ve automated lots of things, then reminding yourself of what you’ve automated and what files, in what state, are where is a necessary, if occasional, task – that’s why I initially wrote this piece, to remind myself of what I’d done and to document how it works to some degree – just in case…

Caveat
I definitely don’t claim that these suggestions are one hundred per cent exhaustive, although I do think they’re pretty comprehensive. If there are any gaping holes, you’d be doing me a favour by commenting to that effect, since it means either that I’ve forgotten to mention something, in which case I’ll fix the article, or that my own file protection strategy is lacking, in which case I will very probably fix that too! Thanks.

One Response to “Protecting your files”

What are you thinking?