Skip Ribbon Commands
Skip to main content

Cory's Blog

:

Quick Launch

Stenoweb Home Page > Cory's Blog > Posts > Digital Photography Workflow
May 03
Digital Photography Workflow

As promised, I have had an opportunity to look at some photography workflow stuff and am now going to share my findings with the world. I have a tremendous amount of digital photos. As of mid-2011, this collection stood at over 200 gigabytes of image data and something like 24,000 individual files. I didn't touch my camera very often in 2012, but as of the past few weeks, I have somewhere in the neighborhood of twenty or thirty gigabytes of images to add to this collection, and in less than a month, I'll be on vacation in

Background

At the moment, I'm using Adobe Bridge + Photoshop CS4. I moved to this setup after establishing my existing photography organization system in one of my courses at NAU. This system has carried me through three or four computers and images from three cameras have gone into it.

It was initially established on one of the early Intel-based iMacs, in particular, a 17-incher that had a 2.16GHz Core2Duo processor, 2 gigabytes of memory, and a 128MB ATi Radeon X1600 graphics processor. This system ran Photoshop CS3 almost as soon as this initial Universal Binary version of the venerable image editing application was released.

In due time, I moved to a ThinkPad R61i, which had a 1.5GHz Core2Duo and GMA X3100 graphics, but 3 gigabytes of memory. This system had Creative Suite 4 installed. I later migrated to a ThinkPad T400, still using Creative Suite 4. This system has a 2.53GHz Core2Duo, 4 gigabytes of memory, a 256MB ATi Radeon HD 3450, and dual hard disk drives, along with other niceties.

The system has images from three main cameras in it which have produced a variety of types of files – a Kodak EasyShare CX6230 (JPEG), a Nikon D50 (JPEGs then RAWs), and a Nikon D300 (all RAW).

However, with the purchase of my newest camera, the EOS M, it is clear that it's time to think about what direction this setup should take. Additionally, in the near future, I intend to take a vacation where I'll be generating a significant amount of image data. I have approximately 192GB of flash memory capacity and the new camera will be shooting exclusively RAW, plus 1080p video. Add this to an increased interest in leaving the house and documenting the world around me and it becomes clear that an image storage, organization, and processing system that's efficient and flexible is necessary.

Computer Performance

From 2008 to today, I used the same camera, a Nikon D300. This (and the bulk, "set and let it rip" workflow I describe below lends itself to the time it takes to process an individual not mattering so much) has meant that from my purchase of the ThinkPad T400 in 2009, an increase in the amount of computing horsepower (or even disk speed) hasn't specifically been necessary.

The ThinkPad T400 was very high end when I purchased it, but it's clear it's not feasible to do everything at once on it. In a recent experiment that was conducted all too hastily, it stalled out intermittently while using Lightroom 5's "Copy as DNG" function, virtualizing Windows 7 (with 2 gigs of its own ram) and playing music on Pandora. It should remain fine for photography tasks specifically, but it is also old enough that I doubt its display will hold a calibration for too terribly long.

Just for fun, I've already tested Lightroom 5 on one of the slower computers I still have around, a ThinkPad T42p with a 1.8GHz Pentium M and 1.5 gigs of ram, plus a relatively slow spinning disk. I can't tell if the low performance in using it was related to the low memory and processor performance of this particular machine, or if it was related more to the fact I had to store the very large library file and original images on a disk that would connect via USB 2.0.

I currently own two far faster endpoint computers, the Mac mini (a 2011 model with a 2.3GHz Core i5 processor, 5 gigabytes of memory, and a gigabit network connection to my file server, where I've been playing with Aperture 3) and a Sony Vaio laptop, which has a very similar 2.3GHz Sandy Bridge-based Core i5 processor, 8 gigabytes of memory, and a reasonably powerful GPU, along with USB 3.0 and gigabit Ethernet, providing for some interesting workflow possibilities.

More computing horsepower (and very fast storage, either via USB 3.0 or via gigabit Ethernet) will become important as I look at solutions that let me integrate my old work and my new images, which are coming from a variety of sources, such as my phone, my old Nikon D300, and the new Canon EOS M. (Which, it bears mentioning, has a sensor that produces notably more pixels in files almost twice the size of those coming out of the D300.)

Any of these solutions can (and will) be deployed on computing hardware I already have, but several of them are thought about in terms of purchasing software for and using it on a new computer (likely a stationary machine.)

Existing Workflow

Today, I follow a multi-step workflow.

  1. Copy images from a memory card to the "import" folder at D:\all_image_files\import\
  2. Use Adobe Bridge to rename these images to wiegersma_YYYYMMDD_$CAMERASEQUENCE.ext
  3. Use Adobe Bridge + Photoshop CS4 to convert these files to DNG, typically moving to another folder, unsorted, on the way.
  4. Use Adobe Bridge to bulk apply base metadata (typically, copyright & contact information) to the newly imported images.
  5. Tag and rate these images. (In theory, this happens during import, meaning that as you move forward you do not have to retroactively tag anything. Needless to say, this hasn't happened at all.)
  6. Sort these photos into approximately four-gigabyte buckets, named dng_$SEQUENCENUMBER
  7. Enter the dng_ folders and select an image to work with, open it, adjust exposure in Camera RAW, and then open it in Photoshop proper.
  8. Save this file in a psd_ folder and do edits to it.
  9. Sharpen and save for export.
  10. Repeat as necessary.

Needless to say, I typically skip step 5 in total, and then never do anything beyond step 6. This is particularly true after a harrowing data-related event in 2009 at which point I almost lost it all, but is certainly true when my goal is just to ingest images into "the system" so I can free up the memory card capacity.

The file system that came out of this is pretty interesting, it looks like this:

  • D:\all_image_files\
    • dng_files\
      • dng_001\
        • 4.0GB of images, each image holds its own metadata such as key words and exposure information.
      • dng_054\
    • export_JPEG\
      • jpg_001\
        • 4.0GB of JPEGs for export. I lost all of mine in 2009, and have since been putting small export batches elsewhere.
    • import\
      • These are fresh off the memory card and sometimes, in the interest of efficiency, I'll pile several shots or cards worth of images in before actually continuing beyond step 1 of the import process.
    • psd_files\
      • psd_001\
        • 4.0GB of in-progress and finished shots that have been worked in photoshop.
    • unsorted\
      • These are renamed and have been metadata'd, but aren't yet filed into the dng_buckets, mainly for keywording. Hypothetically, this is where I add ratings and keywords.

The system is exceedingly logical and is also very portable. The whole system mostly relies on using lowercase lettering, no spaces in filenames, and file-naming which means you can sort your files on a filesystem by the name, and get them in chronological order.

This system is designed with the presumption that you could if you wanted to migrate over to Linux or from Mac to PC and back again without much friction. If you plan your moves well, it's also not hard to find and take your Bridge keyword library with you, and of course, building a good keyword library is almost as much an art form as building a good library is on its own.

The four gigabyte buckets are very much a product of when I adopted this system. By 2008, DVD burners were in every single system I owned, and I had an external one as well, and 4.7 gigabyte DVDs were sold in hundred-stacks for something like $70. From 2008 to 2011, I ended up burning yearly copies of my entire system out to DVDs as archives and labeling the discs things like "dng_007 // 2009-09-10" so I could tell what content was on it and when that disc had been burned. The idea behind reburning my archives was sort of related to file versioning – however most of the stuff in the dng_ folders wasn't going to change, the stuff in the psd_ folders could have changed. It also would have served to capture keywords I've added to older dng_ folders, which was almost always on my to-do list.

The main issues with this system is that it currently encourages laziness – hundreds, nay, thousands of images would build up in the import folder and I'd only get around to actually doing anything about it (like batch renaming and converting to DNG) when I had time to let my main mobile computer hang out for a whole weekend converting files. Keywording was then never done, in part because it was difficult and in part because the task of adding comprehensive and useful key-words to a few thousand images at once was very daunting.

Aperture 3

Aperture is just fun to use. Images import off the card and into the library quickly, Aperture's adjustments are powerful and make a lot of sense, and it helps that the machine I'm doing this all on is incredibly speedy for its size, and is almost silent, to boot.

Somebody for whom Macintoshes are at the center of the universe could certainly do far worse than aperture for managing photos.

The worst part of Aperture is its odd limitations. The first of these, is of course that projects can only have so many items in them. This means that Apple encourages Aperture users to limit the number of items in these organizational bins by making it very easy to split them up. You can do it by events, by dates, or however, really.

The next limitation, and to me the heavily more annoying one, is that Aperture just cannot handle having its libraries (or your originals) being stored on a network. I would almost understand it if Apple wanted you to use a true Mac OS X Server computer as your network storage, but this doesn't even work. The solution I've used, which is certainly a kludge, but which works better than I expected it to, is that I've created a 500-gigabyte sparse DMG file, which I then mounted before copying over the Aperture Library. This will, of course, be very messy in the next few years when I acquire more than 500 gigabytes of digital media.

The advantage here is that I do get the advantages of storing these files on TECT, and potentially of the backup and archival system it will eventually have. The disadvantage, of course, is that I don't know how any backup software I may use on TECT will handle a very large sparse DMG file, and of course, I don't know what it's like to expand sparse DMGs after they've been created. It may be that as I move forward I'll curse myself for having only made the DMG a half a terabyte, as I create a new one at some ridiculously large size and move the library therein.

The library is another sore point for Aperture and myself. When I first started the current workflow as listed above, I had been importing my photos and then deleting them from other media, as though I believed I'd be using Aperture forever. (To be fair, it has always been a nice enough product.) Getting my images out of Aperture was, to put it lightly, "interesting," and I ended up swearing to myself I'd never go back.

And I might not, because most (100%) of my existing photos are stored in Adobe's open "digital negative" (or DNG) format, and Apple dislikes Adobe enough that they've incorrectly implemented support for it in Aperture, iPhoto, and Mac OS X at large, which means that a move to Aperture implies that I'm going to have a whole computer or application around to getting at the photos in the "old" or "archival" way, these photos which of course can't be migrated into Aperture.

This is sad because after the past several dozen gigabytes worth of photos both out of the D300 and the EOS M, I've found Aperture to be smooth and maybe more importantly, fun. It's my sincere hope that my trials with Lightroom over the next week or so prove that it can be fun too.

Lightroom 5

The thing that struck me almost immediately about Lightroom 5 is that it successfully rolls most of the steps in the "Existing Workflow" into a single step. Images fly directly off the card and into a folder structure that Lightroom has established for you. Unlike Aperture, everything is held in standard folders on the filesystem, named how you ask Lightroom to do the naming. In my case, I've used wiegersma_YYYYMMDD_$CAMERASEQUENCENUMBER.ext just like in my old system.

Lightroom wants a lot of display space, however, in order to see all of the different interface elements at once. I've tried it so far on computers with display resolutions of 1366x768, 1400x900, 1680x1050, and 1600x1200. So far, the computer with the 1680x1050 display (a "regular" 15-inch unibody MacBook Pro) was the best mix of computing horsepower and display size. Having a full 1200 pixels of display height to work with is certainly good, but not strictly necessary, I found. Even after having "learned" the Lightroom interface on a big display and been able to apply that knowledge using smaller displays, it bears mentioning that Lightroom still wants a large display in a way that Creative Suite with its far more flexible interface setup and Aperture with it smaller interface that uses floating palettes just don't.

One of the things all of those pixels are used for is tagging and rating the photos. Tagging photos is important because it can help you find images in a collection of tens of thousands (or potentially hundreds of thousands, but I haven't reached that point yet). I have been inconsistent at tagging in the past, bug all of the tags I did have previously came through, with their structure, on Lightroom. Probably because it complies with whatever standards Adobe may or may not have helped invent.

Filtering photos in Lightroom is actually really fun and is a) a good way to waste a lot of CPU time and make an innocent laptop spin its fans up a lot, and b) is completely the realization of why I have tagged my images to begin with.

Once the images have been imported and then tagged and rated (which, because Lightroom views an entire photo collection as a single body, can be a continuous process anyway), they move into the other "modules" of the program for refinement, localized editing and filtering, and then export and sharing or printing. The single-program integrated workflow is certainly appealing and it's something I'm going to be taking a detailed look at on my various computers as I both go through my old photos to tag and rate them, as well as capture new images and bring them into the program and work on tagging and rating them during that ingestion process.

Lightroom has a lot of other nifty functionality, such as creating "picture packages" for printing, sending photos to various online services, and almost anything in the program can be stored as a preset. Want to make contact sheets of the last set of imported images? If there's not already a preset for that, you can make it. Lightroom, like Aperture, also has functionality for exporting images to web sites. Unlike Aperture, it doesn't hook directly into iWeb, but this is probably not a disadvantage at this point, just because although I find iWeb incredibly interesting, it's not really particularly practical for anything I want to do at the moment.

Bridge + Photoshop CS6

This (or its variant, "use the stand-alone DNG converter") is the laziest option and is the least likely to product a setup where nothing changes from my existing workflow and I'm not particularly interested in processing photos because it's always a daunting task.

It was pointed out to me that I could even adapt my workflow either with CS4 or with a new version of Photoshop and Bridge in clever new ways that acknowledge this is 2003, rather than 2008. The prime example of this is that I own Blu-ray burners now and could build my "buckets" into 23 gigabyte chunks to fit on that media. However, I think it would be remiss for me to try to think about preserving my photos separately from preserving all of my data, and I'm going to look closely at where the image data should be stored in order to make accessing it quick and easy, and preserving it quick and easy.

The other issue with the buckets I haven't yet worked out is if you can specify a single parent folder to find images from using metadata. In fact, in the Creative Suite, I often found myself applying keywords only to never use them again. If I liked a particular image, I may go through my folders one by one and look for rated images, or I might just have written the particular image I was looking for or wanted to remember down on paper.

In fact, in the earliest form of this workflow, I had books of printed contact sheets of photos which I would thumb through and then mark on the pages of or use sticky notes in to indicate which image I wanted.

What to Do

It's hard to choose a photography workflow at the drop of a hat. The one I'm using today is the result of a reasonable amount of refining and is probably the third or fourth system I've had anyway. The hardest part is choosing a system that'll accommodate the thousands of images I already have. In some ways, it's tempting to tear it all down and start over completely, and in others it's tempting to either continue as I have, o adopt the "new" solution that aligns most closely with what I already have, to the point of being from the same vendor and using a lot of the same conventions, just by default.

The unfortunate fact is that Aperture and Lightroom are each more fun than working on images in the traditional Creative Suite workflow, which means I am more likely to do something with them. It has been almost two years that I've been doubly working my images with Creative Suite and Aperture, and well over a year since I've actually done anything in Creative Suite. Those images, I've just left in the Import or Unsorted folder and done nothing with.

I believe that in the interest of actually doing things with the images I capture, I'm willing to take a hit on the "technical correctness" of my setup.

A Quick Thought about Data Storage

My photography collection is one of the first things that really got me thinking about what it means to have original data that's worth protecting, and as such is one of the few datasets I have that's really organized with preservation and archival in mind. The preservation I currently do on it is to burn the whole thing to DVD once each year, which I have been a little lax on this year and last. The archival element is that hypothetically, I could take some of the oldest folders which are as "complete" as they'll ever get (in terms of tagging and finding new images worth working, exporting, and sharing) out of the active rotation. This means that I could hypothetically dump the folder dng_001, dng_002, and dng_003 (just as examples) out onto DVDs or blu-rays (or a tape, whatever) and delete them from my hard disk drive. This is, I presume, how folks using this system on laptops or other systems with smaller disks cope.

Lightroom and Aperture are almost certainly capable of similar methods of "archival" – Lightroom even uses a function called Smart Previews to let you keep a "good enough" copy of the image that only takes a few megabytes (compare to 20 or 30) while storing your original DNG files offline (or on a file server.) My only problem with this is that it makes multi-computer workflows more difficult, and today, items like four terabyte external hard disks for active work, or backup and archival are veritably commodities. (Plus, my library is "only" 200-250 or so gigabytes at the moment – I've got a long way to go before the limit of a four terabyte single disk is an issue.)

Comments

There are no comments for this post.