Which Version Do I Color Correct?

(Update from September 2nd: Rarevision has released beta software, 5DtoRGB, that addresses many of the issues discussed in this article.  If you’re interested in learning more about 5DtoRGB, please read my follow-up post on the subject, “5DtoRGB Color Tests.”)

When shooting with an HDSLR, capturing high quality footage onto those compact flash cards can sometimes feel like a monumental task: which lens to use, how to light, how to maintain focus, etc.  But, in a way, working through color space issues in post-production is often just as challenging.  Getting footage to look the way we thought it appeared on set is easier said than done, and it’s a good idea to put one person in charge of the footage from the time it’s shot to the time it’s encoded for DVD or uploaded online — otherwise there could be unacceptable colors shifts at some point along the way.

We’ve all heard the stories: it didn’t look like that when I shot it, there’s some sort of gamma shift going on, it looks one way in YUV color space and another in RGB, the H.264 client review file is much brighter than the original, the DVD is too dark.  These anecdotes are so commonplace in the video business, at every budget level, that it’s a wonder no company has yet stepped up to own the entire workflow color issue from production to final delivery.

As far as I can tell, on Macs at least, HDSLR footage will usually look one way in RGB (in Quicktime 10, for instance) and another way when the Final Cut color compatibility preference is selected in Quicktime 7 (last preference on the bottom):

When Canon released its Final Cut Pro plug-in recently, I had high hopes for it.  Maybe, I thought, colors would finally get baked in, and we could feel secure that the colors we were seeing in our Final Cut timelines could serve as our footage masters from which to color correct.  But after trying it out a few times, I realized it was simply converting to ProRes through Quicktime, while at the same time retaining some additional timecode data.  The timecode is appreciated, but a standard YUV export is not really a big deal (I know Canon is going to be rolling out an actual new codec soon, and we’ll see if they make it available to HDSLR users).

In this post, I’ll go over a few color issues I’ve dealt with recently, and mention some possible workarounds.  In general, what I would say about HDSLR footage is that it’s crucial to know in advance how you will be delivering it to your clients, because if you leave any major problem-solving to the last minute, you’re certain to miss your deadlines.

Know your workflow, from beginning to end.  When you are planning and budgeting a project with your client, be aware that you will need to show them cross-platform compatible review files, for instance, and you’ll want the colors in those files to accurately reflect what the source footage looks like.  Are you delivering a DVD, exporting to FLV, uploading to YouTube?  Don’t wait until the last minute to see how your footage will look in all applicable formats. If you notice that your DVD version is coming out dark at around the same time that the FedEx store is about to close, you’ll need to choose between missing a deadline and sending out a DVD that looks too dark — and that’s not a choice you should have to make.

Another part of post-production that tends to confuse the issue is brand loyalty to Apple.  Who doesn’t love a smokin’ fast MacBook Pro loaded up with Final Cut and Compressor?  But will somebody please remind me again why ProRes footage can’t play nice with RGB, or why exporting to H.264 still results in a gamma shift?  There’s nothing that I’ve seen in Snow Leopard, with it’s new system-wide gamma switch to 2.2, that seems to have done anything to remedy these issues.  I suppose I’m grateful that Macs and PCs finally share the same default gamma levels, but trying to navigate through the myriad of remaining unresolved color issues can be maddening.

By the way, speaking of H.264, what’s often overlooked in the current debate over whether the internet should dump Flash in order to accommodate the iPad is that H.264, Apple’s recommended alternative until HTML5 has matured, is notoriously inaccurate when it comes to color.  It’s fine to feel the love for those awesome folks from Cupertino, but the fact remains that if we needed to rely exclusively on Apple to solve all our post-production requirements, we’d be in a whole heap of trouble.  If not for MPEG Streamclip, Innobits Bitvice, and BitJazz’s Sheer codec, for instance, I’m not sure I’d be able to maintain consistent and accurate colors throughout the entire post-production process.

Let’s begin with a few assumptions: you’ve chosen a Canon picture style that you’re happy with on your 5DmkII or 7D; you’ve white balanced either using the temperature dial or by some other custom method; and you’ve exposed your shot correctly.  Sounds straightforward enough in theory, but real-life conditions often intervene.  Such was the case on a recent shoot I had in Portland, Oregon, with members of the band Caguama.  The interview took place at the Alberta Street Pub, a fantastic but extremely underlit bar in Portland.  Walking around the location, I quickly determined that the best background for the interview would be the stage, which was lit with low-wattage orange lights.

I lit the foreground with two diffused, daylight-balanced Lowel Omnis, white balanced to a card I’ve been using for a zillion years, and managed to get a look on set I was happy with.  The orange lights in the background served as an interesting visual element, and the walls above the paneling had a pleasant warm feel.

A few more assumptions, now on the post-production side: you’ve color calibrated your editing monitor, or at least you’ve chosen a preset that works for you (in case you’re wondering, I use NTSC 1953).  Whether you’re working in gamma 1.8 or 2.2, you feel comfortable that the footage you’re sending to clients generally looks the same to them as it does to you.

Okay, so after returning to Boston, I went through my usual routine when it comes to post: I dumped the footage into the batch window in MPEG Streamclip and exported it to ProRes HQ 422.  I’ve been using ProRes for about two years now, and have gradually come to rely on it (unfortunately, I haven’t had a chance to use the new 4444 codec yet).  It allows for many realtime effects in Final Cut, and, for the most part, handles color well.  But, like every video file on a Mac that needs to jump from YUV color space to RGB, or vice-versa, you need to pay close attention to your end results.

For as long as I’ve been shooting on the 7D, I’ve noticed a few things about the footage: the RGB version in Quicktime 10, before it’s been converted to ProRes, is usually a bit drab and desaturated:

and the “YUV” version in Quicktime 7 (with FCP compatibility selected) is always warmer and more vivid…

…though it never ends up looking exactly the same once it’s been imported into the Final Cut timeline.  I put YUV in quotes because H.264 footage from the Canons isn’t truly YUV until its been converted to ProRes.

For the bulk of these tests, I created a test clip out of a piece of original, unaltered H.264 footage.  I opened the original clip in Quicktime 7, deleted all but a second or two of it, and, using the “Save As” function, renamed it “Test Clip.”

Regarding the original Canon footage as it appears in both Quicktime 10 and Quicktime 7: I’m not saying that a desaturated look isn’t pleasing.  As a matter of fact, some people may prefer a desaturated look over a warm look.  But that’s something you will want to have the flexibility to decide on at a later time, during color correction.  At this stage, all I’m interested in is figuring out how to get footage that will continue to match as it travels from Quicktime to Final Cut to After Effects and finally DVD and the internet.  At this point, it almost doesn’t matter whether the image is too warm or that the exposure could be boosted — what matters is that later, when I color correct in either Final Cut, Color, or After Effects, I will be able to trust that the changes I make to the footage will be maintained across all the various programs.  There’s no point in color correcting in Final Cut if the footage will look different once it gets back to Quicktime, or to color correct in After Effects if it will look different in Final Cut.  We need a workflow that works for all our destinations.

So the next step is obviously to convert to ProRes HQ.  Here’s where it gets interesting.  First, I’ll show you the Quicktime 7 version, which is accommodating for YUV color space:

Nothing’s changed.  So far so good.

But here’s the RGB version in Quicktime 10:

It’s darker and even warmer.  You may think that as long as you work in Final Cut, how a clip looks without YUV compatibility is irrelevant.  But what you will soon see is that the way footage looks in pure RGB color space (Quicktime 10) is often a precursor to how other RGB programs, such as After Effects, or even, bizarrely, Apple’s own Compressor, will receive the clip.

Here’s another point to keep in mind regarding Quicktime 10: Quicktime 7 (or Quicktime Pro, as I still call it) doesn’t even come factory-installed on new Macs anymore.  You need to manually install it from your Snow Leopard DVD.  Either Apple is trying to phase it out, or they think the only people who require the export functions of Quicktime Pro are video professionals, not the average casual user.

So your client may never watch a single video in Quicktime 7, and, even if they did, would they really have selected FCP compatibility in the preferences?  Let’s take it a step further: many editors (including myself) are using the NTSC (1953) preset to color calibrate their work monitors.  But you can be certain that your client’s monitors aren’t calibrated this way, and, anyway, the colors you’re seeing in your ProRes footage haven’t yet been baked in.  Change the color calibration preset on your monitor and you’ll change the temperature and gamma of your footage.

Anyway, let’s keep going.  The next step is to bring your new ProRes clip into Final Cut (I should mention that I’m still working with Suite 2.  If the Suite 3 upgrade solves all these issues for you, then now’s a good time to stop reading this post):

Wait.  What happened?  I thought we enabled FCP compatibility in Quicktime 7 — why should it look any different in Final Cut?  It’s too dark and too yellow now.  So, just to keep score, there’s the way it looks in QT 7, the way it looks in QT 10, and now the way it looks in Final Cut.  Somebody is seriously trying to mess with us.

Maybe there’s a solution to this and maybe there isn’t.  I did some more reading online and slowly came to realize that it’s perfectly normal for something to look one way in Quicktime and another way in Final Cut.  But which version should you consider your master?  Which version should you color correct?

Ready for more?

Import your ProRes clip into After Effects:

Too dark and warm again, like what we were seeing in Quicktime 10 when we opened the ProRes clip (it’s actually somewhere in between the ProRes QT 10 version and the Final Cut version).  And it doesn’t help that you’ll need to export this footage from an RGB program back to ProRes, to complete the round-trip journey back to Final Cut.

In case you think I didn’t try choosing NTSC (1953) color space in After Effects…

…think again:

It’s definitely an improvement, but it’s not tracking back to any of my other versions.  In another words, I still can’t color correct in one program and be certain that my look will carry over to the next.  When it comes to HDSLR ProRes footage that needs to return to Final Cut, you simply can’t color correct in After Effects with any degree of certainty.  Remember, at no time during this process have I color corrected this clip in any way whatsoever.  These shifts are happening behind the scenes, whether I want them to or not.

Care to make an MPEG2 in Compressor.  Your client needs a DVD, right?

It looks similar to that first After Effects version, before trying the switch over to NTSC color space.  Now let’s see how it looks encoded in BitVice (with Gamma 2.2 compensation checked):

Same deal.

Now let’s go to H.264 for client review and YouTube upload versions.

Wait, this can’t be right.  Didn’t we leave Quicktime 10′s drab look behind when we left Canon’s H.264?  The last time we looked at this clip in Quicktime 10, when it was converted to ProRes, it was too warm and dark, and now, simply by returning to an RGB codec again, it’s been compromised.

If you’re looking for an alternative to H.264 for YouTube versions, I happen to be a big fan of the AVI exports that come out of MPEG Streamclip (this only seems to work with HDSLR footage: for most other types of video, you’ll likely get a slight gamma shift on AVI exports viewed in Quicktime 10.  The truth is, there aren’t really any good options for lower-res Quicktime exports on the Mac side.  AVI file sizes will far exceed those of MP4s and WMVs, which will look great if you can live with the slight brightening in QT 10.  Maybe one day I’ll pick up Parallels or some other PC virtualware, and some type of PC video export software, and will try exporting to WMV and opening the file in QT 10).  Here are my 1280×720 settings:

The file size will be huge (you could experiment with the quality slider to bring the size down), but, under YouTube’s ten minute running time limit, you should be okay.  Let’s look at an AVI export from a ProRes clip:

This AVI matches the color space of the ProRes version that I most preferred in Quicktime 7 (not the same ProRes clip as it appeared in the Final Cut timeline).  I definitely believe that, for HDSLR footage at least, AVIs are worth considering as client review (maybe at 640×360, 50% quality) and YouTube uploading options.  With most other types of footage, as I mentioned, your clip will probably still look a tad brighter in QT 10, but so far I’ve been able to maintain my colors with AVIs coming out of footage shot on the 7D.

Just out of curiosity, let’s see what happens to the AVIs if they are run through Compressor and BitVice.

Compressor:

BitVice:

So, even though these were encoded from the AVI version, not the ProRes version, Compressor returned the color information to what we first saw in After Effects, while BitVice was able to handle the same information more accurately.

Okay, let’s return to the central issue here.  Which version of this clip should serve as the master from which to color correct?  The short answer is: none of the above.

By the way, in case you’re wondering about attaching an NTSC (1953) ColorSync profile filter from Quicktime 7 to your ProRes footage, don’t bother.  Here’s the filter in question:

You will drive yourself even crazier trying to juggle any additional variables.  Seriously, color correcting ProRes 422 footage may be hazardous to your health.

My advice, for whatever it’s worth, is not to color correct until the client has approved the final cut.  Then, once the picture is locked, get out of YUV.  Export your ProRes timeline to an RGB codec such as Sheer’s 8-bit RGB, or, if you’re more courageous, use Media Manager to export the actual edited clips, and then batch convert those from within MPEG Streamclip to Sheer.  The resulting files will be more than twice the size of their ProRes counterparts, but the results are worth it.

Here are my RGB 8-bit settings:

It’s true that, as we saw in H.264, Sheer footage will also get washed out slightly in Quicktime 10:

But H.264 is a delivery codec.  Sheer is an editing codec.  You will probably never send your client a Sheer-encoded file to review.

What I care about is how it looks in Final Cut…

Hey, that looks pretty good.  Here’s the original Quicktime 7 ProRes file again for comparison:

It’s not an exact match, but it’s close (and you could easily compensate for the difference in saturation).  Also, you can change the sequence settings in Final Cut to always render in RGB for a specific sequence:

After Effects, as you might expect from an RGB program, is no problem, though you will need to select “Preserve RGB” in the Interpret Footage Window:

And once you’ve matched your colors in After Effects, the door to Photoshop Image Sequences is open to you as well:

AVI?  Still good:

Care to see the H.264 export from a Sheer file?  Okay, but don’t say I didn’t warn you.  Cover your eyes if you’re squeamish.

Quicktime 10, desaturated and drab, now with dreaded gamma shift:

Even Quicktime 7 can’t escape the gamma bug.

So don’t go from Sheer to H.264 if you can help it.

Compressor?

Interesting.  Dark again, and warm.  Whatever benefits we gained from going to Sheer’s RGB color space seem to have been lost.

The BitVice compression fares better as far as color is concerned, but it’s also too dark:

Try this as a workaround: make your DVD in BitVice from your 1280×720 AVI version, not your 1920×1080 Sheer version:

That’s more like it.

Obviously, you’ll need to check your work on a PC and an HD television, though I think you’ll be happy with the results.  If you tried this procedure and something didn’t turn out the way you expected, please let me know about it.  I did my best to catch any mistakes on my end, but it’s possible I may have missed something.

To repeat, at no time in this process have I done any color correction to this clip.  I simply converted it from the camera to ProRes HQ to Sheer 8-bit RGB and AVI.  It’s a repeatable HDSLR workflow that hopefully will give you consistent, accurate colors from the rough cut all the way to the DVD and HD deliverable.

Sure, life would be a lot easier if we never had to worry about these behind-the-scenes color fluctuations — if we could shoot, import into post-production software, and encode for clients without ever having to worry about our colors shifting  — but that’s just not the world we live in.  In the meantime, however, I’m going to stick with this workflow — until the inevitable day arrives when it fails me completely.

19 responses to “Which Version Do I Color Correct?

  1. Pingback: Color correcting DSLR footage on a Mac is a clustercuss « NoFilmSchool·

  2. Pingback: Tweets that mention Color correcting HDSLR video footage can be a bitch and half. Here's one person's solution to the gamma shift - -- Topsy.com·

  3. Thanks for a detailed break-down of these issues. I’ve been amazed that they’re still such a problem and that there’s so few solutions.

    Have you tried the x264 free encoder for your web outputs? I need to deliver H.264 for the web and I’ve found this one doesn’t encode the gamma-shift switch, leading to more consistant gamma online.

    Also I wonder how Adobe’s Premiere CS5 workflow through AE and Encore handles all this. Any thoughts?

    • I’ve tried x264 in the past, but not in this context. I will try it and see what it does. Regarding Premiere CS5, it is definitely intriguing. As a matter of fact, I did install the 30 day demo while writing this post. The problem is that I quickly came to realize I wasn’t anywhere near ready to write about it. I think I need to play with Premiere for a few more hours at least and report back. What I was seeing, though, is that the gamma was spot-on. The color seemed to get a bit yellowish, but maybe there is a setting or preference that would address this. I’ll probably revisit this issue in the next few months, and this will be one of the remedies I consider. Thanks for your comment!

  4. I found things improved with Snow Leo and FCP7. It’s too complex to figure out what changed exactly.

    Having said that, I also felt like I could see the future: all the gamma issues are solved; my master looks incredible, the H264 viewing copy looks great; and then my heart is broken when I see it on Vimeo. Or someone’s POS “HD-Ready” TV.* “If only you could see what I see,” my little heart screams.

    Well, no one can and they never will. Compressing the crap out of a file and then compressing the crap out of it again and then viewing it in SD on a POS computer display that was selected by price by ‘most people,’ who don’t know or care will result in disappointment. That’s how it is. Practice some Zen.

    It’s a very personal observation. I combat obsessive compulsive tendencies :-)

    *At work we have a very big HD-ready plasma TV that I would NEVER use to show anyone video on. There is no workaround to it stretching, and it’s not HD. It shrinks the image, then blows it up and distorts it. People think it’s good because it’s big, but it’s really the same as projecting an iPad display about 5 feet across.

  5. Pingback: correction » Twitter Trends·

  6. Jerome, thanks for the interesting article. Did you ever feed your FCP signal to a properly calibrated broadcast monitor via an I/O device (AJA card, Blackmagic card, Matrox, etc)? That’s the only way I trust to evaluate any video.

    • Thanks Jay. I don’t really come from a broadcast delivery background. All my clients want DVD or internet versions of their videos. What would you say is the better investment from a color standpoint: an external device or the FCP 7 upgrade?

      • Jerome, its my opinion that from a color standpoint, an I/O device and a broadcast monitor are certainly a better investment. Of course, like any investment, you have to ensure a return. Keep in mind that even a modest monitoring set up – the Matrox MXO and an Apple Cinema Display acting as a broadcast monitor, for example – will cost several thousand dollars.

        Whether or not that cost is “worth it” is an individual choice each editor must make.

        I still use a broadcast monitor for my non-broadcast work. That means, though, occasionally having to tweak color for the ultimate delivery format; be it quicktime, flash, youtube, etc. I think of this tweaking in the same way I think about the sound mix – there isn’t necessarily one that works across the board (i.e, I’d mix the same show differently for theatrical release than for the DVD release, differently again for a web release, etc).

        In the end, though, if the clients are happy that’s really all that matters.

        Thanks again for the interesting article.

      • Thanks for the input, Jay. That is a pretty sizable investment. For the time being, I’ll keep using my workarounds, but at some point I will need to make the leap to a broadcast monitor and I/O device.

  7. I would question the use of NTSC as a working monitoring space for these files. Firstly all HD material is normally in REC709 color space unless it’s more high end and has a wider gamut designed for film out.
    By monitoring in NTSC you are asking each program to apply a gamut transform to squeeze one gamut and gamma into another. As each will have it’s own approach to solving this problem each comes out with slightly different results. Also few computer monitors can actually display full NTSC gamut which tends to have more saturated colors. On most monitors what you see will be an approximation of what you might see on an NTSC monitor but shown within the more limited range of the sRGB display.
    There is also a reason why the old joke of NTSC standing for Never Twice The Same Color, came into being.
    sRGB and REC709 share pretty much the same color space / gamut and I think were pretty much designed to be close enough to work together well. I would only suggest really working in NTSC if you actually have a monitor and output device capable of handling it, you would need this for final DVD authoring to avoid nasty surprises.

    At the end of the day though we put in all this work to give something a certain look end to end and it gets displayed on an end users uncalibrated computer monitor or TV. We just have to do the best we can where we can.

  8. Pingback: Is Apple Abandoing the Professional Creative Commuity?·

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s