Friday, May 16, 2014

What's the best setting for Gamma with Rec.709 video?

I've been calibrating TV displays for around twenty-five years and have always leaned on the BBC practice of setting peak white at 80Cd/m2 and the white point at 6504k. For all the decades of standard definition work a gamma of 2.2 has been used. This is required because television cameras don't have a linear transfer characteristic and to make a black-white ramp match (i.e. it looks the same on the monitor as well as to the naked eye) you have to invert the gamma response of the camera at the display end. 
If you get this wrong you wind up with blacks and whites matching but all the shades in between don't - pictures either appear washed-out or all the detail is crushed out of the dark areas of the picture depending on which way you get it wrong.


When Rec.709 came along in the early nineties (for HD television) they changed a few things; new colour primaries, new luminance transfer function, wider gamut etc. They also made a subtle change to the definition of gamma. Like sRGB (the Adobe colour space for graphics working) and Rec.601 the "scene-referred" gamma of 2.2 is specified (in 1992 TV cameras were still tube'd devices) but the display gamma is governed by a complicated transfer law which many people have taken to be closer to a gamma of 2.4. However, a straight power curve of 2.4 is correct only if the display has a zero black level and an infinite contrast ratio, which no real-world display has. The new BT.1886 specification (from 2011) is complex and its precise recommendations vary depending upon the white level, and especially the black level, of the display.
However, if you don't want to bother with a precise BT.1886 calculation a 'best approximation' of this would be a gamma of 2.2 in the low end of the curve rising to a gamma of 2.4 at the high end. In fact it's worse than that - it's linear for the first 10% of the range.
So, what's a colour-calibration guy to do? Once again Charles Poynton comes to the rescue - he's forgotten more things about colour than most of us ever knew and I take his advise whenever possible.
Colors change appearance depending upon absolute luminance, and upon their surroundings. A very dark surround at mastering will “suck” color out of a presentation previously viewed in a light surround. A colorist will dial-in an increase in colorfulness (for example, by increasing chroma gain). The intended appearance for an HD master is obtained through a 2.4-power function, to a display having reference white at 100 Cd/m2 – but that appearance will not be faithfully presented in different conditions! The key point concerning the monitor's gamma is this: What we seek to maintain at presentation is the appearance of the colors at program approval, not necessarily the physical stimuli. If the display and viewing conditions differ from those at mastering, we may need to alter the image data to preserve appearance. In a grading environment, you might set the consumer's display to 100 Cd/m2, matching the approval luminance. However, ambient conditions in an editing environment are somewhat lighter than typically used for mastering today. The lighter conditions cause a modest increase in contrast and colorfulness, beyond that witnessed at content creation.

So, it seems the choice is this - 2.4 gamma in well-controlled grading/mastering conditions (particularly if your monitor has good dynamic range; OLED or Dolby reference monitor) and 2.2 in brighter editing rooms with lower-end LCDs.

No comments:

Post a Comment