Helpful Posts Helpful Posts:  0
Results 1 to 11 of 11

Thread: Gamma: Overrated?

  1. #1

    Join Date
    Nov 2010
    Location
    Toronto Canada
    Posts
    27
    Real Name
    Jack

    Gamma: Overrated?

    While waiting for the elusive Gamma tutorial I have been doing some reading on the subject. There are two statements that crop up periodically that I find misleading for a properly color-managed post-processing workflow and that I would like to better understand with your help (be forewarned, long and technical).

    1. ‘Linear processed raw captures look very dark’. This quote, found in different forms in textbooks and specialist sites alike, comes from a 6 year old Adobe white paper ( http://www.adobe.com/digitalimag/pdfs/linear_gamma.pdf ). I took it for granted for several years and it may even appear to be true in a non-color managed workflow - but it is misleading: raw captures processed in a linear gamma color space look just as good (probably better, see point 2 below) if properly displayed by a color-managed system. Of course if your system is unaware that the image is linear it will pass it as-is to your monitor which, if it is like the vast majority of monitors on sale today, will distort it thanks to its electronics to the tune of a 2.2 power function (gamma) therefore displaying it darker than your data wanted. If on the other hand your system is properly color managed, it will recognize that you are working in a linear color space and, knowing that your monitor will make your image darker, it will apply a compensating inverse power function of 1/2.2 to your image data just before passing it to the monitor so that when it DOES distort it, the end result will be the properly displayed picture, proportional to the underlying image data: 1/2.2 X 2.2 = 1. So the linear processed data of raw captures is not and does not look ‘very dark’. It looks just like the captured scene, if properly displayed. It’s the monitor that makes it dark.

    2. ‘You need to work in a gamma corrected color space (i.e. not linear) because it minimizes posterization/banding’. I found this here (http://www.dpreview.com/learn/?/Glos...nearity_01.htm ), but there are countless examples of similar statements everywhere – this too is, IMHO, misleading. I believe that this statement would be true if you were working with a camera sensor with a non-linear response in its analog stage that would therefore produce the equivalent of gamma corrected raw data out of the box in the raw file. This way you would actually CAPTURE more detail in the shadows. But if you start with a set of linear(ized) raw data, as is the case with virtually all commercial DSLR’s today, you do not have that benefit: you do not GENERATE more detail in the shadows by applying a gamma correction, all you do is (unnecessarily and prematurely?) shift your EXISTING linear data around, expanding it in some regions and compressing it in others. Any time you do this in post-processing it may result in banding, gamma or not. So how is posterization minimized by utilizing a gamma-corrected color space on existing linear raw data? Of course you do not have a choice if on the other hand you are working on a Jpeg image (sRGB, 8 bits).

    So if these premises are correct, in these days of 16 bit color and terabyte hard drives, why exactly do we post process our raw captures in working color spaces with gamma other than one?

  2. #2
    Administrator
    Join Date
    Apr 2008
    Location
    California, USA
    Posts
    1,473
    Real Name
    Sean

    Re: Gamma: Overrated?

    Quote Originally Posted by Jack Hogan View Post
    While waiting for the elusive Gamma tutorial I have been doing some reading on the subject. There are two statements that crop up periodically that I find misleading for a properly color-managed post-processing workflow and that I would like to better understand with your help (be forewarned, long and technical).
    Hi Jack, that tutorial is very very close to being ready. Give it just a few weeks and there'll be an announcement in the newsletter.

    Quote Originally Posted by Jack Hogan View Post
    1. ‘Linear processed raw captures look very dark’.
    Yes, you're right -- the image will only appear dark if the monitor assumes that a gamma of 2.2 needs to be applied on output (which almost all monitors assume). A display gamma of 1.0 would make the image appear just fine, for example. However, pushing your monitor too far from its native gamma can create problems, most notably with posterization / tonal levels. With CRTs, the electron gun has a native gamma which is roughly 2.5, so little correction is needed to get it to a standard 2.2 gamma. Pushing this to 1.0 would require a much bigger correction.

    Quote Originally Posted by Jack Hogan View Post
    2. ‘You need to work in a gamma corrected color space (i.e. not linear) because it minimizes posterization/banding’.
    ...
    So if these premises are correct, in these days of 16 bit color and terabyte hard drives, why exactly do we post process our raw captures in working color spaces with gamma other than one?
    With a low-bit file, such as a standard 8-bit JPEG or TIFF, then this is true. Gamma-encoding reallocates bits / tonal levels so that they're perceptually uniform, which allows files to get away with only 8-bits when they would otherwise require around 11 bits (to avoid visible posterization). However, as you point out, if the file remains in 12-14+ bits (from the RAW capture), then converting to a gamma corrected color space doesn't really help with posterization, since you're just shifting the existing bits around. This is why RAW development software in fact performs all of its processing on the linear data, as you suggest -- not the gamma corrected data. For example, Adobe Camera RAW and Lightroom apply all their curves/levels/etc in a linear gamma color space (which is a modified version of ProPhotoRGB -- just without its gamma of 1/2.2).

    Sorry everyone else for getting a little technical here . . .
    Last edited by McQ; 10th November 2010 at 10:51 PM.

  3. #3

    Join Date
    Nov 2010
    Location
    Toronto Canada
    Posts
    27
    Real Name
    Jack

    Re: Gamma: Overrated?

    Thanks, Sean, that makes sense. A couple of clarifications, if I may trouble the forum, and more apologies for getting even more technical :-). In a properly color-managed working environment:

    1. Just to make sure, when you say “the image will only appear dark if the monitor assumes that a gamma of 2.2 needs to be applied on output” do you mean that the relative luminance of the displayed image will not appear proportional to the raw data unless it compensates for the monitor’s intrinsic gamma? If this is so, in order to minimize performing more distorting operations on our data than needed, shouldn’t we apply to it the inverse of the monitor’s intrinsic NATIVE gamma? And if, due to the physics of the device, a CRT’s intrinsic native gamma is around 2.5, what is the intrinsic native gamma of the current crop of wide gamut LCD’s?

    2. Posterization/Banding. Ok, so a gamma power function can act as a lossy compressor of visual data, just like MP3 acts like a lossy encoder of audio data. We take the 12/14-bit linear data of our image and encode it into 8 bits in a perceptually meaningful way, using extra bits 9-12/14 to fill in some of the spaces (posterization) created by applying a gamma power function to the linear data, and throwing away some perceptually miningless bits in the process. On the other hand, in our original 12/14 bit linear data all of the information is there at maximum resolution and no bits have been thrown away. If we stayed in 12/14 (or even better, 16) bits, applying a gamma power function to the data would only compress it lossily (potentially CREATING visible posterization upon decompression) without giving us other benefits. Worse, it would force us to work with data upon which a lossy transform had been applied. So is it fair to say that, far from minimizing banding, applying a gamma power function to linear data in a modern color-managed workflow makes it worse? That if you start with 12/14 linear bits and work in 16 bits, like most of us do today, applying a gamma correction as part of your working color space does not help banding one bit :-), not even when saving to Jpeg or displaying your image on an 8-bit monitor or printer?

    And if these premises are correct, why do the majority of working color spaces today apply effective gamma corrections of around 2.2 to our linear data? Is it because they erroneously assume that the image data you are feeding them is already gamma-compressed? That may have been true in the not-so-distant past, but it is no longer true today. And what does this say about the choice of a working color space with a gamma other than 1 today, for photographers who deal mainly in 16 bits with their own images, from raw to Photoshop? Why aren’t we minimizing the negative effects of gamma compression by using a linear working color space throughout our work-flow, only converting to a gamma corrected color space at the very end of post processing - and then only when so imposed by the output device?

    Apologies again for the length of the post, I hope at least it may have been a little stimulating.

  4. #4
    Administrator
    Join Date
    Apr 2008
    Location
    California, USA
    Posts
    1,473
    Real Name
    Sean

    Re: Gamma: Overrated?

    Quote Originally Posted by Jack Hogan View Post
    1. Just to make sure, when you say “the image will only appear dark if the monitor assumes that a gamma of 2.2 needs to be applied on output” do you mean that the relative luminance of the displayed image will not appear proportional to the raw data unless it compensates for the monitor’s intrinsic gamma? If this is so, in order to minimize performing more distorting operations on our data than needed, shouldn’t we apply to it the inverse of the monitor’s intrinsic NATIVE gamma? And if, due to the physics of the device, a CRT’s intrinsic native gamma is around 2.5, what is the intrinsic native gamma of the current crop of wide gamut LCD’s?
    Yes, a linear image will appear dark because of the monitor's gamma of 2.2. There's an imperceptible difference between (1) applying an encoding gamma of 1/2.5 and outputting to a gamma 2.5 monitor vs (2) applying an encoding gamma of 1/2.2, then applying a gamma correction of 2.2/2.5 within the monitor and outputting it to the gamma 2.5 monitor. The standard 8-bit image (0-255) actually has slightly more tonal levels than we can perceive with our eyes (at standard contrast ratios), so we also have some tonal levels to spare. Furthermore, when given the choice between a gamma 1/2.2 and 1/2.5 encoded file, 1/2.2 is closer to how our eyes perceive differences in tonal levels. A gamma 1/2.2 will therefore allocate bits a little more efficiently, and will record these tones in a way that is perceptually uniform.

    Ideally all display devices would have the same native gamma, but as with most things, this isn't true in practice. LCDs can vary substantially depending on their display technology (IPS, etc), so they all rely heavily on their own internal (nonlinear) look-up tables (LUTs). It's much easier (and cheaper) for the industry to standardize around a specific gamma, and to have this effectively hard-coded into how the LCD displays tones -- as opposed to having a high bit depth LUT accessible by the computer on an image by image basis.

    Quote Originally Posted by Jack Hogan View Post
    2. Posterization/Banding. Ok, so a gamma power function can act as a lossy compressor of visual data, just like MP3 acts like a lossy encoder of audio data. We take the 12/14-bit linear data of our image and encode it into 8 bits in a perceptually meaningful way, using extra bits 9-12/14 to fill in some of the spaces (posterization) created by applying a gamma power function to the linear data, and throwing away some perceptually miningless bits in the process.
    Yes, the MP3/lossy compression analogy is reasonable, but we need to be careful not to take it too far. A low bit rate MP3 (say 96kbps) is audibly different from one at a high bit rate (say 320kbps). On the other hand, gamma encoding a 12-14 bit linear RAW is by itself *visually* lossless. That being said...

    Gamma encoding is also often pretty darn close to being numerically lossless as well -- if this is performed within a high bit depth (16 bit) working space. For example, if you capture a 12-bit linear RAW and apply a gamma of 2.2 within a 16-bit working space, then you're not really creating any posterization, since the working space has 16X* as many tonal levels as the original file. This leaves plenty of room to shift tones around without causing posterization due to rounding errors. Furthermore, if any bits are lost due to rounding, the lions share will be lost in areas which matter least (the brighter tones) and were recorded in excess to begin with. Extra bits aren't needed, as you suggest.

    *yes, I realize that 16 bit files in Photoshop are actually more like 15 bit, but this doesn't change the nature of the example.

    Quote Originally Posted by Jack Hogan View Post
    On the other hand, in our original 12/14 bit linear data all of the information is there at maximum resolution and no bits have been thrown away. If we stayed in 12/14 (or even better, 16) bits, applying a gamma power function to the data would only compress it lossily (potentially CREATING visible posterization upon decompression) without giving us other benefits. Worse, it would force us to work with data upon which a lossy transform had been applied.
    I don't see the problem here, since this is already addressed by modern RAW processing workflows. One isn't forced to work with data after a lossy transform has been applied. All editing within RAW development software is performed on the unaltered linear data. I also disagree with your assertion that applying a gamma power function to a 12/14 bit RAW file would create *visible* posterization (on its own). No visual study has ever confirmed something like this. Our eyes can only see relative tonal differences of about 1%, so there's still tons of extra tonal levels even after gamma encoding.

    The above answers should address all of your subsequent questions as well...
    Last edited by McQ; 13th November 2010 at 05:45 PM.

  5. #5

    Join Date
    Nov 2010
    Location
    Toronto Canada
    Posts
    27
    Real Name
    Jack

    Re: Gamma: Overrated?

    Thanks for your thorough answers, Sean. I see what you mean about LCD’s and about the fact that the extra headroom - from the 12/14 captured bits to most post-processing internal 16 bits - will make gamma compression virtually invisible in a single round trip.

    What I meant by gamma compression ‘potentially creating … posterization upon decompression’ is this: let’s say, as it is likely, that the next crop of high-end digital cameras will produce raw data with 16 bit depth. In this case, would you say that it would be a good idea for our post-processing software to apply gamma to linear data before it was needed (i.e. either never or only if required by the output)?

    I think in that situation it would be a bad idea to apply gamma upon opening a linear file because, not having any headroom, it would throw away some data and it certainly would not IMPROVE posterization/banding - if anything it would make it worse. So, if applying gamma to the data before it is needed will not make things better and in fact it requires MORE bit depth for the same result, why use a gamma corrected color space (ProPhoto, Melissa or whatever) to save our originally linear data and post-process them with it (say in Photoshop)? If it ain’t broke, don’t fix it?
    Last edited by Jack Hogan; 13th November 2010 at 04:02 PM. Reason: clarification

  6. #6

    Re: Gamma: Overrated?

    Looking forward to the Gamma tutorial. I had just got my head around the concept of the raw data having to having gamma encoding applied because our eyes do not see or record light the same way a digital sensor does. That makes sense to me, I can picture it so to speak.

    I always thought the image sensor would "see" the captured image as very dark and high contrast. If you are using DPP and tick the linear box the image goes very dark, this is how I thought the sensor has captured the image. The RGB Histogram is stacked heavily to the right but when the linear box is unclicked the values in the histogram are distributed as we have come to expect.

    My understanding is when I open a raw file in DPP or ACR the image I am faced with is what the software manufacturer thinks is a good starting point. It is then up to me to start applying changes to the linear data to get the image to how I want it. Only when I save the file to tiff or jpeg is the gamma encoding applied.

    What I have not thought about is the gamma needed for the monitor. Looking forward the tutorial.

  7. #7

    Join Date
    Nov 2010
    Location
    Toronto Canada
    Posts
    27
    Real Name
    Jack

    Re: Gamma: Overrated?

    Quote Originally Posted by Andrew Mckay View Post
    I always thought the image sensor would "see" the captured image as very dark and high contrast.
    Hi Andrew, I do not mean to add to the confusion, but the sensor would receive the exact same radiant energy (for simplicity let's talk about luminance) than a co-located eye looking at the same scene. If the camera then stored this relative luminance in a raw file and the photographer post-processed the image following a proper color managed workflow, the relative luminance presented to another eye by the photographer's computer monitor would ideally be exactly the same as the relative luminance hitting the sensor (and the eye) at the scene: same relative luminance = same perception. Of course, behind the scenes, your color managed software would have counteracted the monitor's 2.2 gamma distortion by pre-distorting the linear data to the tune of 1/2.2 gamma at some point.

    Quote Originally Posted by Andrew Mckay View Post
    If you are using DPP and tick the linear box the image goes very dark
    If you have a properly color managed system, ticking the linear box simply shows you how your monitor is distorting your image data.

    Quote Originally Posted by Andrew Mckay View Post
    when the linear box is unclicked the values in the histogram are distributed as we have come to expect.
    You can represent data with any scale you desire. Because we are photographers and we think in stops, it makes sense and it is more representative to scale a histogram of image data logarithmically.

    Quote Originally Posted by Andrew Mckay View Post
    Only when I save the file to tiff or jpeg is the gamma encoding applied.
    I DO see the benefit when compressing precious 12/14 bit linear data down to 8 bits, to maximise the perceived quality of the images in Jpeg.

    But if you are a photographer that deals mainly with raw or 16 bit TIFF data, the question is WHY any gamma encoding is applied. IMHO there are no benefits in such a scenario - which represents my case 99% of the time - only disadvantages. Perhaps minor, but disadvantages nevertheless: degraded data in the highlights and amplified noise in the shadows come to mind.
    Last edited by Jack Hogan; 15th November 2010 at 02:11 PM. Reason: Additions and clarifications

  8. #8
    Administrator
    Join Date
    Apr 2008
    Location
    California, USA
    Posts
    1,473
    Real Name
    Sean

    Re: Gamma: Overrated?

    Quote Originally Posted by Jack Hogan View Post
    . . . the next crop of high-end digital cameras will produce raw data with 16 bit depth. In this case, would you say that it would be a good idea for our post-processing software to apply gamma to linear data before it was needed (i.e. either never or only if required by the output)?
    While we're on the topic of future image formats/cameras, I suspect that the trend will be to move towards using floating points instead of integers. In that case many of our above arguments would be either irrelevant or vastly different.

    I think that if the image doesn't need to be edited substantially, then sure, applying encoding gamma is just fine. This will be imperceptible and provides much more efficient image storage. Even though we tend to get caught up with the details regarding maximum image quality, it's good to keep this in perspective -- the average camera user generally doesn't touch their images and simply saves them as highly compressed JPEG files. Otherwise, yes, it's always best to avoid large in-camera tonal transformations prior to performing any major editing. I guess the mentality here is very similar to image sharpening.

    Quote Originally Posted by Jack Hogan View Post
    So, if applying gamma to the data before it is needed will not make things better and in fact it requires MORE bit depth for the same result, why use a gamma corrected color space (ProPhoto, Melissa or whatever) to save our originally linear data and post-process them with it (say in Photoshop)? If it ain’t broke, don’t fix it?
    Yes, I often advocate just archiving the original RAW files along with an XMP sidecar file (which records the RAW processing steps which were applied). Storing edited images as ProPhotoRGB TIFF files uses much more space, and as you say, it does sacrifice quality.

  9. #9
    Administrator
    Join Date
    Apr 2008
    Location
    California, USA
    Posts
    1,473
    Real Name
    Sean

    Re: Gamma: Overrated?

    Quote Originally Posted by Andrew Mckay View Post
    I had just got my head around the concept of the raw data having to having gamma encoding applied because our eyes do not see or record light the same way a digital sensor does.
    Yes, as Jack mentioned, the only reason that linear images look dark is because the display also applies a gamma of 2.2. If this display gamma weren't applied (and the monitor worked linearly like the camera), then you could effectively just send RAW data to the display and the tones would appear as intended (presuming proper calibration, etc).

    Hopefully this will all become clearer with the gamma tutorial...

  10. #10

    Join Date
    Nov 2010
    Location
    Toronto Canada
    Posts
    27
    Real Name
    Jack

    Re: Gamma: Overrated?

    Quote Originally Posted by McQ View Post
    I suspect that the trend will be to move towards using floating points instead of integers. In that case many of our above arguments would be either irrelevant or vastly different.
    Yes indeed, this is merely a passing phase based on a 70 year old habit.

    Quote Originally Posted by McQ View Post
    Even though we tend to get caught up with the details regarding maximum image quality, it's good to keep this in perspective
    Of course you are correct. Just curious, though: should I start looking for a gamma=1 color space for most accurate post-processing in Photoshop :-)?

  11. #11

    Re: Gamma: Overrated?

    thanks Sean and Jack. Certainly seems to be more to the picture than meets the eye.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •