Page 2 of 3 FirstFirst 123 LastLast
Results 21 to 40 of 44

Thread: Monitor

  1. #21
    dje's Avatar
    Join Date
    May 2011
    Location
    Brisbane Australia
    Posts
    4,636
    Real Name
    Dave Ellis

    Re: Monitor

    Quote Originally Posted by george013 View Post
    What is the difference between desktop colour depth and output colour depth?

    George
    Yes that's what I'd like to know too!

    Dave

  2. #22
    MilosVukmirovic's Avatar
    Join Date
    Jan 2016
    Location
    Belgrade
    Posts
    19
    Real Name
    Milos

    Re: Monitor

    Great question George,
    Desktop color depth means that you will have only 16.7 million colors available to be seen when displaying basically any non color managed app(or parts of apps) and your desktop.

    Remember that when you calibrate your display you assign an ICC profile that maps your monitors gamut, so color managed apps know what to do with color from your images in relation to your monitors color coverage.
    Well in non color managed environment pure red on your 1.07 billion color display will be way more saturated than pure red in 16,7 million space, and consequently signal of pure red goes to your monitors purest red instead of pure red in 16,7 million color space, thus tonal range will be over saturated, since there is no color management to point it to correctly saturated colors, showing desktop and various software overly saturated.

  3. #23
    James G's Avatar
    Join Date
    Dec 2009
    Location
    Birmingham UK
    Posts
    1,471
    Real Name
    James Edwards

    Re: Monitor

    I need to think this out... I'm developing a brain freeze!

  4. #24
    Moderator Manfred M's Avatar
    Join Date
    Mar 2012
    Location
    Ottawa, Canada
    Posts
    21,925
    Real Name
    Manfred Mueller

    Re: Monitor

    Quote Originally Posted by James G View Post
    George, my somewhat limited understanding is that modern inkjet printers can handle a wider colour gamut than monitors. So arguably with a greater colour depth it is possible to print a wider range of tones?

    I now wait to be corrected

    I should say immediately, that the printer might be capable of this enhanced tonal range but I'm not sure I have the vision to appreciate it!
    In one sense it matters little to me since I have been getting excellent quality prints from my Epson SC-P600. I'll be reprinting a couple of images tomorrow and will be interested to see if I can detect any differences.
    James - my understanding is the same as yours. Modern photo inkjet printers can exceed the AdobeRGB (1998) spec, which is already considerably wider than the sRGB (1996). When the AdobeRGB spec came out, the intent was to produce an RGB colour space that more or less mapped over the CMYK colours that could be produced by the printers of the day.

    Modern colour photo printers with their multiple additional ink colours do exceed what a basic CMYK printer can produce, especially when printing vibrant colours. If recall correctly, sRGB covers around 33% of the CIE Lab colour space and AdobeRGB covers around 50% of the CIE Lab colour space. The ProPhoto colour space covers around 90% of the CIE Lab colour space (and probably 100% of "real world" colours). This is the colour space I use when I print from Photoshop.

  5. #25

    Join Date
    May 2014
    Location
    amsterdam, netherlands
    Posts
    3,182
    Real Name
    George

    Re: Monitor

    Quote Originally Posted by James G View Post
    George, my somewhat limited understanding is that modern inkjet printers can handle a wider colour gamut than monitors. So arguably with a greater colour depth it is possible to print a wider range of tones?

    I now wait to be corrected

    I should say immediately, that the printer might be capable of this enhanced tonal range but I'm not sure I have the vision to appreciate it!
    In one sense it matters little to me since I have been getting excellent quality prints from my Epson SC-P600. I'll be reprinting a couple of images tomorrow and will be interested to see if I can detect any differences.
    That screenshot you posted is from the video card.

    George

  6. #26

    Join Date
    May 2014
    Location
    amsterdam, netherlands
    Posts
    3,182
    Real Name
    George

    Re: Monitor

    Quote Originally Posted by MilosVukmirovic View Post
    Great question George,
    Desktop color depth means that you will have only 16.7 million colors available to be seen when displaying basically any non color managed app(or parts of apps) and your desktop.

    Remember that when you calibrate your display you assign an ICC profile that maps your monitors gamut, so color managed apps know what to do with color from your images in relation to your monitors color coverage.
    Well in non color managed environment pure red on your 1.07 billion color display will be way more saturated than pure red in 16,7 million space, and consequently signal of pure red goes to your monitors purest red instead of pure red in 16,7 million color space, thus tonal range will be over saturated, since there is no color management to point it to correctly saturated colors, showing desktop and various software overly saturated.
    Desktop color depth being 8 bit per channel gives 16.7 million nuances. I presume the that meant is what the monitor does.
    But what is output color depth?

    In my thoughts a monitor produces colour with a certain gamut and expressed in wavelength of the light. The signal of the individual colour can be manipulated in 256 steps, 8 bit. For the 3 colours that gives 16.7 million nuances.

    The second part is what to do when my image is recorded with a different gamut. Than I must make some corrections for the demanded output colour. That's where colour management comes in. But that's not related to the bitdepth.

    I'm still struggling with colours.

    George

  7. #27
    Moderator Manfred M's Avatar
    Join Date
    Mar 2012
    Location
    Ottawa, Canada
    Posts
    21,925
    Real Name
    Manfred Mueller

    Re: Monitor

    Quote Originally Posted by george013 View Post
    Desktop color depth being 8 bit per channel gives 16.7 million nuances. I presume the that meant is what the monitor does.
    But what is output color depth?
    George that is only correct if you are using an sRGB screen. Here your bit depth (8-bits per colour channel) are required to produce that many colours / shades. This covers about 1/3 of the colours that exist in nature that we humans can see. Most computer screens are marketed as being sRGB compliant, many of with use the Twisted Nematic (TN) type technology. sRGB represents approximately 1/3 of the colours that humans can see, as defined by the CIE Lab colour space (which incorporates ALL colours that the human eye can see).

    If you are using an AdobeRGB compliant screen, you need the 10-bits per colour channel (and a graphics card that can produce that output). This means this type of screen can output a larger number of wavelengths of light. These screens are more expensive and use a different construction in the screen, usually some form of IPS (in-plane switched) technology. I'm not aware of any screen that is 100% AdobeRGB compliant, but there are quite a number on the market that are 99% and higher AdobeRGB compliant. AdobeRGB can show about 50% of all of the colours a human can see.


    Quote Originally Posted by george013 View Post
    In my thoughts a monitor produces colour with a certain gamut and expressed in wavelength of the light. The signal of the individual colour can be manipulated in 256 steps, 8 bit. For the 3 colours that gives 16.7 million nuances.
    Again, this is only true for sRGB screens. Once you move to AdobeRGB compliant screens you are looking at 1024 steps (10-bit) and 1 074 million nuances. Again, the screen must be capable of outputting those wavelengths.

    Quote Originally Posted by george013 View Post
    The second part is what to do when my image is recorded with a different gamut. Than I must make some corrections for the demanded output colour. That's where colour management comes in. But that's not related to the bitdepth.

    I'm still struggling with colours.

    George
    First of all, if you shoot raw, there is no gamut. That is assigned during the raw conversion process. If you are shooting jpeg in camera, then you have a choice of selecting either sRGB or AdobeRGB colour spaces. If you are shooting in camera raw, your raw converter will usually let you choose whichever colour space you want to convert the raw data into. Commonly a third colour space, ProPhoto is also available. This is a wide gamut colour that can show about 90% of the CIE Lab colour space (but no computer display or printer can reproduced these yet). Because of the large number of colours ProPhoto can reproduce, it should be used as a 16-bit integer to reduce the risk of banding. 8-bit is always okay with sRGB and usually fine for AdobeRGB.

    A colour managed system will be able to identify the colour space that the image has and can do an accurate mapping of those colours. If a specific colour is outside of the range that the device can reproduce, then the rendering intent in either the screen or printer driver will remap these colours to ones that can be reproduced. In photography we usually use either the Relative Colormetric or the Perceptual rendering intents. These two intents handled out of gamut colours slightly differently. Relative remaps out of gamut colours to colours found at the edge of the gamut where Perceptual remaps all colours to be in gamut (which preserves the gradation, but ends up changing the in-gamut colours too).

    When Adobe Lightroom first came out, it used a variant of ProPhoto internally, although later versions allow the user to choose other colour spaces.

    I hope I haven't added to the confusion too much...

  8. #28

    Join Date
    May 2014
    Location
    amsterdam, netherlands
    Posts
    3,182
    Real Name
    George

    Re: Monitor

    Quote Originally Posted by GrumpyDiver View Post
    George that is only correct if you are using an sRGB screen. Here your bit depth (8-bits per colour channel) are required to produce that many colours / shades. This covers about 1/3 of the colours that exist in nature that we humans can see. Most computer screens are marketed as being sRGB compliant, many of with use the Twisted Nematic (TN) type technology. sRGB represents approximately 1/3 of the colours that humans can see, as defined by the CIE Lab colour space (which incorporates ALL colours that the human eye can see).

    If you are using an AdobeRGB compliant screen, you need the 10-bits per colour channel (and a graphics card that can produce that output). This means this type of screen can output a larger number of wavelengths of light. These screens are more expensive and use a different construction in the screen, usually some form of IPS (in-plane switched) technology. I'm not aware of any screen that is 100% AdobeRGB compliant, but there are quite a number on the market that are 99% and higher AdobeRGB compliant. AdobeRGB can show about 50% of all of the colours a human can see.




    Again, this is only true for sRGB screens. Once you move to AdobeRGB compliant screens you are looking at 1024 steps (10-bit) and 1 074 million nuances. Again, the screen must be capable of outputting those wavelengths.



    First of all, if you shoot raw, there is no gamut. That is assigned during the raw conversion process. If you are shooting jpeg in camera, then you have a choice of selecting either sRGB or AdobeRGB colour spaces. If you are shooting in camera raw, your raw converter will usually let you choose whichever colour space you want to convert the raw data into. Commonly a third colour space, ProPhoto is also available. This is a wide gamut colour that can show about 90% of the CIE Lab colour space (but no computer display or printer can reproduced these yet). Because of the large number of colours ProPhoto can reproduce, it should be used as a 16-bit integer to reduce the risk of banding. 8-bit is always okay with sRGB and usually fine for AdobeRGB.

    A colour managed system will be able to identify the colour space that the image has and can do an accurate mapping of those colours. If a specific colour is outside of the range that the device can reproduce, then the rendering intent in either the screen or printer driver will remap these colours to ones that can be reproduced. In photography we usually use either the Relative Colormetric or the Perceptual rendering intents. These two intents handled out of gamut colours slightly differently. Relative remaps out of gamut colours to colours found at the edge of the gamut where Perceptual remaps all colours to be in gamut (which preserves the gradation, but ends up changing the in-gamut colours too).

    When Adobe Lightroom first came out, it used a variant of ProPhoto internally, although later versions allow the user to choose other colour spaces.

    I hope I haven't added to the confusion too much...

    Manfred,

    There is no relation between color gamut and bit depth. That are 2 different issues. One can have sRGB or aRGB or whatever gamut with minimal 2 bit color depth, not very usable though, or with 100 or more bits color depth.

    I still don't know what the output color depth is.

    George

  9. #29
    MilosVukmirovic's Avatar
    Join Date
    Jan 2016
    Location
    Belgrade
    Posts
    19
    Real Name
    Milos

    Re: Monitor

    The output color depth is the color bit depth of signal going from GPU through cable to the monitor, it goes in this case as 1024(2^10) steps of tonal range from 0 to 100% strength per primary channel.
    The problem of 32bit(or 8bit) desktop color depth is that there is no way for computer to properly put that 0 to 255 tonal range steps it reads from software and desktop in to the wast space of 1024 steps your monitor has.
    Your GPU still sends 10bit per color, it's just lack of instructions for computer that doesn't let it properly show colors from desktop.

  10. #30

    Join Date
    May 2014
    Location
    amsterdam, netherlands
    Posts
    3,182
    Real Name
    George

    Re: Monitor

    Quote Originally Posted by MilosVukmirovic View Post
    The output color depth is the color bit depth of signal going from GPU through cable to the monitor, it goes in this case as 1024(2^10) steps of tonal range from 0 to 100% strength per primary channel.
    The problem of 32bit(or 8bit) desktop color depth is that there is no way for computer to properly put that 0 to 255 tonal range steps it reads from software and desktop in to the wast space of 1024 steps your monitor has.
    Your GPU still sends 10bit per color, it's just lack of instructions for computer that doesn't let it properly show colors from desktop.
    I did some search and came to some conclusion like this. It's a hardware setting. In this example the monitor is still 8 bits per channel. I also read that it's very difficult to bypass that limit under windows for all the API's are written in 8-bits. Special software has to be used. It seems no problem under Linux. I don't know if this is true.

    There is also something strange with the output range. Limited only values between 16 and 235 are passed. Do you know why? Seems to be related to HDMI output.
    https://pcmonitors.info/articles/cor...-and-amd-gpus/

    George

  11. #31
    James G's Avatar
    Join Date
    Dec 2009
    Location
    Birmingham UK
    Posts
    1,471
    Real Name
    James Edwards

    Re: Monitor

    That screenshot you posted is from the video card.
    Correct. Milos pointed out that to exploit the full potential of the monitor, a video card capable of 10 bit output was needed, and it had to be connected via the Display Port (which can handle 10 bit signals).

    The significant point to me was that he said there are reports of banding being generated if the Display Port was not used.

    I was using the DVI port to connect the monitor and had experienced no banding issues, but when I checked the NVidia Control panel it was clear the output from the card was 8 bit.
    I was simply curious to see what the card would output if I switched to the Display Port, and as Milos suggested, the output is 10 bit.

    The general discussion about 10 bit technology interests me but my real interest was the banding possibility , which as I said I had not (yet?) experienced.

    The fact that technology has not yet caught up with 10 bit output and arguably we cannot fully exploit it yet is no surprise to me. It has always been so, 4K video is a recent example. I just wish they would standardise on a port configuration that will last. Display Port was supposed to be the 'new' best, I'm willing to bet it is not. Then we will have 5 slots on a vid card!
    Last edited by James G; 19th August 2016 at 10:30 AM.

  12. #32
    MilosVukmirovic's Avatar
    Join Date
    Jan 2016
    Location
    Belgrade
    Posts
    19
    Real Name
    Milos

    Re: Monitor

    Quote Originally Posted by george013 View Post
    I did some search and came to some conclusion like this. It's a hardware setting. In this example the monitor is still 8 bits per channel. I also read that it's very difficult to bypass that limit under windows for all the API's are written in 8-bits. Special software has to be used. It seems no problem under Linux. I don't know if this is true.

    There is also something strange with the output range. Limited only values between 16 and 235 are passed. Do you know why? Seems to be related to HDMI output.
    https://pcmonitors.info/articles/cor...-and-amd-gpus/

    George
    Luckily we have special software for our images and color management. I can't speak for anything on linux as I have no experience with it.

    The problem shown in article you linked is today very easily solved as there are dedicated options in GPU control panels letting you control limited/full range. It has it's roots in identifying anything on HDMI as a TV. What I don't like is that article claims that on nvidia it can happen even on DP connection, which is absurd and tells a lot about effort Nvidia puts in their drivers

  13. #33

    Join Date
    May 2014
    Location
    amsterdam, netherlands
    Posts
    3,182
    Real Name
    George

    Re: Monitor

    Quote Originally Posted by MilosVukmirovic View Post
    Luckily we have special software for our images and color management. I can't speak for anything on linux as I have no experience with it.

    The problem shown in article you linked is today very easily solved as there are dedicated options in GPU control panels letting you control limited/full range. It has it's roots in identifying anything on HDMI as a TV. What I don't like is that article claims that on nvidia it can happen even on DP connection, which is absurd and tells a lot about effort Nvidia puts in their drivers
    I don't understand the advantage of limited range. The article mentions "limited range 16-235". Still 8 bits are needed.

    George

  14. #34
    Moderator Manfred M's Avatar
    Join Date
    Mar 2012
    Location
    Ottawa, Canada
    Posts
    21,925
    Real Name
    Manfred Mueller

    Re: Monitor

    Quote Originally Posted by george013 View Post
    Manfred,

    There is no relation between color gamut and bit depth. That are 2 different issues. One can have sRGB or aRGB or whatever gamut with minimal 2 bit color depth, not very usable though, or with 100 or more bits color depth.

    I still don't know what the output color depth is.

    George

    For computer screens the answer is yes as that is how they are designed. If you have an sRGB screen, then 8-bit per channel is what will be used, if you have an AdobeRGB screen, then 10-bits are required.

    In general for colour spaces, you are absolutely correct as colour spaces are defined by the envelope and the bit depth simple describes the tonal range values found within that envelope.
    Last edited by Manfred M; 19th August 2016 at 09:02 PM.

  15. #35

    Join Date
    May 2014
    Location
    amsterdam, netherlands
    Posts
    3,182
    Real Name
    George

    Re: Monitor

    Quote Originally Posted by GrumpyDiver View Post
    For computer screens the answer is yes as that is how they are designed. If you have an sRGB screen, then 8-bit per channel is what will be used, if you have an AdobeRGB screen, then 10-bits are required.

    In general for colour spaces, you are absolutely correct as colour spaces are defined by the envelope and the bit depth simple describes the tonal range values found within that envelope.
    I think you're mixing up what is preferred and what is the hard ware.
    There're monitors that specifies an AdobeRGB gamut and a maximum bitdepth of 8bpc. It are 2 completely different issues. But when you mix them up you make it more difficult as it is. So when you say "if you have an AdobeRGB screen, then 10-bits are required" it's not required but preferred. Because the gamut is wider and if the bitdepth would stay 8 bit, then the differences between the steps will be bigger. With 10 bit they will be about equal.

    George

  16. #36
    MilosVukmirovic's Avatar
    Join Date
    Jan 2016
    Location
    Belgrade
    Posts
    19
    Real Name
    Milos

    Re: Monitor

    The roots of limited range are from video and broadcasting and their color spaces, where 16-235 would be your safe levels, but levels from 0-15 and 236-255 would still be recorded in file information if there is a case you need to bring them up, these levels are also called blacker than black and whiter than white.

    The advantage of limited range even in video today is debatable, but it is the current standard and practice.

    Your last post is true George, how large a monitors color gamut(or space) is dependent on how rich(or saturated) are it's pure RGB primaries, how much colors you can show inside that space is dependent on color bit depth.

  17. #37
    Moderator Manfred M's Avatar
    Join Date
    Mar 2012
    Location
    Ottawa, Canada
    Posts
    21,925
    Real Name
    Manfred Mueller

    Re: Monitor

    Quote Originally Posted by george013 View Post
    I think you're mixing up what is preferred and what is the hard ware.
    There're monitors that specifies an AdobeRGB gamut and a maximum bitdepth of 8bpc. It are 2 completely different issues. But when you mix them up you make it more difficult as it is. So when you say "if you have an AdobeRGB screen, then 10-bits are required" it's not required but preferred. Because the gamut is wider and if the bitdepth would stay 8 bit, then the differences between the steps will be bigger. With 10 bit they will be about equal.

    George
    Not necessarily preferred. With my video card / screen I can set 8-bit or 10-bit output. The colours are different, depending on which mode I select, so there seems to be some level of "intelligence" in the video driver that recognizes the hardware I am using. Certain images are more vibrant in 10-bit mode, so its not just an "envelope" issue. If I throw on an old VGA adapter, the 10-bit option does not appear in the settings menu. I can definitely select rendering intent as well.

  18. #38

    Join Date
    May 2014
    Location
    amsterdam, netherlands
    Posts
    3,182
    Real Name
    George

    Re: Monitor

    Quote Originally Posted by GrumpyDiver View Post
    Not necessarily preferred. With my video card / screen I can set 8-bit or 10-bit output. The colours are different, depending on which mode I select, so there seems to be some level of "intelligence" in the video driver that recognizes the hardware I am using. Certain images are more vibrant in 10-bit mode, so its not just an "envelope" issue. If I throw on an old VGA adapter, the 10-bit option does not appear in the settings menu. I can definitely select rendering intent as well.
    If the colors are different viewing a jpg, 8 bit, when changing between 8 and 10 bit and with the same gamut, then there's something going on in your hardware or your operating of the card. The colors should stay the same. It's for sure not the definition of gamut and/or bitdepth.
    Would be a strange situation paying much attention on calibration and then such a changing of colors.

    I don't know what you mean with this "I can definitely select rendering intent as well.".

    George

  19. #39

    Join Date
    Sep 2015
    Location
    Virginia - USA
    Posts
    884
    Real Name
    Sam

    Re: Monitor

    Quote Originally Posted by James G View Post
    I just switched my BenQ screen to the Display Port (£20 for a new cable)

    NVidia Control panel now shows the following.... output color depth now 10bpc

    Monitor
    I am also looking to eventually upgrade my monitor. The Ben Q mentioned here looks to be a possible option. From reading this post it sounds like one needs to have a 10bpc video card to realize the monitor's potential. But from my research its seems that even if you have this option on the NVidia Control panel you may not really be experiencing this increase in color depth. From the NVidia website:

    "NVIDIA Geforce graphics cards have offered 10-bit per color out to a full screen Direct X surface since the Geforce 200 series GPUs. Due to the way most applications use traditional Windows API functions to create the application UI and viewport display, this method is not used for professional applications such as Adobe Premiere Pro and Adobe Photoshop. These programs use OpenGL 10-bit per color buffers which require an NVIDIA Quadro GPU with DisplayPort connector. A small number of monitors support 10-bit per color with Quadro graphics cards over DVI." http://nvidia.custhelp.com/app/answe...a-geforce-gpus

    As a result, I am still uncertain as to whether I will need to upgrade my video card (GTX 970) to obtain 10bpc using Lightroom within with Windows 10.

  20. #40
    DanK's Avatar
    Join Date
    Dec 2011
    Location
    New England
    Posts
    8,625
    Real Name
    Dan

    Re: Monitor

    George, my somewhat limited understanding is that modern inkjet printers can handle a wider colour gamut than monitors. So arguably with a greater colour depth it is possible to print a wider range of tones?

    I now wait to be corrected
    The bit depth of the monitor won't affect the gamut of the printer. the point is that a narrow-gamut monitor (such as mine) can't represent the full gamut of a good modern printer, making it hard to edit some images precisely and to see what you will get from the printer.

Page 2 of 3 FirstFirst 123 LastLast

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •