Page 3 of 3 FirstFirst 123
Results 41 to 44 of 44

Thread: Monitor

  1. #41

    Join Date
    May 2014
    Location
    amsterdam, netherlands
    Posts
    3,182
    Real Name
    George

    Re: Monitor

    Quote Originally Posted by DanK View Post
    The bit depth of the monitor won't affect the gamut of the printer. the point is that a narrow-gamut monitor (such as mine) can't represent the full gamut of a good modern printer, making it hard to edit some images precisely and to see what you will get from the printer.
    I want to add next.
    The gamut is a property of the output device.
    The bitdepth and the belonging color value is telling how much of the max value must be used. So 50% in 8 bits is 128 and in 10 bits that will be 512.

    Sam,
    I came to the same conclusion. What I'm still puzzling with are the settings of James. The Benq SW2700 claims to produce 1.07B colors which means 10 bits. Why not calling that 10 bits? The video card says output 10 bit. But the monitor says "highest 32 bits", which is 8 bit per channel.

    George

  2. #42
    Moderator Manfred M's Avatar
    Join Date
    Mar 2012
    Location
    Ottawa, Canada
    Posts
    21,978
    Real Name
    Manfred Mueller

    Re: Monitor

    Quote Originally Posted by george013 View Post
    What I'm still puzzling with are the settings of James. The Benq SW2700 claims to produce 1.07B colors which means 10 bits. Why not calling that 10 bits? The video card says output 10 bit. But the monitor says "highest 32 bits", which is 8 bit per channel.
    That's because we are looking at the video card interface screenshot and that I suspect that is only capable of delivering an 8-bit output. That would be my first guess.

    For a 10-bit display to work properly, the video card must deliver 10-bit (R, B and G channels + luminosity = 40bit). The video card driver will only show what the computer screen is capable of accepting. Generally these screens only show "valid" options.

  3. #43

    Join Date
    May 2014
    Location
    amsterdam, netherlands
    Posts
    3,182
    Real Name
    George

    Re: Monitor

    Quote Originally Posted by GrumpyDiver View Post
    That's because we are looking at the video card interface screenshot and that I suspect that is only capable of delivering an 8-bit output. That would be my first guess.

    For a 10-bit display to work properly, the video card must deliver 10-bit (R, B and G channels + luminosity = 40bit). The video card driver will only show what the computer screen is capable of accepting. Generally these screens only show "valid" options.
    I've asked this before in this thread. What is the difference between "desktop color depth" and "output color depth". My assumption is that the desktop color depth is showing the status of the monitor and the output is showing the status of the card. If you or anybody else can tell me what this means, I and many others will be pleased.

    George

  4. #44
    MilosVukmirovic's Avatar
    Join Date
    Jan 2016
    Location
    Belgrade
    Posts
    19
    Real Name
    Milos

    Re: Monitor

    Quote Originally Posted by Sam W View Post
    I am also looking to eventually upgrade my monitor. The Ben Q mentioned here looks to be a possible option. From reading this post it sounds like one needs to have a 10bpc video card to realize the monitor's potential. But from my research its seems that even if you have this option on the NVidia Control panel you may not really be experiencing this increase in color depth. From the NVidia website:

    "NVIDIA Geforce graphics cards have offered 10-bit per color out to a full screen Direct X surface since the Geforce 200 series GPUs. Due to the way most applications use traditional Windows API functions to create the application UI and viewport display, this method is not used for professional applications such as Adobe Premiere Pro and Adobe Photoshop. These programs use OpenGL 10-bit per color buffers which require an NVIDIA Quadro GPU with DisplayPort connector. A small number of monitors support 10-bit per color with Quadro graphics cards over DVI." http://nvidia.custhelp.com/app/answe...a-geforce-gpus

    As a result, I am still uncertain as to whether I will need to upgrade my video card (GTX 970) to obtain 10bpc using Lightroom within with Windows 10.
    I have been searching reliable and factual information about differences and general benefits of workstation cards, something that would explain how things function and where(is it in the code, the hardware etc.) do they function. I found nothing, or too little that would lead me to a solid conclusion of some sort. IMO it is the joint fault of manufacturers and developers of software and hardware we use, they are simply almost silent about these things.


    From my practical experience 10bit per color channel enabled in GPU control panel + good calibration and consequently color management + correct cable type connection + good(wide gamut) monitor makes a great environment for photo editing.

    Now if I would have enough money I would use a workstation card(primary for piece of mind that I use the mostly dedicated option for my workflow) and probably test the differences myself, and try to draw conclusions from that, but sadly this is not the case for now.

Page 3 of 3 FirstFirst 123

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •