Helpful Posts Helpful Posts:  0
Page 1 of 2 12 LastLast
Results 1 to 20 of 26

Thread: Monitor Gamut Question

  1. #1

    Join Date
    Nov 2009
    Location
    Cape Coral, Florida
    Posts
    15

    Monitor Gamut Question

    CRTs are great, they have a great gamut and are easy to calibrate. I bought an Apple Cinema display from CompUSA before it went out of business (another company bought it and kept the name). I cannot get it to work. I updated the videocard to allow for the right resolution 1900x1200. If it is defective I will not send it in because I understand that apple charges as much for repairs as for a new one.

    My question is which LCD screen works best. Most of what I have seen only has 70% NTSC gamut. I understand HDMI standart is 130% of NTSC. I have a PC. I will be running Windows 7 64 bits. I currently have XP 64 Prof.

    Thanks.
    Last edited by Colin Southern; 12th November 2009 at 07:02 PM.

  2. #2

    Join Date
    Dec 2008
    Location
    New Zealand
    Posts
    17,660
    Real Name
    Have a guess :)

    Re: Monitor Gamut Question

    Quote Originally Posted by cienfuegos View Post
    CRTs are great, they have a great gamut and are easy to calibrate. I bought an Apple Cinema display from CompUSA before it went out of business (another company bought it and kept the name). I cannot get it to work. I updated the videocard to allow for the right resolution 1900x1200. If it is defective I will not send it in because I understand that apple charges as much for repairs as for a new one.
    Do you mean that you can't get anything on the screen at all? Have you got the right input selected? Can you use a different cable and try another input?

    My question is which LCD screen works best. Most of what I have seen only has 70% NTSC gamut. I understand HDMI standart is 130% of NTSC. I have a PC. I will be running Windows 7 64 bits. I currently have XP 64 Prof.

    Thanks.
    I haven't heard of "NTSC Gamuts"; most monitors are designed to represent the sRGB gamut, and seem to be fairly similar (personally, I find that having a non-reflective screen surface and wide viewing angle are about the only key things of importance in an external monitor).

    Gamuts will always be an issue (unless you want to fork out for an Adobe RGB gamut class like an Eizo; at the end of the day monitors always excel at red, green, and blue whereas printers excel at cyan, magenta (our eyes aren't as sensitive to yellow) - so if you have highly saturated colours in one, you risk being out of gamut on the other.

  3. #3

    Join Date
    Oct 2009
    Location
    USA - California
    Posts
    445

    Re: Monitor Gamut Question

    I don't know where you read that computer monitors have less colors than the NTSC standard. In fact, they have far superior color capabilities than NTSC. NTSC, for those who don't know, is the format standard-def television is broadcast in North America, at 60Hz (Everyone else uses PAL - the 50Hz standard). The "HDMI" standard isn't an HDMI standard, what you are referring to is the US/North America standard for broadcast HD (slighly different standards for 50Hz-based countries). Again, this has more to do with television than it does computers (unless you're doing video editing).

    Basically, the cheapest monitors won't have that great of colors. Television monitors will not show colors, text, or detail even as close to as good as a computer monitor will (This is why you don't use a TV monitor as a computer monitor! The hookups are only good for playing videos through your computer!). Most mid-range monitors in regards to color are fairly equal. If you go for a mid-range or even high-end (non-specialized), then you should make your decision based on other factors, such as viewing angles, response time, pixel pitch, contrast ratio, and brightness.

    If you are very color critical then you will be looking at high-end monitors specifically designed for color, photo and video editing. These displays will sport 10-bit or 12-bit color support (maybe even more these days). Now, with that said, you will need a video card that can kick out that high quality of a signal. Consumer video cards won't do it. You will need something in the likes of a nVidia QuadroFX. Moreover, the software has to support it, and very few products do. nVidia makes plugins for a lot of Adobe products to make use of high-bit output, with their flagship being Adobe AfterEffects (Photoshop for video), with a QuadroFX running a SDI daughterboard. SDI is the professional (and digital) version of component video cables. Again, this is very high-end stuff, used only for profressional color-critical applications, such as television broadcast or movie production. And before those of you who will say you can't tell the difference - hold your comments until you actually see how beautiful a 10-bit display is

    For a very long time LCDs were of less quality than CRT, especially in regards to color. This has changed in recent years. Add in the benefit of per-pixel accuracy from the digital signal, and the fact that pixels are on, and flicker off (as opposed to CRTs which are off and flicker on), the LCD screens are the way to go these days.

    Check first if you still have a manufacturer warrenty on the display before looking for new monitors. I have replied to several posts recently with details on what to look for in choosing a monitor: What the specs mean and why you should pay attention to them.

    Without knowing your price range, I will go out on a limb and recommend a mid-end monitor from a major brand of the size of your choice, in combination with a calibration/profiling device. This will get you the most out of your monitor

    Happy to answer any more questions you might have, let me know.

  4. #4
    Amberglass's Avatar
    Join Date
    Jul 2009
    Location
    Massachusetts
    Posts
    343

    Re: Monitor Gamut Question

    I have two apple cinema monitors and have been absolutely happy with them. I prefer the matte screen to edit with over the glossy. You can purchase apple care warranty for your monitor if it's having issues.

  5. #5

    Join Date
    Dec 2008
    Location
    New Zealand
    Posts
    17,660
    Real Name
    Have a guess :)

    Re: Monitor Gamut Question

    Quote Originally Posted by KentDub View Post
    NTSC, for those who don't know, is the format standard-def television is broadcast in North America
    Or as we like to say here in New Zealand (PAL), NTSC stands for "Never Twice Same Colour"

  6. #6

    Join Date
    Oct 2009
    Location
    USA - California
    Posts
    445

    Re: Monitor Gamut Question

    Quote Originally Posted by Colin Southern View Post
    Or as we like to say here in New Zealand (PAL), NTSC stands for "Never Twice Same Colour"
    Lol

    Messed up thing is there's some truth to that!

  7. #7
    agaace's Avatar
    Join Date
    Sep 2009
    Location
    WA
    Posts
    183

    Re: Monitor Gamut Question

    I've got questions, plenty of them! Thanks KentDub for all your help in all these topics. I'll move my questions here, it's a more relevant topic.

    You said the high contrast ratio is better but.. isn't it just to some extend? Like with adjusting contrast in Photoshop.. the photo generally looks better with more contrast but.. too much contrast is just equally bad as not enough contrast. Eizo say (http://www.eizo.com/global/products/.../CMS/02.html):
    Quote Originally Posted by Eizo
    A higher contrast ratio is not necessarily better. If the contrast ratio is too high, sometimes shadow colors can be too firm, making images appear stiff and rendering it difficult to match colors with output. The ability to adjust contrast ratio can also be used as a criterion for choosing a calibration monitor.
    Is it just marketing blah blah? In fact, their ColorEdge specs look way "worse" than those of monitors 4x cheaper than these (apart from covering over 90% of Adobe RGB).

    I read a bit about the differences between the TN, VA and IPS technologies, and I'd like to go for an IPS myself: mainly because the viewing angle. I don't mind slower response times, I'm not planning to use my screen for playing games or watching action movies. The only problem is.. H-IPS are fricking expensive. You scared me with this video card compatibility thing. I was originally planning to use my laptop for editing, but then I figured out it only has a VGA output. So I either buy a cheap monitor (no point getting excited about gamuts if I don;t even have a DVI), or I invest in my workstation computer (whih is pretty old now and needs a MAJOR upgrade). Ouch.

    Been looking at Eizos recently and decided I can't afford what I dream of (either this http://www.eizo.com/global/products/...43w/index.html or this http://www.eizo.com/global/products/...62w/index.html). Their prices are sick. So I went to NEC website - my intuition was - since they make panels themselves, maybe they're the brand to go for. I looked at their SpectraView II series. They are pretty expensive but you get color calibration "built-in", so you can deduct $200-300, what you'd probably spend on a color calibration solution. Does anyone know if their solution is decent or it's better to go for something else?

    KentDub, my budget is around $1000 (hehe the advantage of coming from Europe and shopping in the US - for me it will be only around €500-600, so still quite cheap until I start earning in dollars :P). I want a wideangle (EXTREMELY useful in software like Photoshop, VisualStudio, Expression Blend, Expression Designer - with these 4 guys I spend 99% of my time). I want it to be IPS preferably. And around 24" (is there any point in going for 26" if the resolution is the same as in 24"? Actually most 22" also have the same res.) I'm not planning to change it any soon, so it has to be a little bit too much for me at the moment, so that in 5 years it's still sufficient.

    I was thinking of this beauty: http://www.necdisplay.com/Products/P...0-9af4c4de3fe5 What do you think? Any advice will be extremely helpful.

    As to NTSC, I saw it in the specs of many business class monitor. Some have even 120-130% of NTSC coverage! I guess it's a new marketing patent. 75% of sRGB sounds bad. 125% of NTSC sounds better! (Eizo's guide mentioned above says the main difference will be in vivid greens and yellows, that's where the NTSC is lacking)

    And in the end of the day.. If my photos end up on websites, they need to be converted back to sRGB anyway. 85% of the viewers will see it on cheap uncalibrated screens, maybe 1% of the viewers will actually have a full sRGB screen. If I end up printing my photos.. again sRGB, because I dunno if there are any Adobe RGB printers out there? (Actually I'm planning to find a company that makes big prints in Seattle, I prefer an ocassional pro print in large format to be put on the wall rather than invest in a printer, inks, papers, and have all the hassle myself). So.. what's the point of having an Adobe RGB gamut monitor? I know, you use it for processing to see finer details etc. But in the end: for printing or websites you convert it back to sRGB, because nobody else will be able to see the finer details you might see with your screen. Even movie production and broadcasting.. they use all this color sensitive equipment for tens of thousands of dollars.. but millions of people at homes watch it on cheap screens anyway

    An out of the moon question: are the new LED monitors any good for photo editing? How do they relate to TN/VA/IPS in terms of gamuts and wideangles and stuff?

  8. #8

    Join Date
    Nov 2009
    Location
    Cape Coral, Florida
    Posts
    15

    Re: Monitor Gamut Question

    There were several questions.

    The cinema display requires a PCI input plus Firewire and USB2. I get nothing on the screen at all. The little light blinks three short blips which supposedly signifies incorrect resolution. I will try the 1900x1200 as soon as I can find the disk.

    Yhe 70% gamut figure comes from one of the latest issues of Maximum PC reviewing about 8 screens including LG. also I will see if I can transfer an Adobe article from the University of Washington on the subject of gamut. I don't have the other references handy, but the gamuts graphed were pitifull. They do not get close to Adobe RGB (98) nor Wide gamut profiles. They barely handle sRGB. CRT's on the other hand, have the capacity for those.

    I have considered a HD television for a monitor instead. they have both HDMI inputs as well as RGB.

    I will look into the NEC's mentioned. FYI, I beleive LG makes the cinema display. However, LG monitors don't seem to match the cinema display.

    Thank you for your help. I am still looking at your posts.
    Attached Files Attached Files

  9. #9

    Join Date
    Nov 2009
    Location
    Cape Coral, Florida
    Posts
    15

    Re: Monitor Gamut Question

    Sorry, I ment DVI

  10. #10

    Join Date
    Nov 2009
    Location
    Cape Coral, Florida
    Posts
    15

    Re: Monitor Gamut Question

    MI 1.3:

    * Higher speed: HDMI 1.3 increases its single-link bandwidth to 340 MHz (10.2 Gbps) to support the demands of future HD display devices, such as higher resolutions, Deep Color and high frame rates. In addition, built into the HDMI 1.3 specification is the technical foundation that will let future versions of HDMI reach significantly higher speeds.
    * Deep Color: HDMI 1.3 supports 10-bit, 12-bit and 16-bit (RGB or YCbCr) color depths, up from the 8-bit depths in previous versions of the HDMI specification, for stunning rendering of over one billion colors in unprecedented detail.
    * Broader color space: HDMI 1.3 adds support for “x.v.Color™” (which is the consumer name describing the IEC 61966-2-4 xvYCC color standard), which removes current color space limitations and enables the display of any color viewable by the human eye.
    * New mini connector: With small portable devices such as HD camcorders and still cameras demanding seamless connectivity to HDTVs, HDMI 1.3 offers a new, smaller form factor connector option.
    * Lip Sync: Because consumer electronics devices are using increasingly complex digital signal processing to enhance the clarity and detail of the content, synchronization of video and audio in user devices has become a greater challenge and could potentially require complex end-user adjustments. HDMI 1.3 incorporates automatic audio synching capabilities that allows devices to perform this synchronization automatically with total accuracy.
    * New HD lossless audio formats: In addition to HDMI’s current ability to support high-bandwidth uncompressed digital audio and all currently-available compressed formats (such as Dolby® Digital and DTS®), HDMI 1.3 adds additional support for new lossless compressed digital audio formats Dolby TrueHD and DTS-HD Master Audio™.

    That is a copied text on the HDMI 1.3 standard.

  11. #11

    Join Date
    Oct 2009
    Location
    USA - California
    Posts
    445

    Re: Monitor Gamut Question

    Quote Originally Posted by agaace View Post
    You said the high contrast ratio is better but.. isn't it just to some extend? Like with adjusting contrast in Photoshop.. the photo generally looks better with more contrast but.. too much contrast is just equally bad as not enough contrast. Eizo say (http://www.eizo.com/global/products/.../CMS/02.html):

    Is it just marketing blah blah? In fact, their ColorEdge specs look way "worse" than those of monitors 4x cheaper than these (apart from covering over 90% of Adobe RGB).
    Yes, higher contrast ratio is always better. You want the calibration/profiling to limit the contrast of your display. The calibration/profiling should not push your contrast to the maximum. CRT displays uses to lose contrast as they aged pretty dramatically. LCD's also lose contrast, but not nearly as much.

    Quote Originally Posted by agaace View Post
    You scared me with this video card compatibility thing. I was originally planning to use my laptop for editing, but then I figured out it only has a VGA output. So I either buy a cheap monitor (no point getting excited about gamuts if I don;t even have a DVI), or I invest in my workstation computer (whih is pretty old now and needs a MAJOR upgrade). Ouch.
    I was talking about *very* high-end stuff. I have used the SDI solution many times for video post production, and it is in a different league than consumer technologies.

    Quote Originally Posted by agaace View Post
    Been looking at Eizos recently and decided I can't afford what I dream of (either this http://www.eizo.com/global/products/...43w/index.html or this http://www.eizo.com/global/products/...62w/index.html). Their prices are sick. So I went to NEC website - my intuition was - since they make panels themselves, maybe they're the brand to go for. I looked at their SpectraView II series. They are pretty expensive but you get color calibration "built-in", so you can deduct $200-300, what you'd probably spend on a color calibration solution. Does anyone know if their solution is decent or it's better to go for something else?
    I don't have much time to review for you at the momment (I'm at work) - so I'll take a look later this evening. HP has a series of high-bit monitors for color critical applications that you might break into.

    Quote Originally Posted by agaace View Post
    KentDub, my budget is around $1000 (hehe the advantage of coming from Europe and shopping in the US - for me it will be only around €500-600, so still quite cheap until I start earning in dollars :P).
    Lucky! I would really look at trying to push it by another $200 to $300US. You will break into the color-critical series of monitors at that point.

    Quote Originally Posted by agaace View Post
    I want a wideangle (EXTREMELY useful in software like Photoshop, VisualStudio, Expression Blend, Expression Designer - with these 4 guys I spend 99% of my time).
    Wow, you live in the same programs I do! I assume you are a .NET web developer? Maybe even silverlight if you're lucky?

    Quote Originally Posted by agaace View Post
    I want it to be IPS preferably. And around 24" (is there any point in going for 26" if the resolution is the same as in 24"? Actually most 22" also have the same res.) I'm not planning to change it any soon, so it has to be a little bit too much for me at the moment, so that in 5 years it's still sufficient.
    I have a 28" Viewsonic (27.5" viewable). I absolutly love the large display. I might go a bit smaller, but not smaller than 24" viewable. Yes, there is a very big difference. As you raise the size of the monitor you need to increase the resolution as well. Windows systems by default are 96-DPI. You can increase this DPI, but not decrease it. This also goes into the Dot Pitch (aka Pixel Pitch), where DPI is really more a term used for printers. The lower the dot pitch (higher DPI), the sharper the display. Read this http://en.wikipedia.org/wiki/Dot_pitch - there is a link at the bottom to a calculator. If you can get a monitor at 96dpi or just over, you'll love it (This is easier to achive for smaller displays). There is no advantage of getting something drastically over 96dpi - you will have to increase the DPI scaling of Windows to compensate, and not all applications play nice when you do this. You will also need to consider how far you are away from the screen - the closer you are the more important the dot pitch & dpi becomes.

    Quote Originally Posted by agaace View Post
    I was thinking of this beauty: http://www.necdisplay.com/Products/P...0-9af4c4de3fe5 What do you think? Any advice will be extremely helpful.
    Will take a look this evening.

    Quote Originally Posted by agaace View Post
    As to NTSC, I saw it in the specs of many business class monitor. Some have even 120-130% of NTSC coverage! I guess it's a new marketing patent. 75% of sRGB sounds bad. 125% of NTSC sounds better! (Eizo's guide mentioned above says the main difference will be in vivid greens and yellows, that's where the NTSC is lacking)
    Again, NTSC is for video. You should be looking for sRGB or AdobeRGB.

    Quote Originally Posted by agaace View Post
    And in the end of the day.. If my photos end up on websites, they need to be converted back to sRGB anyway. 85% of the viewers will see it on cheap uncalibrated screens, maybe 1% of the viewers will actually have a full sRGB screen.
    This is very true - but think of it this way: 99.99% of audio devices "color" the sound - that is they don't have a perfectly flat frequency responce curve (aka calibrated). Studios still have to be calibrated, if they wern't you would end up with double-corrections or cancelations (Two waveforms put ontop of eachother). Photos and video are the same. Even though it is all digital it is still subject to how waveforms behave. Since devices "color" audio/video because they are uncalibrated - if the source is from a calibrated environment, it will *consistantly* "color" the signal to the end-users device. I know this isn't the greatest explaination, but are you following me here?

    Quote Originally Posted by agaace View Post
    If I end up printing my photos.. again sRGB, because I dunno if there are any Adobe RGB printers out there? (Actually I'm planning to find a company that makes big prints in Seattle, I prefer an ocassional pro print in large format to be put on the wall rather than invest in a printer, inks, papers, and have all the hassle myself). So.. what's the point of having an Adobe RGB gamut monitor? I know, you use it for processing to see finer details etc. But in the end: for printing or websites you convert it back to sRGB, because nobody else will be able to see the finer details you might see with your screen. Even movie production and broadcasting.. they use all this color sensitive equipment for tens of thousands of dollars.. but millions of people at homes watch it on cheap screens anyway
    See above for why you want a high-performance, calibrated environment. My Canon 8-ink printer prints in AdobeRGB - even 16-bit! As people will argue "The technology is comming" and others will counter "when? 20 years?", just look at the high-end stuff out there right now, it's just around the corner for consumers. AdobeRGB/16-bit printing from a home printer was unheard of just a couple years ago. My display can't show all of the colors that my camera and printer can provide - but it is there, and in 5 years when I'm in the market for a high-end monitor (like you are now), I'll finally have a displaythat I view them in their full glory without having to print them.

    Quote Originally Posted by agaace View Post
    An out of the moon question: are the new LED monitors any good for photo editing? How do they relate to TN/VA/IPS in terms of gamuts and wideangles and stuff?
    Yes and No. I can't be difinitive here because I havn't used an LED computer monitor. I can tell you some of the advantages of LED. First off, LED is just the backlighting technology used - it is not the display itsself. LEDs use a faction of the power, and will last 10x longer than traditional LCD backlighting. The most important feature LED screens give us is a consistant backlight. Do this little test: With your monitor on, display an image full screen that is 100% black. Look closely at the screen, you will see the edges that have the backlight bulbs are slightly brighter (Smaller screens this is sometimes just two sides, larger screens are all four sides). It is almost reverse vinetting! LED backlit monitors do not have this issue. So I would have to say as long as the screens are equal infront of the backlights - LED screens are better.
    Last edited by KentDub; 13th November 2009 at 07:13 PM. Reason: Had written 'higher' instead of 'lower' when referring to dot pitch

  12. #12
    Amberglass's Avatar
    Join Date
    Jul 2009
    Location
    Massachusetts
    Posts
    343

    Re: Monitor Gamut Question

    Have you considered this monitor for your needs? http://accessories.us.dell.com/sna/p...9&sku=223-4890

    The apple HD cinemas are designed for the macs, not any other brand btw.

  13. #13

    Join Date
    Oct 2009
    Location
    USA - California
    Posts
    445

    Re: Monitor Gamut Question

    Quote Originally Posted by cienfuegos View Post
    MI 1.3:

    * Higher speed: HDMI 1.3 increases its single-link bandwidth to 340 MHz (10.2 Gbps) to support the demands of future HD display devices, such as higher resolutions, Deep Color and high frame rates. In addition, built into the HDMI 1.3 specification is the technical foundation that will let future versions of HDMI reach significantly higher speeds.
    * Deep Color: HDMI 1.3 supports 10-bit, 12-bit and 16-bit (RGB or YCbCr) color depths, up from the 8-bit depths in previous versions of the HDMI specification, for stunning rendering of over one billion colors in unprecedented detail.
    * Broader color space: HDMI 1.3 adds support for “x.v.Color™” (which is the consumer name describing the IEC 61966-2-4 xvYCC color standard), which removes current color space limitations and enables the display of any color viewable by the human eye.
    * New mini connector: With small portable devices such as HD camcorders and still cameras demanding seamless connectivity to HDTVs, HDMI 1.3 offers a new, smaller form factor connector option.
    * Lip Sync: Because consumer electronics devices are using increasingly complex digital signal processing to enhance the clarity and detail of the content, synchronization of video and audio in user devices has become a greater challenge and could potentially require complex end-user adjustments. HDMI 1.3 incorporates automatic audio synching capabilities that allows devices to perform this synchronization automatically with total accuracy.
    * New HD lossless audio formats: In addition to HDMI’s current ability to support high-bandwidth uncompressed digital audio and all currently-available compressed formats (such as Dolby® Digital and DTS®), HDMI 1.3 adds additional support for new lossless compressed digital audio formats Dolby TrueHD and DTS-HD Master Audio™.

    That is a copied text on the HDMI 1.3 standard.
    Specifications are often developed years in advance from when the technology actually is delivered to consumers. This is more geard towards the future of television.

    Computers had a brief visit with HDMI, but are quickly moving to the new DisplayPort connectors. This is because computers have consistantly required higher resolution (thus more bandwidth) than televisions. Video card manufacturers will not release consumer cards that are capable of more than 8-bit color for a long time - they will force users to use their professional cards for some time. Even if they did, or you purchased the professional video card, you'll still have to address the software issue - most (by most I mean nearly all) software won't kick out the higher-quality signal. Most of the people using the technology are running custom-built specialized software (such as military or industrial applications), or are using the SDI daughterboard so they can connect into existing production pipelines (television broadcast and movie production).

    Stay away from televisions used as computer monitors. They are built specifically for different things. Just because they can accept each others signals dosn't mean you should! A lot of this has to do with the pixel pitch and DPI. As Colin argues when printing images, as the size of the print increases so does the viewing distance. If you put out a "Full HD" signal to a 50" television from a computer, but you are 4ft away, it will look like garbage because the DPI is horridly low. If you stand 10ft to 15ft away, it will look acceptable. Computer monitors are more expensive compared to televisions on a size-basis for a reason. The dot pitch and DPI is much, much higher on a computer screen - that is what gives us the sharpness needed to read small (really normal sized) fonts.

  14. #14

    Join Date
    Dec 2008
    Location
    New Zealand
    Posts
    17,660
    Real Name
    Have a guess :)

    Re: Monitor Gamut Question

    Quote Originally Posted by KentDub View Post
    Yes, higher contrast ratio is always better. You want the calibration/profiling to limit the contrast of your display.
    Not really - the "ideal" is for users to calibrate the monitor so that the profile doesn't have to do much; the more the profile has to do, the more levels are wasted. I'll give you an example ...

    Suppose you have a REALLY bright monitor, and with mucked up contrast to make things even worse. Say it goes to black at "level 50" and "level 255" is full brightness, but level 128 gets it to as bright as it needs to be.

    In this scenario when you profile the screen the profile has to translate a potential range of 256 values into a range of 78 - and obviously that's too big a step between levels and will be visibly obvious on gradients.

    If you can calibrate the screen so that 0 = black and 255 is the ideal brightness then the profile doesn't have to do anything and the full 256 levels are available.

  15. #15

    Join Date
    Oct 2009
    Location
    USA - California
    Posts
    445

    Re: Monitor Gamut Question

    Quote Originally Posted by Colin Southern View Post
    Not really - the "ideal" is for users to calibrate the monitor so that the profile doesn't have to do much; the more the profile has to do, the more levels are wasted. I'll give you an example ...

    Suppose you have a REALLY bright monitor, and with mucked up contrast to make things even worse. Say it goes to black at "level 50" and "level 255" is full brightness, but level 128 gets it to as bright as it needs to be.

    In this scenario when you profile the screen the profile has to translate a potential range of 256 values into a range of 78 - and obviously that's too big a step between levels and will be visibly obvious on gradients.

    If you can calibrate the screen so that 0 = black and 255 is the ideal brightness then the profile doesn't have to do anything and the full 256 levels are available.
    That is true, thank you for the correction, Colin. I was in the mindset of highend equipment (equipment capable of >8bit processing). Note that the signal dosn't necessarily have to be more than 8bpp to take advantage of >8bpp processing.

  16. #16

    Join Date
    Dec 2008
    Location
    New Zealand
    Posts
    17,660
    Real Name
    Have a guess :)

    Re: Monitor Gamut Question

    Quote Originally Posted by KentDub View Post
    That is true, thank you for the correction, Colin. I was in the mindset of highend equipment (equipment capable of >8bit processing). Note that the signal dosn't necessarily have to be more than 8bpp to take advantage of >8bpp processing.
    Call me "old", call me a "fool" ... just don't call me an "old fool" (hey, I still have some pride left!) - but I reckon just about any LCD monitor* with a wide viewing angle and matt finish is just fine for 99% of photographic enthusiasts

    *when calibrated and profiled!

  17. #17

    Join Date
    Oct 2009
    Location
    USA - California
    Posts
    445

    Re: Monitor Gamut Question

    Quote Originally Posted by Colin Southern View Post
    Call me "old", call me a "fool" ... just don't call me an "old fool" (hey, I still have some pride left!) - but I reckon just about any LCD monitor* with a wide viewing angle and matt finish is just fine for 99% of photographic enthusiasts

    *when calibrated and profiled!
    I disagree. I think the deciding factor (should be, at least) is if the end-user can tell the difference (Feel free to argue that 99% of people can't see the difference - most probebly cant!). A $150 camera and a $3000 camera both take pictures - but do it quite differently, and with dramatic quality differences. A $150 monitor and a $3000 monitor both display images, but do it differently and with dramatic quality differences.

    Dramatic is a completely subjective word - so like I said, it's only worth it if you can see it. Lots of us buy accessories to maintain the highest optical quality on our cameras, often times purchasing quality we can't see. Why go through all of the trouble if you are showing it on a cheap monitor? I think "You get what you pay for" still applies here.

    If you care about that quality, and more imporantly have an eye that can see the difference, then researching and purchasing a mid-end to high-end monitor is very rewarding.

  18. #18

    Join Date
    Dec 2008
    Location
    New Zealand
    Posts
    17,660
    Real Name
    Have a guess :)

    Re: Monitor Gamut Question

    Quote Originally Posted by KentDub View Post
    I disagree. I think the deciding factor (should be, at least) is if the end-user can tell the difference
    No disagreement there

    (Feel free to argue that 99% of people can't see the difference - most probebly cant!).
    You got it

    A $150 camera and a $3000 camera both take pictures - but do it quite differently, and with dramatic quality differences.
    I think you'd be unpleasently surprised. In the latest edition of Real World Image Sharpening they've printed 6 images of the same thing with cameras including an iPhone, 450D, 50D (I think) 1Ds3, Hassleblad ... and you really can't see any difference (obviously printed small to compensate for the differences in megpixels). I'm not saying that there aren't differences, but what differences there are come down more to differing performance in extreme situations, that doesn't apply to a monitor.

    A $150 monitor and a $3000 monitor both display images, but do it differently and with dramatic quality differences.
    Let me put it this way - I reckon that if you displayed images on a calibrated and profiled run-of-the-mill monitor and a top of the line Eizo, then most wouldn't be able to tell the difference

  19. #19

    Join Date
    Oct 2009
    Location
    USA - California
    Posts
    445

    Re: Monitor Gamut Question

    Quote Originally Posted by Colin Southern View Post
    Not really - the "ideal" is for users to calibrate the monitor so that the profile doesn't have to do much; the more the profile has to do, the more levels are wasted. I'll give you an example ...

    Suppose you have a REALLY bright monitor, and with mucked up contrast to make things even worse. Say it goes to black at "level 50" and "level 255" is full brightness, but level 128 gets it to as bright as it needs to be.

    In this scenario when you profile the screen the profile has to translate a potential range of 256 values into a range of 78 - and obviously that's too big a step between levels and will be visibly obvious on gradients.

    If you can calibrate the screen so that 0 = black and 255 is the ideal brightness then the profile doesn't have to do anything and the full 256 levels are available.
    So I was walking around thinking about this, and yes, higher contrast ratios are ALWAYS better. What you are referring to is contrast itsself, not the devices contrast ratio. Ask yourself this: How Black is your black? Most monitors look dark grey when showing a pure black screen. This is because their limited contrast ratios will merge all blacks together if they are below a certain brightness (usually set by the backlight of the screen).

    The "ideal" is a calibrated monitor; All my statements are made on the assumption that the monitor is correctly calibrated. Two calibrated monitors with different contrast ratios will appear visually different if placed side by side (and 9 out of 10 will prefer the higher contrast ratio monitor).

    The error in my origional statement was suggesting that profiling/calibrating limits the capabilties of the display rather than setting the monitor to achive its maximum potential.

    The job of calibrating the device is to achive the maximum contrast the device can output WITHOUT clipping any detail. In otherwords, to maximize the dynamic range of the monitor. Calibrating the device will never reduce the performance of the monitor - it simply sets it correctly according to its physical capabilities.

    A high-contrast image displayed on a monitor with a low constrast ratio will end up with blacks merging together, and white merging together visually. A higher contrast ratio (aka dynamic range), will handle this situation with much more grace.

    The situation you describe, Colin, is similar to how I had my monitor set before I bought a calibration system. It was very crisp, but I was crushing a lot of my blacks to get that very nice look to it (simulating a higher contrast ratio at the expensive of black detail). Now that it is propery calibrated, there is almost a gray film over my monitor, due to it's lacking contrast ratio. If my monitor had a nice contrast ratio (around 1000:1), then I would not have the "film".

    Please note, for those who don't completely understand, contrast ratio is completely different than cranking up the contrast slider in your monitors OSD (more to what Colin is talking about).

    So in the end, a higher contrast ratio = blacks are "blacker", and whites are "whiter".
    Last edited by KentDub; 13th November 2009 at 09:48 PM.

  20. #20

    Join Date
    Oct 2009
    Location
    USA - California
    Posts
    445

    Re: Monitor Gamut Question

    I think at some point I'll write a FAQ/Tutorial about all of this, complete with references for those who are interested in the subject. Re-reading everything I find my thoughts are all over the place, and I imagine it's hard for others to follow!

Page 1 of 2 12 LastLast

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •