Results 1 to 12 of 12

Thread: Hardware Calibration of Dell Ultrasharp

  1. #1

    Hardware Calibration of Dell Ultrasharp

    Six months ago I got an exceptional deal on a new in box, direct from Dell, U2413 wide gamut monitor. I plugged it in to my Mac mini and liked it immediately. I already knew that to calibrate the monitor internal LUT Dell forces you to use the XRite i1Display Pro. I soon found out that I had to buy a minidisplayport to DVI adapter to hook up the monitor to the Thunderbolt port to even get a wide gamut output, so I hooked up the old monitor, a Dell 2320L, with the HDMI cable I had been using and now have a dual display setup, which I also like very much except for one annoying occasional problem. Every once in a while the Dock moves to the second monitor. Don't know why, and don't know why it also moves back at some point. The HDMI port on a Mac only puts out Rec 709 color with its limited dynamic range of 16 to 235 in each color channel, so you actually can't even calibrate sRGB.
    Anyway, I set the monitor to Adobe RGB, 6500, and 2.2 for starters, and the improvement in Adobe RGB embedded images was noticeable. The factory default Adobe RGB looked good and the spec sheet for the factory cal read really good numbers, so I wasn't sure I'd bother with getting an XRite. I did find a decent price and bought it. Googling for how to, I collected a bunch of web articles and a forum called Photography Life. This forum has a thread titled How To Properly Calibrate A Dell U2413. The beginning of the thread has a detailed procedure with pictures and links, followed by many comments from readers. There is at least one error, it says to set the monitor to "Native", but the U2413 doesn't have Native, so you have to guess that 6500 is close enough for government work. If you expend a lot of effort you can find the Dell UltraSharp Color Calibration Solution Software for Mac on the Dell Support page. This is version 1.5.3. I'll return to that in a bit.
    The thread comments contain quite a bit of misinformation, confusion, and Windows advocacy, including a digression about Windows being the only way to go because it has 10 bit per channel color and Mac is not capable of that. Yeah, Win 7 and above can be set for 10 bit, with the right video card, but you might have to dedicate that machine to Photoshop, because it breaks most other apps. 8 bits does mean that you are only setting 256 points of a 16,382 point LUT, but even if you have a 10 bit pipeline, the U2413 is actually 8bit+FRC, so you are feeding an interpolation to an interpolation for whatever that is worth. It is doubtful that you will notice 10 bit anyway. Humans can only distinguish at most about six million colors if you include luminosity. 10 bit eliminates any real possibility of banding in wide color spaces at best.
    To return to the Dell calibrate software. its folder contains a read me which says " Some Mac video cards may cause he computer to crash and fail to reboot to the desktop.Connect a second monitor and reboot, then reconnect the primary monitor."
    Great. All you need is an ambiguous FUD statement before using a new device with a new monitor.
    I contacted Bruce Wright at XRite on the recommendation of a thread commenter and told him I had tried Dell support in the past and it was a nightmare. He asked me to try them again and if they didn't help me to get a case number so that Xrite could deal with it at company level. After a half hour of penetrating the Dell support structure, I got another Indian in the Concierge area who appears not to know that Dell makes monitors, would not transfer me to someone who does, and would not give me a case number. I reported this to Bruce , who assured me they would take it from there.
    He said this today, so it will probably be a week before I can find out anything.
    What kind of an idiot would write a non specific warning that implies you have a second monitor to recover from a crash.
    Last edited by Richard Lundberg; 4th December 2014 at 05:06 AM.

  2. #2
    ajohnw's Avatar
    Join Date
    Aug 2012
    Location
    S, B'ham UK
    Posts
    3,337
    Real Name
    John

    Re: Hardware Calibration of Dell Ultrasharp

    Did the statement about the problem have a date on it Richard? The problem if it exists may have gone away.

    Any software I have written never has any problems but software people are sometimes placed in an awkward position. There might be a million mac users that are perfectly happy and haven't had any problems at all. A couple may have, who knows why. It might just be down to some other software installed on the machine. Might even be a near act of God - something happened at the only instance when it could cause a problem. This sort of thing is particularly pertinent to PC's.

    Personally as you do have 2 monitors I would give it a go. You can also still soft calibrate the monitor if you wish. My recollection is that it doesn't meet the usual recommendations as it comes, either method will fix that.

    John
    -

  3. #3
    yauman's Avatar
    Join Date
    May 2013
    Location
    Martinez, CA, USA
    Posts
    47
    Real Name
    Yau-Man Chan

    Re: Hardware Calibration of Dell Ultrasharp

    Ok, let me try to clear things up a little as I own two Dell monitors on two Mac systems as well as a Dell Optiplex

    I have a Dell U2713H (30”) attached to my 15" Macbook Pro 2012 Retina via a Thunderbolt hub (Belkin). I also have the little brother Dell U2413 (24”) attached to my MacPro 3,1 (Early 2008). Except for the size, the U2713H and U2413 are pretty much identical units (in fact there’s very little area size difference between the two since the size by diagonal measure. Because of the high resolution, if you are using the DVI port, make sure you are using a Dual-Link DVI cable with it otherwise you see fuzz! I like them very much - obviously since I got the U2713H first and then bought the U2413 a year later.

    Here’s the low down on these monitors and the Mac:
    1. Dell do not want to or know how to support the Mac’s and they don’t care. When first release the U2713H cannot deal with Mac videos without an add-on to the Mac’s display profile - you can google it and find the script. That problem seem to have been fixed but anyone who tried to call Dell for help were basically told straight out they don’t support the Mac (...and they don't care!!!)
    2.Yes, the monitors are capable of hardware calibration - you can calibrate and update the two internal LUT (Look Up Table) but you can only do it from a Window’s PC. Dell has the program that will run on PC only and you HAVE to use the i1DisplayPro calibrator from X-rite. The Dell “SetUp” program for calibration (which comes with the monitor) is basically the i1DisplayPro program lobotomized (printer and projector calibration options removed) and default parameters customized for the Dell Monitor. I have the i1DiisplayPro and have used it to hardware calibrate both the monitors from my Dell PC.
    3. You can run the full-version i1DisplayPro program from Xrite for the Mac - they just updated it for the latest Mac OS. It will soft-calibrated - i.e. build a color profile which your Mac can be set to use whenever it sees the Dell display. (Under Settings > Display > Color) I calibrate my monitors once very couple of month and it’s quite easy to do with the i1DisplayPro.

    So there you have it. Love the Dell U2713H and U2413. I think they are much better than the Apple Thunderbolt Display - if you use the Apple Display, you don’t need to buy a Thunderbolt hub - it has one built into the display but when it comes to color and faithfulness to profile, these two Dell Ultrasharps beat the Apple hands down. The Dell U2713H is about the same price as the Apple when I got it - it’s now about US$100 to US$200 less than Apple’s. I got the U2413 on eBay for only US$300 - New, though it looks like a floor model when I received it - a bit dusty but for that price I’m not complaining.

  4. #4
    Shadowman's Avatar
    Join Date
    Dec 2009
    Location
    WNY
    Posts
    36,717
    Real Name
    John

    Re: Hardware Calibration of Dell Ultrasharp

    Does anyone calibrate their tablets? Is it even possible?

  5. #5

    Re: Hardware Calibration of Dell Ultrasharp

    I am using the DVI cable that came with the monitor, whatever that is, and I don't see any fuzz.
    Your comment 2. is incorrect. The Dell Support page if you can penetrate its mysteries, has a Mac hardware calibrator Dell UltraSharp Calibration Solution ver 1.5.3 specifically for Mac. It, like the Wndows version, requires the i1 Display Pro.
    The problem is that according to the read me file "some mac video cards will crash and will not reboot to the desktop, a second monitor must be used to reboot after which the U2413 can be reconnected.' Which macs? This is the question my contact at XRite, Bruce Wright, is having addressed by the XRite Dell liaison. I highly recommend Bruce if you have any XRite needs.
    I still question the efficacy of a 256 point correction to a 16,382 point LUT from a Mac, and further question the efficacy of a 1024 point correction from a 10 bit Win machine driving an 8bit+FRC display, but we'll see. I may go through the whole exercise and just actually use the factory default Adobe RGB.

  6. #6
    Loose Canon's Avatar
    Join Date
    Jun 2010
    Location
    Missouri, USA
    Posts
    2,454
    Real Name
    Terry

    Re: Hardware Calibration of Dell Ultrasharp

    I appreciate this thread.

    I'm taking the 2413 off my "wish list". No way I want to deal with this mess and I wasn't close enough to pulling the trigger on one yet to get deeper into the research.

    I have dealt with Dell before. By the time I was done I wanted to kill everyone with an axe! Swore I would never purchase another Dell product no matter how good it was.

    Add to that the fact that I am an Apple user!

  7. #7
    ajohnw's Avatar
    Join Date
    Aug 2012
    Location
    S, B'ham UK
    Posts
    3,337
    Real Name
    John

    Re: Hardware Calibration of Dell Ultrasharp

    I haven't asked Dell about anything for a long time but when I did it wasn't via a phone call. They had a type it out on a web form route to support. It was ok. Could be that the question was passed to the right person. Call centre people are often rather cheap and can only answer questions that most users could sort out for themselves but there are exceptions at the user end. I'd guess it's not a fun job for many which can alter attitudes as well.

    I seem to remember that Richard installed the profiling and calibration software I use. An xyz lut calibration will easily get errors down to insignificant levels by installing a LUT on the video card. The only problem is that firefox doesn't like the channels being swapped as it doesn't check which is which - it should. A plain ordinary ICC profile will in all probability do the same on what is basically a good quality monitor. All the hardware in monitor LUT's do is work to a deeper bit depth. If delta E's are all low with an 8 bit LUT so what, the extra bit depth doesn't really offer any real advantage other than not loosing single bits due to it's range. A 14 bit LUT has to be rounded. The only problem with LUT's is that they are calculated at specific colour settings so the over all response may not be smooth. An ICC file will be even if only a few colour patches are used but may have greater errors.

    10bit aRGB or even deeper. No comment but there are implications.

    The only problem I had is probably down to Dell's marketing people but as software has gone of late could also be down to the people who generate it. When I balanced RGB to the target colour temperature previously I have achieved an exact balance. The controls went 0 - 255 giving nice small steps. Now they go 0-100 as numbers like 255 confuse people etc or so it's believed. Probably little point but I got round it by shifting the target colour temperature a bit. Not enough to be concerned about. The contrast and brightness settings from tftcentral's UK data base save time as well.

    John
    -

  8. #8
    Moderator Manfred M's Avatar
    Join Date
    Mar 2012
    Location
    Ottawa, Canada
    Posts
    21,942
    Real Name
    Manfred Mueller

    Re: Hardware Calibration of Dell Ultrasharp

    I have followed this thread with a bit of interest and find it a bit amusing that Dell, rather than Apple getting the blame.

    My experience (through a video co-op I belong to) suggests the finger should be pointed at Apple, not Dell.

    Apple has a long history of using its "walled garden" approach to hardware and software. If Apple sells the gear or software, they will support it very well. On the other hand, if you buy third party hardware or software you are on your own. The Apple help folks point you straight at your third party supplier.

    If you speak to the third party, you will find that they are as frustrated by the lack of with Apple support to them. We had issues with both a high end third party video monitor card as well as some well known third party video software. Both suppliers told us the same thing; they got no help at all from Apple to resolve the issue. Microsoft, on the other hand has a large group dedicated to working with third party suppliers to resolve issues.

  9. #9
    ajohnw's Avatar
    Join Date
    Aug 2012
    Location
    S, B'ham UK
    Posts
    3,337
    Real Name
    John

    Re: Hardware Calibration of Dell Ultrasharp

    Quote Originally Posted by GrumpyDiver View Post
    I have followed this thread with a bit of interest and find it a bit amusing that Dell, rather than Apple getting the blame.

    My experience (through a video co-op I belong to) suggests the finger should be pointed at Apple, not Dell.

    Apple has a long history of using its "walled garden" approach to hardware and software. If Apple sells the gear or software, they will support it very well. On the other hand, if you buy third party hardware or software you are on your own. The Apple help folks point you straight at your third party supplier.

    If you speak to the third party, you will find that they are as frustrated by the lack of with Apple support to them. We had issues with both a high end third party video monitor card as well as some well known third party video software. Both suppliers told us the same thing; they got no help at all from Apple to resolve the issue. Microsoft, on the other hand has a large group dedicated to working with third party suppliers to resolve issues.
    That has probably improved now Manfred down to Apple switching to some flavour of OSX/Linux. I have read that a Apple MACs now contains BASH, stands for Bourne Again Shell, a horrific thing if people like concise shells, DOS is a shell and much more concise, too much so. This should in principle make it easier for developers to get into and do things with it and probably has. You are correct about their attitude though and I suspect that has seriously limited sales. PC's are a little different. IBM produced their technical reference manual which effectively laid the entire hardware at people feet and it went from there - anybody could make them for a start. However MS stuff has it's odd aspects designed to make life difficult for others but there have always been clear routes into it.

    Apple still play games. Apple Talk and open rather insecure protocol for bits to talk to each other changed recently making life difficult for some. Changes like this usually do. Ipad's might work with clone card readers at one point but might not next time the software is upgraded. Bit like Canon and AF confirm adapters.

    I'm inclined to agree that this area may have related to Dell's problems. Lack of info and the supplier not willing to provide it. There are now however a lot of cross platform dev kits available that can generate software to run on all sorts of things. They are getting pretty robust these days.

    John
    -

  10. #10

    Re: Hardware Calibration of Dell Ultrasharp

    Dell is just basically impossible to deal with. The Apple support community can usually provide a clue to solve most problems, but for something like this you would have to pay, and I prefer to to let XRite hash it out with Dell.
    Microsoft has this problem with Windows that they tell you exactly what to do when you install or trysomething, but they don't tell you what you absolutely must do BEFORE you install or try something.

  11. #11

    Re: Hardware Calibration of Dell Ultrasharp

    FWIW, here is a bunch of stuff from various sources about 10 bit output. Andrew Rodney (digitaldog.com) says he doesn't see much value in it


    http://www.ronmartblog.com/2011/07/g...bit-color.html




    View Profile Personal Message (Offline)

    Re: Link or advice to setup new Eizo monitor
    « Reply #6 on: September 07, 2014, 12:47:04 PM »
    ReplyReply Reply with quoteQuote
    AFAIK the CS230 has an 8-bit panel. That's still a whole lot better than the 6-bit + frame rate control of most consumer grade displays.

    The advantage of a 10-bit pipeline is to eliminate all traces of banding, with 1024 discrete levels instead of 256. Whether you will see a real difference is questionable as long as the unit is hardware calibrated to an internal 16-bit LUT. This is a more urgent issue with software calibration to the video card, where banding rapidly becomes a real problem.

    My CG246 does have a 10 bit panel, but I still haven't upgraded to a 10-bit capable video card. It's just not a high priority.

    The thing is to distinguish between "impressive" display and real usefulness as a proofing device. I go for the latter.
    « Last Edit: September 07, 2014, 12:50:46 PM by D Fosse » Report to moderator Logged
    Czornyj
    Sr. Member
    ****
    Offline Offline

    Posts: 1436




    View Profile WWW Personal Message (Offline)

    Re: Link or advice to setup new Eizo monitor
    « Reply #7 on: September 07, 2014, 02:08:00 PM »
    ReplyReply Reply with quoteQuote
    Quote from: D Fosse on September 07, 2014, 12:47:04 PM
    AFAIK the CS230 has an 8-bit panel. That's still a whole lot better than the 6-bit + frame rate control of most consumer grade displays.

    The advantage of a 10-bit pipeline is to eliminate all traces of banding, with 1024 discrete levels instead of 256. Whether you will see a real difference is questionable as long as the unit is hardware calibrated to an internal 16-bit LUT. This is a more urgent issue with software calibration to the video card, where banding rapidly becomes a real problem.

    My CG246 does have a 10 bit panel, but I still haven't upgraded to a 10-bit capable video card. It's just not a high priority.

    The thing is to distinguish between "impressive" display and real usefulness as a proofing device. I go for the latter.

    Actually CS230 has 6+2HiFRC bit panel. There's virtually no visible difference between 10bit vs 8bit pipeline in real world images, and it only works in PS + Fire Pro/Quadro + Windows configurations.
    Report to moderator Logged
    Marcin Kałuża | zarzadzaniebarwa.pl
    D Fosse
    Sr. Member
    ****
    Offline Offline

    Posts: 355




    View Profile Personal Message (Offline)

    Re: Link or advice to setup new Eizo monitor
    « Reply #8 on: September 07, 2014, 03:36:47 PM »
    ReplyReply Reply with quoteQuote
    Well, according to Eizo we're both wrong. They claim 10-bit capability in the CS230:

    http://www.eizo.com/global/products/...230/index.html
    Report to moderator Logged
    Czornyj
    Sr. Member
    ****
    Offline Offline

    Posts: 1436




    View Profile WWW Personal Message (Offline)

    Re: Link or advice to setup new Eizo monitor
    « Reply #9 on: September 07, 2014, 04:20:07 PM »
    ReplyReply Reply with quoteQuote
    It's 6bit + HiFRC panel, 100%.
    Report to moderator Logged
    Marcin Kałuża | zarzadzaniebarwa.pl
    D Fosse
    Sr. Member
    ****
    Offline Offline

    Posts: 355




    View Profile Personal Message (Offline)

    Re: Link or advice to setup new Eizo monitor
    « Reply #10 on: September 07, 2014, 04:29:42 PM »
    ReplyReply Reply with quoteQuote
    Not to argue (you usually know what you're talking about) - but how do you know? What specific panel are they using? Is it different from the NEC P232?
    Report to moderator Logged
    brntoki
    Newbie
    *
    Offline Offline

    Posts: 45


    View Profile Personal Message (Offline)

    Re: Link or advice to setup new Eizo monitor
    « Reply #11 on: September 08, 2014, 09:23:32 AM »
    ReplyReply Reply with quoteQuote
    Quote from: Czornyj on September 07, 2014, 02:08:00 PM
    Actually CS230 has 6+2HiFRC bit panel. There's virtually no visible difference between 10bit vs 8bit pipeline in real world images, and it only works in PS + Fire Pro/Quadro + Windows configurations.

    Thanks to you both. I didn't know for sure if the advantage was only with the Fire Pro/Quadro cards, or if any recent card with a display port would offer the 10 bit throughput. Eizo wasn't clear about that . . . Oops! They were:

    "Using the DisplayPort input, the monitor offers 10-bit simultaneous color display* from a 16-bit look-up table which means it can show more than one billion colors simultaneously. This is 64 times as many colors as you get with 8-bit display which results in even smoother color gradations and reduced Delta-E between two adjacent colors.
    *A graphics board and software which support 10-bit output are also necessary for 10-bit display."

    Well, looking forward to get started with this monitor in the next day or two.

    Thanks again to you both for your guidance.
    Report to moderator Logged
    digitaldog
    Sr. Member
    ****
    Offline Offline

    Posts: 9241

    New edit:

    You will not find that 10 bit display chain provides much, if any, benefit in calibrating over a monitor with a 14 bit LUT which can be calibrated. In fact, the numbers speak for the 8 bit chain with a 14 bit LUT beating out the 10 bit chain in color accuracy by a fairly wide margin. And if you get a monitor with a 16 bit 3D LUT which can be calibrated a 10 bit system has nowhere near the ability to adjust correctness. Unless it too uses the same LUT, in which case the displays will be identical for every image out there except those artificially generated to show banding in 8 bit.

    Naturally this is moot. Even the most well trained human eyes will be hard pressed to tell the difference between an 8 bit image chain with a 16 bit 3D LUT monitor and a 10 bit chain. The difference is the 8 bit chain is well tested, well supported, and good enough for everything we use computers for in photography and video.

    Apart from whether you fancy 10-bit color or not, the hardware is there, and on the Windows side, so is the software, windows 7/8 both support 10-bit color.
    For rather limited values of "support". The operating system will start with a 10 bit GPU with the correct driver and a 10 bit monitor attached. Most display systems in the OS will be disabled. Aero will not work. Font smoothing is weird. And the image viewing and manipulation tools delivered with Windows know nothing about 10 bit display and will not make use of it.

    A few applications support 10 bit editing, such as Photoshop, but there are a lot of glitches. Photoshop will display some parts of an image in 8 bit, some in 10 bit, and there are a lot of glitches with the LUT when this happens. It's more of a proof of concept or special purpose application at this stage.

    Beyond Photoshop support is EXTREMELY limited. Browsers will start showing images in strange ways. Flash will look weird. Forget about games running in 10 bit mode.

    The nVidia quadro k600 is very affordable and has 10-bit color output.
    It's a CAD card. Yes, it will do 10 bit output, but the rest of the software chain is so limited in 10 bit support it's hardly worth it.

    But where is Apple? What about the Mac being the best option for graphic designers/professionals and serious amateurs?
    Graphic designers and professionals work on deadlines. They are less interested in gimmicks than in getting a good enough final image out the door. They have no need for glitchy 10 bit support, and arguably not for any 10 bit support at all. They have 16 bit 3D LUT's in their hardware calibrated monitors and will see no difference at all if they switch to a 10 bit system.

    Serious amateurs have more of a choice, since they do not work under deadline. Some will punish themselves by going 10 bit. Others will get a professional level high bit LUT monitor. Others, like me, do not have the eyesight to make use of anything nearly so fancy but rely on a Dell hardware calibrated 14 bit 3D LUT monitor and are happy with that.

    Plus, staying at 8 bit allows for using all available software and not just a subset of Photoshop.

    Is Apple anno 2013 so much concerned with selling as many Iphones or Ipads as possible, that they don't have time to comply to the latest professional standards in the graphics world anymore?
    The latest professional standards in the graphics world are easy to manage 3D LUT's. Apple supports those very well.

    http://documentation.apple.com/en/co...3%26tasks=true

    Or have they decided to just leave that to the windows powered workstation? You can build a véry decent workstation or buy a HP 220 for around 1300,- euros. It will have top-notch graphics
    It will have as good graphics as your skill in managing it provides. If you want accurate colors you need calibration, LUT management and a good workflow, just like you do with your Apple.

    whereas my 2500,- euro Macbook pro with mountain lion is a color management mess with a wide gamut high end screen
    It's a professional tool. That means you have to know how to use it to get good results from it.

    and no prospect on 10-bit support for upcoming replacement Mac's.
    And what's the point of 10 bit support? You're just stating that it's required as if that's a given.

    So much for Apple's "pro" image.
    If the result of your work is bad, blame the tools. It's what all the "pro's" do.

    Any thoughts on this mystery?
    The mystery is who told you that 10 bit is such a panacea?

    Jesper
    theswede's gear list:
    Fujifilm FinePix X100 Konica Minolta Maxxum 5D Sony SLT-A37 Sony 50mm F1.4 +2 more
    Reply Reply with quote Complain
    Chris Mak
    Chris Mak
    Senior Member • Posts: 1,270
    Like?
    Re: The mystery of Apple against 10-bit graphics
    In reply to theswede, Aug 27, 2013
    theswede wrote:

    Chris Mak wrote:

    Well, mavericks is about to be released, the new Mac pro is here, and still not the remotest sign of 10-bit color support on the Mac. The latest mid class monitors, like the Eizo GC223W that I use myself (already a few years old), all support 10-bit color, and it is slowly advancing in the professional graphics world as the new standard for serious image, 3D and video editing.
    Slowly being the keyword here. I have not heard of a single profgessional studio switching to 10 bit display chains across the board, and those few people who run 10 bit systems have a lot of issues with them. For one, basic Windows functionality gets compromised.

    I use a mid 2011 Macbook pro with the Eizo on the thunderbolt port, and I know I will need a new computer in time, should I want to use the 10-bit color option, which should have noticeable benefits especially in calibrating with e.g. the latest i1 display pro.
    You will not find that 10 bit display chain provides much, if any, benefit in calibrating over a monitor with a 14 bit LUT which can be calibrated. In fact, the numbers speak for the 8 bit chain with a 14 bit LUT beating out the 10 bit chain in color accuracy by a fairly wide margin. And if you get a monitor with a 16 bit 3D LUT which can be calibrated a 10 bit system has nowhere near the ability to adjust correctness. Unless it too uses the same LUT, in which case the displays will be identical for every image out there except those artificially generated to show banding in 8 bit.

    Naturally this is moot. Even the most well trained human eyes will be hard pressed to tell the difference between an 8 bit image chain with a 16 bit 3D LUT monitor and a 10 bit chain. The difference is the 8 bit chain is well tested, well supported, and good enough for everything we use computers for in photography and video.

    Apart from whether you fancy 10-bit color or not, the hardware is there, and on the Windows side, so is the software, windows 7/8 both support 10-bit color.
    For rather limited values of "support". The operating system will start with a 10 bit GPU with the correct driver and a 10 bit monitor attached. Most display systems in the OS will be disabled. Aero will not work. Font smoothing is weird. And the image viewing and manipulation tools delivered with Windows know nothing about 10 bit display and will not make use of it.

    A few applications support 10 bit editing, such as Photoshop, but there are a lot of glitches. Photoshop will display some parts of an image in 8 bit, some in 10 bit, and there are a lot of glitches with the LUT when this happens. It's more of a proof of concept or special purpose application at this stage.

    Beyond Photoshop support is EXTREMELY limited. Browsers will start showing images in strange ways. Flash will look weird. Forget about games running in 10 bit mode.

    The nVidia quadro k600 is very affordable and has 10-bit color output.
    It's a CAD card. Yes, it will do 10 bit output, but the rest of the software chain is so limited in 10 bit support it's hardly worth it.

    But where is Apple? What about the Mac being the best option for graphic designers/professionals and serious amateurs?
    Graphic designers and professionals work on deadlines. They are less interested in gimmicks than in getting a good enough final image out the door. They have no need for glitchy 10 bit support, and arguably not for any 10 bit support at all. They have 16 bit 3D LUT's in their hardware calibrated monitors and will see no difference at all if they switch to a 10 bit system.

    Serious amateurs have more of a choice, since they do not work under deadline. Some will punish themselves by going 10 bit. Others will get a professional level high bit LUT monitor. Others, like me, do not have the eyesight to make use of anything nearly so fancy but rely on a Dell hardware calibrated 14 bit 3D LUT monitor and are happy with that.

    Plus, staying at 8 bit allows for using all available software and not just a subset of Photoshop.

    Is Apple anno 2013 so much concerned with selling as many Iphones or Ipads as possible, that they don't have time to comply to the latest professional standards in the graphics world anymore?
    The latest professional standards in the graphics world are easy to manage 3D LUT's. Apple supports those very well.

    http://documentation.apple.com/en/co...3%26tasks=true

    Or have they decided to just leave that to the windows powered workstation? You can build a véry decent workstation or buy a HP 220 for around 1300,- euros. It will have top-notch graphics
    It will have as good graphics as your skill in managing it provides. If you want accurate colors you need calibration, LUT management and a good workflow, just like you do with your Apple.

    whereas my 2500,- euro Macbook pro with mountain lion is a color management mess with a wide gamut high end screen
    It's a professional tool. That means you have to know how to use it to get good results from it.
    The issues with mountain lion Colorsync on wide gamut screens are not user dependable. The "strange look" of e.g. the Mountain lion dock on wide screen displays (wildly saturated icons) has been widely debated. Also, issues like making proper printer calibration near impossible by stripping out the possibility of printing (a target) without color management, as well as constant change of handling un-tagged images leading to soft proof problems in e.g. Adobe Acrobat, are well known and widely discussed. That simply undercuts any professional reliability of Mac's regarding color management.

    and no prospect on 10-bit support for upcoming replacement Mac's.
    And what's the point of 10 bit support? You're just stating that it's required as if that's a given.

    So much for Apple's "pro" image.
    If the result of your work is bad, blame the tools. It's what all the "pro's" do.

    Any thoughts on this mystery?
    The mystery is who told you that 10 bit is such a panacea?

    Jesper
    Thanks for your detailed thoughts on this. I have not yet used 10-bit myself, but am well familiar with a similar discussion on 12-bit versus 14-bit raw in camera files, with many arguing there is no visible difference, but there is when you do a lot of image editing. 10-bit color on monitors won't influence the quality of your images itself, like with 14-bit raw, it's purely meant to be able to see more accurate what the result of the editing is. I can follow your line of thought and you may be right on a number of things, but in the same line of arguing, there are still people that feel that sRGB should be the only color space used, when the Eizo displays clearly show that it gives tremendous problems with reproducing vivid colors in e.g. the yellow spectrum, if you work from raw files from a high quality 14-bit camera.

    Changing the infrastructure to accommodate higher quality hardware and software is always premature I guess. I don't think Apple should lag here, and simply offer support for 10-bit color, and leave it to the customer to decide whether to make use of it or not. I must admit that I started distrusting Apple when they cut out the "print with no color management" option, forcing me to go through hoops to properly calibrate my printer. I feel that a professional attitude is not to cut out, leave out and decide for the customer like Apple seems to be doing now, but simply comply with modern professional standards and offer support for modern hardware.

    Chris
    Reply Reply with quote


    View Profile WWW Personal Message (Offline)

    Re: Link or advice to setup new Eizo monitor
    « Reply #12 on: September 08, 2014, 11:29:14 AM »
    ReplyReply Reply with quoteQuote
    Quote from: brntoki on September 08, 2014, 09:23:32 AM
    *A graphics board and software which support 10-bit output are also necessary for 10-bit display."
    And an OS and application that support it. If you're on a Mac, you're out of luck.


    Re: NEC PA322UHD and good news for european NEC users
    « Reply #7 on: Today at 06:30:21 PM »
    ReplyReply Reply with quoteQuote
    Quote from: narikin on Today at 02:42:24 PM
    Well if it is the same Sharp developed panel, which it seems to be, then I don't see how the Gamut can be any different, though of course NEC's control of the display may very well be better. I can believe that. Asus' version of this same panel are now under $1500 in the US, so it had better be a lot better to be worth double the money!

    I would respectfully disagree with you about 10bit pipelines, it is a quite surprising difference onscreen when you are doing critical color balancing, gradients, shadow tones, etc, 10/30 bit display from Graphics Card to OS to Monitor works extremely well, and I doubt I could ever go back to just 8bit now. Quite shocking that Apple does not implement it.

    It's not really a problem to tell the difference between normal and wide gamut display, and I have a an i1Pro2 + i1Display Pro sensor to support my observations. Trust me - althoug I didn't saw Asus, I saw Sharp panel, it's not in the same league with NEC PA322UHD.

    I've only made tests with colour critical NEC displays on 10 bit pipeline, and frankly didn't see the obvious difference - even on the only real 10-bit panel (NEC PA302W), and nevertheless I'm experienced graphic designer with perfect color vision on the FM 100 hue test.

  12. #12
    ajohnw's Avatar
    Join Date
    Aug 2012
    Location
    S, B'ham UK
    Posts
    3,337
    Real Name
    John

    Re: Hardware Calibration of Dell Ultrasharp

    Maybe you read and worry too much Richard. Personally I would just get on with what I decided to do and get on with it, The web is a better place for looking for solutions WHEN YOU HAVE A PROBLEM not for looking for problems before you have even had them. Lots out there is way way out of date. Misguided at times also springs to mind. There are also a fair few fabrications.

    John
    -

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •