Helpful Posts Helpful Posts:  0
Results 1 to 2 of 2

Thread: Will 8-bit Images Become Obsolete?

  1. #1

    Will 8-bit Images Become Obsolete?

    I have a question that has been bothering me for a while now, and to which nobody on the internet seems to have a ready answer, even though it might greatly affect each and every photography enthusiast.

    If I understand correctly, the not too distant future will bring us HDR monitors that offer contrast ratios in excess of 1:100000, roughly 7 stops more than we enjoy today.

    Now, I am worried that 8-bit/channel images, such as JPGs, will look quite terrible on such displays - if their meagre 256 steps of luminance are stretched out over a dynamic range so wide, it seems unavoidable (to me) that their tonality will suffer greatly, leading to 'blocky' images with ugly patches of monotone colour, and that compression artefacts, noise and banding will become much more visible.

    For that reason, I'm already recommending to friends and fellow forumers that they at least keep RAW or TIF (16b) copies of their most precious images, even if they only use JPG versions today, so that those images will better withstand the tooth of time.

    Some people are much less concerned though, and foresee no problems, saying that JPG files have plenty of gamut to look good on HDR displays and that it isn't worth all the extra bother of shooting raw+jpg.

  2. #2
    Administrator
    Join Date
    Apr 2008
    Location
    California, USA
    Posts
    1,473
    Real Name
    Sean
    I am going to start by partially sidestepping the question to suggest that people should almost always have a RAW version of their photos-- regardless of whether they are doing this in light of recent HDR advancements. Flash memory and hard disk space has become so inexpensive that the trade-off with having larger files is not as big a deal. There are of course exceptions, such as achieving higher frame rates for rapid fire action shots, but overall this is the best way to protect and "future proof" the images.

    As far as HDR, you are right in that 8bit gradations could potentially become much more apparent on an HDR display, just as with larger vs smaller color gamuts. On the other hand, these could always be displayed on the HDR display using a tonal mapping that produces the same look as the LDR display, so in this sense these photos are not necessarily going to become worse all of a sudden--just not necessarily as good as 16bit or HDR.

    Another important distinction is the bit depth of the image versus the output bit depth of the monitor, which is what would need to increase dramatically in the case of an HDR display. Perhaps the distinction between these two bit depths is getting blurred. Showing an 8-bit image would not be much of a problem with an HDR display that can select pixel values from a 16-bit palette of colors. The problem arises when the HDR monitor is still 8-bit since many of those bits have to be distributed across colors/intensities outside a normal LDR display--leaving fewer bits than would otherwise be available within an LDR monitor's gamut.

    I know this is not really a definitive yes or no, but hopefully it sheds some more light on the issue.

Tags for this Thread

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •