Welcome to the MacNN Forums.

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

You are here: MacNN Forums > Community > MacNN Lounge > Still confused about HDR screen

Still confused about HDR screen
Thread Tools
Ham Sandwich
Guest
Status:
Reply With Quote
Nov 13, 2018, 01:48 PM
 
[...deleted...]
( Last edited by Ham Sandwich; Apr 23, 2020 at 10:07 AM. )
     
subego
Clinically Insane
Join Date: Jun 2001
Location: Chicago, Bang! Bang!
Status: Offline
Reply With Quote
Nov 13, 2018, 03:28 PM
 
None of Apple’s displays are HDR.

All current Apple displays have 256 steps between light and dark. This is currently standard dynamic range, and has been the standard for decades.

The gamut is how wide a range of colors a display can show. More colors isn’t more dynamic range. Dynamic range is about light and dark.


Edit: also, 95% of the time, when someone talks about an HDR image, it isn’t HDR, it’s something called tone mapping.

Edit2: what is an HDR image is most camera RAW files... which have to be converted before display on Apple products because all Apple products only have standard dynamic range.
( Last edited by subego; Nov 13, 2018 at 03:41 PM. )
     
Spheric Harlot
Clinically Insane
Join Date: Nov 1999
Location: 888500128, C3, 2nd soft.
Status: Offline
Reply With Quote
Nov 13, 2018, 03:50 PM
 
So why does the TV app offer to have me watch a few of the iTunes-purchased movies "in HDR (High Dynamic Range) on this iPhone" (Xs), as opposed to on the 2017 iPad Pro?

Is the image processor just that much faster and can do real-time tone-mapping on a movie to gain a higher effective dynamic resolution, while the iPad Pro can't and is stuck in a single colour space?
     
subego
Clinically Insane
Join Date: Jun 2001
Location: Chicago, Bang! Bang!
Status: Offline
Reply With Quote
Nov 13, 2018, 04:27 PM
 
Investigating.
     
subego
Clinically Insane
Join Date: Jun 2001
Location: Chicago, Bang! Bang!
Status: Offline
Reply With Quote
Nov 13, 2018, 08:38 PM
 
Originally Posted by Spheric Harlot View Post
Is the image processor just that much faster and can do real-time tone-mapping on a movie to gain a higher effective dynamic resolution, while the iPad Pro can't and is stuck in a single colour space?
Essentially, yes.
     
subego
Clinically Insane
Join Date: Jun 2001
Location: Chicago, Bang! Bang!
Status: Offline
Reply With Quote
Nov 13, 2018, 10:37 PM
 
Originally Posted by And.reg View Post
So, Apple bragged about the Xs having an HDR screen. But it also has a P3 color gamut. But so have the Pro Macbooks since late 2016, the iPhones since 7. And the iPad Pros since always. And......

So, does P3 make it HDR as well?
Which of Apple’s P3 screens are HDR screens vs. are not HDR screens?
I didn’t read this post as carefully as I should have the first time, and because of it, I was kind of a dick. I apologize for that. Let me give this another swing.

The only screens Apple calls HDR screens are the X, XS, and XS Max.

Apple hasn’t said what HDR screen means.

Whatever it means, the display architecture of those phones is SDR. They can’t do anything with genuine HDR content unless it gets squeezed into SDR first.
     
Spheric Harlot
Clinically Insane
Join Date: Nov 1999
Location: 888500128, C3, 2nd soft.
Status: Offline
Reply With Quote
Nov 14, 2018, 12:32 AM
 
Originally Posted by subego View Post
Essentially, yes.
Source?
     
subego
Clinically Insane
Join Date: Jun 2001
Location: Chicago, Bang! Bang!
Status: Offline
Reply With Quote
Nov 14, 2018, 04:58 AM
 
https://arstechnica.com/gadgets/2017...e-future/6/#h1

Originally Posted by Ars
Apple claims that some kind of software wizardry is in play to use the 10-bit information in a file and render it on an 8-bit display in a way that is superior to just working with 8-bit content. I’m not sure what to make of that; there’s only so much you can do when the hardware is limited to 8-bit color. But Apple hasn’t gone public with details here.
     
P
Moderator
Join Date: Apr 2000
Location: Gothenburg, Sweden
Status: Offline
Reply With Quote
Nov 14, 2018, 08:41 AM
 
HDR means different things to different people. One thing is can mean is the total gamut - ie, how red a 100% red pixel is. This is related to the P3 designation. Another thing it can mean is 10-bit color, meaning that there are 2^10 possible brightness levels for each color, as opposed to 2^8 or even 2^6. It is sometimes stated that if you have an extended gamut, you should also have 10-bit color to avoid the steps between two colors becoming too large. I find this reasoning flawed because the difference in gamut is not that large, and most of the market was fine with 6-bit precision on laptops for sRGB until quite recently, but I suppose I can see the logic.

Apple has shipped displays with DCI-P3 and 10-bit color in the 15" MBP at least, but never guaranteed that, and not all of them have it. There is a panel lottery.

Having P3 gamut is only good - you can show more colors than you could otherwise. Having a 10-bit panel has a massive downside in that it kills graphics performance in some circumstances. My external display supports 10-bit color (although it is only sRGB, and I suspect that it's an 8-bit panel that uses temporal dithering for the other bits) and I always turn it off when I can.
( Last edited by P; Nov 14, 2018 at 08:51 AM. )
The new Mac Pro has up to 30 MB of cache inside the processor itself. That's more than the HD in my first Mac. Somehow I'm still running out of space.
     
P
Moderator
Join Date: Apr 2000
Location: Gothenburg, Sweden
Status: Offline
Reply With Quote
Nov 14, 2018, 08:43 AM
 
Sounds like they're doing temporal dithering.
The new Mac Pro has up to 30 MB of cache inside the processor itself. That's more than the HD in my first Mac. Somehow I'm still running out of space.
     
Ham Sandwich
Guest
Status:
Reply With Quote
Nov 14, 2018, 09:51 AM
 
[...deleted...]
( Last edited by Ham Sandwich; Apr 23, 2020 at 10:07 AM. )
     
Spheric Harlot
Clinically Insane
Join Date: Nov 1999
Location: 888500128, C3, 2nd soft.
Status: Offline
Reply With Quote
Nov 14, 2018, 11:02 AM
 
Read more carefully:

Apple has shipped displays with DCI-P3 and 10-bit color in the 15" MBP at least, but never guaranteed that, and not all of them have it. There is a panel lottery.
Nothing contradictory.
     
subego
Clinically Insane
Join Date: Jun 2001
Location: Chicago, Bang! Bang!
Status: Offline
Reply With Quote
Nov 14, 2018, 11:02 AM
 
Originally Posted by P View Post
HDR means different things to different people.
Dynamic range, in regards to visual media, originated from black and white photography.

Obviously, color’s not involved there, so if we translate the concept to display architecture, color won’t be involved in the translation either.

With photography, dynamic range is the range between lightest and darkest. In a digital display architecture, this range is fundamentally determined by bit-depth.

So, the straightforward, accurate translation is dynamic range means bit-depth. If 8-bit is the standard, then “high” is, well... higher.

This isn’t necessarily the “right” definition, but I submit any other definition is an expansion of this one. Maybe there’s a reason to make it mean more than this, but I can’t see the result being anything other than making it all more complicated and confusing.
     
subego
Clinically Insane
Join Date: Jun 2001
Location: Chicago, Bang! Bang!
Status: Offline
Reply With Quote
Nov 14, 2018, 11:04 AM
 
Originally Posted by P View Post
Sounds like they're doing temporal dithering.
I was going to mention this until I realized that’s just a form of tone mapping, and I didn’t want to look nitpicky.
     
P
Moderator
Join Date: Apr 2000
Location: Gothenburg, Sweden
Status: Offline
Reply With Quote
Nov 14, 2018, 12:27 PM
 
Originally Posted by subego View Post
Dynamic range, in regards to visual media, originated from black and white photography.

Obviously, color’s not involved there, so if we translate the concept to display architecture, color won’t be involved in the translation either.

With photography, dynamic range is the range between lightest and darkest. In a digital display architecture, this range is fundamentally determined by bit-depth.

So, the straightforward, accurate translation is dynamic range means bit-depth. If 8-bit is the standard, then “high” is, well... higher.

This isn’t necessarily the “right” definition, but I submit any other definition is an expansion of this one. Maybe there’s a reason to make it mean more than this, but I can’t see the result being anything other than making it all more complicated and confusing.
I don't know what's right, I'm just saying that the term is used to mean different things. For displays it is either bit depth or gamut. For photography, it is taking multiple photos with different exposure settings and combining them. For gaming some ten years ago, it was about using light sources with a brightness over 100% so that their reflections would look natural (works great for outdoor settings). For gaming now, it goes back to the display thing. It is a term that is so overused as to become almost useless, so I'm taking a very careful approach whenever I see it.

Right now, my MBP has a P3 display with 8-bit color, and the external display it is connected to is sRGB with 10-bit color. By your definition, the external display is the HDR one. That seems wrong to me.
The new Mac Pro has up to 30 MB of cache inside the processor itself. That's more than the HD in my first Mac. Somehow I'm still running out of space.
     
subego
Clinically Insane
Join Date: Jun 2001
Location: Chicago, Bang! Bang!
Status: Offline
Reply With Quote
Nov 15, 2018, 12:27 AM
 
Originally Posted by P View Post
That seems wrong to me.
In all contexts, dynamic range is defined as a ratio.

A gamut is a ratio of what to what?
     
OreoCookie
Moderator
Join Date: May 2001
Location: Hilbert space
Status: Offline
Reply With Quote
Nov 15, 2018, 04:43 AM
 
Originally Posted by subego View Post
In all contexts, dynamic range is defined as a ratio.

A gamut is a ratio of what to what?
First of all, gamut is not a ratio of two numbers, because we are talking about colors. Instead, you should think in terms of coverage in color space. There are different coordinate systems in color space, so to quantify differences in gamut only makes sense after you have chosen a coordinate system. To understand that, have a look at this picture:



(Taken from here.)

In a given color space, you can now picture e. g. a device's color gamut. For sRGB this is the area inside the triangle. If you compare this to Adobe RGB, then sRGB is a proper subset of Adobe RGB and you can compare the areas of the two triangles (sRGB covers about 86 % of Adobe RGB). However, not all color gamuts need to be stacked inside one another. The other big standard that comes from the world of video, DCI P3, also has a larger gamut than sRGB, but compared to Adobe RGB it extends into different directions. That means Adobe RGB covers only a fraction of DCI P3 and vice versa, even though in both cases they share the common subset sRGB and are larger than that. (sRGB covers about 80 % of DCI P3.)

So now, does DCI P3 have a larger or a smaller gamut than Adobe RGB? I'm afraid there is no easy answer. You can measure the area of the corresponding triangles in color space, but you should keep the meaning of it all in mind: human perception matters, and the human eye is best at distinguishing shades of green, for example. So different colors matter in different ways to the human eye.

Note that this is quite different from color depth: here you discretize your triangles into a collection of tiles. The number of tiles is decided by the color depth.
I don't suffer from insanity, I enjoy every minute of it.
     
Ham Sandwich
Guest
Status:
Reply With Quote
Nov 15, 2018, 10:11 AM
 
[...deleted...]
( Last edited by Ham Sandwich; Apr 23, 2020 at 10:07 AM. )
     
subego
Clinically Insane
Join Date: Jun 2001
Location: Chicago, Bang! Bang!
Status: Offline
Reply With Quote
Nov 15, 2018, 12:29 PM
 
Originally Posted by OreoCookie View Post
First of all, gamut is not a ratio of two numbers, because we are talking about colors.
That’s my point.

Since dynamic range is a ratio of two numbers, it cannot apply to a gamut.
     
P
Moderator
Join Date: Apr 2000
Location: Gothenburg, Sweden
Status: Offline
Reply With Quote
Nov 15, 2018, 05:59 PM
 
The ratio of a gamut to the total color space humans can perceive?

I don’t really care what is the correct answer, because people use the term to mean all sorts of things. It’s like complaining that people say Frankenstein when they mean the monster.
The new Mac Pro has up to 30 MB of cache inside the processor itself. That's more than the HD in my first Mac. Somehow I'm still running out of space.
     
OreoCookie
Moderator
Join Date: May 2001
Location: Hilbert space
Status: Offline
Reply With Quote
Nov 15, 2018, 08:11 PM
 
@P
That would be one reasonable way to do this. However, human color perception is logarithmic whereas sensors and film are linear, so that would add further to the complexity here.

@subego
Well, yes, in that you are right. But you also claimed earlier that “dynamic range is bit depth”, which is not correct. Bit depth tells you the number of gradations (e. g. shades of gray), but that could be across a very narrow dynamic range or a very wide dynamic range.
I don't suffer from insanity, I enjoy every minute of it.
     
subego
Clinically Insane
Join Date: Jun 2001
Location: Chicago, Bang! Bang!
Status: Offline
Reply With Quote
Nov 16, 2018, 02:37 AM
 
Originally Posted by OreoCookie View Post
Well, yes, in that you are right. But you also claimed earlier that “dynamic range is bit depth”, which is not correct. Bit depth tells you the number of gradations (e. g. shades of gray), but that could be across a very narrow dynamic range or a very wide dynamic range.
Telling me I’m wrong without touching on an alternative definition is extremely frustrating and unhelpful.
( Last edited by subego; Nov 16, 2018 at 04:27 AM. )
     
P
Moderator
Join Date: Apr 2000
Location: Gothenburg, Sweden
Status: Offline
Reply With Quote
Nov 16, 2018, 07:14 AM
 
It can be the dynamic range between 100% red and completely black, plus the same for blue and green? Which makes it a vector I suppose, but still, it is a possible definition.

(I just think that 10-bit color is an utterly useless thing to care about for the vast majority of users who were happy with 6-bit for a very long time. I can see a wider gamut no problem, and it really pops, but greater granularity in the colors? 1 billion colors instead of 17 million? No.)
The new Mac Pro has up to 30 MB of cache inside the processor itself. That's more than the HD in my first Mac. Somehow I'm still running out of space.
     
subego
Clinically Insane
Join Date: Jun 2001
Location: Chicago, Bang! Bang!
Status: Offline
Reply With Quote
Nov 16, 2018, 10:11 AM
 
Originally Posted by P View Post
It can be the dynamic range between 100% red and completely black, plus the same for blue and green? Which makes it a vector I suppose, but still, it is a possible definition.

(I just think that 10-bit color is an utterly useless thing to care about for the vast majority of users who were happy with 6-bit for a very long time. I can see a wider gamut no problem, and it really pops, but greater granularity in the colors? 1 billion colors instead of 17 million? No.)
If what we define as dynamic range sits at the end of the display architecture pipeline, it makes more sense to define it the way Apple appears to have here. An OLED screen is what makes it HDR because it has a significantly higher contrast ratio than the standard..

That’s not to knock a wider gamut, in fact, it’s the opposite. Why cram a two-dimensional foot, with such a colorfully qualitative long axis, into a shabby, one-dimensional, black and white shoe?

Whichever one, the “problem” is if HDR refers to something like a wide gamut or a high contrast ratio, then there’s no such thing as HDR content. Both gamut and contrast ratio work their magic regardless of what content gets sent to it.

This is why my definition exists. Without a redesign of digital imaging from the ground up, the only way to demonstrably improve the content is to increase the resolution or the bit depth.

HDR “means” higher bit-depth because an option to call HDR content something other than higher bit-depth literally doesn’t exist.

To be clear, I’m not questioning whether this is a shitty definition. It quite clearly is, and you’re now dumber for knowing it.
     
OreoCookie
Moderator
Join Date: May 2001
Location: Hilbert space
Status: Offline
Reply With Quote
Nov 17, 2018, 12:52 AM
 
Originally Posted by subego View Post
Telling me I’m wrong without touching on an alternative definition is extremely frustrating and unhelpful.
I'm not quite sure why you respond so aggressively. I thought I had answered that question in the preceding post:
Originally Posted by OreoCookie
Note that this is quite different from color depth: here you discretize your triangles into a collection of tiles. The number of tiles is decided by the color depth.
That means if all you care about is dynamic range (defined as the ratio between the brightest white and the “darkest black”, for example) has nothing to do with color depth, that dictates how many shades of gray you have that interpolate from white (= brightest white) and black (= least light).

HDR does the following: it compresses a larger dynamic range (where the information is taken from multiple exposures) into a smaller one. That's why many of the early, less tastefully done HDRs have an unnatural look to them with lots of halos and all.
Originally Posted by subego View Post
If what we define as dynamic range sits at the end of the display architecture pipeline, it makes more sense to define it the way Apple appears to have here. An OLED screen is what makes it HDR because it has a significantly higher contrast ratio than the standard..
No, you have to really think about it as a problem in color management. Have a look at my post in this thread, and you will see why.
Originally Posted by subego View Post
Whichever one, the “problem” is if HDR refers to something like a wide gamut or a high contrast ratio, then there’s no such thing as HDR content. Both gamut and contrast ratio work their magic regardless of what content gets sent to it.
It isn't magic, HDR is compression of larger dynamic range from one or several source files to a single file. You can use a combination of higher bit depth source files and using several different exposures. The former allows you to find different non-linear interpolations so that you can recover detail in the shadows and highlights. The latter gives you a larger dynamic range.

Indeed, modern dslrs and large sensor cameras process 14 bit RAW files, which is a much larger bit depth than the output devices (8 or 10 bit). That is why you can recover a lot of details when you shoot RAW.
( Last edited by OreoCookie; Nov 17, 2018 at 08:51 AM. )
I don't suffer from insanity, I enjoy every minute of it.
     
subego
Clinically Insane
Join Date: Jun 2001
Location: Chicago, Bang! Bang!
Status: Offline
Reply With Quote
Nov 17, 2018, 03:59 AM
 
Originally Posted by OreoCookie View Post
I'm not quite sure why you respond so aggressively.
While I’m responding to the rest...

If “this post is frustrating and unhelpful” counts as aggressive, I want to know what internet you’ve been spending time on. It sounds like a much nicer place than the one I got in the mail.
     
OreoCookie
Moderator
Join Date: May 2001
Location: Hilbert space
Status: Offline
Reply With Quote
Nov 17, 2018, 05:40 AM
 
Originally Posted by subego View Post
If “this post is frustrating and unhelpful” counts as aggressive, I want to know what internet you’ve been spending time on. It sounds like a much nicer place than the one I got in the mail.
I have thicker skin than that, but I was just a little taken aback. In any case, don’t worry about it.

Out of curiosity: was my second reply and the link more helpful?
I don't suffer from insanity, I enjoy every minute of it.
     
subego
Clinically Insane
Join Date: Jun 2001
Location: Chicago, Bang! Bang!
Status: Offline
Reply With Quote
Nov 17, 2018, 05:57 AM
 
Absolutely!

I’m working up a reply.
     
Ham Sandwich
Guest
Status:
Reply With Quote
Nov 17, 2018, 11:05 AM
 
[...deleted...]
( Last edited by Ham Sandwich; Apr 23, 2020 at 10:07 AM. )
     
Spheric Harlot
Clinically Insane
Join Date: Nov 1999
Location: 888500128, C3, 2nd soft.
Status: Offline
Reply With Quote
Nov 17, 2018, 12:58 PM
 
Originally Posted by And.reg View Post
Is this the case for both photos and HDR movies, or just photos?
Also, does this definition apply to media and displays, or just the media?
     
Brien
Professional Poster
Join Date: Jun 2002
Location: Southern California
Status: Offline
Reply With Quote
Nov 18, 2018, 12:50 AM
 
Originally Posted by subego View Post
If what we define as dynamic range sits at the end of the display architecture pipeline, it makes more sense to define it the way Apple appears to have here. An OLED screen is what makes it HDR because it has a significantly higher contrast ratio than the standard..

That’s not to knock a wider gamut, in fact, it’s the opposite. Why cram a two-dimensional foot, with such a colorfully qualitative long axis, into a shabby, one-dimensional, black and white shoe?

Whichever one, the “problem” is if HDR refers to something like a wide gamut or a high contrast ratio, then there’s no such thing as HDR content. Both gamut and contrast ratio work their magic regardless of what content gets sent to it.

This is why my definition exists. Without a redesign of digital imaging from the ground up, the only way to demonstrably improve the content is to increase the resolution or the bit depth.

HDR “means” higher bit-depth because an option to call HDR content something other than higher bit-depth literally doesn’t exist.

To be clear, I’m not questioning whether this is a shitty definition. It quite clearly is, and you’re now dumber for knowing it.
It doesn’t help that HDR photos and HDR mean drastically different things.
     
OreoCookie
Moderator
Join Date: May 2001
Location: Hilbert space
Status: Offline
Reply With Quote
Nov 18, 2018, 04:08 AM
 
Originally Posted by And.reg View Post
Is this the case for both photos and HDR movies, or just photos?
As far as I understand the idea of HDR, it applies to just images, whether it is a single image or a series of images shouldn't matter: you combine image data with a larger dynamic range and combine them in a way so as to compress the dynamic range. Put simply, you take the shadow detail from an overexposed photo and the details in the highlights from an underexposed photos. The mid-range is obtained from a properly exposed photo. A classic example would be a sunset or a sunrise, and you would like to capture the detail of what is beneath the sky (say, a skyline or a forest). Of course, with video, you are more limited in how much data you have.
I don't suffer from insanity, I enjoy every minute of it.
     
subego
Clinically Insane
Join Date: Jun 2001
Location: Chicago, Bang! Bang!
Status: Offline
Reply With Quote
Nov 18, 2018, 12:47 PM
 
Originally Posted by OreoCookie View Post
That means if all you care about is dynamic range (defined as the ratio between the brightest white and the “darkest black”, for example) has nothing to do with color depth, that dictates how many shades of gray you have that interpolate from white (= brightest white) and black (= least light).
Here’s the best way I can put this.

Digital signal processing fundamentally involves the rendering of a continuous set into a countable set.

If my signal covers range X, it needs to be divided into Y samples for it to retain the appearance of continuity.

If my signal covers range 2X, but I only use Y samples, I haven’t rendered a 2X signal. I’ve rendered an X signal and stretched it. Half of the rendering is quantanization error.

The number of available samples (shades of grey in this case) is what ultimately determines the range of the signal.
( Last edited by subego; Nov 18, 2018 at 01:22 PM. )
     
subego
Clinically Insane
Join Date: Jun 2001
Location: Chicago, Bang! Bang!
Status: Offline
Reply With Quote
Nov 18, 2018, 01:37 PM
 
It’s the same thing with gamuts.

sRGB has 256 steps between white and red.

Let’s say we have a much wider gamut. We’ll call it “supergamut”.

If that has only 256 steps between white and red there will be colors it can’t display, but sRGB can, because sRGB is more granular.
     
subego
Clinically Insane
Join Date: Jun 2001
Location: Chicago, Bang! Bang!
Status: Offline
Reply With Quote
Nov 18, 2018, 01:55 PM
 
Originally Posted by OreoCookie View Post
HDR does the following: it compresses a larger dynamic range...
I understand where this definition comes from, but it’s nonsensical.

If HDR is compressing a large dynamic range, what’s the term for an image which actually has a large dynamic range?

As I mentioned earlier, compressing dynamic range is tone mapping.
( Last edited by subego; Nov 18, 2018 at 02:11 PM. )
     
Brien
Professional Poster
Join Date: Jun 2002
Location: Southern California
Status: Offline
Reply With Quote
Nov 18, 2018, 02:08 PM
 
Originally Posted by OreoCookie View Post
As far as I understand the idea of HDR, it applies to just images, whether it is a single image or a series of images shouldn't matter: you combine image data with a larger dynamic range and combine them in a way so as to compress the dynamic range. Put simply, you take the shadow detail from an overexposed photo and the details in the highlights from an underexposed photos. The mid-range is obtained from a properly exposed photo. A classic example would be a sunset or a sunrise, and you would like to capture the detail of what is beneath the sky (say, a skyline or a forest). Of course, with video, you are more limited in how much data you have.
That is still tone mapping / HDR photography. HDR video does actually have more dynamic range. Waaay brighter highlights and deeper blacks.
     
subego
Clinically Insane
Join Date: Jun 2001
Location: Chicago, Bang! Bang!
Status: Offline
Reply With Quote
Nov 18, 2018, 02:22 PM
 
Originally Posted by OreoCookie View Post
defined as the ratio between the brightest white and the “darkest black”, for example
FWIW, I consider this to be the correct definition of dynamic range.

Which is why I’m resisting the idea of including color, because both terms of this ratio are achromatic.
     
subego
Clinically Insane
Join Date: Jun 2001
Location: Chicago, Bang! Bang!
Status: Offline
Reply With Quote
Nov 18, 2018, 02:42 PM
 
Originally Posted by Spheric Harlot View Post
Also, does this definition apply to media and displays, or just the media?
The definition isn’t quite right, but the answer is both. To use a digital audio analogy, you won’t notice a low sample rate if your speaker is shit.
     
subego
Clinically Insane
Join Date: Jun 2001
Location: Chicago, Bang! Bang!
Status: Offline
Reply With Quote
Nov 18, 2018, 03:05 PM
 
@Oreo

I also want to add I completely get the point you were making that as a data construct, the range between black and white is a...

That’s the thing. There isn’t a word for it. I searched for hours. I asked people with graduate degrees in math. Nothing.

The closest I found were proper fraction, bounded set, and unit interval. None of those are really right. I’d say this falls in the realm of dimensional analysis, but I wasn’t able to dig up anything specific.

Of course, whatever this property is, it’s the basis for the “one louder” joke from Spinal Tap.
     
Spheric Harlot
Clinically Insane
Join Date: Nov 1999
Location: 888500128, C3, 2nd soft.
Status: Offline
Reply With Quote
Nov 18, 2018, 06:10 PM
 
Originally Posted by subego View Post
The definition isn’t quite right, but the answer is both. To use a digital audio analogy, you won’t notice a low sample rate if your speaker is shit.
What I’m asking about is whether there is a difference between hi-res content and hi-res playback equipment.

Not the speaker. You can have 24-bit content at 96kHz, but if your DAC is capable of reproducing only standard-resolution 16-Bit audio at 44.1 kHz, you’re not going to measure a difference.

My question is whether HDR content has a wider resolution that only an HDR display is capable of displaying?

The answer so far seems to be confusing.
     
OreoCookie
Moderator
Join Date: May 2001
Location: Hilbert space
Status: Offline
Reply With Quote
Nov 18, 2018, 08:43 PM
 
Originally Posted by subego View Post
@Oreo

I also want to add I completely get the point you were making that as a data construct, the range between black and white is a...

That’s the thing. There isn’t a word for it. I searched for hours. I asked people with graduate degrees in math. Nothing.
All you had to do was follow the link to my Macrumors forum post swhere I explain all of this and give references. I’m quite sure the confusion would then subside. It is very simple: even for black and white, you need to go through color, because white is a mixture of colors. You really need to think in terms of color spaces, and mathematically, the name is quite apt.

Take a (input or output) device. For argument’s sake, let’s say it is a monitor. First, you can use colorimeter to completely map all the colors and intensities the monitor can display. These are “real-world” colors with no reference as to how the colors were mixed together (via red, green and blue on a monitor, or via inks of different colors and shades). Here, you are measuring the gamut of the device.

Note that white is defined as a color of a black body radiator at a given temperature; this is usually referred to as the color temperture (say 6500 Kelvin). Note that ~6000 Kelvin is the surface temperature of the sun, and this is what we usually perceive as white in everyday life. Once you picked a color temperture, you can also define a black and white scale.

Then you need to pick a coordinate system for your colors such as RGB, although there are others that are in use. This converts real-life colors into a bunch of numbers, and is a mathematical representation of all the colors. However, the actual color coordinates are independent of the gamut, they just provide a numerical representation. With the above color calibration procedure, you get a table, that tells you: if the monitor displays R:128, G:45, B:89 you get this particular, measured color at this particular intensity (yes, you are also measuring intensity!). It is actually a discretization of a continuum of colors and coordinates. But that does not change the gamut (up to some unimportant fuzziness right at the edges). How is black and white contrast represented here? You take the ratio of the intensities at (255,255,255) and (0,0,0) (I’m assuming 8 bit per channel).

This map between actual colors and intensities on the one hand and points in a color manifold is a color profile. Input devices also have color profiles, but here the process is in reverse: you take known input colors and measure the signal from your camera, scanner or other device. When you take a camera and take pictures at different exposures, it is important to note that you are no longer measuring absolute brightness any longer, because cutting the exposure time in half cuts the number of photons hitting your sensor in half, too.

Color management tools such as ColorSync will now mediate between different devices, because they will have different gamuts and different capabilities. It translates so as to get consistent colors. You need at least two color profiles, one from the input and one from the output device. Assuming the color is in the gamut of both devices, ColorSync will take an input, translate that to a specific color at a specific intensity, and then use the look-up table for the output device to accurately reproduce the output color.

Now you will complain that I explained color management to you, even though you wanted to know about HDR. Well, along the way, we have clarified what gamut (total set of reproducible colors and intensities) means and what dynamic range means (ratio of brightest to least bright version of a specific color; black-to-white is a particular case that is covered here, because white is defined as a color mixture parametrized by temperature).

Let’s get to HDR: with HDR you are solving one problem that is part and parcel when you deal with e. g. colors that are in the gamut of one device but outside of the gamut of another. How should ColorSync handle that? HDR primarily deals with the other case where there is lack of overlap in the intensities rather than colors. Here, the dynamic range (say, black and white) of a particular scene exceeds the gamut of any output device you want to view this on. For simplicity, let’s pretend the input device gamut is large enough to avoid having to talk about multiple exposures and combining those. HDR algorithms not compress this large dynamic range into the dynamic range of the output device. (Often you would use color ranges such as sRGB or Adobe RGB as stand-ins for the gamut, because many monitors are designed so as to cover all of sRGB and 90+ % of Adobe RGB.) That poses the question how to interpolate colors and intensities in between so that the result looks “natural”. What does and does not look natural is not easy to say. When HDR was first developed, the algorithms were quite horrible, and I remember lots of candy colors and halos in the processed images. (Some made that into a look, though, an artistic choice rather than an artifact.) Nowadays most HDRs are quite tasteful in that they do what they were developed to do: they exploit that human vision perceives intensities on a logarithmic rather than a linear scale, and therefore adding more details in the shadows and the highlights can actually look natural even though the translation between color and intensity of in- and output is no longer linear. In exchange, the mid-tones have to be compressed without them washing out.

Bit depth helps in two ways: on the input side, it increases the data, because if you can no longer differentiate between two colors and intensities, you won’t be able to interpolate between the two. On the output side, it gives you a larger range of shades between which you can interpolate.
I don't suffer from insanity, I enjoy every minute of it.
     
subego
Clinically Insane
Join Date: Jun 2001
Location: Chicago, Bang! Bang!
Status: Offline
Reply With Quote
Nov 18, 2018, 09:25 PM
 
Perhaps because it was on mobile, your link sent me to the whole thread.

So I read all of your posts.

Originally Posted by OreoCookie View Post
Let’s get to HDR: with HDR you are solving one problem that is part and parcel when you deal with e. g. colors that are in the gamut of one device but outside of the gamut of another.
This is not HDR. This is tone mapping.

In an HDR system the screen has a high dynamic range, and it displays an image with high dynamic range. No compression of dynamic range is taking place between the two.

I genuinely appreciate the effort to explain color management, but it’s at most tangentially related to the topic.
( Last edited by subego; Nov 18, 2018 at 09:44 PM. )
     
subego
Clinically Insane
Join Date: Jun 2001
Location: Chicago, Bang! Bang!
Status: Offline
Reply With Quote
Nov 18, 2018, 09:57 PM
 
Originally Posted by OreoCookie View Post
On the output side, it gives you a larger range of shades between which you can interpolate.
What is the difference between a screen with a 1:500 contrast ratio, and one with a 1:20,000 contrast ratio?

One has more range between black and white than the other.

What is more range between black and white if not more shades?
     
Ham Sandwich
Guest
Status:
Reply With Quote
Nov 18, 2018, 10:06 PM
 
[...deleted...]
( Last edited by Ham Sandwich; Apr 23, 2020 at 10:07 AM. )
     
OreoCookie
Moderator
Join Date: May 2001
Location: Hilbert space
Status: Offline
Reply With Quote
Nov 18, 2018, 10:08 PM
 
Originally Posted by subego View Post
This is not HDR. This is tone mapping.
Tone mapping is one technique to implement HDR. Here is a blog post on that subject by Skylum who make Aurora HDR (emphasis mine):
Originally Posted by Skylum
Photographers commonly confuse HDR and tone mapping. While these two techniques are definitely related, they aren’t the same thing. HDR stands for High Dynamic Range and is a process through which multiple images are combined to increase the final image’s overall dynamic range using an HDR editor like Aurora HDR.

[...]

Dynamic tone mapping is used to make flat HDR images look punchy and full of detail. Tone mapping deals with reducing the tonal values within an image to make them suitable to be viewed on a digital screen.
Originally Posted by subego View Post
In an HDR system the screen has a high dynamic range, and it displays an image with high dynamic range. No compression of dynamic range is taking place between the two.
No, that's not correct. From the aforementioned Skylum blog post:
For example, an HDR photo that has a 100,000:1 dynamic range needs to undergo tone mapping so that the tonal values fall between 1 and 255. [...] These displays simply cannot reproduce the high dynamic range that your file may end up with after merging multiple photos, so dynamic tone mapping is a vital step to reduce the tonal variation in such photos.
Consistent with what I wrote, the dynamic range of the input device (or, combining several input files to obtain input data with a much larger dynamic range) is much larger than the output device. They are very careful not to claim that the dynamic range of the output is 255:1 or something like that, they just say that they need to map an input with a huge dynamic range using only 256 luminance values.
Originally Posted by subego View Post
I genuinely appreciate the effort to explain color management, but it’s at most tangentially related to the topic.
No, I think it is quite central to your question, since you wanted a mathematical definition of things like dynamic range and gamut. I just didn't know how to do that without explaining the basics of color management alongside to fix the important notions.
I don't suffer from insanity, I enjoy every minute of it.
     
OreoCookie
Moderator
Join Date: May 2001
Location: Hilbert space
Status: Offline
Reply With Quote
Nov 18, 2018, 10:15 PM
 
Originally Posted by subego View Post
What is the difference between a screen with a 1:500 contrast ratio, and one with a 1:20,000 contrast ratio?
Contrast is measured between black and white. To do a proper measurement, you in addition need to fix the color temperature of these monitors. Then the definition is the ratio of the intensities at white (implemented as R:255, G:255, B:255 if you used 8 bit per channel) with the intensity at R:0, G:0, B:0. For the first monitor, a pixel set to R:255, G:255, B:255 is 500 times brighter than one set to R:0, G:0, B:0. In the second case, the white pixel is 20,000 times brighter than a black pixel.

While I give all these values for 8 bits, the bit depth does not matter: you could have a b&w display where white is 1 and black is 0, and the contrast ratio would be exactly the same.
Originally Posted by subego View Post
What is more range between black and white if not more shades?
No, that's not correct. Imagine driving the two displays with 1:500 and 1:20,000 contrast ratios in black and white. The dynamic range would be the exact same as if you were driving them at 10 bit per channel. A larger bit depth gives you more freedom in terms of tone mapping, though, and more input data.
I don't suffer from insanity, I enjoy every minute of it.
     
Brien
Professional Poster
Join Date: Jun 2002
Location: Southern California
Status: Offline
Reply With Quote
Nov 18, 2018, 10:42 PM
 
Does anyone here have an HDR television, or seen a film in Dolby Vision/HDR10? Because I am getting the impression that people are conflating HDR photography with actual HDR.
     
subego
Clinically Insane
Join Date: Jun 2001
Location: Chicago, Bang! Bang!
Status: Offline
Reply With Quote
Nov 19, 2018, 12:07 AM
 
Originally Posted by Brien View Post
Does anyone here have an HDR television, or seen a film in Dolby Vision/HDR10? Because I am getting the impression that people are conflating HDR photography with actual HDR.
My TV has HDR10, and I’ve watched HDR content on it, but I haven’t really done an active comparison, and the contrast ratio of my TV is definitely standard.
     
OreoCookie
Moderator
Join Date: May 2001
Location: Hilbert space
Status: Offline
Reply With Quote
Nov 19, 2018, 02:04 AM
 
Originally Posted by Brien View Post
Does anyone here have an HDR television, or seen a film in Dolby Vision/HDR10? Because I am getting the impression that people are conflating HDR photography with actual HDR.
Confusingly, photo HDR ≠ video HDR. Blame the marketing people.
Originally Posted by cnet
HDR for TVs aims to show you a more realistic image, one with more contrast, brightness and color than before.

An HDR photo isn't "high dynamic range" in this sense. The image doesn't have the dynamic range possible in true HDR. It's still a standard dynamic range image, it just has some additional info in it due to the additional exposures.
(Emphasis mine.)
I don't suffer from insanity, I enjoy every minute of it.
     
Spheric Harlot
Clinically Insane
Join Date: Nov 1999
Location: 888500128, C3, 2nd soft.
Status: Offline
Reply With Quote
Nov 19, 2018, 08:27 AM
 
Good, so now that we’ve finally clarified that the subject here has absolutely nothing to do with HDR photography: is really nobody able to explain what makes a display like in the X and Xs „HDR“?
     
 
Thread Tools
 
Forum Links
Forum Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts
BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Top
Privacy Policy
All times are GMT -4. The time now is 10:57 AM.
All contents of these forums © 1995-2017 MacNN. All rights reserved.
Branding + Design: www.gesamtbild.com
vBulletin v.3.8.8 © 2000-2017, Jelsoft Enterprises Ltd.,