Basically: if I set my display to ~75% brightness and open the video, the whites in the video are 100% brightness, way brighter than #FFF interface white on the rest of my screen.
But if I increase my display brightness to 100%, the whites in the video are the same as the interface white, because it obviously can't go any brighter.
If I decrease my display brightness to 50%, the whites in the video are no longer at maximum 100% brightness, maybe more like 75%.
But it's also kind of buggy -- after messing around with system brightness a bit, the video stops being brighter and I've got to quit QuickTime and restart it again for effect. Also, opening and closing the video file makes my cursor disappear for a couple seconds!
I'm wondering if it switches from hardware dimming to software dimming when the video is opened and closed, and if that switch has to do with the cursor disappears. If it is, though it's flawlessly undetectable in terms of brightness -- the interface white and grays don't change at all.
Interestingly confirming it: taking a screenshot of the video while screen is at 75% brightness show massive brightness clipping in the video, since it's "overexposed" in the interface's color range. But taking a screenshot while screen brightness is 100% shows no clipping, because the video is no longer "overexposed" to the interface.
I'm just so surprised I had utterly no idea macOS worked like this. I'd never heard of this feature until now.
In order to pull this off, you need to know exactly how many nits bright the display is, as well as having complete software control of the actual hardware brightness. On Windows, you have neither. Enabling HDR mode completely throws off your current colors and desktop brightness and you have to reset both your physical monitor settings and dial in the new desktop white point with a software slider Microsoft buried in the advanced HDR settings (that almost nobody knows how to use) to hopefully be somewhere in the vicinity of what you had before.
When it comes to display technology, having vertical integration is a huge benefit. Look at high-DPI: state-of-art on Windows in 2020 is nowhere near as good from a software implementation or actual user experience point of view as it was on day 1 when Apple introduced Retina MacBooks back in 2012.
On Windows the systemwide "color management" basically consists of assigning a default color profile that applications can choose to use - which is generally only done by professional design/photo/video software, and not by the desktop or most "normal" apps.
I wonder what happens if you try this on the LG 5K hooked up to a mac. It is physically the same panel that's in the iMac, so in theory it can present the same range. But if the OS needs to know the exact physical abilities of the display, it might not be able to detect that LG display. Or maybe apple does detect it, because they partnered with LG.
> The operating system is complicit in this trickery, so the Digital Color Meter eyedropper shows “white” as 255, as do screenshots.
Digital Color Meter shows both UI white and video white as exactly #fff despite the video white being much brighter!
Even at full screen brightness, the video white was noticeably brighter
It's not technically HDR, but if I view HDR videos with it in SDR mode with the brightness turned up to maximum, there is definitely a bit of an "HDR effect", similar to what Apple has achieved.
However, under Windows, without support from the operating system, this doesn't really work. The colours are shifted and the brightness of the content is way too low.
Microsoft could add something similar, but this is a company that thinks that colour is a feature in the sense of:
Colour: Yes.
We're approaching 2021 rapidly. I suspect we'll have to wait to 2030 before Microsoft gets its act together and copies even half of what Apple has in their display technology now.https://appleinsider.com/articles/20/08/03/what-hdr-hdr10-an...
That plus P3 gamut means a video playing on a display is closer to what a filmmaker intended.
https://en.wikipedia.org/wiki/DCI-P3
What's cool is: 1) 4-year old Macs have become HDR laptops and 2) the implementation is subtle- you get full brightness in the HDR video without having the rest of the UI blast to full brightness.
That video can have a very bright sky, for instance. You can have a bright sky and a blinding text box or neither in other OSs.
It’s also a very bright display in its own right, with 1600 nits vs the 300-400 of a regular one. And 1,000,000:1 contrast as well.
If my screen is already at 100% brightness then there's no HDR effect. 100% brightness is true 100% brightness, the max capability of my backlight.
The only difference is that if my screen is at less than 100% brightness, the HDR content can be brighter than the rest of the screen because it has the headroom.
Does that make sense?
My impression is that PMing this is really hard. And then each of the other guys is going to have an opinion that this shouldn't be done because it's so rare, etc.
Something must be organizationally right for something like this capable engineering to have succeeded on such a barely noticeable feature.
I love it when products casually have cool things like this. Not quite the same scope but IntelliJ's subpixel hinting option has each element of the drop down displaying with the hints that it describes. You don't have to pick an option to see it. You can just preview it off directly.
https://hbr.org/2020/11/how-apple-is-organized-for-innovatio...
Most frameworks don't let applications touch pixel data without jumping through some hoops, because by restricting it you can implement things like lazy loading, GPU jpeg decoding GPU resizing, etc.
It's easier when you control all parts of the stack. No way they could have pulled that one off with NVidia who were "famous" for breaking with Apple years ago when Apple demanded to code the drivers themselves... for valid reasons when one looks at the quality of their Windows and Linux drivers. The Windows ones are helluvalot buggy and the Linux ones barely integrate with Linux because NVidia refuses to follow standards.
It really is beyond belief that an organization with so many employees can fail to adhere to a uniform vision and standard and focus on correcting details.
I'm a life long Windows and Android user. But honestly, seeing articles like this and how smooth the UI on macOS and uniformly they apply new updates and UI changes makes me extremely jealous and resentful that Microsoft is so bad at something so basic.
Features are great, but users at their start point interact with UI first. They need to fix that before anything else.
Now they want to give you the option to run Android apps on Windows through emulation. This just going to create a bigger jumbled mess.
Latest Windows 10 iteration is by far the snappiest OS I've used in a long time since it uses GPU acceleration for the desktop window manager. You can check this in task manager. The icing on the cake, if you gave a laptop with 2 GPUS(Optimus) is when you can run a demanding 3D app like a game in windowed mode in parallel with other stuff like watching videos on youtube and you can see in task manager how windows uses the external GPU to render the game and the integrated GPU to accelerate your web browser, all running butterly smooth.
>In fact, their OS is in such shambles and is a disoriented mess with respect to UI consistency.
True, but that's what you get with 30 years worth of built in backwards compatibility. I can run a copy of Unreal Tournament 1999 that was just copied off an old PC with no sweat right after ripping and tearing in Doom Eternal. Can you run 20 year old software on current Apple without emulation? Apple can afford to innovate in revolutionary ways when it dumps older baggage whenever it feels like it and start from a fresh drawing board without looking back, see intel to apple silicon transition. In 2 years x86 apps will be considered legacy/obsolete on Mac hardware. Microsoft can't really do this with windows so yeah, it's a mess of new GUI elements for the simple stuff and windows 2000 era GUI elements for the deep pro settings. The advantage is that if you're an old time Windows user you can easily find your way using the "old" settings and if you're new to windows you can do most configs through the "new" GUI without touching the scary looking "old" settings.
Uh, Windows started doing that in Vista.
I worked at Facebook for years and it now has a similar problem. Developers are evaluated every six months on their 'impact', which results in many dropping boring work and joining teams that are doing new things, even if they aren't needed.
Windows struggles to correctly display Adobe RGB (wide gamut but not HDR) JPG images I've downloaded in 1999, now over 21 years ago and counting.
They'll get there... eventually. Maybe next decade, by which I mean 2030, not 2021.
If you’re not spent time in a huge company, this might seem to be the case. But really, Apple’s uniform standard really is the exception. I’m sure there are other organizational costs for this, such as it being harder to take risks with products or execute quickly.. but gosh they are good at producing a cohesive, mostly consistent set of products. I deeply appreciate their attention to detail and long term commitment.
This is a bit misleading. The backlight isn’t at a higher level than necessary for sRGB content all the time, just whenever any HDR encoded videos or EDR apps are open. When you open an HDR video you can see the highlights getting brighter as the backlight gets pushed.
Yup, I think this has to be the case. But the crazy thing is, I can't perceive any shift in UX brightness whatsoever, even a flicker, when I open/close an EDR video.
I would have thought that there would be some slight mismatch at the moment the backlight is brightened and pixels are darkened -- whether it would be a close but not perfect brightness match, or a flicker while they're not synced. But nothing.
As I mentioned in another comment, the only giveaway is that my cursor (mouse pointer) disappears for a second or two. I have to guess that adjusting its brightness happens at a different layer in the stack that can't be so precisely synced.
So that's why Lunar[1] reads a much higher brightness and makes all external monitors as bright as ten thousand suns when HDR content is played.
I wonder if there's any way to detect this and read or compute the SDR brightness instead.
but what about blacks? If you have a dark scene with bright highlights (eg. campfire at night), does the black parts of the scene get blown out because backlight bleed?
What I like about conclusions of this piece is it points to how strategic Apple is thinking in its leverage due to the breadth of distribution of advanced hardware and software.
Apple is able to set entire new standards of expectation for customers that are _very_ hard for competitors to follow.
While competitors fixate on some single feature like night photo quality, Apple is also subtly chipping away at something like this.
The cheaper version of this display has a price tag of $5k, the more expensive one $6k.
I never spent even remotely as much money on a display, so I cannot speak from first-hand experience. But a quick search for graphics-design displays yields several competitor products that have similar or better specs, at generally even lower price tags.
Apple does many things, but certainly not bleeding-edge innovation. Of course it likes to sell its products as such, but I guess that's just how marketing works.
Translation: I don't understand this display.
> But a quick search for graphics-design displays yields several competitor products that have similar or better specs, at generally even lower price tags.
Translation: Despite my lack of knowledge, I feel qualified to say Apple sucks.
Typical HN.
What? Having been in the inner sanctum of engineering within Apple for years, that's exactly the engineering priority for groups I saw and worked within. I'm genuinely curious why you assert otherwise, find it surprising.
The monitor that are comparable to the Pro XDR Display are ASUS PA32UCX ($4500) or EIZO CG319X ($6000) which usually requires full recalibration every certain amount of usage.
As for bleeding edge, Apple has pioneered plenty of technology. It's true that it builds upon foundations of technical designs and scientific discoveries by others but that applies to every other company as well. Very few organizations are capable of going straight from invention in a science lab to large scale commercial product all by themselves. If you judge by how much "new" technology has actually reached consumers though, Apple is clearly leading the field.
The closest competitor I’ve found is the PA32UCX at $4500, and supposedly its fan is noisy enough that I’m hesitant to buy it.
> it’s strange and new, and possibly unique to Apple.
But that is exactly how Windows 10 does it with HDR displays, too. So not really unique to Apple. To the article's benefit, it did say "possibly" :)
I'm using Win 10 myself with an HDR display, and HDR white appears brighter than "desktop" white just like in the article photo.
There is also a slider in Win10 HDR settings that allows you to bring SDR/"desktop" white up if you wish to oversaturate SDR.
When you toggle HDR on Windows, the desktop becomes dull gray and desaturated exactly because they pull down the previous desktop brightness to something less than 255. So you have to then adjust your monitor's brightness up to compensate. The monitor's brightness effectively sets the upper cap of the HDR brightness, so let's say your brightness was set at 50% before, now you've got to fiddle with the monitor to boost the screen brightness to 100% to allow HDR to function, and to achieve your previous desktop white brightness (you'll probably also have to adjust the software "desktop white" point slider you mentioned, since MS has no clue what the correct monitor brightness and SDR pull down amount should be, so good luck matching your previous desktop colors and brightness). In my experience very few people successfully manage to setup their Windows HDR correctly, and even if you do there's no way to "seamlessly" switch between the two modes (which you have to do since tons of stuff on Windows doesn't work properly when HDR mode is enabled). I haven't checked Surface or other MS hardware, perhaps they're able to do something more clever there?
What Apple does, is that when your display brightness is 50% and you display HDR content, the HDR content will seamlessly appear at a brightness somewhere between 75% to 100% of the maximum screen brightness. That is a seamless HDR effect, giving you the whiter than white experience next to your other windows that just works.
Though I remember having read that the Windows HDR stuff works slightly differently for internal monitors (e.g. in laptops), is your experience with those?
You have the HDR/SDR brightness balance to set and the monitor's own contrast adjustments.
Macs and its apps have been properly color managed for decades. That's why the transition from SDR to HDR monitors has been painless. Apps have been ready for it for a long time.
No, apps on Windows HDR look normal unless they are HDR-aware and use the "extra brightness".
In particular https://gitlab.freedesktop.org/swick/wayland-protocols/-/blo... was linked which discussed how a wayland compositor would have to display mixed "HDR" and SDR content on the same display. This document even has references to EDR. Ultimately this would end up achieving a similar result as what's described in the blog post here.
If you're interested in the technical details on what may be necessary to achieve something like this, the wayland design document might be a good read.
The first thing you think is, how was I OK with this terrible standard video to begin with? The HDR version just looks SO MUCH better and the standard looks so flat next to it. Like comparing an old non HDR photo with an HDR one.
What is the source for this? I don't see any justification for this claim in the article. There are plenty of ways to implement this feature that don't involve permanently throwing out dynamic range on all your SDR panels. I'm not even convinced from reading this that they aren't HDR panels to begin with - the idea of an iPhone having a 9-bit or 10-bit panel in it isn't that strange to me, and while that wouldn't be enough for like Professional-Grade HDR it's enough that you could pair it with dynamic backlight control and convince the average user that it's full HDR.
Considering Apple controls the whole stack and uses a compositor there's nothing stopping them from compositing to a 1010102 or 111110 framebuffer and then feeding that higher-precision color data to the panel and controlling the backlight. Since they control the hardware they can know how bright it will be (in nits) at various levels.
I've also gotten into the habit of using uBlock Origin to kill all of the "Accept Cookies" type popups without ever clicking accept. Those are just as intolerable as ads to me.
On other story, I give my junior advice that she should install adblocker on her browser, her reply was: "How can I watch the commercial then?" Umm...OKAY.
My assumption:
1) people just didn't know better or
2) they actually loves those flashing ads all along
> But at key parts of the story, certain colors eek outside of that self-imposed SDR container, to great effect. In a very emotional scene, brilliant pinks and purples explode off the screen — colors that not only had been absent from the film before that moment, but seemed altogether outside the spectrum of the story’s palette. Such a moment would not be possible without HDR.
I think the author knows that this is a special case of the effect where you limit color palette to some range of colors for a duration of the film and then exceed that range in places—no HDR in particular fundamentally needed to make this possible.
True, HDR can give a greater effect in absolute colorimetric terms when full palette is revealed, but the perceived magnitude depends on how restricted the original palette was prior to the reveal, and how masterfully the effect is used in general.
I seems that the new MacBook Air with M1 can't play HDR content on external displays. :(
The MacBook Air with M1 has two USB 4 ports. USB 4 is Thunderbolt 3 + some extra bits.
See https://support.apple.com/en-us/HT201736 for information.
1) the software needs to render with the correct gamut profile (typically "P3.display")
2) the OS needs to have a reasonable color profile for the display that knows about the higher gamut
3) the display needs to be in the correct mode to interpret and render in the correct gamut.
My LG HDR400 display only has 1 obvious setting for HDR as a "quick" setting, but it behaves like what you're said, and that just drops gamma.
Monitors that are IPS (i.e. most monitors) with HDR are more mostly fake.
If you want real HDR600, try the Samsung Odyssey G7; but even that does not have full-array local dimming (which you'd expect to spend $2k for).
For the 5-6k that this display from Apple costs, I would certainly expect it to be an OLED display at LEAST. But it's not.
How is this the best we can get and it's NOT OLED? Dimming zones for 6k? I don't understand what's going on, I just want a nice OLED monitor that will fit on my monitor arm. I'll even pay the same price that you can get an AMAZING 55" OLED TV from LG for; 1500-2k.
Also the amount of bright colors used in computer interfaces would cause some significant discomfort.
And I'm not sure how legitimate this claim is, but I've read before that OLED suffers from dead pixels at a higher rate than other screen types, but don't take that too seriously without proof.
I see this repeated a lot. Do you have any numbers/images on actual burnin in OLED screens. Would be interesting to know how long an OLED screen remains useable when used as a PC monitor.
Or to put it another way: Burn in does not seem to be enough of a concern to prevent Samsung/etc from puttin OLED screens into phones.
> Also the amount of bright colors used in computer interfaces would cause some significant discomfort.
The entire point of Apple's solution here is that UI's max brightness is not the display's true max brightness.
White is white. White on a display is the brightest point the display can display.
What Apple is doing, as the article explained, is showing regular white as gray. That’s not cool, that just stupid. It’s exactly what TVs at Best Buy do when showing SD vs HD content: They ruin the regular image just so you can see the difference.
The issue is that my monitor is not a demo display, it’s what I use sometimes in daylight, and I’d very much appreciate that extra brightness that Apple takes away from me.
You know what this means for you? Everything you see and watch on your computer is not a bright as it could be. On an LCD screen that’s a big deal because suddenly your blacks are brighter (because of the backlight at 100%) and your whites are dimmer (because Apple saves brightness on the off chance that you have HDR content)
#ffffff is L=100%. What is L=800%? It exists in HDR content, and we can’t just make the web color #ffffff a dim gray to the eye.
We must start thinking in terms of HSL or LAB or even RGBL, and consider that L > 100% is where HDR peak brightness lives.
HDR’s color space exceeds the luminosity that sRGB hex triplets can represent, and remapping HDR color spaces into sRGB hex gives you horrendous banding and requires complex gamma functions. The CSS colors spec is finalizing on this, but essentially we’re at the last days of hex codes being a great way to express color on the web. They’ll remain good as a last resort, but it’s time to move a step forward.
Apple is pinning sRGB hex #ffffff to “paper white” brightness because the hex color specification can’t encompass the full spectrum of monitors anymore. The different between #ffffff and #fefefe can be enormous on a display with 1800 nits of peak brightness, and if you map #ffffff to peak brightness, you burn out people’s eyes with every single web page on today’s legacy sRGB-color Internet (including Hacker News!). That’s why HDR has “paper white” at around 400 lumens in calibration routines.
So, then, sRGB hex colors have no way to express “significantly brighter than paper white #ffffff”, and UI elements have little reason to use this extended opportunity space - but HDR content does, and it’s nice to see Apple allowing it through to the display controller.
But there’s no way to make use HDR in web content - other than embedded images and videos - if we continue thinking of color in terms of hex codes. This insistence that we remap hex codes into thousands of nits of spectrum is why web colors in Firefox on an HDR display make your eyes hurt (such as the HN topbar): it’s rescaling the web to peak brightness rather than to paperwhite, and the result is physically traumatic to our vision system. Human eyes are designed for splashes of peak brightness, but when every web page is pouring light out of your monitor at full intensity, it causes eye strain and fatigue. Don’t be like Firefox in this regard.
“But how do we conceive of color, if not in hex codes?” is a great question, and it’s a complicated question. In essence you select color and brightness independent of each other, and then make sure that it looks good when peak brightness is low, and doesn’t sear your eyes when peak brightness is high.
If this interests you, and you’d like to start preparing for a future where colors can be dimmer or brighter than sRGB hex #FFFFFF, here are a couple useful links to get you started:
https://news.ycombinator.com/item?id=15534622
https://news.ycombinator.com/item?id=22467744
As a final note, there are thermal reasons why peak brightness can be so much higher than paperwhite: your display can only use so much power for its thermal envelope. Yes, HDR displays have thermal envelopes. So overusing peak white, such as scaling #ffffff to the wrong brightness, can actually cause the total brightness of the display to drop when it hits thermal protections, while simultaneously wasting battery and hurting your users’ eyes.
It's gonna be five minutes before everybody's extra important call-to-action buttons and probably ads are max brightness too
I've always wondered what it'll take for the major OS manufacturers to implement an anti-seizure filter for the content they transmit to their screens, and I'd bet that a flickering ad at HDR max brightness causing seizures worldwide one day will finally compel them to do so.
(If HDR display is connected, that option is replaced with a HDR on/off switch.)
This is simply not true; a lie. Everyone in Hollywood and the professional print world who needed color-accurate displays that could be calibrated were using HP Dreamcolor Displays or similar products from companies like BenQ
( For example: https://nofilmschool.com/2017/12/benq-sw271-color-management... )
No movie studio does professional color grading on anything else.