macOS/iPadOS cranks the backlight brightness up but then adds a black filter to the non-bright content to sim it back to “normal” levels.
Also given how prevalent the issue of QR code brightness is for many use cases this could actually prove useful.
??? It's just an all-white video, with an overlay of black QR code applied by CSS. It's already "fully automated."
Both (the video and the overlay) are small enough to be base64'd right into the HTML.
Once again webdevs creating another UX nightmare.
Isn't that the sole purpose of HDR!?
You might be thinking of high gamut displays?
https://webkit.org/blog/10042/wide-gamut-color-in-css-with-d...
I might be wrong however.
Display-P3 is only the color space (and maybe the EOTF). This video uses the "absolute brightness" feature of HDR to increase brightness.
I’ve noticed that you cannot long press the brighter code to read it.
This works on the regular code next to it (tested in safari mobile).
Not sure why they've done this, perhaps something about the HDR approach requires both the black and white to be a part of the same video render.
Also - I'm not sure if it's a browser default or some other website CSS but imo there's no real reason long press shouldn't work on a video anyway... videos need accessibility too.
odd choice indeed, I wasn’t thinking clearly when I experimented and wrote the article.
putting a black and transparent image atop the video sounds like a much better idea — i will try it and see if it works. thanks for the suggestion!
I'm assuming there's some problem with screen-rendered QR codes that this solves, but my quick google search mostly just resulted in listicle spam.
What's the problem this solves?
I notice some airline apps and Google Wallet set screen brightness to maximum when you view a boarding pass.
When I look at the two codes side-by-side with normal (full) brightness, the HDR one looks quite a bit brighter. If I dim my screen brightness substantially, the HDR one also gets a lot dimmer. Not sure if this solves that problem, though it could be an issue with my particular screen?
I may make a difference with laser scanners or other more commodity hardware than smartphone cameras which are pretty amazingly good these days.
The only thing I've seen it do is override my brightness setting to make my screen go to full brightness when I've set it lower.
One is that HDR displays can usually get MUCH brighter. An iPhone 5 could do 500 nits. An iPhone 14 Pro can do 1,000 - 2,000 nits.
That’s what’s used here. You get all the same colors as before, but max brightness is way brighter.
The other thing is color gamut. The standard we had for a long time was 24-bit color, so ~16.7 million colors. Instead of having 8 bits per channel, screens may now have 10 or 12 bits per channel. I can’t find just how many a modern iPhone has.
This means there are more shades between black and 100% bright red. There are more variations between blueish green and greenish blue. Gradients can be smoother. Objects that are mostly one color (yellow corn, a red apple, etc) now have more options the can use to provide definition and details.
In a very dark scene, there are now more dark shades to show things with. When looking at bright clouds, they don’t have to be all white and washed out.
And combined with the increased brightness a scene can show definition in both dark and bright areas without having to wash everything out.
I have one Dell made a few years ago. It supports the additional colors of P3 but isn’t any brighter than any other quality normal display of the time.
Every (compliant) screen will produce that range, when given numbers between the minimum and maximum. It's nowhere near every color we can see (the dark gray area), but it is a decent manufacturing target and it covers a very practical area of our vision.
Screens could just produce their maximum range all the time... but then you'd get screwed up colors, with every display being different than the others (some redder, some greener, etc.). Hence standard color ranges, like sRGB, so bananas used for color scale always look the same shade of yellow. That triangle looks the same on my screen as it does on yours, assuming the manufacturer cared at all about consistency. (remember the stupidly red OLEDs when they were new? they looked awful, humans looked like oompah loompahs)
So screens are capable of more than they normally display, sometimes by a pretty large margin. Somehow you have to tell the screen to go beyond its normal range, to show yellower yellows for super-bananas - that's HDR.
It does give you more range.
A lot of modern screens literally can't run at their full dynamic range, they would draw too much power and/or get too hot (some professional displays I use at work have a power cable as thick as your thumb and radiate heat like an oven with the door open. They also require following warm-down procedures to prevent the electronics from failing if they cool down too quickly when you power them off).
They can, however, run at full brightness for a small section of the display or brief periods of time. HDR videos are one way to instruct the screen to do that.
(obviously, how much range it can give you depends what screen technology you have - Apple's software has supported this for a long time, but even some current model displays they sell can't really do HDR properly - though many that "cant" to HDR are able to make a best effort using complex software tricks)
It is usually combined with new display technologies that can emit unusually high brightness levels. On a traditional display there is obviously no point.
I can understand the increased granularity, if you're making monitors that go much brighter, then you get more light levels, but why define the old maximum to be less than the new maximum?
Whether the effect is significant depends on a lot of stuff; how the display is built, how the codec works, your viewing conditions, and not least of all, whether the content "looks better" artistically with that much more range.
Just like the compression audio effect clips the range of music and how some pieces of music sound more expressive when the range in amplitudes is used well for artistic effect.
It gives you more range.
> so what does it actually do?
If you have done a little C programming, you will know (or can check) that a 32-bit `int` represents values from -2147483648 to 2147483647 while a 32-bit `float` goes from -340282346638528859811704183484516925440. to 340282346638528859811704183484516925440. - clearly more "range". This is almost exactly what is happening here.
(If you haven't done a little C programming, you should!)
> The only thing I've seen it do is override my brightness setting to make my screen go to full brightness when I've set it lower.
That is too bad. My experience is very good: only the QR code becomes much more striking, while the rest of the display remains the same. Perhaps it is implemented by increasing the brightness on your display to compensate for the reduced range available due to your brightness setting, but if that's what is happening here, something else is also compensating the colours to make the non-HDR areas of my display darker so that I cannot notice.
What’s happening here is using a short (16 bit) rather than a byte (8 bit), although not quite to the same extent, for each channel. HDR allows you to represent 2^6/12/18 times as many colors as SDR.
Basically if srgb content was just naively stretched to rec2020 it'd look garish and oversaturated instead of what the designer/photo/videographer intended. If it was additionally stretched to HDR, it'd look garish and eye-searingly bright.
Because the differences between HDR displays and SDR is so dramatic it is forcing everyone to scramble to do color management as the result would be pretty unusable otherwise.
Look at the second example ( red images) and click the Display P3 option.
Do you see the symbol inside? No? Well then your display / monitor isn’t capable of showing a colour range higher than sRGB.
HDR is similar in respect to brightness and contrast.
If your device supports HDR, then there’s more steps in the brightness scale that your monitor can display.
Basically, the only observable difference I'm noticing between the two laptops is that the non-HDR QR code looks dimmer on the HDR-supporting machine. I'm sure this is some kind of optical illusion, but it's certainly not doing much to make me value the existence of HDR.
So, yeah, both QR codes probably are brighter on your old Pro, if its display brightness is set higher. And, if you adjust display brightness on the Air, you'll see them converge to the same brightness there, too (on my M2 Air, it seems that Apple leaves a little headroom for the HDR "white" to appear a bit brighter than UI white at full brightness; not sure if the M1 Air behaves the same or not as it has an older display).
Displays with local dimming (and OLEDs) can display, in small areas, colors that are brighter than the maximum full-screen brightness. That's where HDR content gets really compelling-- if all you can do is dim the whole screen, you're unlikely to see much of anything that you couldn't do just by bumping up the brightness with SDR content.
<qrcode>Some content!</qrcode>
With necessary browser/device level support to automatically do this HDR thing.
You must have an HDR or EDR-capable screen and there must be an HDR video playing to activate the HDR context (can be <video> playing somewhere in a webpage).
Lunar app's website used this trick! Check it out. https://lunar.fyi/#xdr
Chrome also seems to support hdr. It's just Firefox that doesn't support this yet. There's an open issue for this: https://bugzilla.mozilla.org/show_bug.cgi?id=1793091
More interesting is whether CSS will be fixed to support this. There seems to be some recent work on this: https://w3c.github.io/ColorWeb-CG/
So, this hack might long term not be needed if this gets addressed properly.
I would not have thought that QR codes could become a hot topic in 2023 but combining that with the new Stable Diffusion Controlnet generated QR codes [1] could actually be pretty interesting
[1] https://www.linkedin.com/posts/gschwandtner-michael_qr-stabl...
So in terms of hex digits, three sets (Y, Cb, Cr) of three, with not all values representing valid colors.
Compared to SDR standards like sRGB, HDR formats also typically use larger color spaces (Rec. 2020[1] is typical) and far more extreme transfer ("gamma") functions (PQ[2] or HLG[3]).
Finally, note that it is common for the encoded values to represent colors and intensities that far exceed the capabilities of most, if not all, display hardware, so the mapping from encoded values to actual displayed pixels can be rather complicated. Google "HDR tone mapping" for more than you ever wanted to know.
[1] https://en.wikipedia.org/wiki/Rec._2020
The "bright" one has metadata that tells the operating system to render white at the maximum possible brightness, instead of whatever brightness it would normally render white at.
It's broadly supported on Apple devices, though how well it works depends on the hardware you have.
For that matter i thought avif supported HDR images.
See https://w3c.github.io/ColorWeb-CG/#png and the linked ML thread for some details.
I think Apple is trying to keep HDR for native apps only, like many other platform exclusive features like faceid, fingerprint reading, etc - which are all unavailable to webapps. Video is probably an exception.
Maybe there should be some CSS extensions for using HDR directly on web pages, though, if the developer really wants to.
Works on my iPad Pro 2020 iOS14