It didn't take very long to learn, and it turned out to be extremely important in the work I did during the early days at Waymo and later at Motional.
I wanted to pass along this fun video from several years ago that discusses HDR: https://www.youtube.com/watch?v=bkQJdaGGVM8 . It's short and fun, I recommend it to all HN readers.
Separately, if you want a more serious introduction to digital photography, I recommend the lectures by Marc Levoy from his Stanford course: https://www.youtube.com/watch?v=y7HrM-fk_Rc&list=PL8ungNrvUY... . I believe he runs his own group at Adobe now after leading a successful effort at Google making their pixel cameras the best in the industry for a couple of years. (And then everyone more-or-less caught up, just like with most tech improvements in the history of smartphones).
This gets to a gaming rant of mine: Our natural vision can handle these things because our eyes scan sections of the scene with constant adjustment (light-level, focus) while our brain is compositing it together into what feels like a single moment.
However certain effects in games (i.e. "HDR" and Depth of Field) instead reduce the fidelity of the experience. These features limp along only while our gaze is aimed at the exact spot the software expects. If you glance anywhere else around the scene, you instead percieve an unrealistically wrong coloration or blur that frustratingly persists no matter how much you squint. These problems will remain until gaze-tracking support becomes standard.
So ultimately these features reduce the realism of the experience. They make it less like being there and more like you're watching a second-hand movie recorded on flawed video-cameras. This distinction is even clearer if you consider cases where "film grain" is added.
It's crazy that post is 15 years old. Like the OP and this post get at, HDR isn't really a good description of what's happening. HDR often means one or more of at least 3 different things (capture, storage, and presentation). It's just the sticker slapped on advertising.
Things like lens flares, motion blur, film grain, and shallow depth of field are mimicking cameras and not what being there is like--but from a narrative perspective we experience a lot of these things through tv and film. Its visual shorthand. Like Star Wars or Battlestar Galactica copying WWII dogfight footage even though it's less like what it would be like if you were there. High FPS television can feel cheap while 24fps can feel premium and "filmic."
Often those limitations are in place so the experience is consistent for everyone. Games will have you set brightness and contrast--I had friends that would crank everything up to avoid jump scares and to clearly see objects intended to be hidden in shadows. Another reason for consistent presentation is for unfair advantages in multiplayer.
Ignoring film grain, our vision has all these effects all the same.
Look in front of you and only a single plane will be in focus (and only your fovea produces any sort of legibility). Look towards a bright light and you might get flaring from just your eyes. Stare out the side of a car or train when driving at speed and you'll see motion blur, interrupted only by brief clarity if you intentionally try to follow the motion with your eyes.
Without depth of field simulation, the whole scene is just a flat plane with completely unrealistic clarity, and because it's comparatively small, too much of it is smack center on your fovea. The problem is that these are simulations that do not track your eyes, and make the (mostly valid!) assumption that you're looking, nearby or in front of whatever you're controlling.
Maybe motion blur becomes unneccessary given a high enough resolution and refresh rate, but depth of field either requires actual depth or foveal tracking (which only works for one person). Tasteful application of current techniques is probably better.
> High FPS television can feel cheap while 24fps can feel premium and "filmic."
Ugh. I will never understand the obsession this effect. There is no such thing as a "soap opera effect" as people liek to call it, only a slideshow effect.
The history behind this is purely a series of cost-cutting measures entirely unrelated to the user experience or artistic qualities. 24 fps came to be because audio was slapped onto the film, and was the slowest speed where the audio track was acceptable intelligible, saving costly film paper - the sole priority of the time. Before that, we used to record content at variable frame rates but play it back at 30-40 fps.
We're clinging on to a cost-cutting measure that was a significant compromise from the time of hand cranked film recording.
</fist-shaking rant>
Such a blast from the past, I used to spend so much time just clicking that button!
If you have a good display (eg an OLED) then the brights are brighter and simultaneously there is more detail in the blacks. Why do you think that is worse than SDR?
HDR in games would frequently mean clipping highlights and adding bloom. Prior the "HDR" exposure looked rather flat.
It may not be obvious, but film has a visual language. If you look at early film, it wasn't obvious if you cut to something that the audience would understand what was going on. Panning from one object to another implies a connection. It's built on the visual language of still photography (things like rule of thirds, using contrast or color to direct your eye, etc). All directing your eye.
Stereo film has its own limitations that were still being explored. In a regular film, you would do a rack focus to connect something in the foreground to the background. In stereo, when there's a rack focus people don't follow the camera the same way. In regular film, you could show someone's back in the foreground of a shot and cut them off at the waist. In stereo, that looks weird.
When you're presenting something you're always directing where someone is looking--whether its a play, movie, or stereo show. The tools are just adapted for the medium.
I do think it worked way better for movies like Avatar or How to Train Your Dragon and was less impressive for things like rom coms.
Sure, you need a good HDR-capable display and a native HDR-game (or RTX HDR), but the results are pretty awesome.
We’ve had HDR formats and HDR capture and edit workflows since long before HDR displays. The big benefit of HDR capture & formats is that your “negative” doesn’t clip super bright colors and doesn’t lose color resolution in super dark color. As a photographer, with HDR you can re-expose the image when you display/print it, where previously that wasn’t possible. Previously when you took a photo, if you over-exposed it or under-exposed it, you were stuck with what you got. Capturing HDR gives the photographer one degree of extra freedom, allowing them to adjust exposure after the fact. Ansel Adams wasn’t using HDR in the same sense we’re talking about, he was just really good at capturing the right exposure for his medium without needing to adjust it later. There is a very valid argument to be made for doing the work up-front to capture what you’re after, but ignoring that for a moment, it is simply not possible to re-expose Adams’ negatives to reveal color detail he didn’t capture. That’s why he’s not using HDR, and why saying he is will only further muddy the water.
All mediums have a range, and they've never all matched. Sometimes we've tried to calibrate things to match, but anyone watching SDR content for the past many years probably didn't do so on a color-calibrated and brightness calibrated screen - that wouldn't allow you to have a brightness slider.
HDR on monitors is about communicating content brightness and monitor capabilities, but then you have the question of whether to clip the highlights or just map the range when the content is mastered for 4000 nits but your monitor manages 1000-1500 and only in a small window.
That said, there is one important part that is often lost. One of the ideas behind HDR, sometimes, is to capture absolute values in physical units, rather than relative brightness. This is the distinguishing factor that film and paper and TVs don’t have. Some new displays are getting absolute brightness features, but historically most media display relative color values.
Owning a display that can do 1300+ nits sustained across a 100% window has been the biggest display upgrade I think I have ever had. It's given me a tolerance for LCD, a technology I've hated since the death of CRTs and turned me away from OLED.
There was a time I would have said i'd never own a non OLED display again. But a capable HDR display changed that logic in a big way.
Too bad the motion resolution on it, especially compared to OLED is meh. Again, at one point, motion was the most important aspect to me (its why I still own CRTs) but this level of HDR...transformative for lack of a better word.
I came here to point this out. You have a pretty high dynamic range in the captured medium, and then you can use the tools you have to darken or lighten portions of the photograph when transferring it to paper.
That isn't what the article claims. It says:
"Ansel Adams, one of the most revered photographers of the 20th century, was a master at capturing dramatic, high dynamic range scenes."
"Use HDR" (your term) is vague to the point of not meaning much of anything, but the article is clear that Adams was capturing scenes that had a high dynamic range, which is objectively true.
https://www.kimhildebrand.com/how-to-use-the-zone-system/
where my interpretation is colored by the experience of making high quality prints and viewing them under different conditions, particularly poor illumination quality but you could also count "small handheld game console", "halftone screened and printed on newsprint" as other degraded conditions. In those cases you might imagine that the eye can only differentiate between 11 tones so even if an image has finer detail it ought to connect well with people if colors were quantized. (I think about concept art from Pokémon Sun and Moon which looked great printed with a thermal printer because it was designed to look great on a cheap screen.)
In my mind, the ideal image would look good quantized to 11 zones but also has interesting detail in texture in 9 of the zones (extreme white and black don't show texture). That's a bit of an oversimplification (maybe a shot outdoors in the snow is going to trend really bright, maybe for artistic reasons you want things to be really dark, ...) but Ansel Adams manually "tone mapped" his images using dodging, burning and similar techniques to make it so.
These are all related things. When you talk about color, you can be talking about color cameras, color image formats, and color screens, but the concept of color transcends the implementation.
> The claim that Ansel Adams used HDR is super likely to cause confusion, and isn’t particularly accurate.
The post never said Adams used HDR. I very carefully chose the words, "capturing dramatic, high dynamic range scenes."
> Previously when you took a photo, if you over-exposed it or under-exposed it, you were stuck with what you got. Capturing HDR gives the photographer one degree of extra freedom, allowing them to adjust exposure after the fact.
This is just factually wrong. Film negatives have 12-stops of useful dynamic range, while photo paper has 8 stops at best. That gave photographers exposure latitude during the print process.
> Ansel Adams wasn’t using HDR in the same sense we’re talking about, he was just really good at capturing the right exposure for his medium without needing to adjust it later.
There's a photo of Ansel Adams in the article, dodging and burning a print. How would you describe that if not adjusting the exposure?
No, that’s not inherently true. AA used 12 zones, that doesn’t mean every negative stock has 12 stops of latitude. Stocks are different, you need to look at the curves.
But yes most modern negatives are very forgiving. FP4 for example has barely any shoulder at all iirc.
> The post never said Adams used HDR. I very carefully chose the words
Hey I’m sorry for criticizing, but I honestly feel like you’re being slightly misleading here. The sentence “What if I told you that analog photographers captured HDR as far back as 1857?” is explicitly claiming that analog photographers use “HDR” capture, and the Ansel Adams sentence that follows appears to be merely a specific example of your claim. The result of the juxtaposition is that the article did in fact claim Adams used HDR, even if you didn’t quite intend to.
I think you’re either misunderstanding me a little, or maybe unaware of some of the context of HDR and its development as a term of art in the computer graphics community. Film’s 12 stops is not really “high” range by HDR standards, and a little exposure latitude isn’t where “HDR” came from. The more important part of HDR was the intent to push toward absolute physical units like luminance. That doesn’t just enable deferred exposure, it enables physical and perceptual processing in ways that aren’t possible with film. It enables calibrated integration with CG simulation that isn’t possible with film. And it enables a much wider rage of exposure push/pull than you can do when going from 12 stops to 8. And of course non-destructive digital deferred exposure at display time is quite different from a print exposure.
Perhaps it’s useful to reflect on the fact that HDR has a counterpart called LDR that’s referring to 8 bits/channel RGB. With analog photography, there is no LDR, thus zero reason to invent the notion of a ‘higher’ range. Higher than what? High relative to what? Analog cameras have exposure control and thus can capture any range you want. There is no ‘high’ range in analog photos, there’s just range. HDR was invented to push against and evolve beyond the de-facto digital practices of the 70s-90s, it is not a statement about what range can be captured by a camera.
Reminded me of the classic "HDR in games vs HDR in photography" comparison[0]
[0] https://www.realtimerendering.com/blog/thought-for-the-day/
The good thing about digital is that it can deal with color at decent tonal resolutions (if we assume 16 bits, not the limited 14 bit or even less) and in environments where film has technical limitations.
As for tone mapping, I think the examples they show tend way too much towards flat low-local-contrast for my tastes.
I'm a huge fan of Helldivers 2, but playing the game in HDR gives me a headache: the muzzle flash of weapons at high RPMs on a screen that goes to 240hz is basically a continuous flashbang for my eyes.
For a while, No Mans' Sky in HDR mode was basically the color saturation of every planet dialed up to 11.
The only game I've enjoyed at HDR was a port from a console, Returnal. The use of HDR brights was minimalistic and tasteful, often reserved for certain particle effects.
I stopped playing that game for several years, and when I went back to it, the color and brightness had been wrecked to all hell. I have heard that it's received wisdom that gamers complain that HDR modes are "too dark", so perhaps that's part of why they ruined their game's renderer.
Some games that I think currently have good HDR:
* Lies of P
* Hunt: Showdown 1896
* Monster Hunter: World (if you increase the game's color saturation a bit from its default settings)
Some games that had decent-to-good HDR the last time I played them, a few years ago:
* Battlefield 1
* Battlefield V
* Battlefield 2042 (If you're looking for a fun game, I do NOT recommend this one. Also, the previous two are probably chock-full of cheaters these days.)
I found Helldivers 2's HDR mode to have blacks that were WAY too bright. In SDR mode, nighttime in forest areas was dark. In HDR mode? It was as if you were standing in the middle of a field during a full moon.
Also (mostly) on Windows, or on videos for your TV: a lot of cheap displays that say they are HDR are a range of hot garbage.
The end result is a complete chaos. Every piece of the pipeline doing something wrong, and then the software tries to compensate for it by emitting doubly wrong data, without even having reliable information about what it needs to compensate for.
https://docs.google.com/document/d/1A__vvTDKXt4qcuCcSN-vLzcQ...
But HDR, it's a minefield of different display qualities, color spaces, standards. It's no wonder that nobody gets it right and everyone feels confused.
HDR on a display that has peak brightness of 2000 nits will look completely different than a display with 800 nits, and they both get to claim they are HDR.
We should have a standard equivalent to color spaces. Set, say, 2000 nits as 100% of HDR. Then a 2000 nit display gets to claim it's 100% HDR. A 800 nit display gets to claim 40% HDR, etc. A 2500 nit display could even use 125% HDR in it's marketing.
It's still not perfect - some displays (OLED) can only show peak brightness over a portion of the screen. But it would be an improvement.
Everything is flattened, contrast is eliminated, lights that should be "burned white" for a cinematic feel are brought back to "reasonable" brightness with HDR, really deep blacks are turned into flat greys, etc. The end result is the flat and washed out look of movies like Wicked. It's often correlated to CGI-heavy movies, but in reality it's starting to affect every movie.
Because HDR wasn’t natively supported on most displays and software, for a long time it was just “hacked in there” by squashing the larger dynamic range into a smaller one using a mathematical transform, usually a log function. When viewed without the inverse transform this looks horribly grey and unsaturated.
Directors and editors would see this aesthetic day in, day out, with the final color grade applied only after a long review process.
Some of them got used to it and even liking it, and now here we are: horribly washed out movies made to look like that on purpose.
It feels like to some photographers/cinematographers/game designers, HDR is a gimmick to make something look more splashy/eye catching. The article touches on this a bit, with some of the 2000s HDR examples in photography. With the rise of HDR TVs, it feels like that trend is just happening again.
It's late night here so I was reading this article in dark mode, at a low display brightness - and when I got to the HDR photos I had to turn down my display even more to not strain my eyes, then back up again when I scrolled to the text.
For fullscreen content (games, movies) HDR is alright, but for everyday computing it's a pretty jarring experience as a user.
For context: YouTube automatically edits the volume of videos that have an average loudness beyond a certain threshold. I think the solution for HDR is similar penalization based on log luminance or some other reasonable metric.
I don't see this happening on Instagram any time soon, because bad HDR likely makes view counts go up.
As for the HDR photos in the post, well, those are a bit strong to show what HDR can do. That's why the Mark III beta includes a much tamer HDR grade.
For anyone else who was confused by this, it seems to be a client-side audio compressor feature (not a server-side adjustment) labeled as "Stable Volume". On the web, it's toggleable via the player settings menu.
https://support.google.com/youtube/answer/14106294
I can't find exactly when it appeared but the earliest capture of the help article was from May 2024, so it is a relatively recent feature: https://web.archive.org/web/20240523021242/https://support.g...
I didn't realize this was a thing until just now, but I'm glad they added it because (now that I think about it) it's been awhile since I felt the need to adjust my system volume when a video was too quiet even at 100% player volume. It's a nice little enhancement.
I completely understand the desire to address the issue of content authors misusing or intentionally abusing HDR with some kind of auto-limiting algorithm similar to the way the radio 'loudness wars' were addressed. Unfortunately, I suspect it will be difficult, if not impossible, to achieve without also negatively impacting some content applying HDR correctly for artistically expressive purposes. Static photos may be solvable without excessive false positive over-correction but cinematic video is much more challenging due to the dynamic nature of the content.
As a cinemaphile, I'm starting to wonder if maybe HDR on mobile devices simply isn't a solvable problem in practice. While I think it's solvable technically and certainly addressable from a standards perspective, the reality of having so many stakeholders in the mobile ecosystem (hardware, OS, app, content distributors, original creators) with diverging priorities makes whatever we do from a base technology and standards perspective unlikely to work in practice for most users. Maybe I'm too pessimistic but as a high-end home theater enthusiast I'm continually dismayed how hard it is to correctly display diverse HDR content from different distribution sources in a less complex ecosystem where the stakeholders are more aligned and the leading standards bodies have been around for many decades (SMPTE et al).
Another related parallel trend recently is that bad AI images get very high view and like counts, so much so that I've lost a lot of motivation for doing real photography because the platforms cease to show them to anyone, even my own followers.
I set my screen brightness to a certain level for a reason. Please don’t just arbitrarily turn up the brightness!
There is no good way to disable HDR on photos for iPhone, either. Sure, you can turn off the HDR on photos on your iphone. But then, when you cast to a different display, the TV tries to display the photos in HDR, and it won’t look half as good.
You might be on to something there. Technically, HDR is mostly about profile signaling and therefore about interop. To support it in mpeg dash or hls media you need to make sure certain codec attributes are mentioned in the xml or m3u8 but the actual media payload stays the same.
Any bit or Bob being misconfigured or misinterpreted in the streaming pipeline will result in problems ranging from slightly suboptimal experience to nothing works.
Besides HDR, "spatial audio" formats like Dolby Atmos are notorious for interop isuues
On both Android & iOS/MacOS it's not that HDR is ignoring your screen brightness, but rather the brightness slider is controlling the SDR range and then yes HDR can exceed that, that's the singular purpose of HDR to be honest. All the other purported benefits of HDR are at best just about HDR video profiles and at worst just nonsense bullshit. The only thing HDR actually does is allow for brighter colors vs. SDR. When used selectively this really enhances a scene. But restraint is hard, and most forms of HDR content production are shit. The HDR images that newer iPhones and Pixel phones are capturing are generally quite good because they are actually restrained, but then ironically both of them have horrible HDR video that's just obnoxiously bright.
In contrast, my TV will change brightness modes to display HDR content and disables some of the brightness adjustments when displaying HDR content. It can be very uncomfortably bright in a dark room while being excessively dim in a bright room. It requires adjusting settings to a middle ground resulting in a mixed/mediocre experience overall. My wife’s laptop is the worst of all our devices, while reviews seem to praise the display, it has an overreactive adaptive brightness that cannot be disabled (along with decent G2G response but awful B2W/W2B response that causes ghosting).
Some games also have a separate slider https://i.imgur.com/wenBfZY.png for adjusting "paper white", which is the HDR white one might normally associate with matching to SDR reference white (100 nits when in a dark room according to the SDR TV color standards, higher in other situations or standards). Extra note: the peak brightness slider in this game (Red Dead Redemption 2) is the same knob as the brightness slider in the above Battlefield V screenshot)
I think it's because no one wants it.
Let the whole experience be HDR and perhaps it won't be jarring.
This is also true for consumers. I don't own a single 4k or HDR display. I probably won't own an HDR display until my TV dies, and I probably won't own a 4k display until I replace my work screen, at which point I'll also replace one of my home screens so I can remote into it without scaling.
I think the industry is strangling itself putting "DisplayHDR 400" certification on edgelit/backlit LCD displays. In order for HDR to look "good" you either need high resolution full array local dimming backlighting (which still isn't perfect), or a panel type that doesn't use any kind of backlighting like OLED.
Viewing HDR content on these cheap LCDs often looks worse than SDR content. You still get the wider color gamut, but the contrast just isn't there. Local dimming often loses all detail in shadows whenever there is something bright on the screen.
I absolutely loathe consuming content on a mobile screen, but its the reality is the vast majority are using phone and tablets most the time.
The problem starts with sending HDR content to SDR-only devices, or even just other HDR-standards. Not even talking about printing here.
This step can inherently only be automated so much, because it's also a stylistic decision on what information to keep or emphasize. This is an editorial process, not something you want to emburden casual users with. What works for some images can't work for others. Even with AI the preference would still need to be aligned.
[edit]
Some googling suggested I check in the Netflix app; at least Netflix thinks my phone does not support HDR. (Unihertz Jelly Max)
I also have a screen which has a huge gamut and blows out colors in a really nice way (a bit like the aftereffects of hallucinogens, it has colors other screens just don't) and you don't have to touch any settings.
My OLED TV has HDR and it actually seems like HDR content makes a difference while regular content is still "correct".
Don't feel like you have to. I bought a giant fancy TV with it, and even though it's impressive, it's kinda like ultra-hifi-audio. I don't miss it when I watch the same show on one of my older TVs.
If you ever do get it, I suggest doing for a TV that you watch with your full attention, and watching TV / movies in the dark. It's not very useful on a TV that you might turn on while doing housework; but very useful when you are actively watching TV with your full attention.
Like a lot of things, it’s weird how some people are more sensitive to visual changes. For example:
- At this point, I need 120hz displays. I can easily notice when my wife’s phone is in power saver mode at 60hz.
- 4k vs 1080p. This is certainly more subtle, but I definitely miss detail in lower res content.
- High bitrate. This is way more important than 4k vs 1080p or even HDR. But it’s so easy to tell when YouTube lowers the quality setting on me, or when a TV show is streaming at a crappy bitrate.
- HDR is tricky, because it relies completely on the content creator to do a good job producing HDR video. When done well, the image basically sparkles, water looks actually wet, parts of the image basically glow… it looks so good.
I 100% miss this HDR watching equivalent content on other displays. The problem is that a lot of content isn’t produced to take advantage of this very well. The HDR 4k Blu-ray of several Harry Potter movies, for example, has extremely muted colors and dark scenes… so how is the image going to pop? I’m glad we’re seeing more movies rely on bright colors and rich, contrasty color grading. There are so many old film restorations that look excellent in HDR because the original color grade had rich, detailed, contrasty colors.
On top of that, budget HDR implementations, ESPECIALLY in PC monitors, just don’t get very bright. Which means their HDR is basically useless. It’s impossible to replicate the “shiny, wet look” of really good HDR water if the screen can’t get bright enough to make it look shiny. Plus, it needs to be selective about what gets bright, and cheap TVs don’t have a lot of backlighting zones to make that happen very well.
So whereas I can plug in a 4k 120hz monitor and immediately see the benefit in everything I do for normal PC stuff, you can’t get that with HDR unless you have good source material and a decent display.
Also in my country (Italy) TV transmissions are 1080i at best, a lot are still 570i (PAL resolution). Streaming media can be 4K (if you have enough bandwidth to stream it at that resolution, which I don't have at my house). Sure, if you download pirated movies you find it at 4K, and if you have the bandwidth to afford it... sure.
But even there, sometimes is better a well done 1080p movie than an hyper compressed 4K one, since you see compression artifacts.
To me 1080p, and maybe even 720p, is enough for TV vision. Well, sometimes I miss the CRT TVs, they where low resolution but for example had a much better picture quality than most modern 4K LCD TV where black scenes are gray (I know there is OLED, but is too expensive and has other issues).
People in the HN echo chamber over-estimate hardware adoption rates. For example, there are millions of people who went straight from CDs to streaming, without hitting the iPod era.
A few years ago on HN, there was someone who couldn't wrap their brain around the notion that even though VCRs were invented in the early 1960's that in 1980, not everyone owned one, or if they did, they only had one for the whole family.
Normal people aren't magpies who trash their kit every time something shiny comes along.
Point of clarification: While the technology behind the VCR was invented in the '50s and matured in the '60s, consumer-grade video tape systems weren't really a thing until Betamax and VHS arrived in 1975 and 1976 respectively.
Early VCRs were also incredibly expensive, with prices ranging from $3,500 to almost $10,000 after adjusting for inflation. Just buying into the VHS ecosystem at the entry level was a similar investment to buying an Apple Vision Pro today.
Who?
There was about a decade there where everyone who had the slightest interest in music had an mp3 player of some kind, at least in the 15-30 age bracket.
This. I can always tell when someone "gets" software development when they either understand (or don't) that computers can't read minds or infer intent like a person can.
About HDR on phones, I think they are the blight of photography. No more shadows and highlights. I find they are good at capturing family moments, but not as a creative tool.
Slide film has probably a third the dynamic range of negative film and is meant as the final output fit for projection to display.
HDR is what enables you to capture both the darkest shadow detail and the brightest highlight detail.
With SDR, one or both are often simply just lost. It might come down to preference — if you're an "auto" shooter and like the effect of the visual information at the edges of the available dynamic range being truncated, SDR is for you.
Some people prefer to capture that detail and have the ability to decide whether and how to diminish or remove it, with commensurately more control over the artistic impact. For those folks, HDR is highly desirable.
I still use it myself but I need to redo the build system and release it with an updated LibRaw... not looking forward to that.
The utility of HDR (as described in the article) is without question. It's amazing looking at an outdoors (or indoors with windows) scene with your Mk-1 eyeballs, then taking a photo and looking at it on a phone or PC screen. The pic fails to capture what your eyes see for lighting range.
HDR full screen content: Yes.
HDR general desktop usage: No. In fact you'll probably actively dislike it to the point of just turning it off entirely. The ecosystem just isn't ready for this yet, although with things like the "constrained-high" concepts ( https://www.w3.org/TR/css-color-hdr-1/#the-dynamic-range-lim... ) this might, and hopefully does, change & improve to a more pleasing result
Also this is assuming an HDR monitor that's also a good match for your ambient environment. The big thing nobody really talks about wiith HDR is that it's really dominated by how dark you're able to get your surrounding environment such that you can push your display "brightness" (read: SDR whitepoint) lower and lower. OLED HDR monitors, for example, look fantastic in SDR and fantastic in HDR in a dark room, but if you have typical office lighting and so you want an SDR whitepoint of around 200-300 nits? Yeah, they basically don't do HDR at all anymore at that point.
The difference is absolutely stunning in some games.
In MS Flight Simulator 2024, going from SDR to HDR goes from looking like the computer game it is to looking life-like. Deeper shadows with brighter highlights makes the scene pop in ways that SDR just can't do.
EDIT: You'll almost certainly need an OLED monitor to really appreciate it, though. Local dimming isn't good enough.
I use a mini-led monitor, and its quite decent, except for starfields, & makes it very usable even in bright conditions, and HDR video still is better in bright conditions than the equivalent SDR video.
Top tip: If you have HDR turned on for your display in Windows (at least, MacOS not tested) and then share your screen in Teams, your display will look weirdly dimmed for everyone not using HDR on their display—which is everyone.
Like totally washed out
As others here have said, OLED monitors are generally excellent at reproducing a HDR signal, especially in a darker space. But they're terrible for productivity work because they'll get burned in for images that don't change a lot. They're fantastic for movies and gaming, though.
There are a few good non-OLED HDR monitors, but not many. I have an AOC Q27G3XMN; its a 27" 1440p 180hz monitor that is good for entry-level HDR, especially in brighter rooms. It has over 1000 nits of brightness, and no major flaws. It only has 336 backlight zones, though, so you might notice some blooming around subtitles or other fine details where there's dark and light content close together. (VA panels are better than IPS at suppressing that, though.) It's also around half the price of a comparable OLED.
Most of the other non-OLED monitors with good HDR support have some other deal-breaking flaws or at least major annoyances, like latency, screwing up SDR content, buggy controls, etc. The Monitors Unboxed channel on YouTube and rtngs.com are both good places to check.
My current monitor is an OLED and HDR in games looks absolutely amazing. My previous was an IPS that supported HDR, but turning it on caused the backlight to crank to the max, destroying black levels and basically defeating the entire purpose of HDR. Local dimming only goes so far.
I have a LG 2018 OLED that has some burnt in Minecraft hearts because of that, not from Minecraft itself, but just a few hours of minecraft Youtube video in those settings from the built in youtube client, but virtually no other detectable issues after excessive years of use with static content.
You only see them with fairly uniform colors as a background where color banding would usually be my bigger complaint.
So burn-ins definitely happen, but they are far from being a deal breaker over the obvious benefits you get vs other types of displays.
And driving everything possible in dark mode (white text on dark bg) on those displays is even the logical thing to do. Then you dont need much max brightness anyway and even save some energy.
You can lie as much as you like as an advertiser. At the beginning, HD meant 1080p. Then, 1080p became FullHD, only that 720p to became HD and to make HD digital television possible. (the original SD television is worse than analog)
You know the 0-10 brightness slider you have to pick at the start of a game? Imagine setting it to 0 and still being able to spot the faint dark spot. The dynamic range of things you can see is so much expanded.
Early HDR screens were very limited (limited dimming zones, buggy implementation) but if you get one post 2024 (esp the oled ones) they are quite decent. However it needs to be supported at many layers: not just the monitor, but also the operating system, and the content. There are not many games with proper HDR implementation; and even if there is, it may be bad and look worse — the OS can hijack the rendering pipeline and provide HDR map for you (Nvidia RTX HDR) which is a gamble: it may look bleh, but sometimes also better than the native HDR implementation the game has).
But when everything works properly, wow it looks amazing.
Note that HDR only actually changes how bright things can get. There's zero difference in the dark regions. This is made confusing because HDR video marketing often claims it does, but it doesn't actually. HDR monitors do not, in general, have any advantage over SDR monitors in terms of the darks. Local dimming zones improve dark contrast. OLED improves dark contrast. Dynamic contrast improves dark contrast. But HDR doesn't.
0-10 is so old century. Now it should be 0-2 (0 - default, 1 dark, 2 light,) /s
If you have say a 400 nits display the HDR may actually look worse than SDR. So it really depends on your screen.
Given that monitors report information about their HDR minimum and maximum panel brightness capabilities to the machine they are connected to, any competently-built HDR renderer (whether that be for games or movies or whatever) will be able to take that information and adjust the picture appropriately.
I'd also be interested in hearing whether it makes sense to look into OLED HDR 400 screens (Samsung, LG) or is it really necessary to get an Asus ProArt which can push the same 1000 nits average as the Apple XDR display (which, mind you, is IPS).
Also the maximum brightness isn't even that bright at 800 nits, so no HDR content really looks that different. I think newer OLEDs are brighter though. I'm still happy with the screen in general, even in SDR the OLED really shines. But it made me aware not all HDR screens are equal.
Also, in my very short experiment using HDR for daily work I ran into several problems, the most serious of which was the discovery that you can no longer just screenshot something and expect it to look the same on someone else's computer.
To be pedantic, this has always been the case... Who the hell knows what bonkers "color enhancement" your recipient has going on on their end?
But (more seriously) it's very, very stupid that most systems out there will ignore color profile data embedded in pictures (and many video players ignore the same in videos [0]). It's quite possible to tone-map HDR stuff so it looks reasonable on SDR displays, but color management is like accessibility in that nearly noone who's in charge of paying for software development appears to give any shits about it.
[0] A notable exception to this is MPV. I can't recommend this video player highly enough.
The high-end LCD monitors (with full-array local dimming) barely make any difference, while you'll get a lot of downsides from bad HDR software implementations that struggle to get the correct brightness/gamma and saturation.
IMHO HDR is only worth viewing on OLED screens, and requires a dimly lit environment. Otherwise either the hardware is not capable enough, or the content is mastered for wrong brightness levels, and the software trying to fix that makes it look even worse.
And thats now that all the LEDs are still fresh. I can't imagine how bad it will be in a few years.
Also, a lot of Software doesn't expect the subpixel arrangement, so text will often look terrible.
For desktop work, don't bother unless your work involves HDR content.
On my Macbook Pro only activates when it needs to but honestly I've only seen one video [1] that impressed me with it, the rest was completely meh. Not sure if its because it's mostly iPhone photography you see in HDR which is overall pretty meh looking anyway.
[1] https://www.youtube.com/watch?v=UwCFY6pmaYY I understand this isn't a true HDR process but someone messing with it in post, but it's the only video I've seen that noticeably shows you colors you can't see on a screen otherwise.
You have to spend really good money to get a display which does HDR properly.
Second, the HDR effect seems to be implemented in a very crude way, which causes the whole Android UI (including the Android status bar at the top) to become brighter when HDR content is on screen. That's clearly not right. Though, of course, this might also be some issue of Android rather than Chrome, or perhaps of the Qualcomm graphics driver for my Adreno GPU, etc.
If I enable HDR the Firefox ones become a gray mess vs the lights feeling like actual lights in Safari.
edit: Ah, nevermind. It seems Firefox is doing some sort of post-processing (maybe bad tonemapping?) on-the-fly as the pictures start out similar but degrade to washed out after some time. In particular, the "OVERTHROW BOXING CLUB" photo makes this quite apparent.
That's a damn shame Firefox. C'mon, HDR support feels like table stakes at this point.
edit2: Apparently it's not table stakes.
> Browser support is halfway there. Google beat Apple to the punch with their own version of Adaptive HDR they call Ultra HDR, which Chrome 14 now supports. Safari has added HDR support into its developer preview, then it disabled it, due to bugs within iOS.
at which point I would just say to `lux.camera` authors - why not put a big fat warning at the top for users with a Firefox or Safari (stable) browser? With all the emphasis on supposedly simplifying a difficult standard, the article has fallen for one of its most famous pitfalls.
"It's not you. HDR confuses tons of people."
Yep, and you've made it even worse for a huge chunk of people. :shrug: Great article n' all just saying.
"we finally explain what HDR actually means"
Then spends 2/3rds of the article on a tone mapping expedition, only to not address the elephant in the room, that is the almost complete absence of predictable color management in consumer-grade digital environments.
UIs are hardly ever tested in HDR: I don't want my subtitles to burn out my eyes in actual HDR display.
It is here, where you, the consumer, are as vulnerable to light in a proper dark environment for movie watching, as when raising the window curtains on a bright summer morning. (That brightness abuse by content is actually discussed here)
Dolby Vision and Apple have the lead here as a closed platforms, on the web it's simply not predictably possible yet.
Best hope is the efforts of the Color on the Web Community Group from my impression.
No. Because it's written for the many casual photographers we've spoken with who are confused and asked for an explainer.
> Then spends 2/3rds of the article on a tone mapping expedition, only to not address the elephant in the room, that is the almost complete absence of predictable color management in consumer-grade digital environments.
That's because this post is about HDR and not color management, which is different topic.
To be fair, it would be pretty weird if you found your own post off-putting :P
It's about HDR from the perspective of still photography, in your app, on iOS, in the context of hand-held mobile devices. The post's title ("What Is HDR, Anyway?"), content level and focus would be appropriate in the context of your company's social media feeds for users of your app - which is probably the audience and context it was written for. However in the much broader context of HN, a highly technical community whose interests in imaging are diverse, the article's content level and narrow focus aren't consistent with the headline title. It seems written at a level appropriate for novice users.
If this post was titled "How does Halide handle HDR, anyway?" or even "How should iOS photo apps handle HDR, anyway?" I'd have no objection about the title's promise not matching the content for the HN audience. When I saw the post's headline I thought "Cool! We really need a good technical deep dive into the mess that is HDR - including tech, specs, standards, formats, content acquisition, distribution and display across content types including stills, video clips and cinematic story-telling and diverse viewing contexts from phones to TVs to cinemas to VR." When I started reading and the article only used photos to illustrate concepts best conveyed with color gradient graphs PLUS photos, I started to feel duped by the title.
(Note: I don't use iOS or your app but the photo comparison of the elderly man near the end of the article confused me. From my perspective (video, cinematography and color grading), the "before" photo looks like a raw capture with flat LUT (or no LUT) applied. Yet the text seemed to imply Halide's feature was 'fixing' some problem with the image. Perhaps I'm misunderstanding since I don't know the tool(s) or workflow but I don't see anything wrong with the original image. It's what you want in a flat capture for later grading.)
On the HN frontpage, people are likely thinking of one of at least three things:
HDR as display tech (hardware)
HDR as wide gamut data format (content)
HDR as tone mapping (processing)
...
So when the first paragraph says we finally explain what HDR actually means, it set me off on the wrong foot—it comes across pretty strongly for a term that’s notoriously context-dependent. Especially in a blog post that reads like a general explainer rather than a direct Q&A response when not coming through your apps channels.
Then followed up by The first HDR is the "HDR mode" introduced to the iPhone camera in 2010. caused me to write the comment.
For people over 35 with even the faintest interest in photography, the first exposure to the HDR acronym probably didn’t arrive with the iPhone in 2010, but HDR IS equivalent to Photomatix style tone mapping starting in 2005 as even mentioned later. The ambiguity of the term is a given now. I think it's futile to insist or police one meaning other the other in non-scientific informal communication, just use more specific terminology.
So the correlation of what HDR means or what sentiment it evokes in people by age group and self-assesed photography skill might be something worthwhile to explore.
The post get's a lot better after that. That said, I really did enjoy the depth. The dive into the classic dodge and burn and the linked YouTube piece. One explainer at a time makes sense—and tone mapping is a good place to start. Even tone mapping is fine in moderation :)
Bad HDR boils down to poor taste and the failure of platforms to rein it in. You can't fix bad HDR by switching encodings any more than you can fix global warming by switching from Fahrenheit to Celsius.
I predict HDR content on the web will eventually be disabled or mitigated on popular browsers similarly to how auto-playing audio content is no longer allowed [1]
Spammers and advertisers haven't caught on yet to how abusively attention grabbing eye-searingly bright HDR content can be, but any day now they will and it'll be everywhere.
1. https://hacks.mozilla.org/2019/02/firefox-66-to-block-automa...
High dynamic resolution has always been about tone mapping. Post-sRGB color profile support is called “Wide color” these days, has been available for twenty years or more on all DSLR cameras (such as Nikon ProPhoto RGB supported in-camera on my old D70), and has nothing to do with the dynamic range and tone mapping of the photo. It’s convenient that we don’t have to use EXR files anymore, though!
An HDR photo in sRGB will have the same defects beyond peak saturation at any given hue point, as an SDR photo in sRGB would, relative to either in DCI-P3 or ProPhoto. Even a two-bit black-or-white “what’s color? on or off pixels only” HyperCard dithered image file can still be HDR or SDR. In OKLCH, the selected luminosity will also impact the available chroma range; at some point you start spending your new post-sRGB peak chroma on luminosity instead; but the exact characteristic of that tradeoff at any given hue point is defined by the color profile algorithm, not by whether the photo is SDR or HDR, and the highest peak saturation possible for each hue is fixed, whatever luminosity it happens to be at.
Color management and handling HDR in UIs is probably a bit out of scope.
The photo capture HDR is good. That's a totally different thing and shouldn't have had its name stolen.
My hypothesis are the following:
- Increase display lighting to increase peak white point + use a black ink able to absorb more light (can Vantablack-style pigments be made into ink?) => increase dynamic range of a printable picture
- Alternatively, have the display lighting include visible light + invisible UV light, and have the printed picture include an invisible layer of UV ink that shines white : the pattern printed in invisible UV-ink would be the "gain map" to increase the peak brightness past incident visible light into HDR range.
What do you folks think?
Alternatively, use transparent film and a bright backlight.
Haven't you ever been to a photo exhibition ?
Around this, a bunch of practical tooling surfaced (e.g., hybrid log approaches to luminance mapping) to extend the thinking from 8-bit gamma-mapped content presenting ~8 stops of dynamic range to where we are now. If we get away from just trying to label everyting "HDR", there are some useful things people should familiarize with:
1. Color primaries: examples - SDR: Rec. 601, Rec. 709, sRGB. HDR: Rec. 2020, DCI-P3. The new color primaries expand the chromatic representation capabilities. This is pretty easy to wrap our heads around: https://en.wikipedia.org/wiki/Rec._2020
2. Transfer functions: examples - SDR: sRGB, BT.1886. HDR: Rec. 2100 Perceptual Quantizer (PQ), HLG. The big thing in this space to care about is that SDR transfer functions had reference peak luminance but were otherwise relative to that peak luminance. By contrast, Rec. 2100 PQ code points are absolute, in that each code value has a defined meaning in measurable luminance, per the PQ EOTF transfer function. This is a big departure from our older SDR universe and from Hybrid Log Gamma approaches.
3. Tone mapping: In SDR, we had the comfort of camera and display technologies roughly lining up in the video space, so living in a gamma/inverse-gamma universe was fine. We just controlled the eccentricity of the curve. Now, with HDR, we have formats that can carry tone-mapping information and transports (e.g., HDMI) that can bidirectionally signal display target capabilities, allowing things like source-based tone mapping. Go digging into HDR10+, Dolby Vision, or HDMI SBTM for a deep rabbit hole. https://en.wikipedia.org/wiki/Tone_mapping
So HDR is everything (and nothing), but it's definitely important. If I had to emphasize one thing that is non-obvious to most new entrants into the space, it's that there are elements of description of color and luminance that are absolute in their meaning, rather than relative. That's a substantial shift. Extra points for figuring out that practical adaptation to display targets is built into formats and protocols.
https://www.dpreview.com/news/7452255382/sigma-brings-hdr-br...
I wonder if there’s an issue in Windows tonemapping or HDR->SDR pipeline, because perceptually the HDR image is really off.
It’s more off than if I took an SDR picture of my iPhone showing the HDR image and showed that SDR picture on the said Windows machine with an IPS panel. Which tells me that the manual HDR->SDR “pipeline” I just described is better.
I think Windows showing HDR content on a non-HDR display should just pick an SDR-sized section of that long dynamic range and show it normally. Without trying to remap the entire large range to a smaller one. Or it should do some other perceptual improvements.
Then again, I know professionally that Windows HDR is complicated and hard to tame. So I’m not really sure the context of remapping as they do, maybe it’s the only way in some contingency/rare scenario.
<video src="https://www.lux.camera/content/media/2025/03/new-york-skyline-hdr.mp4" poster="https://img.spacergif.org/v1/4032x3024/0a/spacer.png" width="4032" height="3024" loop="" autoplay="" muted="" playsinline="" preload="metadata" style="background: transparent url('https://www.lux.camera/content/media/2025/03/new-york-skyline-hdr_thumb.jpg') 50% 50% / cover no-repeat;"></video>Look for the word video.
All this aside, HDR and high brightness are different things - HDR is just a representational thing. You can go full send on your SDR monitor as well, you'll just see more banding. The majority of the article is just content marketing about how they perform automatic tonemapping anyways.
That’s a consequence of https://en.wikipedia.org/wiki/Adaptation_(eye). If you look at 1000 nits on a display in bright sunlight, with your eyes adapted to the bright surroundings, the display would look rather dim.
Hopefully HN allows me to share an App Store link... this app works best on Pro iPhones, which support ProRAW, although I do some clever stuff on non-Pro iPhones to get a more natural look.
Not having before-and-after comparisons is mostly down to my being concerned about whether that would pass App Review; the guidelines indicate that the App Store images are supposed to be screenshots of the app, and I'm already pushing that rule with the example images for filters. I'm not sure a hubristic "here's how much better my photos are than Apple's" image would go over well. Maybe in my next update? I should at least have some comparisons on my website, but I've been bad at keeping that updated.
There's no Live Photo support, though I've been thinking about it. The reason is that my current iPhone 14 Pro Max does not support Live Photos while shooting in 48-megapixel mode; the capture process takes too long. I'd have to come up with a compromise such as only having video up to the moment of capture. That doesn't prevent me from implementing it for other iPhones/cameras/resolutions, but I don't like having features unevenly available.
I really appreciate the article. I could feel that they also have a product to present, because of the many references, but it was also very informative besides that.
Creative power is still in your hands versus some tone mapper's guesses at your intent.
Can people go overboard? Sure, but thats something they will do regardless of any hdr or lack thereof.
On an aside its still rough that just about every site that touches gain map (adaptive HDR as this blog calls them) HDR images will lose that metadata if they need to scale, recompress, or transform the images otherwise. Its led me to just make my own site, but also to handle what files a client gets a bit smarter . For instance if a browser doesnt support .jxl or .avif images, im sure it wont want an hdr jpeg either, thats easy to handle on a webserver.
https://docs.krita.org/en/general_concepts/colors/bit_depth....
https://docs.krita.org/en/general_concepts/colors/color_spac...
https://docs.krita.org/en/general_concepts/colors/scene_line...
A lot of these design flaws are fixed by Firefox's picture in picture option but for some reason, with the way you coded it, the prompt to pop it out as PIP doesn't show up
Sidebar: I kinda miss when Halide's driving purpose was rapid launch and simplicity. I would almost prefer a zoom function to all of this HDR gymnastics (though, to be clear, Halide is my most-used and most-liked camera app).
EDIT: Ah, I see, it's a Mark III feature. That is not REMOTELY clear in the (very long) post.
My understanding is most SDR TVs and computer screens have displays about 200-300 nits (aka cd/m²). Is that the correct measure of the range of the display? The brightest white is 300 nits brighter than the darkest black?
The hardest part of it, by far, was taking hundreds upon hundreds of pictures of a blank piece of paper in different lighting conditions with different settings.
https://blog.adobe.com/en/publish/2023/10/10/hdr-explained
Greg Benz Photography maintains a list of software here:
https://gregbenzphotography.com/hdr-display-photo-software/
I'm not sure what FOSS options there are; it's difficult to search for given that "HDR" can mean three or four different things in common usage.
Glad all this "Instagram influences searing eyeballs with bright whites" is news to me. All I know about is QR code mode doing that.
Literal snort.
I love when product announcements and ADS in general are high value works. This one was good education for me. Thank you for it!
I had also written about my plasma and CRT displays and how misunderstandings about HDR made things generally worse and how I probably have not seen the best these 10 bit capable displays can do.
And finally, I had written about 3D TV and how fast, at least 60Hz per eye, 3D in my home made for amazing modeling and assembly experiences! I was very sad to see that tech dead end.
3D for technical content create has a lot of legs... if only more people could see it running great...
Thanks again. I appreciate the education.
It’s infuriating.
e.g. Open this in macOS Chrome: https://www.youtube.com/watch?v=Gq7H6PI4JF8