Now Samsung has released Super HDR, without any information about that standard or if it relates to Google's Ultra HDR. Sigh.
Edit: I forgot about WebP/WebP2, which was also developed by Google as a JPEG replacement.
Apple's HEIC is very annoying since it's not really supported by anything non-Apple. Would certainly be nice to see that go away.
JPEG XL is officially blessed by JPEG but otherwise irrelvant here, even though non-progressive JPEG 1 files can be losslessly recompssed into JPEG XL by its design. The main role of JPEG here was to specify explicit goals for JPEG XL proposals [1]. Interestingly enough, the JPEG 1 recompression was not a part of that call for proposals back then. JPEG XL is otherwise a completely different format with much better compression algorithms, so should be the image format for pretty much all uses once popularized.
Ultra HDR [2] is an extension to the JPEG 1 format, which depends on XMP and the CIPA Multi-Picture Format. This kind of extension was not the first, even JPEG the group itself had a similar extension called JPEG XT which was never popular! (If you don't know much about JPEG, APNG was a similarly designed extension to PNG which eventually became a part of PNG.) Naturally Ultra HDR files cannot be smaller than ordinary JPEG 1 files, so can't fulfill Samsung's needs.
[1] https://jpeg.org/downloads/jpegxl/jpegxl-cfp.pdf
[2] https://developer.android.com/media/platform/hdr-image-forma...
"Apple's"?
HEIC is HEVC encoded image in a HEIF container, it's defined in ISO/IEC 23008-12, and was created by the Motion Picture Experts Group.
Further more, it's supported by Windows 10/11, Android 10 and Ubuntu 20.04.
Or did you mean, it's not really supported by browsers other than Safari?
Whose fault is that?
> Would certainly be nice to see that go away.
Why, exactly?
Did you also wish for H.264 playback support to "go away" when Safari supported it, but neither Chrome nor Firefox did?
It's really weird Google deprecated the format despite contributing engineers to help build JPEG XL. I guess it's office politics
https://www.fsf.org/blogs/community/googles-decision-to-depr...
Note how noone is asking Mozilla why Firefox won't support it or actually building websites using it.
> Note how noone is asking Mozilla why Firefox won't support it or actually building websites using it.
People do ask why Firefox isn't supporting it, but the answer is obvious: because Chrome dropped it, and Firefox has what, 5% market share?
And people do use it on websites, you just didn't notice because companies like Nike don't tend to write up blog articles about how they're using a cool new image format.
Also they have a separate JPEG XL article: https://r2.community.samsung.com/t5/CamCyclopedia/JPEG-XL-Im...
Apple for example “supports” the JPEG XL format, but decodes it to sRGB SDR irrespective of the source image gamut.
As of today, Adobe Lightroom running on an Apple iDevice can edit a RAW camera image in HDR, can export the result in three formats… none of which can be viewed as HDR on the same device.
Windows 11 with all the latest updates can basically open nothing and will show garbage half the time when it can open new formats.
Linux is still stuck in the teletype era and will catch up to $399 Aldi televisions from China any decade now.
I should create an “Are we HDR yet?” page and track this stuff.
These are trillion dollar companies acting like children fighting over a toy.
“No! Use my format! I don’t want to play with your format! It’s yucky!”
I also verified this by transferring the images to my NAS and then grabbing those images on my Pixel 5 at the time and also my pixel fold that I use now and both of those you could tell immediately when the HDR transforms the image. There's like a split second as the image shows up on the screen where you could tell that it's like tone mapping it or engaging the HDR display mode or something.
And I know we aren't talking about video but way back when Doom eternal came out I recorded a full playthrough of that using a capture card that I have that allows me to capture in h265 with proper HDR metadata. It was a messy setup because my only HDR monitor is my TV and my computers in my living room so I had to string an HDMI cable from my TV to the capture box input and then another HDMI to my computer monitor along with the USBC cable to my computer. So I beat the entire game in the avermedia preview window and then edited each level in DaVinci resolve exported that with all the correct settings after reading the like 4,000 page manual just to make sure I was doing it just right. The entire time I was editing I wasn't exactly sure it was going to come out right because my computer monitor is a 6-bit panel with dithering to make it 8-bit and it's not even an HDR monitor at all. But in the end, My m1 Mac mini was able to watch it on YouTube in HDR in 4K. My TCL 4K HDR TV was able to watch it using the built-in YouTube app. Basically anything I had that I could attach to a screen that would enable HDR mode would let that video play correctly, including the Pixel 5. And I did move things around in my living room just so I could make sure my windows 10/11 and I say that because I was insider preview around the transition time so it was kind of a hybrid of both in a way, that was also able to watch the video natively and on YouTube correctly.
I think things are more compatible than just looking at compatibility listings. If you have a modern computer with parts that are 7 years old but run a modern operating system and you have, and this is the kicker, a screen with a 10-bit or 12-bit panel that also has an actual rating of 1000 nits then you have something that can legitimately view the minimum standard for most HDR technical specifications.
If you're trying to look at HDR content and you say that it's not working correctly then you might not actually have a monitor that is at least the proper video industries or film industries minimum standard. It's okay if you have a cheap TV like I do that's an 8-bit panel that uses advanced dithering to make it 10 bit, mine for some whatever reason also could go up to 12 in windows. But you have to have that 10-bit minimum you have to have REC 2020/2084 and P3DCI 65 along with 1000 nits peak brightness.
Some gamer monitor saying that it's an HDR display and it has something like 600 nits isn't a standard it's a marketing term that that company made up so they could say it's HDR because all laypeople think HDR means is brighter. What's the point of having 1,024 levels of brightness per color if your brightness levels of your screen can't show that full dynamic range?
Now send it to anyone, in any way. iMessage? Doesn’t work. Received on an iPhone? Won’t work. Android? Definitely not. Etc…
There is no one file format that works across ecosystems. Apple is even internally fragmented, with some formats working on MacOS that don’t on IOS.
> Of course it looked different
That is broken!!
This is precisely what I mean: HDR is often incorrectly decoded as SDR in Apple operating systems. This is worse than just sending an SDR JPG because the HDR-to-SDR conversion is unpredictable, and making the HDR image was a waste of time and bits.
Note that I didn’t say HDR video! I meant specifically HDR still images, the type that JPEG XL can encode.
Right now, in 2024, if I want to send someone HDR anything, the only robust method is to make it into a video and send them a YouTube link to it.
That’s a sad state of affairs.
> Additionally, storage capacity has been reduced while maintaining image quality by providing JPEG XL format.