Some history for people who are not aware: Theora became somewhat popular in free software / open source circles, because at the time, it was the best codec which was believed to be either free of patents or the patents were explicitly opened up for free use. Therefore, if you were concerned about patents and their impact on free software, you'd use it. But Theora wasn't a great codec, which we always knew, it just was the best we had before google bought and opened up VP8.
It's an interesting tradeoff. Theora was never particularly popular, so you probably will have a low number of sites being impacted. But we kinda have a tradition that the web plattform rarely breaks things. You mostly can still use old html from 20-30 years ago, gif will probably stay supported in browsers forever, and I don't think there are many examples of media formats in browsers being deprecated. Even odd things like bmp are still supported.
Dropping e.g. HTML/CSS features is much harder, since there likely won't be any workaround other than running older version of the browser.
Obviously only if I'd be certain some other browser can still open it, or maybe some emulator would be able to display it. My earliest websites are 27 years old now (and live on a floppy disk) I'D hate to find that they're no longer readable in any software.
But I'm perfectly fine with my Firefox, with which I spend hours a day on the modern web, dropping support for that, when they need to shed some cruft or weight.
It's a shame Theora never "made it". The peak of its popularity is long past.
So was VP8, which Google ended up pushing at nearly the same time they were rejecting Theora as an inferior H.264 clone. The irony being that VP8 wasn't that different either[1]
[1] https://web.archive.org/web/20150301015756/http://x264dev.mu...
I tried to use VP9 in the past but it's like 20-40MBs a dll. The lowest I could find was dav1d and it's still around 4MB for the library dll and encoding AV1 and getting good compression rate was not trivial.
I was wondering about this too. https://www.osnews.com/story/24954/us-patent-expiration-for-... seems to think 12/2027.
Would it be surprising if H.264 was replaced by something else by that point? We have multiple subsequent standards, and it seems like everyone producing or providing content would want improved codecs by then.
I considered libtheora, the library size is good but the compression/visual quality is awful compared to the alternatives.
This feels like a gross gross misoptimization that is actively harmful to 99.999999999% of user experiences.
...and in fact it doesn't need to be, as I can say so from having written an MPEG-2 (+MPEG-1) decoder myself, whose binary turned out to be less than 16KB.
When one hears about a codec being dozens of MB, the natural instinct should be "for what?" and not "who cares?" The latter attitude is responsible for why software has gotten so much more inefficient, and serves only to line the pockets of hardware manufacturers.
Small inefficiencies add up and in the end you have the janky laggy mess that is modern software.
I can just play those as they are, in any browser, except Safari. Painfully, macOS actually supports Vorbis, but only in a CAF (Core Audio Format) container instead of an Ogg container. Still hoping; because shipping an entire Ogg decoder in the browser with WASM works but is ugly.
Of course, I could also just re-encode them with a microservice but it’s just… bleh.
From the FF announcement linked elsewhere in these comments.
VP9 on Safari? Sure, on desktop. On mobile? Oh yeah, only via WebRTC(why???).
Want to import FLACs into Apple Music? Nope, only inferior ALAC is supported for lossless.
AV1? Only just added to iPhone 15 Pro series (not in 15 cause old SoC) and still not on Macbooks.
HEVC? Oh yeah, of course we use it as HEIC for photos and support HEVC playback in Safari, how could we not?
It was a "nifty but who really needs it?" idea
But even then, in a modular environment the potential pool of engineering resources is much broader
My laptop doesn't have a 1080p screen, but trying 1080p it could do some videos better than others.
in this one I only dropped 2 frames:
https://www.youtube.com/watch?v=m1jY2VLCRmY&list=PLAMlLc3Zgg...
I'm sorry for being ambiguous. I meant encoding, not decoding. In my opinion an average PC should be able to encode at least some minutes of video in reasonable time.
Whatever, having to recourse to frame dropping (even in a negligible degree) means having 100% CPU or IO load already reached, doesn't it? 100% load on 720p playback sounds bizarre. I have been accustomed to any video playback taking just a few percents on any old computer with Intel graphics.
I mean ANY phone. My phone is $160 from 2019 and it can play 1440p AV1 on youtube (with a few frame drops, but nothing noticeable), even though it has a 720p screen. I'd have to get my OnePlus 3 (2016) to stutter in 1440p
note that the OnePlus 3 would stutter in h264 1440p as well, it's just old
"Chrome will deprecate and remove support for the Theora video codec in desktop Chrome due to emerging security risks. Theora's low (and now often incorrect) usage no longer justifies support for most users. "
(Not that it would make any sense to implement ffmpeg on top of WebCodecs on top of ffmpeg. Just needed an example.)
I hope I have not completely missed the train by focusing on other areas outside of web.
(it's actually usable to some extent)