Thinking back on it now, I just did a little trial and error until I found something that worked, but what would I search for if I was trying to find data on how... ?streamable? an encoding is?
If curious, I got my proof of concept working but it was unpleasantly slow. I blindly chunked the incoming stream into megabyte sized chunks registered the chunks on ipfs then used ipfs pubsub to announce the chunk to any watchers. The watcher would watch the pubsub channel for announcements download the chunk and try to reassemble it in order and play it. one neat side effect that I found was when the stream was done if I had stored all the ipfs address I could then generate a whole ipfs file structure you could use to download the stream at a later date.
Having recently written my own fragmented-MP4 remuxing library, I felt this pain too, and my soon-to-be-published writeup has very similar things to say about the ISO's paywalling practices.
I think one of the hardest parts of ISO-BMFF, aside from spec availability, is that it's pretty hard to implement "cleanly", making existing code confusing to use as reference. (My own implementation is certainly not clean either)
Would be curious to hear what goals you had with writing a muxer yourself as well, given that most people just use LibAV/GStreamer/GPAC and call it a day.
> I think one of the hardest parts of ISO-BMFF, aside from spec availability, is that it's pretty hard to implement "cleanly", making existing code confusing to use as reference. (My own implementation is certainly not clean either)
I certainly wouldn't call the OBS implementation "clean" either. It's very much inspired by the FFmpeg/LibAV implementation since that one is fairly straightforward (not a lot of abstraction), and gets the job done (and also is GPL/LGPL so not a huge concern looking at it).
"Library" is perhaps an overstatement, it does the things I need and not much more.
I always forget about GStreamer but I think I have a perfect application for it. Hopefully it’s easier to use as a library than MediaFoundation or FFMpeg.
Would love to see MP4 Hybrid supported in popular packages like mp4-muxer [1] and mp4box [2] someday.
1: https://github.com/Vanilagy/mp4-muxer 2: https://github.com/gpac/mp4box.js
Very cute easter egg. Moof is what dogcows say: http://clarus.chez-alice.fr/history.php
> 2. They are slow to access on HDD or network drives, as each fragment's header needs to be read to get the complete metadata of the file and start playback
Huh? That's not right. The whole point of fragmented MP4 is that you can access any fragment without having to read the headers of the other fragments. That's why adaptive streaming is built around fragmented MP4.
I have 20 years or professional experience and my conclusion, if someone asked, what IT boils down to: pain.
The pain is what filters who can succeed and who fail. Can you endure hunting a bug for 7 hours in your chair? Can you fix problem after problem to get a system running? Everything that can fail, will fail, and you have to deal with it.
MP4 has been able to have multiple video streams for quite some time. One of the very first advanced MP4 authoring tools I saw in the early 00s allowed for this, and we used it to make a few advanced files to demo the "new" MP4 format. Much like multi-angle DVDs, this was a niche feature that did not gain very much attraction. I could see why someone not around at that time might think this is a new feature, but it's not