How?
1) Light fields are 4 dimensional images, light field video is a 5 dimensional stream. This is a basic requirement for hand- or head-tracked images, like in headsets or AR devices.
2) We're just getting to the point where real time ray tracing is truly economical, and OTOY is all-in on it. Up til now rendering has been a "bag of tricks" approach, where you try to paint sophisticated paintings on polygons. Many of these tricks fall apart when you try to do the predictive modeling required for 6DOF streaming. You see the reflections painted onto the countertop. Ray tracing actually simulates light.
3) They've fully embraced the cloud. They're offering everything they do as cloud services, which means it can work on every device, for a minimal cost, with no need for customers or users to be on the latest hardware.
4) Open formats. They're not trying to build a portal the way Oculus or Valve is, they're inventing the content pipeline and getting it integrated everywhere they can. I am skeptical any the closed content stores will win, we saw how big the web became, I think a better bet is that the metaverse will be more like the web than the App Store, and that's the bet OTOY is making.
Orbach has been working relentlessly on this vision behind the scenes. Not a lot has been coming out of the company, but I've been watching him lay the groundwork for the whole next generation of content distribution for 5 years now, and he's killing it. Release after release of core building blocks.
From OTOY.com's page: "For optimal performance and smooth interactive use, Windows 7 64bit, 8GB system RAM, a modern Quad core CPU, and a GTX brand NVidia graphics card (like GTX 560 / 570 / 580 / 590) with 1536MB VRAM or more is recommended."
Sounds like a minimum hardware spec using fairly recent hardware to me.
Here is one of the two ORBX docs from MPEG 119, the other (which has the full container schema) I'll post shortly.
https://home.otoy.com/wp-content/uploads/2017/08/m41018-An-i...
They seem to have a render farm, light field format and renderer, and a streaming video format that they will always mix together in their demos.
"Look at these amazing renders. It runs on a phone!"*
*its actually just streaming video to the phone.
I'm pretty excited about this stuff but I keep finding myself frustrated trying to crack through OTOY's marketing to get my hands on something I can try myself.
Can someone please break my incredulity?
1: streaming a Unity/Unreal game into a surface texture
2: packaging an entire Unity project into an ORBX file
So... am I understanding this right: that ORBX can contain not just a light field, but all the assets and logic for a game, compiled to LuaJIT, which the ORBX player (or orbx.js) will then play? And Unity can target this for output?
2) yes, it works as shown in the video using the standard Unity samples unmodified to prove viability of the system on GearVR and Samsung Internet. We also have this working non PC/rift and daydream now
ORBX.js is < ORBX.lua on top of luajit. But an ORBX file can be flattened down to a cloud stream if user agent can only play ORBX.js feature set.
ORBX is a container with the render graph of the content in xml/Json + assets. Just like a web page, it can be cached to archive (.orbx file) or streamed from a URL or URI over raw UDP/tcp or web wss or https
There are open source implementations of various MPEG and JPEG codecs, but they still carry a licensing fee to use them. Most people who don't use Linux don't realize this (installing Linux and trying to deal with av codecs is a quick education in the difference).
Also: how much processing is it required to get video light fields from video recording (with an array of cameras I suppose)? I mean: does it scale?
It's a true plenoptic (light-field) camera.