Lytro releases the first footage shot by its VR camera


“Light field” camera maker Lytro is finally pulling back the curtain a bit on Immerge, the massive orb-shaped virtual reality rig that the company announced almost a year ago. For the first time it’s showing footage captured by Immerge in the form of a minute-long short that’s been broken up into three clips, each of which show how the final, processed VR footage comes together.

Light field photography is different from traditional photography because the cameras can measure the geometry of the light that strikes the image sensor — instead of just capturing it straight on. With enough computing power, Lytro’s software can then reconstruct the scene that was captured in three dimensions. The company previously tried to make and sell consumer light field cameras that let users to adjust the focus of any photograph after it was shot. Those attempts largely failed, but in 2015 Lytro received a big investment and pivoted toward becoming a virtual reality company — a seemingly more natural fit for light field capture.

It’s an important step in Lytro’s shift to VR

The three videos released today all center around a short film shot with Immerge that’s called Moon. Moon — not to be confused with Duncan Jones’ feature film — features an actor on a soundstage who plays an astronaut that descends from a lunar lander. Halfway through, he approaches the camera, all before being told to start again by a nearby director (placed behind the viewer).

It’s a short and simple clip that’s supposed to tease what Lytro believes Immerge can and will be used for, but those who want to immerse themselves in the experience will have to wait — all three clips Lytro released today are two-dimensional. Two of them (which you can see below) handle the technical details, while the third (above) intercuts parts of Moon with a shot of a woman watching the clip on a VR headset, to illustrate the viewing experience.

Lytro CEO Jason Rosenthal stopped by The Verge’s New York office earlier this month to let us see the test footage as it was intended to be seen — in VR. With an Oculus Rift and a PC in tow, the company gave a demo of the Moon footage in two stages.

This video shows the raw light field capture and how it was blended together with some of Moon’s CGI elements.

First, Rosenthal showed the raw footage that was captured on set by Immerge. This first clip was just a limited 90-degree view of the sound stage, and the scene was full of what Rosenthal called “depth artifacts” — stray squares and pixels of image data that popped in and out of sight, especially around the parts of the scene that were moving (like the astronaut). That made it messy, but Rosenthal said that this kind of unpolished footage is the kind of thing that will be immediately available to the director or cinematographer on set when using Immerge. While it’s rough around the edges, this real-time footage offers the people in charge of the shoot a way to interactively preview what they’ve shot.

That’s important, because Immerge uses Lytro’s light field technology to capture footage in multiple directions and with depth information, too. And while the raw footage wasn’t pixel perfect, I still had what is referred to as “six degrees of freedom,” or the ability to move side to side, up and down, and forward and back, and all of those movements affected what I saw in the headset.

In just this respect, Immerge doesn’t feel that far off from some of the crazy hardware solutions that Industrial Light and Magic, James Cameron, or Peter Jackson have used to preview their own breed of special effects in real time — the kind of stuff that changed the direction of Hollywood. “That’s exactly what Moon is meant to be,” Tim Milliron, Lytro’s vice president of engineering, tells me over the phone. “It’s the first ever piece of six degrees of freedom 360 live-action content.”

The final composite footage was seamless

Second, Lytro showed a post-processed version of Moon, where the raw footage had been cleaned up and composited with computer-generated elements and another 3D shot of the accompanying sound stage (placed behind the viewer in 360-degree space). It was a near-final version of the footage released today, and it was impressive in its seamlessness. Lytro, and the crew that worked on the short, were able to bring all those elements — a fake surface of the moon, a CGI lunar lander, a real actor in a costume, and the sights and sounds of a Hollywood soundstage — together to created a coherent VR clip that showed off what a preproduction version of Immerge is capable of.

Now, it didn’t feel like I was standing on the Moon in the headset, or even the soundstage re-creation of it. But believability wasn’t really the point. With Immerge — and with Lytro Cinema, the company’s other relatively new, professional, and incredibly capable light field camera — Lytro is still trying to show off why production companies, filmmakers, and Hollywood studios would even want to start playing around with light field VR capture in the first place.

The simplest reason for those parties to be champing at the bit over Immerge is because creating immersive environments that take advantage of the freedom of motion offered by the Oculus Rift or the HTC Vive requires backbreaking work. Immerge looks like it will provide a shortcut through a lot of that work.

The Moon footage as seen in Nuke, which is a leading compositing program in the visual effects world. “The sphere in the center of the screen represents the viewing volume that you can experience Moon from,” Tim Milliron, Lytro’s VP of engineering, says. “What you’re seeing everywhere else is our camera data sort of re-projected back out into the world based on our scene reconstruction.”

Of course, it will take a while for real-world results that haven’t been produced by Lytro to pour in. Milliron couldn’t say which companies will be the first to work with Immerge, though Within (formerly known as VRSE), Felix & Paul Studios, and others have been featured in early promo material. For now, though, there’s not much to go on.

The first real-world Immerge footage likely won’t be seen until 2017

“What I can say is definitely in Q1 of 2017 you should be seeing several kinds of these kinds of experiences out in the real world from other content producers that we’re working with today,” Milliron says. In that sense, Moon serves as a stake in the ground for Lytro, something the company could one day look back on and say: “this is where it really started.” Right now, though, it’s little more than a call to action for companies that are looking for new ways to capture virtual reality content.

While the camera rig is a flashy physical object, it will probably be the things behind the scenes of Immerge — the software, the cloud connections, and other hardware that powers it all — that will turn Lytro into a VR industry mainstay, if it happens at all. As Milliron describes it, the tech that makes Immerge what it is can really be configured in any number of ways based on the needs of the client.

“I like to tell my team, ‘rigs are cheap, we’ll build a different rig depending on what the needs are,’” he says. “I think that anybody who thinks that they know exactly how VR productions are going to be shot a year and a half or two years from now is probably kidding themselves.”

Thank you have visited this post Lytro releases the first footage shot by its VR camera. We wish could be additional information about technology for you

Source link