Russ Fairley recently asked Tim Dashwood some very interesting questions about 360VR post. The article can be read in two parts at
http://desktopvideo.about.com/od/desktopeditinghardware/fl/Talking-360deg-Video-With-Tim-Dashwood-Part-1.htm (part 1) and
http://desktopvideo.about.com/od/desktopeditinghardware/fl/Talking-360deg-Video-With-Tim-Dashwood-Part-2.htm (part 2)
We’ve also reprinted both parts of the interview here for easier reading. There may even be some hints in this interview of things to come from us!
Working with 360° will change how we think of video. Learn how it’s done!
by Russel Fairley
Updated February 29, 2016.
With VR headsets, 360° video cameras and 3D production in general at an early growth phase, it’s important as video enthusiasts and pros alike to understand what this new medium is, and how to work with it. Workflows, as we’re use to them, are likely going to change, and the hardware and software we use daily most likely will as well.
As a newcomer to the world of 360° video, I thought our best bet would be to ask the expert of experts, Tim Dashwood. As a developer of award-winning software tools for the film industry, Dashwood created the post-production tools Stereo3D Toolbox, Editor Essentials, Secret Identity, and the award-winning calibration/analysis/tracking software Stereo3D CAT.
ABOUT.COM: With the imminent explosion of VR headsets and 360 degree/3D cameras on the industry, what are some considerations for video editors interested in working with 360 degree footage?
TIM DASHWOOD: There are huge technical and aesthetic considerations for any traditional 2D editor taking on the job of editing cinematic 360° video. The technical side can be learned easily (e.g. working with 2:1 aspect ratio equirectangular “lat-long” format, manipulating footage on three separate axis, using specially designed plugins for 360VR, etc.) but we are still developing the craft of how to present a good story as a compelling 360° experience. Do we need slower pacing than a
traditional ‘flat’ video? Do dissolves work better than cuts? Do we try to fill all of the surrounding volume with interesting content or just let the viewer organically experience the story with the potential of looking the “wrong way?” Should it even be possible to look the “wrong way?” 360° VR video is an art form unto itself and I feel like we it is too early in this new medium to really set any sort of rules governing preferred practices of editing.
As content creators continue to experiment with different techniques, a unique storytelling language will surely emerge.
ADC: When it comes to workflow, how does working with 360° cameras differ?
TD: There are a few different workflows for managing 360° video projects. The one I like best for large projects is to first ingest all the footage from the camera arrays into the NLE system and organize them using standard multi-cam NLE functions like those in Avid, Premiere Pro or Final Cut Pro X.
This way content can be pre-selected, even loosely edited, and then consolidated before being sent off for “selects” stitching. This can save a lot of time in post because rendering stitched footage is so processor intensive and there is no point in wasting time stitching all of your dailies. Luckily there are self-contained professional 360 cameras just around the corner that will make the manual stitching process unnecessary, and therefore let us go directly to the ingest and edit.
Stabilization of 360° footage is a little more complicated because rotation on each axis affects the complete spherical scene.
The good thing is that we never lose pixels when stabilizing 360° footage, we just change the orientation of the sphere on a frame-by-frame basis. I developed a plugin called “Reorient Sphere” for my 360VR Toolbox suite that can be used in conjunction with your favorite tracking software (After Effects, Motion, Mocha, etc.) to stabilize and level footage.
Another big concern in post is the application of filters that manipulate the position of pixels. Standard filters like blur, glow, noise reduction and sharpening cannot be applied to 360° video because they will not affect the whole scene equally (the effect will lessen near the zenith and nadir) and they will create a seam where the equirectangular projection should seamlessly wrap.
To solve this problem I had to develop new “seamless” filters for blur, glow, noise reduction and sharpening that can be applied to 360° video footage. These were also made available as part of 360VR Toolbox.
ADC: How has the Oculus Rift changed the viability of this type of footage/videography?
TD: I think that the Oculus Rift as a “crowd-funded to corporate” success story has got a lot of people interested in moving into this field, but as a product, it is very gamer-oriented, and in my opinion not really targeted at consumers of 360° video. I think the ‘headset’ that has really made this medium viable is Google Cardboard. And I don’t just mean the actual Google-branded Cardboard, but the ecosystem created around low-cost headsets (like the re-imagined View-Master) that function with the user’s own smart phone. The other key player on the distribution side is YouTube’s (and to a lesser extent Facebook’s) support for 360° video on smart phones, HMDs, or just in a browser on a computer. We are also seeing 360 video apps now on set-top boxes like AppleTV.
Of course, we’ve had technologies like Quicktime VR since the 90’s (great for real-estate listings,) and Google’s Streetview. 360° video was a natural evolution for virtual tourism and that is now evolving into something much more creative and rewarding for the viewer.
ADC: Your tool, 360 VR Toolbox, enables editors to use an Oculus Rift VR headset to edit 360 degree footage interactively, being able to actually look around at the entirety of a shot as if they were inside the camera. Excuse the joke, but how has this capability changed your perspective regarding video production?
TD: It is important to understand while editing how the end-user may be experiencing the immersive scene. Since the equirectangular “lat-long” format used to store a complete 360° image is flat, it is difficult to edit in a standard NLE and still visualize the immersive effect. It didn’t make sense to me to constantly render and export every few minutes just to load the footage into a 360° player so I could view on a head-mounted display (HMD) like the Oculus Rift.
It was a ridiculous waste of time for all that trial and error, and time is everything for an editor. I got so frustrated with the issue that I just sat down one day and wrote a plugin and accompanying app that allowed me to monitor the live output of the Premiere Pro or Final Cut Pro timeline in the Oculus Rift. My first attempt was functional within a few hours, but then I decided to take it further by actually capturing and displaying the editing timeline as a heads-up-display within the HMD so I could edit and see the effects of my editing choices in real-time.
The overall poor resolution and chromatic aberration inherent in the current generation of HMDs is not ideal for displaying timelines like we are used to on a Retina Display, but it was a huge time-saving step in the editing of cinematic 360° video.
Unfortunately the frame rate of my first version was only about 15fps. I asked a friend of mine, Anton Marini, who is an expert on optimizing code for GPUs, to work on the code for the plugin so we could reduce the latency and increase the frame rate. He managed to get the frame rate up to 60 fps on any modern Mac with a dedicated GPU, and amazingly, got the latency down to 0 frames. It was the first, and only, solution to the problem so we made it available as the 360VR HMD Viewer in the summer of 2015 and it is now in use all over the world. (Note: only available for After Effects, Premiere Pro and Final Cut Pro X on Mac.)
ADC: What should editors be striving for, finished product-wise, when it comes to 360 degree productions? We all know how standard 2D video can, should and will look edited, but we may not know just what to do with 3D footage. Is there a precedent?
TD: The sky’s the limit when it comes to editing with 2D “flat” video, but 360° video editors should always consider the heightened sense of immersion and potential sensitivity of the end-user to “VR sickness.” This is especially true with stereoscopic 3D 360° video, where too much parallax in a scene can actually cause eye strain and pain to the user.
With that said, I feel like we are at the beginning of motion picture filmmaking and The Great Train Robbery is coming soon, but Eisenstein won’t be showing Battleship Potemkin for a few years. There is room to grow in the medium, push the boundries, and the great ‘immersionmakers’ will emerge. They probably won’t even come from a filmmaking background. In fact, I expect theatre is where the next generation of 360° storytellers will originate. Directing a 360° production is very much like setting the scene for a play and then just letting the actors do their thing while the crew is in another room hidden from sight.
ADC: How is 360 VR Toolbox a game-changing tool for those interested in doing more with their footage than just shooting and playing it back?
TD: I poured all of my personal experience into the plugin solutions contained in 360VR Toolbox, and I continue to work to find better solutions to common issues as I encounter them. I still consider myself a DP and editor/post-supervisor first, and a software developer only when necessary. The 360VR Toolbox plugins for Premiere Pro, After Effects and FCP simply don’t exist anywhere else (although there are some other plugin companies doing their best to duplicate the functionality on other platforms.) For example, I have just finished an extensive development and testing period on a new type of plugin that can stereoscopically render 2D logos, footage (or particles) into a stereoscopic 3D 360° environment. Imagine trying to re-create the opening scroll for Star Wars in 3D 360°. It’s now as easy to dropping the plugin onto a particle-emitted starfield and pointing to a text layer with the scrolling text. It’s plugins like this that help those of us in post concentrate on creativity instead of wasting hours trying to overcome technical hurdles.
I don’t have any corporate backers so sales of 360VR Toolbox fund future development (translation: if I can afford to turn down editing gigs then I can continue to develop these plugins 24/7) but I understand that consumers just dabbling in 360° video, who may have purchased a $300 consumer 360 camera, have no need for a complete professional suite of plugins. Therefore I am releasing a stripped-down version of 360VR Toolbox, called 360VR Express, at a price anyone can afford. It will allow for some essential functions of manipulating 2D 360° video, as well as free access to the basic 2D functions of the 360VR HMD Viewer. I’m hoping the availability of consumer cameras and 360VR Express will spur creativity of anyone with an immersive story idea.
About Tim Dashwood:
Tim Dashwood is the founder of 11 Motion Pictures Limited and its sister companies Dashwood Cinema Solutions and the Toronto-based stereoscopic 3D production company Stereo3D Unlimited Inc.
An accomplished director/cinematographer, editor and stereographer, Dashwood’s diverse range of credits include numerous music videos, commercials, documentaries and feature films, as well as S3D productions for 3net, DirectTV, Discovery Channel and the National Film Board of Canada. He also specializes in the previsualization of live action fight/stunt scenes for large-scale productions such as Kick-Ass, Scott Pilgrim vs. The World and Pacific Rim. His recent film credits for cinematography include the films Air Knob, Bob, The Killer and Oscar-Winner Chris Landreth’s Subconscious Password.
As a developer of award-winning software tools for the film industry, Dashwood created the post-production tools Stereo3D Toolbox, Editor Essentials, Secret Identity, and the award-winning calibration/analysis/tracking software Stereo3D CAT.
Considered a thought-leader in the disciplines of camera and stereoscopic 3D production technologies, Dashwood is a contributor at DV Info Net and has spoken at The National Association of Broadcasters Conference (NAB), International Broadcasting Convention (IBC), South By Southwest (SXSW), University of Texas at Austin, Canadian Cinema Editors (CCE), Creative Content World (CCW), AENY, DVExpo, Reel Asian Film Festival, ProFusion, Banff Centre, TFCPUG, CPUG Supermeet, The Toronto Worldwide Short Film Festival and the Toronto International Stereoscopic Conference.
Dashwood graduated with Honours from Sheridan College’s Media Arts program in 1995 and returned to its Advanced Televison and Film program in 2000/2001 to study with veteran cinematograher Richard Leiterman CSC. Dashwood is also a member of the Canadian Society of Cinematographers.