Stereoscopic Style: ‘Future Fashion’ and Production’s Next Dimension

Originally published in Videography Magazine March 2010.
by Jon Silberg

There’s been buzz about 3D for several years, of course, but it seems that just after Avatar‘s first weekend grosses came out, the focus for everyone from studios to corporate clients shifted from the abstract to actual 3D work. Toronto-based designer Nada Shepherd was among them.

Fascinated by the potential of 3D to grab an audience in new ways, she wanted to find a way to incorporate 3D presentation into a fashion show. When she and Tim Dashwood, co-founder with former stuntman Paul Rapovksi of the Toronto-based Stereo3D Unlimited, got together at the beginning of this year, they set to work planning what is now the seven-minute ”Fashion Future/Future Fashion” —a 3D short that combines live action and CGI and that is playable in any RealD-compatible theater.

Fortunately for Dashwood, his was among the companies that had been preparing for this moment for some time. In the year and a half prior to Avatar’s auspicious opening, Dashwood had worked out details of a two-camera 3D rig, put together a motion capture stage and developed the Stereo3D Toolbox plug-in that allows a user to edit left- and right-”eye” imagery and adjust stereoscopic issues such as convergence within Apple Final Cut Pro.

Taking the place of the traditional runway show, ”Fashion Future” stars “fembots” clad in NADA interacting in a virtual 3D computer environment designed by Pheinixx Paul of Pencil. The film is directed by Grant Padley, with producer Kevin Hedley, of Atomic Clock Cinematic Arts in conjunction with Stereo3D Unlimited’s Rapovski and Dashwood as 3D cinematographer/post supervisor. The short was created with software from Dashwood Cinema Solutions, PHYX and Noise Industries.

The concept for “Future Fashion” involves placing the audience inside a world that looks and feels like an immersive videogame in which we see two models challenging each other to various futuristic battles within a totally CGI background. “It became about the models choosing as their ‘weapons’ a particular hairstyle or article of clothing,” Dashwood summarizes. “It’s a perfect way to showcase her collections.”

For the four-day shoot, Stereo3D Unlimited would provide the stage on which the two flesh-and-blood models would interact, the camera rig, the 3D CGI backgrounds and the tools to show clients 3D-composited shots in real time.

The camera rig, Dashwood explains, was based on two Iconix Video 2K cameras from Sim Video placed parallel to one another on a mount of Dashwood’s design. An advantage of using the tiny Iconix cameras over larger bodies is that he was able to get the two lenses as close as 35mm apart; since the people and objects would get very close to the lenses, a small intraocular distance was necessary to make the 3D effect work.

“When you shoot with a 65mm interocular, that’s the same as the way humans see things,” Dashwood says. “But we have swords coming out at the audience as close as 4 or 5 feet away, so we needed the lenses closer together.”

The cameras were positioned exactly parallel, rather than pointing toward or away from the center, in order to avoid keystoning and to provide the cleanest starting point when adjusting the two images in post. Dashwood notes that this project benefitted from the use of a computer-generated set; the advantage of a CGI background over a real set is that it allows significantly more ability to manipulate parallax than if foreground and background are all captured live. The cameras each recorded a full 1080p signal, which was laid down on a single Sony SRW-1 HDCAM SR deck in dual stream mode, in which the tape runs at twice the speed and records two discrete 1080/24p 4:2:2 channels.

In order to be able to cover the live-action performance with a moving camera, Dashwood made use of his company’s Vicon motion capture system. This system combines the use of multiple small cameras around the greenscreen stage, which pick up tracking markers, and the processors to translate the markers’ location in 3D space. Dashwood placed markers on the camera rigs so that the Vicon system would always know where the cameras were and where they were pointing within the stage.

The game-like 3D CGI backgrounds were built in Autodesk’s 3ds Max, and Dashwood’s setup made use of Autodesk MotionBuilder (employed primarily by game developers) to render real-time proxies that combined the motion capture data about camera position to the 3D CGI models of the various environments that had been built prior to production. This tool, in combination with a newly modified version of Dashwood’s Stereo3D Toolbox, spat out dual-stream 3D images of the models and CGI backgrounds onto a 46-inch JVC stereoscopic display on which the clients, wearing passive, polarized 3D glasses, could see foreground and background together live in 3D.

The final comps would be refined, but this setup allowed Dashwood and company to offer clients a 3D production in the timeframe and with the flexibility people expect of a 2D production.

What should clients expect to pay if they want to do their production in 3D? “Our ballpark for 3D jobs is what it would cost to do the same thing in 2D, plus 12 to 15 percent,” he says. “Considering what you get, it really costs less than most people expect.”

Leave a Comment on this Article

*