3D Field Monitoring

The recent resurgence of 3D theatrical movies and the emergence of consumer friendly 3D large screen TVs with 3D Blu-ray DVD players have ramped up the demand for 3D content. Manufacturers releasing economical prosumer 3D cameras and aftermarket dual-camera rigs are either out by this printing or well on the way. Mainstream big budget content like James Cameron’s AVATAR has no doubt spurned demand and is selling home 3D systems. Along the way new opportunities for independent video producers will materialize. But there is a catch.

A Little History Lesson

3D movie imaging is just about as old as the invention of motion pictures itself. It may be a surprise to learn much of the still photography of the Civil War was done in 3D. In fact, the first documented public exhibition of a motion picture was L’arrivée d’un Train á La Ciotat shown to a Parisian audience in the late 1800s by the Lumière brothers. Shortly after, it was reshot and projected in 3D!

Hollywood didn’t embrace 3D in a big way until the early 1950s, but despite many financial successes, the format all but died out before the 1960s. The point here is that 3D is nothing new. What is new is the affordable technology to make good 3D video and get it to the consumer’s eyes. This is not about the dual-color anaglyphic (red and blue filtered paper spectacles) process that only works on monochromatic images.

2D or Not 2D

Which brings us to the issue of monitoring. Top of the line 3D camera systems are very expensive and either come with or have access to in-field 3D monitoring. These high-end systems are not the subject of this article. Most of the semi-professional 3D cameras are reasonably priced, somewhat restricting and they rarely have an on-camera ability to display a 3D image to the camera operator. Consumer grade cameras are even less accommodating. So what to do?

In the past, a steady diet of experience in the form of trial and error led to skilled technicians who could be counted on to bring home good 3D scenes. The early 1950s 3D films like Bwana Devil and House of Wax were shot with very heavy dual 35mm camera rigs and were impossible to monitor in 3D. Even Creature From the Black Lagoon was expertly shot in 3D underwater without any means to monitor the images recorded on film. Then, in the late 1960s, Chris Condon invented a single camera and projection system called Magnavision (later known as StereoVision) that made 3D filmmaking efficient and economical. The system placed the left and right images side-by-side in the same negative four perforation space as conventional single strip motion picture production. The system used common conventional film cameras and a single projector in the theater. With his newly invented system, Chris produced The Stewardesses that ran in theaters around the world for months and grossed almost $30-million in 1970. The only way to monitor the 3D quality back then was to shoot the film and screen it the next day in a 3D screening theater (dailies). But, even Chris couldn’t check his images in 3D at the time of shooting.

Here and Now

Today there are several ways to check and confirm your 3D images while shooting. Why is this important? Well, for one thing, the nature of the 3D effect is not readily apparent in a 2D viewfinder or monitor. By experience, a videographer can learn to shoot and compare the left and right images in a superimposed monitor and pretty much come to realize what the end product will look like. A camera (or lens) that has an adjustable interocular control can set the ideal dimension position of the subject/object in the frame. Always remember the most pleasing 3D emulates the human perception experience. This means most subject/objects should be positioned unnaturally close to the 3D taking lens and the lens should be wide angle. Most shooters new to 3D often make the mistake of framing with longer lenses and focusing on subjects too far from the camera mainly because they can’t see the 3D image until much later. Even so, ESPN has been experimenting with long-lens sports coverage in 3D. At first glance this would seem to be an oxymoron since we humans don’t see with telescopic eyes.

Mark Cholis at Grass Valley says that “many of the productions that use our Kayenne or Kalypso production switchers for live 3D events will have a single 3D monitor in the truck for the director and TD to work from, another for the Stereographer and then strategically place one outside the truck to help keep unnecessary foot traffic and interruptions down while doing the production.”

Some 3D camera systems feature dual viewfinders, one for the left eye camera and one for the right eye camera. On-camera or aftermarket binocular-finders work well and can easily be viewed during daylight shooting. However, these types of finders may not be particularly useful when using cameras that are preset at the factory for intraocular convergence. (Interocular refers to the distance between human eyes, about 2.5 inches.) More elaborate dual camera systems offer a flexible adjustment for interocular convergence. Having variable convergence is desirable when the 3D is to be “pushed” for effect. That said, it is possible to adjust convergence somewhat in post production, which will be addressed later. An adaptation is dual video spectacles that optically deliver the viewer a 3D image in a heads-up display that can be wired to emulate 3D. These presently are not HD capable to the consumer but can serve the purpose. By the time you read this, consumer HD versions may be commercially available.

The other way to monitor 3D is with electronic liquid crystal shuttered eyeglasses and a dual feed SD video alternating field interlace accessory. There are numerous incantations of this process and demands research as to what is currently available. These devices have been around for more than 20 years. They take discrete video feeds from the two camera sources and alternate the left and right images between field one and field two in standard definition NTSC. The new 3D Blu-ray TVs use a descendent of this technology. This method may be used to confirm your 3D in SD is on point for the audience even if you’re shooting HD but seeing 2D. The main drawback to this technique is observing the monitor on location in daylight. This can be somewhat overcome by shading or tenting the monitor. Of course this also requires electronic liquid crystal shuttered eyeglasses that either run on batteries or are cabled to the interlace device. Fortunately there are some new systems available for this purpose that are reasonably priced.


Off the Shelf

Perhaps the best way to monitor in the field is to carry a 3D consumer TV monitor or a smaller professional monitor that can display the Blu-ray 3D format and a 3D post system that can burn test discs on location. Working this way will affect the budget, as any system that requires time to post and burn test discs will delay the shooting schedule. However, consider the consequences of coming back from the field with unusable material rather, then facing a re-shoot. (Why is it there’s never enough money to do it right the first time but there’s always enough money to do it over again?

Another option is to set up a dedicated screening room equipped with two video projectors using simple polarizing filters. Although not very practical for location work, a studio facility might find this a viable way to use older equipment on hand or the new small video projectors that are readily available. By the way, a not-so-dirty little secret is to do your edit in 2D then just digitize only the 3D content actually used in the finished project.

After the Fact

There is also the possibility to add 3D to an existing 2D project, but that is labor intensive and costly requiring very careful advanced preparation to work. Tim Burton’s Alice in Wonderland was done that way with great success! This aspect of video producing is very much a moving target as the popular editing programs are all including 3D posting features. It is important to understand that any post convergence adjustment from original footage will no doubt cost in both time and quality. There are some who think this is preferable to setting convergence on set. My opinion is it is better to get it right the first time.

Case Study

Silverado Systems, based in Folsom California, is a RED and Apple reseller and services facility. Torrey Loomis, President and CEO, is testing a new prototype 3D HD camera system developed by GunMetal to produce 3D content. This camera is not yet on the market, but promises to be a valuable addition to any 3D production. The camera easily fits in the palm of your hand and is totally self contained. The fixed lenses produce a picture that emulates about a 14mm lens in the DSLR format. It has been used to test-shoot skydiving so that should say something about it’s size and weight. As with any small field camera system, monitoring is limited and in this case presently not possible. The camera has fixed convergence so how does Silverado address the monitoring issue? They shoot a convergence test marker at the desired distance from the taking lens for each scene. Remember convergence cannot be adjusted on this camera. In post, the test marker can be adjusted to a minor degree to optimize the 3D effect. Display is done through the use of a little palm-sized box by AJA that takes in two discreet sources from the SCRATCH workflow solution and puts out HDMI 1.3 (3D). This is only one step away from field monitoring when the GunMetal cameras can produce two composite side-by-side video feeds or carries and integrated an onboard hybrid of the AJA 3D combiner already mentioned. Chad Cruchley, CEO of GunMetal, says that is in the works. He also reports that the big manufactures have polarized flat screen TVs coming just around the corner that will use passive non-electronic polarized eyeglasses.

What If?

An intriguing thought is the prospect of setting up a small 3D projection system using the new crop of mini camera/projectors like the 3M Shoot & Share CP40. By using polarized filters 90-degrees out of phase with polarized eyeglasses, it may be possible to view your 3D efforts in the field. This of course would require a darkened room of some kind and a small silver lenticular screen. You will have to use your imagination if you want to try this one for size. The trick would be getting the two camera/projectors to synchronize frame accurately in playback. Some enterprising individual or aftermarket accessory company might step up to this plate.

What Not to Do?

A recent press release from an unnamed company announced they seemed to have solved the 3D conundrum by taking two DSLRs and pairing them into a mechanical rig. To achieve the effect, one of the cameras was zoomed in to match the other camera’s framing. The result was a dismal failure. No matter how you enter the 3D arena, make sure both cameras are perfectly matched with identical focal length lenses. Anything short of that simply will not work.

Not Uninvented Here

The 3D genie is out of Aladdin’s Lamp and probably not going back. The potential market that 3D production offers the independent videographer is growing daily and not just confined to entertainment. There has long been demand for remote 3D viewing in various industries like medical, education and space exploration. With the Space Shuttle being retired this year most of us are probably not going to Mars any time soon. But having a working knowledge of 3D production and monitoring techniques might just come in handy when a client asks the question; “Can we shoot this project in 3D?”

Gary Tomsic is an independent writer and indie producer.

Did you find this content helpful?

Videomaker
The Videomaker Editors are dedicated to bringing you the information you need to produce and share better video.

1 COMMENT

  1. Interesting article, Gary. It might pay, for many would-be 3D enthusiasts to consider how ‘necessary’ 3D is in the general scheme of things. That way, no hasty decisions are likely to be entered-into. Our household rcently went over, lock-stock-and-barrel to Digital ‘flat-screen’ TV; in New Zealand, in the PAL format. My first ‘discovery’ was that a quality LCD screen-image, inherently seems to have greater ‘depth’ of image than the old ‘conventional’ tube-based TV it replaced, and that seems to be enhanced by the greater width of the 16:9 A-R image amongst other things. (I’ve been ‘into’ widescreen since the days of 16mm filem and ‘anamorphics’). Our eyes, for 3D purposes, are assumed to be 65mm apart, so it does not require a great deal of ‘trig’ knowledge to make it obvious that a powerful 3D effect falls off very sharply with distance, in fact, beyond a hundred metres (or yards to you), I feel a more powerful 3D effect, is generated by the fact that the various planes overlap each other. I have often wondered, then, how much stereo 3D actually contributes to the visual enhancement of such images. I was, for the last ten years of my working life, a watercolour artist, and there are numerous little artist’s tracks which may enhance visual impact, juxtaposition of complementary colours is one, light-against-dark contrasts another, ‘mass’ contrasted with fine detail, yet another. For film/video-making, you might add one of the most powerful of all, ‘differential focus’. All of these are available to a ‘thinking’ video-maker, without any extra complexity or expense.

    By comparison I have seen ‘instant-3D’ in the way of sports transmissions, and the figures always seem to have a pronounced, ‘cut-and-paste’ about them, to me. With the Rugby (Football), World Cup being held in New Zealand this year, advance sales are underway, for the ‘3D-experience’, (and it doesn’t come cheap), courtesy of equipment to be flown out from Britain, apparently. If the images are no better than those I have seen, to date, I think that move is likely to set-back enthusiasm for ‘3D’ in much the same way as ‘House-of-Wax’ and other such productions succeeded in killing it off in the 1950’s, before it even got-started.