Recently, my brother and I completed a 60-second commercial for a client in Florida. We used a digital video camera (Digital8) and remained digital throughout the entire editing process (via Miro’s DV200 and Premiere), then output back to Digital8. We produced an S-VHS tape, since we thought it would be an acceptable format for TV stations. We discovered that not only do these stations not accept S-VHS, but they take nothing but Betacam and U-matic 3/4-inch. Is this simply an example of an attempt to keep the video-producing industry in the hands of the networks? Or does broadcast quality actually mean more than lines of resolution, since 3/4-inch is supposedly only 250 lines of resolution? Confused am I…
A. Well Sam, the main reason has less to do with quality than it does with the equipment that local TV stations happen to have on hand. Up until five years ago, many small market TV stations didn’t even support Betacam. One-inch and U-matic 3/4 were the formats producers had to choose from, and most stations went to Beta kicking and screaming. The cost of embracing a different format is a major economic consideration (especially when you are talking about numerous machines), so the broadcast medium is the last to get onto the techno wave. Look at how long HDTV has been in the works.
At a TV station’s Master Control, tone-cued machines insert commercials into programming. Also, when new ads come in, a master tape that holds the commercial play-list places them in rotation. Locked machines that are able to make frame accurate insert edits are necessary to replace spots, and if the station uses 3/4-inch or Beta to do that, then that’s what they’ll accept.
Although stations have now gone to hard disk digital systems that serve up the commercial menus, they still use the same old decks to input the ads. Your best bet is to find a production house in town that will let you bring in your Digital8 and make a Beta dub via the Y/C connect. If you purchase your own five-minute reconditioned Beta tape (approximately $3 to $5 each), this process shouldn’t be too cost prohibitive.
Q. Is Bit Depth Crucial?
I am currently in pre-production of a movie I’m shooting in DV. I hope to later transfer my DV footage to 35mm film. I have a question about something I read within the "maximum bit depth" heading of your capture card buyer’s guide. Is bit depth an issue I should be concerned with to get a better film transfer? Since my images ultimately will be projected onto the big screen, should I strive for more bits per pixel to achieve more accurate color shading? I’ve called others who told me it was a marketing myth and said it might help with the graphics but not with video image quality. I would greatly appreciate any help.
A. The answer to your question is no, bit depth will not help your color separation or image quality. What will help make your DV image the best it can be is using a three-CCD camera.
Component video is comprised of three colors: red, green and blue (RGB). Each color gets 8 bits in a digital image for a total of 24 bits. In a 32-bit system, an alpha channel receives the additional 8 bits. In essence, it defines the ratio of opacity to transparency of each pixel. The support of this additional channel allows, among other things, keying effects. Defined by the user, some portions of the image are made transparent or kept opaque by luminance (lightness/darkness) or chrominance (color value).
Within the lens of a three-CCD camera, three prisms, each digitized by a single CCD, separate out the image’s RGB components. This process improves color separation and in turn, image quality. Also, if you plan to transfer to film, talk extensively with those who will do the transfer. Taking DV to film is a less than ideal scenario unless you do it for a specific production purpose. At any rate, discuss with the transfer folks your specific goals for any hints they might have for lighting and shooting for that purpose.