On January 25, 1997, representatives from six of the leading nonlinear editing companies joined the Nonlinear Editing Panel at the Videomaker Expo. They discussed the state of nonlinear editing hardware today and tomorrow. Videomaker‘s Executive Editor, Stephen Muratore, moderated the lively discussion. Following are the names of the panelists and a transcript of their discussion.

  • Joyce Chung, Product Manager, Adobe Systems
  • Linda Frager, Product Manager, Apple Computer
  • Christian Jorgenson, Product Manager, Fast Electronics
  • Eric Kloor, President, DraCo Systems
  • Jan Piros, Product Manager, miro Computer
  • Steve Stautzenbach, Sales Director, Ulead Systems

Stephen Muratore: Now, let’s kick it off. What are the pros and cons of analog and digital video? I’ll throw it to anybody who wants to take it.

Christian Jorgensen: I think the reality for all of us is that it’s a mixed world. There’s no pure, simple, one-way only, "it’s only linear, it’s only analog, it’s only nonlinear, its only digital." It’s not that simple–the lines are very blurred. And we’re in a transition process. The future is digital.

Jan Piros: Nonlinear editing is very convenient. In the past, before the computer was used, people were doing traditional cut-and-paste stuff in publishing. The computer, just by adding the "Undo" feature, has really helped a lot of people in different ways. In nonlinear editing, you can go in and do a whole bunch of cuts and edits, and experiment with your videotapes and do a lot of production things that in a linear sense would be very expensive. Digital has a lot more flexibility to do all kinds of experimenting, processing and manipulation. But it is the quality of a videotape that is actually passed through a computer to put out to a final product, out to a recorder–that’s important. It depends on the quality of the capture board itself. Whereas, in the analog world, if you do a straight dub, the quality is there almost every time.

Linda Frager: I think the other issue for consumers is the ease of use and trying to take some of the frustrations out of making your own movies or creating videos. When you’re into the analog video world, once you put your order of video clips in, that’s the order you have to keep unless you have to start all over. When you have nonlinear and digital, you are able to rearrange the order of your clips, and that takes some of the frustration out of dealing with video.

Steve Stautzenbach: I think that flexibility also extends to the delivery of video material, where digital allows you to edit for tape, for CD, for the Internet, for whatever else is next. This is one of the great benefits of the digital world.

Joyce Chung: I would agree with all of those statements. One other practical thing if you are going to do special effects like A/B rolls: you really need only one deck, or one video source, because once it’s digitized, you can do whatever you want with it. You don’t have to have an A deck and a B deck and then a record deck.

Christian Jorgensen: I really want to say to all of you in the linear world that it’s really important to bridge these two worlds. It’s important that we, the manufacturers, give you the bridge between the world you’re in today and the world you’re going to be in tomorrow or five years from now.

Stephen Muratore: Perhaps a couple of definitions are in order here. When we’re referring to nonlinear editing, we are fundamentally referring to video editing that’s done from the hard drive. When we are talking about linear video, we are talking about tape-to-tape editing. The latter is called linear editing because all of the shots are laid on the tape in a row. When they’re on a hard drive, you have random access to any shot in any order. There is another breed in the middle which we’ve been referring to as hybrid editing systems. This has both linear and nonlinear aspects. A hybrid system incorporates both the random-access interface and the advantages of tape-to-tape editing.

Moving on: would any of you care to address the impact that DV (the new Digital Videocassette format) has had on the nonlinear editing industry, and on your company in particular?

Christian Jorgensen: It just so happens we [Fast Electronics] have a new DV product. We’ve been working very closely with Sony on the development of some products, and one of the things that DV represents–and it’s really remarkable–is that it’s the first time we have the ability to go right from the camcorder’s tape and keep the signal digital all the way through the entire process of editing.

How many of you have a DV camcorder? The projections are that in 1997, we’ll go from the established installed base of 100,000 fitted with Firewire, to over a million. One thing I want to caution you about is to make sure that you look at DV as part of an integrated system–that you make sure that your existing technology, your existing investment in footage, camcorders, decks, and all the rest–becomes part of this new system. But for all of you, make sure that you get a DV camcorder, that’s first and foremost, because it is going to be your most flexible tool.

Joyce Chung: With a system like Premiere–which is an open environment–you can plug in new codecs that come along, like DV. The Cinepack, Motion JPEG, and now this new DV format can be plugged in through QuickTime or Video for Windows. You don’t have to throw away your software or your system–you can get these new cards that plug into your system that have the Firewire connector. Now you can bypass the step of digitizing, going from analog to digital, because it is already digital. You are not going to lose any quality, and there’s no generation loss. In fact, DPS has a DV product, Fast and miro are all going to have DV products, and other companies are jumping on the bandwagon as well.

Jan Piros: It is nice to think about DV and the quality that you get, but it’s pretty pricey to get into DV. Because it is emerging technology, a lot of the components are hard to get. You’ve got Sony with the Firewire data port. They just introduced their DV VCR at this show. Three of us are making cards that will be coming on the market pretty quickly. But at the same time, Sony has such a hold on the DV technology that a lot of the cards are going to be very expensive. Right now, for instance, the Motion JPEG board is under $1000. A board based on a hardware DV codec is a little bit more than that. That’s the whole problem with it–it’s an emerging technology now–and I know a lot of you are asking, "Should I jump both feet into DV right now?" The camcorders are a perfect investment. It’s good to get right away because then you’re set up for the future; you can either use the analog output or the Firewire output, but when you get into using DV and the computer–it’s an emerging technology.

Some of the drawbacks of DV are that it is not a scaleable compression ratio–it’s set at 5:1 all the time. Also, the video window that can be displayed on the computer is set at a particular resolution, so there are a number of things that we as hardware people have to really watch out for.

Steve Stautzenbach: Just to echo Joyce, your PC-based software products are all open architecture, for the most part, so your investment will just move forward. When our partners at Fast and miro deliver you a board, we at Ulead will recognize the codec. There are actually some additional things from the software side that we can move forward with as this technology matures. A number of you have used two technologies in your discussion–DV and 1394 Firewire–which are two technologies moving forward in parallel.

Jan Piros: Software is actually a very important thing coming up now, because–if any of you went to MacWorld–there was an introduction of a 500MHz machine. Intel wants to put everything on their CPU–they don’t want to have board makers come out that do specific applications; they want to sell you CPUs every six months. So what does that mean for us as digital-video people? It means that a lot of the chips being made now do compression in Motion-JPEG or DV compression using a hardware codec. A lot of that can be transferred into software and processed by the host CPU. This is another thing that you have to watch out for. The software is going to be a very important component. You won’t have to have a specific piece of hardware; you just have to get video into your computer, and the host CPU can do all of the compression and all of the processing.

Christian Jorgensen: Understand that in DV, the video is compressed at the camcorder, and once it goes on to the cassette or down the Firewire–which is nothing more than a high-speed digital transfer device–if you do a cuts-only edit, it stays compressed. What people need to be clear about is that once you add transitions, effects, titles, filters–you need to recompress the video once it’s done. I don’t think in the industry that we are being honest enough about telling you that before you get that DV board home, you’ll need the efficiency and the speed of an extremely fast gigahertz CPU, or, more realistically, a hardware codec.

Stephen Muratore: Let’s pause for another definition: Firewire and 1394. If you look at any of the Sony DV cameras, you’ll notice that there is a new kind of jack that you have never seen before, a small rectangular black jack–the Firewire output or the 1394 jack. Through that jack, you can transfer digital video to or from the camera. It’s called a digital camera because it records a digital video signal onto tape, as opposed to the analog signal recorded by most camcorders. If you want digital video to come out of the camera, it has to come out through the Firewire jack. Otherwise, if you use the regular jacks, the RCA plugs or the S-video jack, what you are getting is analog video out of a digital camera. Its high-quality, but it is still not digital. The importance of Firewire is that you can edit digitally all the way if you are able to connect that Firewire jack either to a recording deck, such as the Sony deck, which is itself a digital video deck, or through a Firewire board that you would plug into your computer, then directly onto a hard drive.

We’ve hashed over DV pretty well. How about a bit on MMX? What does MMX mean to the industry, to nonlinear editing, to your company, and to videomakers now and in the future? Eric, do you want to try first?


Eric Kloor: The MMX chip means nothing to us. We [DraCo Systems] built a little box that just edits digital video from a hard drive; it’s not a computer, and it doesn’t use any Intel hardware. We use an old Motorola Chip, the 040. In the future, we will have the DV Firewire interface board. We’re working with Sony with their DVK1 board, so we will have that codec in and out of the Casablanca. Our target market is maybe a little bit different. We get some interesting faxes from people that say, "I don’t want to throw away my computer; I’m confused by all of this." Our target market is just people who want to edit video, that want to crank out video, to make money doing productions. We target people that have work to do, and don’t want to mess with hardware installations, endless upgrades, and all that.

Linda Frager: I think the question on a lot of Macintosh owners’ minds is, "What does MMX mean for Macintosh, and what does it mean for Apple?" The Apple Macintosh has always been tailored and optimized for multimedia creation, whether it’s audio, video, or creating your own CDs and titles. So where Intel and that platform is now moving into multimedia creation, Apple’s been a multimedia creation platform for quite some time. Having said that, it’s obviously necessary to address MMX from the Macintosh side, and there are a number of new architectures and products that we’ll be announcing over the next several months that will help to address the MMX.

Christian Jorgensen: I’m very excited about MMX. We talked a little here about the idea of one nice, closed video appliance box, versus a tad bit more complexity. By using the modular approach of the computer technology, and software partners and hardware partners that make the process easier, you’re going to benefit on a daily basis. We’re at the point now where we’re throwing away CPUs. It appalls me that we’re trashing equipment, but at the same time, that’s the efficiency of our society today. Realistically, a high-speed computer with all the latest speed and innovations is going to give you some dramatic benefits–MMX is one of them.

Steve Stautzenbach: MMX is all about speed. You’ll want to get an overdrive chip into your current Pentium to get there. You’ve looked at Windows video editing. You’ve loved all the filters and transitions, all the creativity. One of your foremost questions is, "How come it takes so long to render my productions?" So MMX addresses the speed question. Plug it in, and you’re going to see speed improvement right out of the chute. When we deliver our software that’s optimized for MMX, when we go back in and write code that takes advantage of all the new capabilities on that chip, you’ll see much greater speed improvement. You want Windows video editing that moves forward and delivers faster speed, higher quality, more creativity. MMX is about speed.

Joyce Chung: As far as Adobe products are concerned, we intend to support MMX for our digital video and our imaging products as well. As far as the actual increases in performance that you’re going to see, I can’t really say right now because we are still optimizing and looking at some of the benchmarks. Just keep an eye out. We’ll put out some releases and let you know when that happens.

Christian Jorgensen: One of the things we’ve heard a great deal of is, "I bought this wonderful card from you last year, and now you’re introducing the next card. I feel like you just outdated me." The point of it is, we’re going to keep bringing you the latest and greatest technology. Whether its MMX, DV or whatever.

Stephen Muratore: Please join me in thanking Joyce, Christian, Eric, Steve, Jan, and Linda for appearing here this morning.

LEAVE A REPLY

Please enter your comment!
Please enter your name here