Editing video clips may seem simple, but people are always surprised with the amount of time it requires. The idea of automatic video editing is quite old, but it wasn’t until 2001 that a company named muvee announced the launch of the world’s first smart automatic video editing software. Muvee autoProducer was designed for the person who wanted a basic tool for automatically editing video. Users chose the video files that they wanted to include, picked a choice of music, selected a production ‘style’ and then clicked “make muvee”. Muvee’s video algorithm analyzed the video for shot boundaries, face detection and more. The music selection was analyzed for beat and tempo and this analysis was used to make smart automatic edit decisions for each video.
A few years later Pinnacle Studio included an automatic editing feature. This feature (named SmartMovie) allowed users to simply select a series of clips and let Pinnacle Studio do the rest; trimming and arranging them, cutting them to music and even adding effects and titles – all automatically.
More recently two other similar products were announced that use smart algorithms. HighlightCam was built for iPhone and iPod Touch and now works with Android phones; their app allows users to load in their videos, answer a few short questions, and let their Amazon cloud-driven service create the video.
Around the same time a similarly intriguing product named Magisto was launched. Users select up to 16 video clips to upload for editing and with the click of a button Magisto tries to find the best footage in the videos and edits them into short movies ready to be shared. Magisto says its video editing tool is even capable of inferring a user’s “intent” by analyzing the sounds and images on the video.
While all of these solutions have made editing video clips easy, these software packages don’t know which clips are the most enjoyable to watch. Humans still need to watch all of the video and choose their favorite clips. There is a new technology on the horizon, which claims to make this much easier. EmoLens leverages the power of the Emotiv EPOC headset’s ability to detect emotions via facial expressions and subtle monitoring of one’s mental state to automatically index photos as you browse them on Flickr. Photos that spark strong emotions in the user are automatically tagged not only with relevant keywords but also by the emotions detected by the EPOC headset. Later the user can search for photos either by recalling a feeling or a keyword. This technology will be available for editing video clips in the near future.
All of this technology means that much of the work associated with editing video can be done by the computer with little or no human interaction. Experienced video editors will still be very “hands-on” as they practice their craft, but these tools will continue to provide video editors of all skill levels improved access to the tools of the trade.
Matthew York is Videomaker‘s Publisher/Editor.