CCD Technology

What in the heck is a CCD and why do I want it in my video camera anyway? Read below for a short explanation of CCD technology and just how it pertains to you.

When it comes to video cameras, or even the video industry as a whole, there are tons of buzzwords that people use to convey ideas or explain technology. One of the most popular, when it comes to selecting the right video camera, is the acronym CCD, which stands for charge-coupled device.

What originally started out as a new type of computer memory circuit in the late 1960’s, the charge-coupled device quickly found a place in the world of imaging, as the silicon-based circuitry is so sensitive to light. Since that time, CCD technology has revolutionized the way we communicate and conduct business through the use of office equipment like photocopiers and fax machines. However, where CCDs have really come into mainstream popularity has been in the home and professional video communities.

Put simply, these tiny circuits act as the camera’s eye, capturing light and then converting that light into a digital signal, or image on tape. While that is a short, sweet and convenient explanation for how CCDs operate, to really grasp how these circuits work and how to choose a camera with the right CCD technology for you, it’s best to have a broader understanding of their form and function.

Look Into the Light

How does the CCD receive information? In a film camera, light is focused through the camera lens in varying levels of intensity. That light then hits a chemically-treated piece of film which reacts to the different intensities of light. The same basic principle is true for video cameras; however, instead of light hitting a strip of chemically-treated film, the light hits the charge-coupled device that measures light photons on a panel of light-sensitive diodes and then converts that light information into an image. However, this simple operation is capable of capturing and transmitting only a black and white image.

One CCD

To process and redistribute color information, the CCD in your camera has to measure light at three separate wavelengths of red, green, and blue. In other words, it has to be able to recognize color. It does so through use of a CCD Array, which is a device that mounts charge-coupled devices together to capture more pixel information.

From basic video surveillance cameras to high-end consumer and pro-model cameras, color imaging is processed using either one or three CCD arrays.

Single CCD array technology uses exactly what the name implies, just one CCD array to process all color information. This low-cost technology employs use of a grid of color filters on the face of the array so that only red, green, or blue light ever reaches any given pixel. This simple separation allows for color images; however it does have its drawbacks, one of which is known as aliasing.

Aliasing is the term used to describe when a region of sharp color contrast bleeds. For instance, say you shot a white blanket with a wide brown stripe down the center. If you were to magnify the image, you would see where color information at the edge of the wide brown stripe failed to make a sharp transition. Instead, individual pixels would appear to have a rainbow effect at the edges. While it doesn’t completely degrade the overall image, aliasing is an after effect of single CCD technology that limits the usefulness of the image. Therefore, single CCD cameras are typically used in applications like security surveillance, where pixel-perfect resolution is not a must.

As the need for high-quality video imaging grew, three chip color cameras made their way into the market. For this technology to work, a complex optical prism assembly is used to separate the image being captured into three components: red, green, and blue. Because of the prism assembly, each of the three colors gets its own charge-coupled device. This means almost all of the light information associated with a particular color channel is captured by its own array and the individual CCD is able to devote all of its pixels to that particular color channel, unlike single CCD technology where just one array has to process and split all of the color information.

A Few Words about Resolution

In the video world, image resolution is constantly scrutinized and often confusing. SD, 24-progressive, HD, HDV, DTV… there is a long list of terms and technology involved when discussing quality images. But, for the consumer market, there are two basic factors you need to know about with respect to image resolution. One is the quality of your optics, which is fodder for a completely different article. The other is… yup, you guessed it, CCD technology.

As pointed out, CCD arrays and the number of pixels in each are essential to a quality image. CCD arrays are available in varying sizes, usually from 1/6 of an inch to one full inch. One of the most popular sizes on the market today is the 1/2-inch format, which provides a resolution of 768 x 582 pixels. Since most standard definition analog televisions provide only around 480 lines of resolution, the 1/2-inch CCD array will capture and output more than enough in terms of picture quality. However, if you’re looking to capture images suitable for anything more than standard-def video, you’re entering a whole new realm, often requiring precision optics to prevent “vignetting” of your image.

This brief explanation of CCD technology is by no means deep. However, it should give you enough information to help make the right decision when it comes to purchasing your next camera system.

Michael Fitzer is an Emmy award-winning commercial and documentary writer/ producer

LEAVE A REPLY

Please enter your comment!
Please enter your name here