首页    期刊浏览 2025年07月17日 星期四
登录注册

文章基本信息

  • 标题:Digital Film
  • 作者:Albert J. Klee
  • 期刊名称:PSA Journal
  • 印刷版ISSN:0030-8277
  • 出版年度:2001
  • 卷号:August 2001
  • 出版社:PSA Photographic Society of America

Digital Film

Albert J. Klee

"O Villain! Thou hast stolen both mine office and my name"

Although I apologize to the Bard for filching the above quotation from The Comedy of Errors, a long-time annoyance to me has been the term, "digital film," used for the removable storage cards for digital cameras, such as CompactFlash and SmartMedia. For one thing, traditional film both records the image and stores it; "digital film" only stores the image. Film photographers have learned to deal with the nature of film and understand the jargon of terms such as grain, contrast, resolution, latitude, sharpness, saturation, etc. Digital is the new boy on the block, however, so I propose in this article to highlight some of the major differences between the two.

* The Basics

Traditional and digital cameras are based on similar principles: gathering light, recording light, and transforming that light into an image -- that is the very nature of photography. The basic distinction between the two is that film cameras rely on a chemical reaction between light and film, while digital cameras rely on an electrical reaction between light and a photosensitive diode.

As most photographers know, black and white film employs silver-halide crystals to record the intensity or brightness of light. When the film is exposed, the light changes the crystals chemically, causing them to bind together in clumps. The more light that hits the film, the greater the clumping. The film now contains a chemical record of the light, which then is translated into an image during the developing and printing processes.

Digital cameras, on the other hand, use a solid-state device called an image sensor. These fingernail-sized objects contain millions of photosensitive diodes called "photosites." Each photosite is one element of the whole picture that is formed, thus it is commonly called a picture element, or "pixel" for short. When the shutter is open, each photosite records the intensity of the light that falls on it by accumulating a charge; the more light, the higher the charge. The charge recorded by each photosite is then stored as a set of numbers that can then be used to set the brightness and color of dots on a computer screen or printed page to reconstruct the image.

* Resolution

The resolving power of film is measured simply by its ability to reproduce patterns of horizontal and vertical black lines on a white background as lines and not as uniform gray areas.

In a digital image, resolution refers to the number of pixels in the image. When a relatively small sensor array is used to create an image, the individual pixels become very apparent in the resultant picture. A common example is the jagged stairway effect produced on diagonal lines. (Figure 1). In film, sharpness, although allied to resolution, is not the same thing since it relates to the transition between a light tone and an adjoining darker tone (i.e., acutance), and this does have a direct equivalent in digital images; "resolution," however, does not.

[ILLUSTRATION OMITTED]

* Color - Film Versus Digital

The first photographs could only record black and white images, but a major breakthrough occurred with James Clerk Maxwell's discovery in 1860 that color photographs could be formed using red, blue, and green filters. A tartan ribbon was photographed three times, each time with a different one of the color filters over the lens. The three images were developed and then projected onto a screen with three different projectors, each equipped with the same color filter used to take its image. When brought into register, the three images formed a full color image.

Today's color film works in much the same way. There are three emulsion layers, each one responding to a primary color. The bottom layer reacts to red, the middle layer to green, and the top layer to blue light. Coupler dyes, each one designed to react with only one of the emulsion layers, combine with the silver-halide crystals. The resulting red, green and blue, mix to approximate the actual color of the light that first hit the film.

Although filters are also used with digital sensors, the methods vary, including the following:

(a) Three separate image sensors can be used, each with its own filter. This way, each image sensor captures the image in a single color.

(b) Three separate exposures can be made, changing the filter for each one. This way, the three colors are `painted' onto the sensor, one at a time.

(c) Filters can be placed over individual photosites so each can capture only one color.

However, method (a) is expensive and method (b) slows down the process. Therefore, most consumer-level digital cameras use method (c).

However, when small filters are placed directly over individual photosites on the sensor, the optical resolution of the sensor is reduced by one-third. This is because each of the available photosites records only one of the three colors. On the typical sensor there are twice as many green photosites as red and blue. This is because human vision, which we want to emulate, is most sensitive to the green wavelengths, and also because the green readings are used to compute brightness.

To create a full-color image, interpolation is used, i.e., the colors of neighboring pixels are used to calculate the two colors a photosite didn't record. By combining these two interpolated colors with the color measured by the site directly, the original color of every pixel is calculated. The calculations required to perform this process use complicated mathematical algorithms.

[ILLUSTRATIONS OMITTED]

When interpolation is used, there has to be enough information in the surrounding pixels to contribute color information. This isn't always the case. Low-resolution image sensors have a problem called "color aliasing" that occurs when a spot of light in the original scene is only big enough to be read by one or two pixels. Surrounding pixels don't contain any accurate color information about the pixel so the color of that spot may show up as a dot of color disconnected from the surrounding image. Color aliasing is an "artifact" -- a term increasingly popular when discussing digital image quality, and refers to any effect produced by a digital process not present in the original image.

Another form of color aliasing shows up as out of place color fringes surrounding otherwise sharply defined objects. The resulting pattern, particularly on diagonal lines, when blown up is an unreal mosaic of colors that are a result of averaging between adjacent pixels. A less than perfect workaround for this problem is to desaturate the color in specific areas on an image where fringing is apparent. The creation of color is an important area where film and digital differ significantly.

* Noise

Next to resolution, noise is perhaps the most important sensor specification. We are all familiar with the idea of noise in sound, but electronic fixed pattern and noise is present all the time in digital sensors as well. Therefore, just before an exposure is made, a digital camera will make what is known as a "dark reading" to ascertain the signal from each sensor in its photosite array. When the actual exposure is made, the camera then subtracts the dark noise signal from the exposure signal. This is one of the reasons that there is always a momentary delay between the time when you press the shutter button and when the picture is actually made. The dark noise subtraction method is never perfect, however, particularly when the sensors are warm. Thus, digital cameras are more susceptible to heat than are film cameras. (Some astronomical digital cameras actually have freezers built into them in order to keep their sensors cool.)

Smaller elements with fewer photosites have a higher noise to signal. Also, more noise will appear in situations with low light levels where the noise to signal ratio will be at its highest. Finally, attempts to reduce the noise by mathematical manipulation increase the processing time to produce the image. This is yet another difference between film and digital.

* Blooming

Blooming or light spillover, is a problem caused by photons spilling from one sensor element to another creating what can be a whole region of overfill, resulting in highlight blowout and/or weird color in these areas. The effect is especially noticeable when photographing objects where specular highlights are present, such as shiny metal. Larger sensor elements can collect and contain the photons better than the smaller ones found in most of today's consumer level digital cameras.

Some sensors have anti-blooming gates built into them. These typically occupy about 30% of the pixel area. This is a 70% fill factor or the percentage of the sensor element that gathers photons (see Figures 4 and 5) and the reduced sensitivity means that the exposure must be almost twice as long to get the same signal level as a sensor without the anti-blooming feature. In addition, the area of the sensor occupied by the anti-blooming gate leaves a significant gap between pixels, reducing the effective resolution of the sensor, and the signal to noise ratio is also reduced. Again, this is another significant difference between film and digital.

[ILLUSTRATIONS OMITTED]

* Post-Exposure to Storage

After film is exposed, it must be processed, after which the resultant image is stored on the film itself. This differs from a digital camera where image recording and storage are on separate media. The charge on the sensor must be transferred to an analog-to-digital (A/D) converter that counts the individual electrons that constitute the charge. (Unfortunately, noise is also introduced during the analog-to-digital conversion.) This number is then assigned a digital equivalent, expressed only in zeros and ones (e.g. 01100101), or bits, the fundamental language in which all computers function.

These numbers are then moved into the camera's computer where a series of mathematical algorithms convert them into a single computer file, which is then moved to the camera's storage device (Figure 6). These processing and data transfers take time, but nowhere near as long as it takes to get one's film back from the processor. Thus, another difference between film and digital, but in this case, score one for digital!

[ILLUSTRATION OMITTED]

Albert J. Klee Milford, Ohio

COPYRIGHT 2001 Photographic Society of America, Inc.
COPYRIGHT 2001 Gale Group

联系我们|关于我们|网站声明
国家哲学社会科学文献中心版权所有