Search This Blog

What are Array and Linear Sensors?

Q: What are Area Array and Linear Sensors?

Hand a group of camera or scanner designers a theory and a box of components and you'll see fireworks. They will explore every possible combination to see which works best. The market determines the eventual winners in this "throw them against the wall and see what sticks" approach. At the moment, designers have two types of components to play with:
area array and linear sensors.

Area-array Sensors
Most cameras use area-array sensors with photosites arranged in a grid because they can cover the entire image area and capture an entire image all at once.

These area array sensors can be incorporated into a camera in a variety of ways.

-- One-chip, one-shot cameras use different color filters over each photosite to capture all three colors with a single exposure. This is the most common form of image sensor used in consumer-level digital cameras. One chip, three shot cameras take three separate exposures: one each for red, green, and blue. A different colored filter is placed in front of the image sensor for each of the colors. These cameras cannot photograph moving objects in color (although they can in black & white) and are usually used for studio photography.

-- Two-chip cameras capture chromonance using one sensor (usually equipped with filters for red light and blue light) and luminance with a second sensor (usually the one capturing green light). Two-chip cameras require less interpolation to render true colors.

-- Three-chip cameras, such as one from MegaVision, use three full frame image sensors; each coated with a filter to make it red-, green- or blue-sensitive. A beam splitter inside the camera divides incoming images into three copies; one aimed at each of the sensors. This design delivers high resolution images with excellent color rendering. However, three-chip cameras tend to be both costly and bulky.

Linear Sensors
Scanners, and a few professional cameras, use image sensors with photosites arranged in either one row or three. Because these sensors don't cover the entire image area, the image must be scanned across the sensor as it builds up the image from the captured rows of pixels. Cameras with these sensors are useful only for motionless subjects and studio photography. However, these sensors are widely used in scanners. Linear image sensors put a different color filter over the device for three separate exposures—one each to capture red, blue or green.

-- Tri-linear sensors use three rows of photo sites—each with a red, green, or blue filter. Since each pixel has it's own sensor, colors are captured very accurately in a single exposure.


Post a Comment

Please enter you comments or your question what ever you have regarding Graphic Designing. Thanks

Blog Widget by LinkWithin