![]() ![]() “Most common planetary camera designs have filter wheels that rotate different light filters in front of the sensor,” Izenberg says. Filtering is used primarily to make scientifically interesting details stand out. They’re designed to measure not just all visible light but also the infrared and ultraviolet light past each end of the visible spectrum. Space cameras are configured differently. The fact remains, in an earthbound camera, some information is lost. ![]() Granted, the human eye works in roughly the same way. In effect, filtering dumbs down each panachrome pixel so that it registers only a fraction of the light it’s capable of seeing. To create a color image, each pixel on a typical earthbound camera has a filter in front of it that passes red, green, or blue light, and the camera’s electronics add up the result to create the image we see, similar to a color TV. Each of the pixel-sized receptors in a digital camera sensor is basically a light bucket unmodified, their combined output is simply a grayscale image generated from all light in the visible spectrum and sometimes beyond. In principle, most digital cameras, including cheap Walmart models in addition to the custom-built jobs on space probes, are monochrome, or more accurately panachrome. The first, as you rightly suppose, is that grayscale images are often more useful for research. He told us there are basically two reasons space photography is mostly in black and white. To find out about space cameras, we got in touch with Noam Izenberg, a planetary scientist working on the MESSENGER probe, which is now circling Mercury taking pictures. But the truth is, we’re probably better off the way things are.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |