From panchromatic to hyperspectral

Earth observation in a myriad of colors

 

OHB Redaktionsteam
Published on
by OHB Redaktionsteam, OHB SE

Buxtehude may not be as remote as Timbuktu but it’s still in the sticks as far as most people are concerned, almost as far away as the places where pepper is grown. But where are these places? And what’s it like there? From Bremen, it is relatively easy to find out what things are like in Buxtehude given that the two places are less than one hundred kilometers apart. Timbuktu, on the other hand, is located in Africa or, to be more precise, in Mali. And pepper grows even further away, at the southern tip of India. Despite this, it’s not too difficult to find out what these places look like. Nowadays, there are probably only a few places of which there are no freely available aerial photographs. Our Earth is under constant observation. And aerial photography is just the beginning.

Eyes from space

The Earth is surrounded by satellites, many of which carry with them special measuring instruments for observing the Earth. In order to understand how satellites observe the Earth, however, it important to first understand how human sight works.

The human eye: panchromatic and multispectral

As creatures that are primarily visually oriented, humans mainly use their sense of sight to collect information about their environment. The information carrier for this purpose is the spectrum of electromagnetic radiation emitted by the sun. This ranges from hard X-rays with extremely short wavelengths (< 0.1 nanometers) to long radio waves (> 1 kilometer). However, the human eye is only designed to absorb visible light, i.e. radiation with wavelengths in a range of between 380 and 780 nanometers. This is where the solar spectrum reaches its greatest intensity. Some of the remaining wavelength ranges are absorbed by the Earth’s atmosphere, while others do reach the Earth’s surface, but are not perceptible to the human eye. These include UV and infrared radiation, for example.

Irrespective of their perceptibility to the human eye, all types of radiation penetrating the atmosphere interact with the Earth’s surface and thus provide information about its composition. However, the human eye can only absorb a fraction of this information due to its limited sensitivity to the solar spectrum.

The function of the human eye is restricted, especially in low light intensities. In this case, only a single type of receptor, namely the rods, are active, merely allowing the eye to perceive differences between light and dark in a range of between 400 and 600 nanometers. This means that at low light intensities it is not possible to differentiate between different wavelengths and thus distinguish individual colors. The resulting images resemble those of a panchromatic camera.

What is a panchromatic camera?

A panchromatic camera works with sensors that are equally sensitive to the entire spectrum of visible light. This means that in panchromatic images the visible light reflected by objects is reproduced in different shades of gray. The gradations correspond to the human eye’s perception of brightness. Although colors cannot be reproduced, panchromatic images can achieve high spatial resolution and are therefore often used for mapping purposes.

If there is sufficient illumination, however, the eye works as a multispectral sensor with three relatively wide spectral recording bands. The retina inside the eye has three different subtypes of color receptors or cones, whose visual dyes have absorption peaks at different wavelengths but overlap in their sensitivity. In the brain, the activation of the individual cone types caused by the composition of the incoming radiation is processed and translated into a color image. A violet color impression is created by the shortest (approx. 380 to 420 nanometers) and a red color impression by the longest perceptible wavelengths (approx. 600 to 780 nanometers). In between are the wavelength ranges of all the other color impressions. It should be noted, however, that a certain color impression can be created both by the light of a single specific wavelength and by the light of mixed wavelengths. For example, the color impression “yellow” can be created by yellow light on the one hand but also by a mixture of red and green light on the other.

Technical sensors for the electromagnetic spectrum

Technical multispectral instruments usually have a higher spectral resolution than the human eye, but a standard digital camera also counts as a multispectral instrument. Like the human eye, the sensors in a digital camera can differentiate between blue, green and red light and thus reproduce the visual impression of the eye. By contrast, however, far more complex multispectral instruments with up to about fifteen color bands are used for Earth observation. They can usually be used to detect different wavelength ranges of visible light and infrared radiation.

The data collected by the individual bands is first recorded separately by color-insensitive sensors and can be displayed and evaluated in the form of grayscale images. In addition, spectrally different images of the same area can be combined to form color images using mathematical algorithms. The ability to represent the recorded information is in turn determined by the way in which human sight functions.

Where images have been collected in wavelength ranges not visible to humans, it is necessary to assign a spectral range perceptible to the human eye. This entails allocating a specific color to the data in order to visualize it. It doesn’t matter what color is assigned to what wavelength although in practice certain common display methods have been established. In most cases, man-made structures are recorded and represented in their natural color range, while images in spectral ranges invisible to the human eye often use “false colors” for the targeted analysis of soils, vegetation and water bodies.

What is a false color image?

Satellite images are often displayed in false colors. This is because satellite instruments partly cover spectral ranges that are invisible to the human eye. In addition, the human eye can differentiate only a few hundred levels of brightness within a color but is able to distinguish more than a million different color shades. With false color representations, unnatural colors are assigned to individual spectral ranges in order to render details more visible.

From panchromatic to hyperspectral

The main differences between panchromatic, multispectral and hyperspectral data acquisition are the width and number of recording bands. Whereas panchromatic sensors work with a single wide recording band, multi and hyperspectral instruments have a larger number of narrower recording bands to increase spectral resolution. Hyperspectral sensors can have up to several hundred closely adjacent color bands.

With such instruments, a continuous spectrum and thus a kind of “spectral fingerprint” of objects can be recorded. Thus, for example, different types of vegetation and different soil conditions can be differentiated on the basis of their characteristic absorption and reflection properties. It is even possible to distinguish individual plant species or rock compositions. In addition, the state of health of vegetation can be determined on the basis of the reflection properties in the infrared range: healthy plants produce the leaf pigment chlorophyll, which reflects six times more strongly in the infrared range than in the visible (and especially green) spectrum.

This data is mainly used in geographical remote sensing and environmental sciences. Since the differences in the absorption spectra of the individual materials are not perceptible to the human eye, the image is displayed in false colors.

Hyperspectral projects at OHB

PRISMA

PRISMA (PRecursore IperSpettrale della Missione Applicativa) is the technology demonstrator for a new generation of Earth observation satellites. The satellite carries on board a new type of electro-optical payload that combines a medium resolution panchromatic camera with a hyperspectral sensor. The panchromatic camera makes it possible to capture the geometry of the landscape, while the hyperspectral sensor provides information about its chemical and physical composition. By combining the two instruments, the interpretation of data sets recorded with the hyperspectral sensor can be facilitated. The resulting images can then be used for environmental monitoring, resource management, crop classification and other applications. The purpose of PRISMA is to validate the principle.

PRISMA’s observation design is based on the principle applied to the line scan camera (pushbroom or along-track scanning). The satellite scans the terrain line by line at right angles to the flight direction. The hyperspectral sensor of the payload carried by PRISMA has 239 recording bands with a spectral resolution of less than 12 nanometers in a range of between 400 and 2,500 nanometers (visible light to short-wave infrared). The spatial resolution is 30 meters with a recording width of 30 kilometers. The spatial resolution of the panchromatic data recorded simultaneously is 5 meters.

The PRISMA project is being financed entirely by the Italian Space Agency ASI. The satellite platform was developed and built by OHB Italia, with the payload supplied by Leonardo Airborne and Space Systems. On March 22, 2019, PRISMA was launched from the European spaceport in Kourou on board a Vega launcher.

EnMAP

EnMAP is another highly complex Earth observation satellite currently being assembled at OHB. The hyperspectral instrument that the satellite will carry on board consists of two imaging spectrometers with a total of 242 recording bands in a wavelength range from 420 to 2,450 nanometers. The spectral resolution is 6.5 nanometers in the visible and near-infrared range and 10 nanometers in the short-wave infrared range. This will enable the satellite to record the solar radiation reflected from the Earth’s surface in a range from visible light to the short-wave infrared range in continuous spectra. In this way, it will be possible to generate high-resolution spectral images that allow quantitative findings to be made about the mineralogical composition of rocks, the damage to plants by air pollutants, the water quality of lakes and coastal waters or the degree of soil pollution. The geosciences, agricultural sciences and environmental sciences will benefit from this.

The satellite can be rotated 30 degrees, resulting in a possible maximum of four days’ revisiting time for any point on the Earth’s surface. As with PRISMA, the spatial resolution is 30 meters. Thus, EnMAP can also be used to document comparatively rapid spatial-temporal changes, such as erosion processes or vegetation periods.

The German Geosciences Research Center in Potsdam is responsible for scientific management of the EnMAP project while overall project management lies in the hands of DLR Space Administration. OHB is engineering the satellite platform and the hyperspectral instrument. The launch of the satellite is scheduled for the beginning of 2021, with the scientific mission to have a duration of five years.