The basics of spectrometry - Google Code

brothersroocooElectronics - Devices

Oct 18, 2013 (4 years and 8 months ago)


The basics of spectrometry


Radiation principles


Atmospheric interference




Image analysis


Properties of remotely sensed data


Transmitters and receivers


Geometrical correction

Optical imaging sensors are one of the mostly used tech
nologies in the field of remote sensing. Almost every
remote sensing application nowadays uses optical sensors that can collect data in the visible bandwidths,

but also in the non
electromagnetic spectra
, like infrared
. The visible data is of cours
e very intuitive
for humans and it is easy to interpret this data. But spectral data that go beyond the human eye can provide
very useful data in many applications, especially when multiple bands can be observed at the same time.
This is the field of multi
spectral imaging.

Multispectral imaging
is a technology originally designed for aerospace applicati
ons. It operates mostly in
the optical region of the EM spectrum that is traditionally defined as radiation with wavelengths between
0.4 and 15 μm. This range is made up of

[T.A. Warner, M. Duane Nellis, G.M. Foody, 2009]

VIS; the visible range, 0.4

NIR; the near infrared range, 0.7
1.1 μm.

SWIR; the short wave infrared range, 1.1
2.5 μm.

MWIR; the midwave infrared range, 2.5
7.5 μm.

LWIR; the long wave infrared range, 7.5
15 μm.

It is possible to design sensors that span this entire range, it is

more practical to limit the sensor to one or
two of these regions.

A digital image is made up of a matrix of
. A pixel is defined as a two
dimensional picture element that
is the smallest nondivisible element of a digital image [John R. Jensen, 200
7]. Each pixel has an original
Brightness Value or Digital Number associated with it; in essence its information. The information collected
by the satellite is often stored in a value range of 8 to 12 bits. The analog
digital conversion is called
zation. Remote sensor data of 8 bits can have a value ranging between 0 to 255,
data of 12 bits ca
range between 0 to 1023.

The incoming EM field to the sensor is then resolved spatially, spectrally, radiometrically and temporally to
produce a(n) (digit
al) image of the radiation emanating from the observed location. These four
classifications are subdivisions
of a very important aspect in remotely sensed (satellite) data: resolution.

Resolution refers to an
“imaging system’s capability of resolving two a
djacent features or phenomena”

2009]. There are
as stated above four types of resolution;

spatial, spectral, radiometric and temporal.


Spatial and spectral resolution is the most important aspect of our imager. The best spatial re
solution (
closest distance between two points on the ground that
can be distinguished)

, as stated by the
Rayleigh criterion
, is determined by the size of the aperture,
, the wavelength of light
, and the height

of the sensor relative to the ground. This is a good measure for the maximum potential ground resolution.

This limit is due to the diffraction of light waves
. Also, the size of the detector element of the

sensor can
determine the ground resolution. This has to do with the angular
Instantaneous Field of View (IFOV)
Field of View (FOV)
, is the angle for which all light waves come through the aperture or field stop.
IFOV is an angle, and with
in this angle all light waves will

actually hit the focal plane, at which the detection
elements are located. The

is defined by the ratio of the field stop diameter

over focal length


Definitions of the IFOV, field stop and focal plane in a figure. Source:

The resolution size of the pixel on the ground is then given by:

In most cases the term

Ground Instantaneous Field of View

is used if we’re dealing with a l
distance to the ground.

Most well engineered optical systems are limited by the field by the detector field
stop, and not due to diffraction, and so

at the longest wavelength collected by the sensor.
Furthermore it is very

important for the user of remote sensory imagery to be aware of the fact that your
system could have a spatial resolution that differs with certain spectral bands due to their design and
diffraction effects
[T.A. Warner, M. Duane Nellis, G.M. Foody, 2009]

With the formula of the resolution size of the pixel known, we can now make a rough sketch of the
resolution we could achieve with our nanosatellite. On of the demands of this study is that the satellite must
have a lifespan of at least five ye
The lifetime of a satellite can be estimated using the following formula
R. Noomen, Q.P. Chu, A. Kamp, 2007

Where H is the density scale height (number of particles per cubic meter at orbit height), and Δa

is the
decrease of the semi
major axis after one orbital revolution. L is in orbital revolutions

and can easily be

Taking the required distance

during a solar minimum, when F

index is about 75, we get a minimum
distance of 400 km. The F


is a measure of the noise

generated by the sun at a wavelength of 10.7
The desired dimensions of the satellite are also known, these ar
e 10*10*15 cm.
However, we
can make the lens to be deployable in space.
So let’s say, just for this indication, the
diameter of our field
stop is 1


and our focal length is

cm. Then our maximum resolution size is
800 m.

Spectral resolution

The spectr
al resolution or resolving power of a spectrograph, or, more generally, of a frequency spectrum, is
a measure of its power to resolve features in the electromagnetic spectrum

Radiometric resolution

Radiometric resolution determines how finely a system can

represent or distinguish differences of intensity,
and is usually expressed as a number of levels or a number of bits, for example 8 bits or 256 levels which is
typical of computer image files. The higher the radiometric resolution, the better subtle diff
erences of
intensity or reflectivity can be represented, at least in theory. In practice, the effective radiometric resolution
is typically limited by the noise level, rather than by the number of bits of representation.

Temporal resolution

Temporal resol
ution refers solely to the time it takes to refresh the image. With satellites, this has a direct
correlation with the revisiting time of the satellites orbit. Temporal resolution of nanosatellites can easily be
improved by adding more satellites in the sa
me orbit.

Detection of EM propagation

Nowadays, almost all modern system use solid state semiconductors to convert the incoming
electromagnetic radiation into a small current or electrical signal. Essentially, they operate as

by convertin
g incident
photons into photoelectrons producing a measurable voltage [T.A. Warner,
M. Duane Nellis, G.M. Foody, 2009]. The energy required to detect a passing photon is called the
band gap
Because the energy of a photon is inversely related to its wavele
ngth, the band gap of a given material sets
an upper limit on the wavelength response of a detector.
if more energy is needed to release the
electron for detection, also a higher frequency is needed. The energy of a photon is given below.

Sensors are made of different materials, depending on what wavelengths they need to detect. In the visible
and NIR bands up to 1
μm, usually silicon (Si) is used. Silicon is a popular choice because of its abundance on
earth and thus low costs, typically high efficiency and room temperature operation. However, room
temperature is not present in space, clearly; therefore the last arg
ument is not applicable in space
environment. At longer wavelengths indium gallium arsenide (InGaAs) is used for

bands spanning 0.8
μm. What also can be used at these wavelengths and beyond is indium antimonide (InSb), for 0.3
5.5 μm.
This has one draw
back however; it requires cryogenic cooling to 77 K while InGaAs can operate at 220 K
which is achievable with passive radiators in space or thermal electric cooling. Another material that needs
cooling to 77 K but can be used for bands ranging between 0.7
μm is mercury cadmium telluride
(HgCdTe). The different bands can be achieved by ranging the relative concentration of Cd and Te.


When dealing with infrared imaging, a few principles are important to know. One of those is the blackbody
; a theoretical construct that absorbs and radiates energy at the maximum possible rate per unit
area at each wavelength for a given temperature
[John R. Jensen, 2007]. The total emitted power of a
blackbody can be described with the Stefan
Boltzmann law a
nd is expressed as:

σ is the Stefan
Boltzmann constant, 5.6697*10


and T
in Kelvin. It is important to
remember that the radiance of any object is a function of its temperature, see figure 1. The dominant
wavelength can be determined using Wien’s displacement law:

Where T is the temperature in Kelvin, and k is a constant equaling 2898 μmK.


Blackbody radiation curves for different temperatures. As a temperature of an object increases, its dominant wavelength is
shifted towards the shorter wavelengths of the spectrum and of course the intensity is increased.

e blackbody model is a nice model to conduct research with; however, the real world is not a fairy
picture as the model. Rather, it is composed of selectively radiating bodies, made up of all kinds of different
materials, emitting a certain portion of

the maximum they can emit.

ε), is the ratio between the
actual radiance emitted by a real world selective radiating body (M
) and a blackbody at the same
thermodynamic temperature [John R. Jensen, 2007]:

One can see that the maximum emissivity is 1. So
me materials depart quite a lot from there blackbody
radiation curves and have low emissivity, while others are only a bit less. It is important to know that even
though two materials have the same thermodynamic temperature, they can still emit a different

of infrared.
Therefore, different apparent temperatures can me measured by a radiometer leading to
(sometimes significant) errors. The emissivity of an object be influenced by a lot of factors [John R. Jensen,
2007], including (Page 256 John R
. Jensen).

So we have to take into account an objects emissivity when we use infrared remote sensing to measure an
objects true temperature. One way to do this is to use Kirchhoff’s radiation law. (Page 257 John R. Jensen)

Thermal infrared data collectio

For satellites, thermal infrared remote sensor data may be collected by [John R.


track thermal scanners
using moving mirrors.

Pushbroom linear and area
array charge
device (CCD) detectors.

In order to derive surface temper
atures using infrared scanning sensors, two problems must be solved.
These are:

Compensation for atmospheric distortions.

Correction for surface emissivity effects.

This can be achieved by using empirical
external referencing based on

in situ

(on place)
data, or using
internal blackbody source referencing.
With internal source referencing, an across
track scanning system
first looks at a “cold” reference target within the satellite platform, and then a “hot” reference target for
each line scan. The appar
ent temperature of the two reference bodies is then calculated with the help of the
true kinetic temperature, which is constantly monitored and stored. The radiometric resolution with this
solution is usually within ± 0.2
C [John R. Jensen, 2007]. Unfortu
nately, with this method one does not
account for the atmospheric disturbances, like absorption or scattered infrared EM waves. Also, this system
needs two reference bodies which is not a real option for a small nanosatellite.

With external
in situ

ements, one simply collects data of the measured area (like water vapour
percentage, true kinetic temperature etc.) with the help of a thermometer, a handheld radiometer or
perhaps even a radiosonde:
a meteorological balloon. These help to improve the data

collected by the


Wavelengths absorption of different InGaAs alloys. Source:




Digital Analysis of Remotely Sensed Imagery, Jay Gao, Ph.D., 2009 McGrawHill


The SAGE Handbook of Remote Sensing,
Edited by
T.A. Warner, M.

Duane Nellis, G.M. Foody, 2009



Space Engineering and Technology III, R. Noomen, Q.P. Chu
, A. Kamp, 2007

Faculty of Aerospace


Remote Sensing of the Environment, An Earth Resource Perspective, John R. Jensen, Pearson Education,
Inc. 2007