The basics of spectrometry - Google Code

brothersroocooElectronics - Devices

Oct 18, 2013 (4 years and 23 days ago)

99 views

The basics of spectrometry


-

Radiation principles

-

Atmospheric interference

-

Reflectance

-

Image analysis

-

Properties of remotely sensed data

-

Transmitters and receivers

-

Geometrical correction


Optical imaging sensors are one of the mostly used tech
nologies in the field of remote sensing. Almost every
remote sensing application nowadays uses optical sensors that can collect data in the visible bandwidths,

but also in the non
-
visible
electromagnetic spectra
, like infrared
. The visible data is of cours
e very intuitive
for humans and it is easy to interpret this data. But spectral data that go beyond the human eye can provide
very useful data in many applications, especially when multiple bands can be observed at the same time.
This is the field of multi
spectral imaging.


Multispectral imaging
is a technology originally designed for aerospace applicati
ons. It operates mostly in
the optical region of the EM spectrum that is traditionally defined as radiation with wavelengths between
0.4 and 15 μm. This range is made up of

[T.A. Warner, M. Duane Nellis, G.M. Foody, 2009]
:




VIS; the visible range, 0.4
-
0.7
μm.



NIR; the near infrared range, 0.7
-
1.1 μm.



SWIR; the short wave infrared range, 1.1
-
2.5 μm.



MWIR; the midwave infrared range, 2.5
-
7.5 μm.



LWIR; the long wave infrared range, 7.5
-
15 μm.


It is possible to design sensors that span this entire range, it is

more practical to limit the sensor to one or
two of these regions.


A digital image is made up of a matrix of
pixels
. A pixel is defined as a two
-
dimensional picture element that
is the smallest nondivisible element of a digital image [John R. Jensen, 200
7]. Each pixel has an original
Brightness Value or Digital Number associated with it; in essence its information. The information collected
by the satellite is often stored in a value range of 8 to 12 bits. The analog
-
to
-
digital conversion is called
quanti
zation. Remote sensor data of 8 bits can have a value ranging between 0 to 255,
data of 12 bits ca
n
range between 0 to 1023.


The incoming EM field to the sensor is then resolved spatially, spectrally, radiometrically and temporally to
produce a(n) (digit
al) image of the radiation emanating from the observed location. These four
classifications are subdivisions
of a very important aspect in remotely sensed (satellite) data: resolution.

Resolution refers to an
“imaging system’s capability of resolving two a
djacent features or phenomena”

[
Gao,
2009]. There are
as stated above four types of resolution;

spatial, spectral, radiometric and temporal.


Spatial
Resolution

Spatial and spectral resolution is the most important aspect of our imager. The best spatial re
solution (
the
closest distance between two points on the ground that
can be distinguished)

d
Rayleigh
, as stated by the
Rayleigh criterion
, is determined by the size of the aperture,
D
A
, the wavelength of light
λ
, and the height
H

of the sensor relative to the ground. This is a good measure for the maximum potential ground resolution.





This limit is due to the diffraction of light waves
. Also, the size of the detector element of the

sensor can
determine the ground resolution. This has to do with the angular
Instantaneous Field of View (IFOV)
,
θ
IFOV
.
The
Field of View (FOV)
, is the angle for which all light waves come through the aperture or field stop.
The
IFOV is an angle, and with
in this angle all light waves will

actually hit the focal plane, at which the detection
elements are located. The
θ
IFOV

is defined by the ratio of the field stop diameter
D
F

over focal length
f
.



Figure
1

Definitions of the IFOV, field stop and focal plane in a figure. Source:
http://www.f
as.org/man/dod
-
101/navy/docs/es310/EO_image/IMG00003.GIF



The resolution size of the pixel on the ground is then given by:





In most cases the term
GIFOV

or
Ground Instantaneous Field of View

is used if we’re dealing with a l
inear
distance to the ground.

Most well engineered optical systems are limited by the field by the detector field
stop, and not due to diffraction, and so

at the longest wavelength collected by the sensor.
Furthermore it is very

important for the user of remote sensory imagery to be aware of the fact that your
system could have a spatial resolution that differs with certain spectral bands due to their design and
diffraction effects
[T.A. Warner, M. Duane Nellis, G.M. Foody, 2009]
.


With the formula of the resolution size of the pixel known, we can now make a rough sketch of the
spatial
resolution we could achieve with our nanosatellite. On of the demands of this study is that the satellite must
have a lifespan of at least five ye
ars.
The lifetime of a satellite can be estimated using the following formula
[
R. Noomen, Q.P. Chu, A. Kamp, 2007
]:




Where H is the density scale height (number of particles per cubic meter at orbit height), and Δa


is the
decrease of the semi
-
major axis after one orbital revolution. L is in orbital revolutions

and can easily be
converted.


Taking the required distance

during a solar minimum, when F
10.7

index is about 75, we get a minimum
distance of 400 km. The F
10.7

index

is a measure of the noise

generated by the sun at a wavelength of 10.7
c
enti
m
eters.
The desired dimensions of the satellite are also known, these ar
e 10*10*15 cm.
However, we
can make the lens to be deployable in space.
So let’s say, just for this indication, the
diameter of our field
stop is 1

mm

and our focal length is
50

cm. Then our maximum resolution size is
800 m.


Spectral resolution

The spectr
al resolution or resolving power of a spectrograph, or, more generally, of a frequency spectrum, is
a measure of its power to resolve features in the electromagnetic spectrum


Radiometric resolution

Radiometric resolution determines how finely a system can

represent or distinguish differences of intensity,
and is usually expressed as a number of levels or a number of bits, for example 8 bits or 256 levels which is
typical of computer image files. The higher the radiometric resolution, the better subtle diff
erences of
intensity or reflectivity can be represented, at least in theory. In practice, the effective radiometric resolution
is typically limited by the noise level, rather than by the number of bits of representation.


Temporal resolution

Temporal resol
ution refers solely to the time it takes to refresh the image. With satellites, this has a direct
correlation with the revisiting time of the satellites orbit. Temporal resolution of nanosatellites can easily be
improved by adding more satellites in the sa
me orbit.


Detection of EM propagation

Nowadays, almost all modern system use solid state semiconductors to convert the incoming
electromagnetic radiation into a small current or electrical signal. Essentially, they operate as
photon
detectors

by convertin
g incident
photons into photoelectrons producing a measurable voltage [T.A. Warner,
M. Duane Nellis, G.M. Foody, 2009]. The energy required to detect a passing photon is called the
band gap
.
Because the energy of a photon is inversely related to its wavele
ngth, the band gap of a given material sets
an upper limit on the wavelength response of a detector.
I.e.
if more energy is needed to release the
electron for detection, also a higher frequency is needed. The energy of a photon is given below.





Sensors are made of different materials, depending on what wavelengths they need to detect. In the visible
and NIR bands up to 1
μm, usually silicon (Si) is used. Silicon is a popular choice because of its abundance on
earth and thus low costs, typically high efficiency and room temperature operation. However, room
temperature is not present in space, clearly; therefore the last arg
ument is not applicable in space
environment. At longer wavelengths indium gallium arsenide (InGaAs) is used for

bands spanning 0.8
-
2.8
μm. What also can be used at these wavelengths and beyond is indium antimonide (InSb), for 0.3
-
5.5 μm.
This has one draw
back however; it requires cryogenic cooling to 77 K while InGaAs can operate at 220 K
which is achievable with passive radiators in space or thermal electric cooling. Another material that needs
cooling to 77 K but can be used for bands ranging between 0.7
-
15
μm is mercury cadmium telluride
(HgCdTe). The different bands can be achieved by ranging the relative concentration of Cd and Te.


Infrared

When dealing with infrared imaging, a few principles are important to know. One of those is the blackbody
principle
; a theoretical construct that absorbs and radiates energy at the maximum possible rate per unit
area at each wavelength for a given temperature
[John R. Jensen, 2007]. The total emitted power of a
blackbody can be described with the Stefan
-
Boltzmann law a
nd is expressed as:




Where
σ is the Stefan
-
Boltzmann constant, 5.6697*10
-
8

Wm
-
2
K
-
4

and T
is
in Kelvin. It is important to
remember that the radiance of any object is a function of its temperature, see figure 1. The dominant
wavelength can be determined using Wien’s displacement law:




Where T is the temperature in Kelvin, and k is a constant equaling 2898 μmK.



F
igure
2

Blackbody radiation curves for different temperatures. As a temperature of an object increases, its dominant wavelength is
shifted towards the shorter wavelengths of the spectrum and of course the intensity is increased.


T
h
e blackbody model is a nice model to conduct research with; however, the real world is not a fairy
-
tale
picture as the model. Rather, it is composed of selectively radiating bodies, made up of all kinds of different
materials, emitting a certain portion of

the maximum they can emit.
Emissivity

(
ε), is the ratio between the
actual radiance emitted by a real world selective radiating body (M
r
) and a blackbody at the same
thermodynamic temperature [John R. Jensen, 2007]:





One can see that the maximum emissivity is 1. So
me materials depart quite a lot from there blackbody
radiation curves and have low emissivity, while others are only a bit less. It is important to know that even
though two materials have the same thermodynamic temperature, they can still emit a different

wavelength
of infrared.
Therefore, different apparent temperatures can me measured by a radiometer leading to
(sometimes significant) errors. The emissivity of an object be influenced by a lot of factors [John R. Jensen,
2007], including (Page 256 John R
. Jensen).


So we have to take into account an objects emissivity when we use infrared remote sensing to measure an
objects true temperature. One way to do this is to use Kirchhoff’s radiation law. (Page 257 John R. Jensen)


Thermal infrared data collectio
n

For satellites, thermal infrared remote sensor data may be collected by [John R.
Jensen,

2007]:




Across
-
track thermal scanners
using moving mirrors.



Pushbroom linear and area
-
array charge
-
coupled
-
device (CCD) detectors.


In order to derive surface temper
atures using infrared scanning sensors, two problems must be solved.
These are:




Compensation for atmospheric distortions.



Correction for surface emissivity effects.


This can be achieved by using empirical
external referencing based on

in situ

(on place)
data, or using
internal blackbody source referencing.
With internal source referencing, an across
-
track scanning system
first looks at a “cold” reference target within the satellite platform, and then a “hot” reference target for
each line scan. The appar
ent temperature of the two reference bodies is then calculated with the help of the
true kinetic temperature, which is constantly monitored and stored. The radiometric resolution with this
solution is usually within ± 0.2
º
C [John R. Jensen, 2007]. Unfortu
nately, with this method one does not
account for the atmospheric disturbances, like absorption or scattered infrared EM waves. Also, this system
needs two reference bodies which is not a real option for a small nanosatellite.

With external
in situ

measur
ements, one simply collects data of the measured area (like water vapour
percentage, true kinetic temperature etc.) with the help of a thermometer, a handheld radiometer or
perhaps even a radiosonde:
a meteorological balloon. These help to improve the data

collected by the
satellite.



Figure
3

Wavelengths absorption of different InGaAs alloys. Source:
http://www.sensorsinc.com/GaAs.html


Sources


Literature


[1]

Digital Analysis of Remotely Sensed Imagery, Jay Gao, Ph.D., 2009 McGrawHill


[2]

The SAGE Handbook of Remote Sensing,
Edited by
T.A. Warner, M.

Duane Nellis, G.M. Foody, 2009

SAG
E
Publications


[3
]

Space Engineering and Technology III, R. Noomen, Q.P. Chu
, A. Kamp, 2007

Faculty of Aerospace
Engineering


[4]

Remote Sensing of the Environment, An Earth Resource Perspective, John R. Jensen, Pearson Education,
Inc. 2007