PART II: RANDOM SIGNAL PROCESSING
Chapter 7 Random Processes
7.1 Introduction

What is a random process?

The probability model used to characterize a random
signal is called a random process (or a stochastic process or a time series)

Examples of random
processes:

a piece of sound collected in a train station (continuous)

a section of facsimile in a fax machine (discrete)

……

Many random processes are the outputs of linear systems, such as:

the temperature in an industry oven

the vibration of an electrica
l motor

…

Study random processes can help us understand the behavior of engineering systems
and hence to optimize their operations

The road ahead:

Chapter 7 = Chapter 3 in the textbook: Introduction to random processes

Chapter 8 = Chapter 4 in the textbook
: Linear systems

Chapter 10 = Chapter 6 in the textbook: Signal Detection

Chapter 11 = Chapter 7 in the textbook: Filtering
7.2 Definitions of Random Processes

A random process can be described by
X
(
t
,
), where,
t
is time,
is a set of variables
can cha
racterize the process and is the outcomes of random experiments
E
, and
X
represents the waveform or the signal.

The features of random variables:

X
(
t
,
) = {
X
(
t
,
i
) /
i
S} = {
x
1
(
t
),
x
2
(
t
) …} is a collection of functions of time

X
(
t
0
,
i
) =
x
i
(
t
0
) is a
numerical value of the ith element at time
t
0
.

X
(
t
,
i
) =
x
i
(
t
) is a specific member function or deterministic function of time

X
(
t
0
,
) = {
X
(
t
0
,
i
) /
i
S} = {
x
1
(
t
0
),
x
2
(
t
0
) …} is a collection of the numerical
values of the member functions at
t
=
t
0
,

Let us consider a simply example: a rotary machine has two states: normal and
abnormal. Under the normal condition, denoted as
N
, its vibration can be described by
the following function:
x
n
(
t
) = sin
t
Note that the angular speed is
= 1. On the other
hand, under the abnormal
condition, denoted by
A
, its vibration is characterized by:
x
a
(
t
) = 2sin
t
Then, the measured vibration signal is a random process:
X
(
t
,
) = {
X
(
t
,
i
) /
i
{
N
,
A
}} = {
x
n
(
t
),
x
a
(
t
)} = {sin
t
, 2sin
t
}
In particular:
X
(
t
,
N
) =
sin
t
X
(
t
,
A
) = 2sin
t
X
(0,
) = {0, 0}
X
(
/2,
) = {1, 2}
X
(0,
N
) = 0;
X
(0,
A
) = 0;
X
(
/2,
N
) = 1;
X
(
/2,
N
) = 2;

Note that at each specific time, the random process becomes a random variable.
Hence, we can only calculate its probability. For examp
le, if the machine is equally
likely to be in the state of normal and abnormal, then:
P
(
X
(
/2) = 1) = ½
This is called the probabilistic structure of the random process.

Similar to the (inform) definition of probability, let n be the total number trails
at
t
=
t
0
, and k be the number of trail that result in
X
(
t
0
,
) = a, then:
n
k
a
t
X
P
n
lim
)
)
,
(
(
0
We can define the joint and conditional probability in similar pattern.

The formal definition of random processes is based on the probability: Let
S
be the
sample space of a random experiment and t be a variable that can have values in the
set
R
1
(the real line). A real

valued random process
X
(
t
),
t
Ⱐ楳⁴桥渠n
浥m獵牡扬攠晵湣瑩潮渠
S
that maps
S
onto
R
1
. It can be described by its (nth
order) di
stribution function:
t
n
x
t
X
x
t
X
P
x
x
x
F
n
n
n
t
X
t
X
t
X
n
and
all
for
,
)
(
,...,
)
(
)
,...,
,
(
1
1
2
1
)
(
),...,
(
),
(
2
1

Random processes can be classified according the characteristics of
t
and the random
variable
X
(
t
) at time
t
:
t
is continuous
t
is discrete
X
(
t
) is continuous
Continuous random process
Continuous random
seq
uence
X
(
t
) is discrete
Discrete random process
Discrete random sequence

Random processes can also be classified based on the probabilistic structure. If the
probability distribution of a random process does not depend on the time, t, then, the
process i
s called stationary. Otherwise, it is called non

stationary. That is:
if
P
(
X
(
t
)
x
(
t
)) = P(
X
x
)
then
X
(
t
) is stationary
else
X
(
t
) is nonstationary
This will be further discussed in the subsequent sections.

Another important type of random processes is
the Markov process. Given a random
process,
X
(
t
), if
)
(
/
)
(
)
(
),...,
(
/
)
(
1
1
1
k
k
k
k
t
X
t
X
P
t
X
t
X
t
X
P
Then, it is a Markov process. The most important feature of a Markov process is that
its current probability depends on only the immediate past.

For the example above, the rand
om process is continuous random process, it is
nonstationary but Markov.
7.3 Representation of Random Processes

Random processes can be described by their probability structures. There are other
alternative ways to describe random processes and these desc
riptions may be easier to
use when solving certain problems.

The first alternative is the joint probability distribution. In general:

The first order:
)
)
(
(
1
1
a
t
X
P
represents the instantaneous amplitude

The second order:
)
)
(
,
)
(
(
2
2
1
1
a
t
X
a
t
X
P
represents the structure of the signal
(which is related to the spectrum discussed in subsequent sections)

The higher (
k
th) order:
)
)
(
,...,
)
(
,
)
(
(
2
2
1
1
k
k
a
t
X
a
t
X
a
t
X
P
represents the
details of the signal.
The joint probability representation is easy to use. However,
two different process
may have the same joint probability distributions.

The other alternative form is the analytical form. For example:
X
(
t
) =
A
sin(
t
+
)
where,
A
conforms a Normal distribution and
conforms an uniform distribution. In
general, the analytical form representation is unique but it is not always possible to
find an analytical form.

When two or more random processes are involved, say
X
(
t
) and
Y
(
t
), the following
concepts are important:

The joint distribution:
P
[
X
(
t
1
) <
x
1
, …,
X
(
t
n
) <
x
n
,
Y
(
t’
1
) <
y
1
, …,
Y
(
t’
n
) <
x
n
]

The cross

correlation function:
)
(
)
(
*
)
(
2
1
2
,
1
t
Y
t
X
E
t
t
R
XY

The cross

covariance function:
)
(
)
(
*
)
,
(
)
,
(
2
1
2
1
2
1
t
t
t
t
R
t
t
C
Y
X
XY
XY

The
correlation coefficient:
)
,
(
)
,
(
)
,
(
)
,
(
2
1
2
1
2
1
2
1
t
t
C
t
t
C
t
t
C
t
t
r
XY
XX
XY
XY

Equality:
X
(
t
) and
Y
(
t
) are equal if their respective member functions are identical
for each outcome of
S.

Uncorrelated:
X
(
t
) and
Y
(
t
) are uncorrelated if
)
,
(
2
1
t
t
C
XY
= 0,
t
1
,
t
2

Orthogonal;
X
(
t
) and
Y
(
t
) are orthogonal if
)
,
(
2
1
t
t
R
XY
= 0,
t
1
,
t
2

Independent:
X
(
t
) and
Y
(
t
) are independent if
P
[
X
(
t
1
) <
x
1
, …,
X
(
t
n
) <
x
n
,
Y
(
t’
1
) <
y
1
, …,
Y
(
t’
n
) <
x
n
] =
P
[
X
(
t
1
) <
x
1
, …,
X
(
t
n
) <
x
n
]
P
[
Y
(
t’
1
) <
y
1
, …,
Y
(
t’
n
) <
x
n
],
t
1
,
t
2
…
t’
1
,
t’
2
㜮㐠潭攠印散楡氠ia湤潭⁐牯e獳敳
⠱(⁇畳獩a渠n牯e獳es

The most commonly used random process is Gaussian process. A Gaussian process,
X
(
t
),
t
Ⱐ,a渠扥e獣物扥搠y猠灲潢a扩b楴yn獩sy晵湣瑩潮
x
x
x
x
x
x
1
1
2
1
2
2
1
exp
2
)
(
n
f
睨w牥Ⱐ
t
1
,
t
2
, …,
t
n
Ⱐ
x
= {
x
1
,
x
2
, …,
x
n
}, and
x
i
=
X
(
t
i
). In addition,
x
is the mean
vector and
is the covariance matrix.

Gaussian processes have several important features: Let
2
1
2
1
...
...
x
x
x
n
k
x
x
x
x
,
2
1
x
x
x
,
22
21
12
11
2
1
2
2
22
21
1
1
12
11
...
...
...
...
...
...
...
...
...
...
...
...
...
...
...
...
..
...
...
...
...
...
...
nn
nk
n
n
kk
n
k
n
k
Then:
(a)
Independence: if
ij
= 0,
i
j
, then
X
is independent.
(b)
Partiti
on at
k
:
X
1
has a k

dimensional multivariate Gaussian distribution with a
mean
1
x
and covariance
11
.
(c)
Reduction: if
A
is a
k
n
matrix of rank
k
, then
Y
=
AX
has a
k

variate Gaussian
distribution:
x
y
A
T
A
A
x
y
(d)
Conditional probability: the conditional probability density function of
X
1
given
X
2
=
x
2
is a k

dimensional multivariate Gaussian distribution with the mean and
covariance defined below:
2
1
2
1
2
1
22
12
2
2
1
/
x
x
X
X
x
x
X
X
E
21
1
22
12
11
/
2
1
X
X

An ex
ample: a work station sends out either a signal (1) or no signal (0). Because of
the noise disturbance, it is found that the signal can be described by a Gaussian
random process:
X
(
t
) = {
x
1
(t),
x
2
(t)}
where,
x
1
(t) ~
N
(0, 1),
x
2
(t) ~
N
(1,1) and:
1
0
1
1
Note that
X
is not independent. The conditional probability of
X
1
given
X
2
= 2 is
Gaussian with:
2
1
X
X
= 0 + (1)(1)

1
(2
–
1) = 1
2
1
/
X
X
= 1
–
(1)(1)

1
(0) = 1
By the way, since the probability function
is independent on the time, the signal is
stationary. Is it also Markov? (Yes)
(2) Another important type of random process is binary random process (random binary
waveform). It is often used in data communication systems.

Illustration of a binary random p
rocess

The properties of the binary random process:

Each pulse has a rectangular shape with a fixed duration T and a random
amplitude
1.

Pulse amplitude are equally likely to be
1 (can be converted to 0 and 1)

All pulse amplitude ar
e statistically independent

The start time of the pulse sequences is arbitrary and is equally likely between 0
and T.

The mathematical expression:
k
k
D
kT
t
p
A
t
X
)
(
)
(
where,
p
(
t
) is a unit amplitude pulse of duration
T
,
A
k
is a binary random variable
r
epresenting the
k
th pulse, and
D
is the random start time with uniform distribution in
the interval [0,
T
].

An example of binary random process, in the above figure: {…1,

1, 1,

1,

1, …}

The mathematical expectations:

E
{
X
(
t
)} = 0;

E
{
X
2
(
t
)} = 1;

T
t
t
t
t
R
XX
2
1
2
1
1
)
,
(
3.5. Stationarity
(1)
Stationarity is one of the most important concepts in random signal processing. There
are two kinds of stationarities: strict

sense stationarity (SSS) and wide

sense
stationarity (WSS).
(2)
Strict

sense stationarity

definition
: a random process
X
(
t
) is SSS if for all
t
1
,
t
2
, …,
t
k
,
t
1
+
,
t
2
+
, …,
t
k
+
and
k
= 1, 2, …,
P
[
X
1
(
t
)
x
1
,
X
2
(
t
)
x
2
, …,
X
k
(
t
)
x
k
] =
P
[
X
1
(
t+
)
x
1
,
X
2
(
t+
)
x
2
, …,
X
k
(
t+
)
x
k
]
If the foregoing definition holds for all
k
th order distribution function
k
= 1, 2, …,
N
but necessary for
k
>
N
, then the process is s
aid to be
N
th order stationary.

As a special case, the first order SSS is:
1

1
T
D
t
P
[
X
(
t
)
x
] =
P
[
X
(
t+
)
x
]
That is, the probability distribution is independent of time
t
.

For a first order SSS random process:
E
{
X
(
t
)} =
x
(a constant)
Note that the reverse is
not true in general. That is, even if this equation holds, the
corresponding random process may not be SSS.
(3)
Wide

sense stationarity

WSS is much more widely used

Definition: a random process
X
(
t
) is SSS if the following conditions are hold:
E
{
X
(
t
)} =
x
(
a constant)
E
{
X
*(
t
)
X
(
t+
)} =
R
xx
(
)
where,
X
*(
t
) is the conjugate complex of
X
(
t
).

Example 1: Is the binary random process WSS? From (1), we know that:
E
{
X
(
t
)} = 0;
T
t
t
t
t
R
XX
2
1
2
1
1
)
,
(
Let
t
2
=
t
1
+
, we have:
T
T
t
t
t
t
R
XX
1
1
)
,
(
1
1
1
1
therefore, i
t is WSS.

multivariate random processes can be defined in a similar manner.
3.6. Autocorrelation and Power Spectrum
(1)
Autocorrelation and power spectrum are very important for signal processing. It is
interesting to know that there are different ways to int
roduce power spectrum.
However, they all result in the same thing.
(2)
Autocorrelation function

the autocorrelation function of a real

valued WSS random process is defined as
R
XX
(
) =
E
{
X
(
t
)
X
(
t+
)}

the basic properties:

it is related to the energy:
R
xx
(
) =
E
{
X
2
(
t
)} = average power (note that
R
xx
(
)
0)

it is an even function of
:
R
xx
(
) =
R
xx
(

)

it is bounded by
R
xx
(
) :
R
xx
(
)
R
xx
(
)

If
X
(
t
) contains a periodic component
, then
R
xx
(
) will also contain a periodic
component

If
C
R
XX
)
(
lim
, then
2
X
C

If
R
xx
(
) =
R
xx
(0) for some
T
0
0, then
R
xx
is periodic with a period
T
0
If
R
xx
(0) <
(3) Cross

correlation function and its properties

Cross

c
orrelation function is defined as follows:
R
XY
(
) =
E
{
X
(
t
)
Y
(
t+
)}

The cross

correlation function has the following properties

R
XY
(
) =
R
XY
(

)

)
0
(
)
0
(
)
(
YY
XX
XY
R
R
R

)
0
(
)
0
(
2
1
)
(
YY
XX
XY
R
R
R

R
XY
(
) = 0, if the processes are orthogonal, and
R
XY
(
) =
X
Y
, if the processes are independent
3.7. Power Spectrum
(1) Definition and properties

Based on the autocorrelation function, the power spectrum (Wiener

Khinchine) can
be defined as follows:
d
f
j
R
R
F
f
S
XX
XX
XX
)
2
exp(
)
(
)
(
)
(
Note:

This definition is applicable to a
ll WSS random processes. How about the no
stationary random processes? (it does not make sense to calculate the power
spectrum).

S
XX
(
f
) is called power spectral density function (psd).

Given the power spectral density function,
S
XX
(
f
), the autocorrelation
function can be
obtained:
df
f
j
f
S
f
S
F
R
XX
XX
XX
)
2
exp(
)
(
)
(
)
(
1

The properties of power spectrum:

S
XX
(
f
) is real and nonnegative

The average power in
X
(
t
) is given by:
df
f
S
R
t
X
E
XX
XX
)
(
)
0
(
)}
(
{
2

If
X
(
t
) real,
R
XX
(
) will be even. Hence,
S
XX
(
f
) will be even.

If X(t) has
periodic components, then
S
XX
(
f
) will have impulses.

We can define the cross

power spectral density function in a similar manner and find
its properties
(3)
Lowpass and bandpass processes

Definition: if the psd of a random process is zero for
f
>
B
(the bandw
idth of the
process), then the process is called a lowpass process with the bandwidth
B
.
Similarly, the bandpass process is defined.

The power of a given frequency band, 0
f
1
<
f
2
, is as follows:
2
1
)
(
2
]
,
[
2
1
f
f
XX
X
df
f
S
f
f
P

Some random pro
cesses may have psd functions with nonzero but insignificant values
over a large frequency bands, hence, the effective bandwidth (
B
eff
) is defined:
)
(
max
)
(
2
1
eff
f
S
df
f
S
B
XX
XX
This is illustrated as follows:
S
XX
(
f
)

B
B
f
c
f
c
+
B
/2
lowpass
bandpass
(4)
Power spectral density function of
random sequences

We can define the power spectral function for random sequences by means of
summation (instead of integration).
(5) Examples.
Equal area
B
eff
Max[
S
XX
(
f
)]
Comments 0
Log in to post a comment