What is HOS ? HOS is "Higher Order Statistics" or "Higher Order Spectra." It is a field of statistical signal processing which has become very popular in the last 15 years. It makes use of information extra to that usually used in 'traditional' signal processing measures such as the power spectrum and autocorrelation function. This extra information can be used to get better estimates of parameters in noisy situations, or to shed light on nonlinearities in the signal's production mechanism.

yakzephyrAI and Robotics

Nov 24, 2013 (3 years and 4 months ago)


What is HOS ?

HOS is "Higher Order Statistics" or "Higher Order Spectra." It is a field of
statistical signal processing which has become very popular in the last 15
years. It makes use of information extra to that usually used in
'traditional' signal pro
cessing measures such as the power spectrum and
autocorrelation function. This extra information can be used to get better
estimates of parameters in noisy situations, or to shed light on
nonlinearities in the signal's production mechanism.

To date virtual
ly all practicable digital signal processing techniques have
been based on second order statistics. Consequently, the assessment of
signals is often based on an examination of the signal spectrum. The
conclusion is drawn that if a signal has a flat or near
flat spectrum then
the quality of any prediction will be poor. This line of reasoning while
useful for linear predictive systems which only exploit the first and
second order statistics of the signal, is not true in general since it ignores
the higher ord
er statistics of the signal. The last two decades have seen
a growing interest, within the signal processing community, in the use of
higher order statistics in a variety of applications. The reader is referred
to the
tutorial papers

by Mendel and Nikias.

When did HOS start out ?

The use of higher order statistics is not new, the work of Pearson around 1900
with the method of moments being the most obvious example. Around, 1920
this approach fell out

of favour with the development of Maximum Likelihood
(ML) by Fischer. This was due to the ML approach providing the optimal
estimate although in practice estimating the log
likelihood functions is often
trivial. Around the early 1960s a group of stati
sticians at the University of
California began to explore the use of these techniques again. However the
signal processing community largely ignored these techniques, and it was not
until 1980 when Mendel and his co
workers at the University of Southern
lifornia began to develop system identification techniques based on HOS
methods and apply them to seismic deconvolution problems that any significant
signal processing was carried out. At that point in time it was however
impractical to implement any of th
ese techniques in real time. In addition, there
were question marks over the robustness of the numerous methods in real world
environments particularly with short data records. It is only recently that the
advances in DSP technology coupled with research o
n the robustness question
that many HOS methods are now potentially feasible.

What are cumulants ?

No answer yet submitted.

Why Cumulants and not moments ?

Summary of answer: cumulants have properties that moments don't have.
because of these properties
expressions involving cumulants are much simpler
and easier to manipulate than expressions involving moments.

Are higher order cumulants really unaffected by additive Gaussian noise ?

Consistency is preserved even in the presence of additive coloured Gauss
noise but the variance in the estimation of cumulants from finite samples

What are the variance properties of cumulant estimators ?

How bigger is the variance compared to correlations and how does the variance
grow with cumulant order ?

at are polyspectra ?

"Polyspectra" is a term first coined by (Tukey ?) to describe the spectra of
cumulants. The 2nd order polyspectra is the familar power spectrum, which is
related to the 2nd order cumulant function (the autocorrelation function) via th
Fourier Transform.

Why normalise the bispectrum / trispectrum ?

The problem of normalising the polyspectra arises in nonparametric polyspectral
analysis, and it is done to make the variance of the polsypectral estimate
approximately flat accross freque
ncies. The raw bispectral estimate has a
variance which is proportional to the triple product of the true power spectra. A
new quantity with an approximately flat variance can be formed by dividing the
squared bispectral estimate by this variance term.

w to normalise the bispectrum ?

There are several normalisation schemes for the bispectrum, each with merits
and disadvantages. The terminology overlaps somewhat, so it can be quite
confusing. Some references are provided here in sketchy form for now :


icoherence : Kim and Powers 1979, 1981. (bounded between 0 and 1).


Skewness : Hinich 1982 (not bounded, but variance should be flatter).


Bicoherency index : Nikias and Petropulu

this is the same as the
skewness function.

A recent comparison was presen
ted in the PhD thesis of Dr Kravtchenko

What is quadratic phase coupling ?

Different types of nonlinearity result in different types of phase coupling. If a
signal composed of two sinusoids is passed through a squarer, then the output
will contai
n components at the sum and difference frequencies of the two
sinusoids. Quadratic phase coupling is the term used to describe the coupling
which results from this type of nonlinearity.

Is the bicoherence related to the ordinary coherence function ?

In so
me ways, yes. Both quantities are bounded between 0 and 1. However, the
coherence is estimating two auto
spectra and one cross spectrum, i.e. you need
both input and output measurements to compute the coherence function. The
bicoherence is an auto
, i.e. it can be computed from a single signal. The
coherence function provides a quantification of deviations from linearity in the
system which lies between the input and output measurement sensors. The
bicoherence measures the proportion of the signal e
nergy at any bifrequency that
is quadratically phase coupled.

Is it possible to obtain the power spectrum from a slice of the bispectrum ?

Can you have auto and cross bispectra like you can have auto and cross
spectra ?

The trispectrum magnitude is a 4
mensional quantity

how can that be
visualised ?

Bill Collis

has experimented with a visualisation package to deal with this

you can see an example


this involves plotting a coloured ball at
each place in the 3D frequency space. The size and colour of the ball indicates
the trispectrum magnitude at that trifrequency.
Note : Bill apologises but his
home page is currently unavailable

it should be online again soon 1
February 1996

In computation of the bispectrum, why is the data segmented into K
segments, each of M samples ? and why should there be overlapping (e.g.
50%) of these frames ?

The use of segment
averaging for bispectral
estimation is motivated by the same
reasons as for ordinary power spectral estimation : i.e. to achieve a consistent
estimate. The introduction of segment averaging reduces the estimate variance
by a factor of K. More information on this can be found in al
most any spectral
analysis book. If no data window is used, then no overlap should be used.
However, if data windows (such as Hamming, Hanning) are used, then frames
can be overlapped, so that the samples at the edge of one frame (which are
downweighted by

the window) will also play a part in the estimate for another
frame. Again see any book on ordinary spectral analysis for more information.

What is iid ?

iid stands for "idependent, identically distributed."

Q/ How can I generate a random process whic
h has symmetric alpha stable
distribution using computer?

What applications might there be ?

There have been applications in many fields, including the following, seismic
signal processing, speech processing, texture analysis, underwater signal
ssing, machine condition monitoring, radar, array processing, nonlinear
wave analysis.

Q/ How can I generate a random process which has symmetric alpha stable
distribution using computer?

No answer yet supplied

Can you get any text books on HOS ?


A book on HOS was published in 1993, the full reference is given below

C L Nikias and A P Petropulu
Higher Order Spectra Analysis
, Prentice

Are there any packages available for HOS ?

Yes, a

box called

is available.

Are there any good review articles on HOS ?


the seminal HOS paper which laid many of the foundations for later
development was by Brillinger and

Rosenblatt (see reference below). More
recently J Mendel wrote an often cited tutorial paper on the applications of HOS
to linear system identification.

D Brillinger and M Rosenblatt
, "Computation and Interpretation of k
Order Spectra", in
Spectral An
alysis of Time Signals
, ed. B Harris, Wiley,
938, 1967.

J M Mendel
, "Tutorial on Higher Order Statistics (Spectra) in signal
processing and system theory: theoretical results and some
Proceedings of the IEEE
, V79(3), pp 278
305, 1991.

Are there any conferences devoted to HOS ?

Yes, a bi
annual IEEE Signal Processing Workshop has taken place since 1989.
The list below shows the venues of these workshops.


Vail, Colorado, US.


Chamrousse, France (proceedings can be


Lake Tahoe, California, US.


Costa Brava, Spain.

There have also been special sessions devoted to HOS in many signal processing
conferences, including ICAS
SP, Eusipco and Asilomar. In the UK there has
been one IEE Colloquium (May 1995) on "Higher Order Statistics

Are they of
any use ?". There are several HOS papers in the
anced program


Who works in HOS ?

There are people from all over the world working in the field. In Edinburgh
University we maintain an
index of HOS researchers

Are there any HOS resources on the WWW ?

In Edinburgh there is a
HOS Home Page.

This contains links to HOS
bibliographies, lists of researchers and forthcoming conferen