Automatic Speech Recognition II

runmidgeAI and Robotics

Oct 20, 2013 (3 years and 9 months ago)

51 views

Automatic Speech Recognition II


Hidden Markov Models


Neural Network

Hidden Markov Model


DTW, VQ => recognize pattern, use distance measurement.


HMM: statistical method for characterizing the properties of the frame of
pattern,

Discrete
-
time Markov Processes


Consider a system with:


N

distinct states


A set of probabilities associated with the state. =>
probabilities to change from one state to another state


Time instants

Discrete
-
time Markov Processes

First order Markov chain: the probability depends on just the preceding state.

The set of state
-
transition probabilities
a
ij

:

Discrete
-
time Markov Processes


Ex
. Consider a simple three
-
state Markov model of the weather.





What is the probability that the weather for the next seven consecutive
days is “sun
-
sun
-
snow
-
snow
-
sun
-
cloudy
-
sun”.
Given the weather for today is
“sun” and the weather condition for each day depends on the condition on
a previous day.


O=(sun, sun, sun, snow, snow, sun, cloudy, sun)


O=(3, 3, 3, 1, 1, 3, 2, 3)


State1: snow

State2: cloudy

State3: sunny

Prob. for initial state

Discrete
-
time Markov Processes


Ex
. Given a single fair coin, i.e., P(Heads)=P(Tails)=0.5


What is the probability that the next 10 tosses will provide the sequence
(HHTHTTHTTH)






What is the probability that 5 of the next 10 tosses will be tails?

Coin
-
Toss Models


You are in a room with a barrier which you cannot see what is happening.


On the other side of the barrier is another person who is performing a coin
-
tossing experiment (using one or more coins).


The person will not tell you which coin he selects at any time; he will only tell
you the result of each coin flip.


How do we build an HMM to explain the observe sequence of heads and
tails?


What the states in the model correspond to


How many states should be in the model

Coin
-
Toss Models


Single coin


Two states: heads or tails


Observable Markov Model => not hidden

heads

tails

P(H)

1
-
P(H)

1
-
P(H)

P(H)

Coin
-
Toss Models


Two coins (Hidden Markov Model)


Two state: coin 1, coin 2


Each state(coin) is characterized by a probability
distribution of heads and tails


There are probabilities of state transition (state transition
matrix)



Coin 1

Coin 2

a11

a22

1
-
a11

1
-
a22

P(H)=P1

P(T)=1
-
P1
)

P(H)=P2

P(T)=1
-
P2
)

Coin
-
Toss Models


Three coins (Hidden Markov Model)


Three state: coin 1, coin 2, coin 3


Each state(coin) is characterized by a probability distribution of heads
and tails


There are probabilities of state transition (state transition matrix)


a11

a22

a12

a21

a33

a31

P(H)=P1

P(T)=1
-
P1
)

P(H)=P3

P(T)=1
-
P3
)

P(H)=P2

P(T)=1
-
P2
)

The Urn
-
and
-
Ball Model


There are
N
-
glass urns in the room.


Each urn is a large quantity of colored balls :
M

distinct colors


A genie is in the room and it chooses an initial urn. From this urn, a ball is
chosen at random and its color is recorded as the observation. The ball is
then replace to the same urn.


A new urn is then selected according to the random selection of the current
urn.

Element of an HMM


The number of states in the model (
N
)

:
S={1,2,…,N}


The number of distinct observation symbols per state (
M
): V={v
1
,v
2
,…
v
M
}


The state transition probability distribution
A={
a
ij
}
where
a
ij
=P[q
t+1
=
j|q
t
=
i
]


The observation symbol probability distribution,
B={
b
j
(k)}
, in which
b
j
(k)=P[
o
t
=
vk|q
t
=j]


The initial state distribution



Complete parameter set of the model


HMM Generator of Observations


Given appropriate values of N, M, A, B, and

, the HMM can be used as a
generator to give an observation sequence O=(o
1
o
2

o
T
)



Choose an initial state q
1
=
i


Set t=1


Choose
o
t
=
v
k

according to the symbol probability distribution in state
i

,
b
j
(k)


Transit to the new state q
t+1
=
j

according to the state
-
transition probability
distribution for state
i
,
a
ij


Set t=t+1; return to step 3 if t<T, otherwise, terminate the procedure.

HMM Generator of Observations


Ex.
Consider an HMM representation of a coin
-
tossing problem. Assume a
three
-
state model (three coins) with probabilities:






All state transition probabilities = 1/3

State1

State2

State3

P(H)

0.5

0.75

0.25

P(T)

0.5

0.25

0.75

HMM Generator of Observations

1.
You observe the sequence O=(H
H

H

H

T H T
T

T

T
). What state sequence
is most likely? What is the probability of the observation sequence and
this most likely state sequence?


Because all state transition probability are equal, the most likely state
sequence is the one for which the probability of each individual
observation is maximum.




Thus for each H, the most likely state is 2 and for each T the most likely
state is 3. The most likely state sequence is


q=(2 2 2 2 3 2 3 3 3 3) with probability


HMM Generator of Observations

2.
What is the probability that the observation sequence came entirely from
state 1?



O=(H
H

H

H

T H T
T

T

T
), q=(1 1 1 1 1 1 1 1 1 1)


The probability that the first H come from state 1 =0.5*1/3


The probability that the second H come from state 1 =0.5*1/3…


The probability that the first T come from state 1 =0.5*1/3…







HMM Generator of Observations


If the state
-
transition probabilities were:






What is the most likely state sequence for O=(H
H

H

H

T H T
T

T

T
).



a
11
=0.9

a
21
=0.45

a
31
=0.45

a
12
=0.05

a
22
=0.1

a
32
=0.45

a
13
=0.05

a
23
=0.45

a
33
=0.1

The three basic problems for HMM


Problem 1
: How do we compute
P(O|

)


Problem 2
: How do we choose the state sequence q=(q
1
, q
2
,…
q
T
) that is
optimal? (most likely)


Problem 3
: How do we adjust the model to maximize
P(O|

)


Speech recognition sense

Training Model

Samples of
W word
vocab

W1

Model

Wn

Model

Problem3

The three basic problems for HMM


To study the physical meaning of model states.

Initial

Vowel

Final

Problem2

The three basic problems for HMM


Unknown word

Recognize an unknown word.

Calculate

P (O|

1
)

Calculate

P (O|

n
)

compare

Prediction

Problem1

Artificial Neural Network


An artificial neural network (ANN),
usually called "neural
network" (NN), is a mathematical model or computational
model that tries to simulate the structure and/or functional
aspects of biological neural networks.

Composition of NN


Input nodes: each node is the feature vector of each sample.


Hidden nodes: can be more than 1 layer.


Output nodes: the output of the correspond input sample.


The connections of input nodes, hidden nodes, and output nodes are
specified by weight values.

Input nodes

Hidden nodes

Output nodes

Connections

Feedforward

operation and
classification


A simple three
-
layer NN

x
1

x
2

y
1

y
2

z
k

bias

Output k

Hidden j

Input
i

w
ji

w
kj

Feedforward

operation and
classification


Net activation: the inner product of the inputs with the weights at the hidden
unit.




Where
i

= index of input layer, j =index of hidden layer node



Each hidden unit emits an output that is a nonlinear function of its activation,
f(net)
that is:



Simple of sign function:


Feedforward

operation and
classification


Each output unit computes its net activation based on the hidden unit signals
as





The output unit computes the nonlinear function of its net:

Back propagation


Backpropagation

is the simplest and the most general methods
for supervised training of multilayer NN.


The basic approach in learning starts with an untrained
network and follows these steps:


Present a training pattern to the input layer.


Pass the signals through the net and determine the output.


Compare the output with the target values => difference (error)


The weights are adjusted to reduce the error.

Exercise


Implement the vowel classifier by using Neural Network.


Use the same speech samples that you use in VQ exercise.


What is the important feature to classify vowels?


Separate your samples into 2 groups: training and testing


Label the class for the training sample.


Train Multilayer
perceptron

from the training samples and perform testing on the
testing data.