Biometric Authentication via Anatomical Characteristics of the Oculomotor Plant

nauseatingcynicalSécurité

22 févr. 2014 (il y a 3 années et 8 mois)

77 vue(s)


T
echnical Report





1

1
00

1
01

1
02

1
03

1
04

1
05

1
06

1
07

1
08

1
09

1
10

1
11

1
12

1
13

1
14

1
15

1
16

1
17

1
18

1
19

1
20

1
21

1
22

1
23

1
24

1
25

1
26

1
27

1
28

1
29

1
30

1
31

1
32

1
33

1
34

1
35

1
36

1
37

1
38

1
39

1
40

1
41

1
42

1
43

1
44

1
45

1
46

1
47

1
48

1
49

1
50

1
51

1
52

1
53

1
54

1
55

1
56

1
57

1
58

1
59

1
60

1
61

1
62

1
63

1
64

1
65

1
66

1
67

1
68

1
69

1
70

1
71

1
72

1
73

1
74

1
75

1
76

1
77

1
78

1
79

1
80

1
81

1
82

1
83

1
84

1
85

1
86

1
87

1
88

1
89

1
90

1
91

1
92

1
93

1
94

1
95

1
96

1
97

1
98

1
99


Abstract


A novel biometrics approach that performs
authentication based on the anatomical characteristics of
the oculomotor plant (comprising the eye globe, its m
uscles
and the brain’s control signals) is presented. The
extraction of the oculomotor plant characteristics (OPC) is
achieved by analyzing the recorded eye movement
trajectories via a

2D

linear homeomorphic mathematical
representation of the oculomotor pl
ant. The derived OPC
allow authentication via various statistical methods and
information fusion techniques. Au
thentication based on
OPC yielded
Half

Total Error Rate of
25
% for a pool of 59
recorded subjects, w
hen the eye movement records were

affected by the re
-
calibration biases of the eye tracking
equipment. However, when the impact of re
-
calibration is
removed the designed methods allow the achievement of an
HTER close to
15
% for the same pool of subjects. The OPC
biometric authentication h
as high counterfeit resistance
potential, because it includes both behavioral and
physiological human attributes that are hard to reproduce
.

1.

Introduction

The methods of
biometric
identification
have
evolved
throughout history from basic measurements of hea
d
dimensions

[
1
]

to more advanced techniques
involving

f
ingerprints

[
2
]
,

i
ris

[
3
]
, and face recognition
[
4
]
.

But the
above
-
mentioned techniques are not completely
fraud
-
proof since they are based on human body
characteristics that can be replicate
d with modern
technological advances
[
2
-
5
]
. As a result there is a
significant need in biom
etrics

research

to identify methods
that are highly counterfeit resistant.
In this paper we present
a method that has potential to be highly counterfeit resistant
because it employs non
-
visible anatomical structures of the
human eye.

The human eye

already

provides
a
plethora

of
information useful for biometrics. The physical and
behavioral properties of the eye are employed in biometrics
based on the iris
[
6
]
, face recognition
[
4
]
, retina
[
7
]
,
periocular information
[
8
]
, recordings of the raw eye
position, velocity signal and pupil dilation
[
9
,
10
]
.

In terms of its anatomical structure, the eye provides a
unique opportunity for identification by containing a
multitude of anatomical components that
together comprise
the so
-
called

Oculomotor Plant

(OP). These components
are
the
eye globe and
its surrounding tissues, ligaments, six
extraocular muscles each containing thin and thick
filaments, tendon
-
like components, various tissues and
liquids
[
11
]
.
T
he dynamic and static
characteristics

of the
OP are represented by the eye globe's inertia, dependency
of an individual muscle's force on its length and velocity of
contraction, resistive properties of the eye globe, muscles
and ligaments, frequency characteristics of the neuronal
control sig
nal sent by the brain to the extraocular muscle
and the speed of propagation of this signal.

Individual
properties of the extraocular muscles vary depending on the
role each muscle performs. There are two roles: the agonist
-

muscle contracts and pulls the

eye globe in the required
direction and the antagonist
-

muscle expands and resists
the pull
[
12
]
.

Numerically evaluating the OP characteristics (OPC)
could yield a highly counterfeit resistant biometric method
because OPC represent

dynamic

behavioral and
physiological human attributes that only exist in a living
individual. Biometric authent
ication via OPC promises to
be highly repeatable because any type of random stimulus
ideally would produce
the
same OPC values.

Accurate estimation of the OPC is challenging due to the
secluded nature of the corresponding anatomical
components, which
neces
sitates

indirect estimation and
includes noises and inaccuracies associated with the eye
tracking equipment, classification and filtering of the eye
movement signal, mathematical representation of the OP,
and actual algorithms for numerical estimation of O
PC.
This work addresses these challenges and
presents methods

that
might eventually
allow

accurate
authentication of

an
individual based on the anatomical characteristics of the OP

using eye tracking technologies
.

Closely
related

work
.

In

general
the
OPC b
iometrics
approach

presented here

is similar to the one presented by
Komogortsev and colleagues

[
13
]

however
this work

advances
the
state of the art in
OPC biometrics

via

following major contributions
: a) ability to process two
dimensional eye movements vs. one dimensional

ones by
the use of a linear 2D

mathematical model of the human eye
b)

very thorough
establishment of the baseline
performance
via use of more accurate/
high sampling

Biometric Authentication via Anatomical Characteristics of the Oculomotor
Plant



Oleg V. Komogortsev
, Texas State University,

(ok11@txstate.edu),
Alex Karpov
,
Texas State
University,

(ak26@txstate.edu)
, Cecilia Aragon
, University of Washington

(aragon@uw.edu)


T
echnical Report





2

2
00

2
01

2
02

2
03

2
04

2
05

2
06

2
07

2
08

2
09

2
10

2
11

2
12

2
13

2
14

2
15

2
16

2
17

2
18

2
19

2
20

2
21

2
22

2
23

2
24

2
25

2
26

2
27

2
28

2
29

2
30

2
31

2
32

2
33

2
34

2
35

2
36

2
37

2
38

2
39

2
40

2
41

2
42

2
43

2
44

2
45

2
46

2
47

2
48

2
49

2
50

2
51

2
52

2
53

2
54

2
55

2
56

2
57

2
58

2
59

2
60

2
61

2
62

2
63

2
64

2
65

2
66

2
67

2
68

2
69

2
70

2
71

2
72

2
73

2
74

2
75

2
76

2
77

2
78

2
79

2
80

2
81

2
82

2
83

2
84

2
85

2
86

2
87

2
88

2
89

2
90

2
91

2
92

2
93

2
94

2
95

2
96

2
97

2
98

2
99

frequency eye tracking equipment and
a
large

amount of
recorded eye movements

c) addition

of

eye movement data
filtering

methods

to ensure high quality biometric data d)
use of different probabilistic approaches for person
authentication e) use of information fusion methods.
Methods applied in this work
could
achieve FAR and FRR
rate of
15
%

within each single recording sess
ion
vs
.

FAR
of
5.4% and FRR of 56.6% achieved by Komogortsev and
colleagues

within each single recording
.

2.

Biometric Authentication via Anatomical
Characteristics of
the
Oculomotor Plant

2.1.

Overview

Figure 1 presents an overview of the proposed method.
The rec
orded eye movement signal
from an
individual is
supplied to the
Eye Movement Classification

module
that
classifies
the
eye position signal into fixations

(
movements
that keep an eye focused on
a

stationary object of interest
)

and saccades

(extremely rapid
eye rotations between the
points of fixation)
.

OPC can be extracted only from a
dynamic eye movement such as saccade. Therefore, a
sequence of classified saccades
are sent to the second
module labeled
Oculomotor Plant Mathemati
cal Model
(OPMM)
, which gene
rates simulated saccade
s’

trajectories

based on the default OPC values that are grouped into a
vector with the purpose of matching the simulated
trajectories with the recorded ones.

Each individual saccade
is matched independently of any other saccade. Bot
h
recorded and simulated trajectories for each saccade are
sent to the
Error Function

module

where the error between
the trajectories is

computed.

The
error triggers
the
OPC
Estimation

module to optimize the values inside of the OPC
vector and sends them t
o the OPMM to generate more
accurate trajectories minimizing resulting error.

After
several iterations,

when the minimum error is obtained,

a
sequence of the optimized

OPC vectors is supplied to the
Information
Fusion

and
Person
Authentication

modules to
authenticate a
user. The Person
Authentication
module accepts
or rejects a user
based on the
recommendation
of a given
classifier.
The
Information
Fusion module
aggregates
information related to OPC vectors and works with
the
Person Authenticati
on module to authenticate a person
based on multiple classification sources.

2.2.

Eye Movement Classification

A
n a
utomated e
ye movement classification algorithm
plays a crucial role

in aiding the establishment of the
invariant representation for the subsequent estimation of
the OPC values. The goal of
this
algorithm is to

automatically

and
reliably identify
the
beginning, end and
all trajectory points from a very noisy and jittery ey
e
movement signal. The additional goal of the eye movement
classification algorithm is to provide additional filtering for
saccades to ensure their high quality and a sufficient
quantity of data for the
estimation of the
OPC
values
.

A standardized Velocit
y
-
Threshold (I
-
VT) algorithm
[
14
]

was selected due

to

its speed and robustness.
A
comparatively
high classification threshold of 70°/s was
employed to reduce the impact of trajectory noises at the
beginning and the end of each saccade. Additional filtering
discarded

saccades with amplitudes of less than 5°,
duration of less than 20 ms., and various trajectory artifacts
that do not belong to normal saccades.

2.3.

Oculomotor

Plant Mathematical Model

The OPMM has to be able to quickly simulate accurate
saccade trajectories w
hile containing major anatomical
components related to the OP.

The linear homeomorphic
2D
OP mathematical model
developed by Komogortsev and Jayarathna
[
15
]

was
selected. This OPMM, driven by twelve differential
equations, is capable of simulating saccades with properties
resembling normal humans on a 2D plane (e.g. computer
monitor) by considering physical properties of the eye
globe and four extraocular mus
cles: medial, lateral,
superior, and inferior recti.


The following advantages are associated with a
selection of this OPMM: 1) major anatomical components
are present and can be estimated, 2) linear representation
simplifies the estimation process of the

OPC while
producing accurate simulation data within
the
spatial
boundaries of a regular computer monitor, 3) the
architecture of the model allows dividing it into two smaller
models of the form that is described by Komogortsev and
Khan
[
16
]
. One of the smaller models becomes responsible
for the simulation of the horizontal component of
m
ovement and the other for the vertical. Such assignment,
while producing identical simulation results when
compared to the full model, allows a significant reduction
in the complexity of the required solution and allows
simultaneous simulation of both move
ment components on
a

multi
-
core system.


Buy SmartDraw
!
-

purchased copies print this
document without a watermark
.
Visit www
.
smartdraw
.
com or call
1
-
800
-
768
-
3729
.
Figure 1. Biometrics via anatomical
characteristics of the Oculomotor Plant.


T
echnical Report





3

3
00

3
01

3
02

3
03

3
04

3
05

3
06

3
07

3
08

3
09

3
10

3
11

3
12

3
13

3
14

3
15

3
16

3
17

3
18

3
19

3
20

3
21

3
22

3
23

3
24

3
25

3
26

3
27

3
28

3
29

3
30

3
31

3
32

3
33

3
34

3
35

3
36

3
37

3
38

3
39

3
40

3
41

3
42

3
43

3
44

3
45

3
46

3
47

3
48

3
49

3
50

3
51

3
52

3
53

3
54

3
55

3
56

3
57

3
58

3
59

3
60

3
61

3
62

3
63

3
64

3
65

3
66

3
67

3
68

3
69

3
70

3
71

3
72

3
73

3
74

3
75

3
76

3
77

3
78

3
79

3
80

3
81

3
82

3
83

3
84

3
85

3
86

3
87

3
88

3
89

3
90

3
91

3
92

3
93

3
94

3
95

3
96

3
97

3
98

3
99

2.4.

OPC vector

The following subset of nine OPC was selected
as
a
vector to represent an individual saccade for each
component of movement (horizontal and vertical): length
tension

-

the relationship between the length o
f a
n
extraocular

muscle and the force it is capable of exerting
,
series elasticity

-

resistive properties of a
n eye

muscle while
the muscle is innervated by the neuronal control signal
,
passive viscosity of the eye globe, force velocity
relationship

-

the

relationship between the velocity of a
n
extraocular

muscle extension/contraction and the force it is
capable of exerting

-

in the agonist muscle, force velocity
relationship in the antagonist muscle, eye globe inertia,
agonist and antagonist muscles’ tens
ion intercept that
ensures an equilibrium state during an eye fixation at
primary eye position,
the
agonist muscle’s tension slope,
and the antagonist muscle’s tension slope. All tension
characteristics are directly impacted by the neuronal
control signal
sent by the brain and therefore partially
contain the neuronal control signal information.

If both horizontal and vertical components of a saccade
are considered, the resulting OPC vector would contain
eighteen unique OP
C
.

2.5.

Error Function

The goal of the
Error Function module is to provide high
sensitivity to any differences between the recorded and
simulated saccade trajectories.

The error function was implemented as the absolute
difference between the saccades that are recorded by an eye
tracker and sacc
ades that are simulated by the OPMM.




|





|






(
1
)

where
n

is the number of points in a trajectory,



is a point
in a recorded trajectory and



is a corresponding point in a
simulated trajectory
.
The absolute
difference approach
provides an advantage over other estimations such as
RMSE due to its higher absolute sensitivity to the
differences between the trajectories.

2.6.

OPC Estimation

The goal of the OPC estimation module is to provide a
mechanism for optimizing
the values in the OPC vector to
ensure a minimum error between the simulated and
recorded saccadic trajectories.

The Nelder
-
Mead (NM) simplex algorithm

[
17
]

(fminsearch implementation in MATLAB)
is used

in a
form that allowed simultaneous estimation of all

OPC
vector

parameters

at the same time
.
A lower
boundary was
imposed to prevent reduction of each individual OPC value
to less than
10
% of its

default

value
.

Stability

degradation

of the numerical solu
tion for differential
equations
describing the OPMM served as an upper boundary

indicator

for each OPC parameter.

2.7.

Person Authentication

The goal of
the
Person Authentication module is to
confirm or reject claimed identity based on
the
comparison
of
the
two sets of OPC vectors.

One of the biggest challenges associated with
the
OPC
biometrics is the amount of variability present in the
estimated OPC. Experiments from which one might infer
the
variability

of OPC values are

almost non
-
existent in t
he
OP lit
erature. Usually
,

average

numbers are derived from
strabismus surgeries performed on a limited number of
patients
[
18
]
, and even from cat studies
[
19
]
. As a result it is
hard to estimate a priori the a
mount of variability of the
values for the OP properties in a large pool of normal
humans. We hypothesize that a substantial amount of
variability is present in the OPC to ensure accurate
authentication. Therefore, authentication methods that
allow address
ing variability concerns are required to make
OPC biometrics successful.

Two classifiers fit this purpose: a) two
-
sample Student’s
t
-
test
[
20
]

enhanced by voting and b) Hoteling's T
-
square
test
[
21
]
. Both methods are able to perform acceptance and
rejection tests. In the acceptance test, two datasets
belonging to the sa
me individual are compared. In the
rejection test, the datasets are taken from different people.
The outcome of each test determines the authentication
accuracy of the corresponding authentication approach.

2.7.1

Student’s t
-
test with Voting

The following Null
Hypothesis (H
0
) is formulated as a
part of the Student’s t
-
test given that two

sets of OPC
vectors
,

one from the user

i and

the other from the user

j
,

are
compared: “H
0
:There is no difference between the OPC’s
estima
tion sequences from the users

i and j”.
In order to
make a conclusion about the difference between

two users
,
the statistical significance (P
level
) resulting from the test is
compared to the significance threshold α. If the resulting
P
level

is smaller than α, the H
0

is rejected indicating that
the
OPC estimation belongs to different people. Otherwise, the
H
0

is accepted indicating that the OPC estimation belongs
to the same person.

The Student’s t
-
test approach allows performing an
authentication based on just a single OPC, therefore not
taking

immediate

advantage of the potential information
included in other OPC. In this work we enhance the
Student’s t
-
test by considering voting methods described
by Lam and Suen
[
22
]
. Such method accepts a person
assuming
that for at least k OPC the H
0

is accepted and
rejects a person if H
0

is accepted for less than k OPC.

The
performance of the
Student’s t
-
test with
v
oting

is affected
by the significance threshold and number of votes k.

2.7.2

Hoteling’s T
-
square Test

Hoteling’s

T
-
s
quare t
est

[
21
]

is
a multivariate
representation of the
Student’s t
-
test

and therefore provides
an opportunity to assess the similarity of the multivariate
distribution for the entire OPC vector

instead of just single

T
echnical Report





4

4
00

4
01

4
02

4
03

4
04

4
05

4
06

4
07

4
08

4
09

4
10

4
11

4
12

4
13

4
14

4
15

4
16

4
17

4
18

4
19

4
20

4
21

4
22

4
23

4
24

4
25

4
26

4
27

4
28

4
29

4
30

4
31

4
32

4
33

4
34

4
35

4
36

4
37

4
38

4
39

4
40

4
41

4
42

4
43

4
44

4
45

4
46

4
47

4
48

4
49

4
50

4
51

4
52

4
53

4
54

4
55

4
56

4
57

4
58

4
59

4
60

4
61

4
62

4
63

4
64

4
65

4
66

4
67

4
68

4
69

4
70

4
71

4
72

4
73

4
74

4
75

4
76

4
77

4
78

4
79

4
80

4
81

4
82

4
83

4
84

4
85

4
86

4
87

4
88

4
89

4
90

4
91

4
92

4
93

4
94

4
95

4
96

4
97

4
98

4
99

parameter as it was done in
the
Student’s t
-
test
.

The
performance of the Hoteling’s T
-
square test
is only affected
by the

value of

significance threshold.

2.8.

Information Fusion

Information fusion techniques allow
improv
ement of the
overall accuracy of an authentication method by
considering the information from multiple classifiers
[
23
]
.
Fusion techniques are usuall
y broken into two categories
:

a) fusion prior to matching and b) fusion after matching.
Fusion prior to matching consolidates information for
classifiers prior to the identity match. Fusion after matching
consolidates classification results when identity d
ecision is
already made by each classifier. Both types of fusion are
employed in our work to increase the accuracy of the
OPC
-
based authentication
.

2.8.1

Fusion Prior to Matching

Our fusion approach
fuses OPC vectors estimated from
horizontal and vertical moveme
nts

taking advantage of the
two dimensional Oculomotor plant model
,

effectively
doubling the number of OPC

parameters

in the combined
OPC
vector. We call this type of fusion horizontal fusion

2.8.2

Fusion After Matching

In the fusion after matching category we
consider a
decision level fusion technique proposed by Daugman in a
form of AND/OR approach [22]. For simplicity we call this
method
logical fusion
. The AND method only accepts an
individual if all of the classifiers accept the individual
,

therefore provid
ing an opportunity to reduce the combined
false acceptance rate and increase the resulting false
rejection rate. The OR method only accepts an individual if
one of the classifiers accepts such individual therefore
providing an opportunity to increase the c
ombined false
acceptance rate and decrease the combined false rejection
rate.

3.

Methodology

3.1.

Apparatus & Software

The data was recorded using the EyeLink
II

eye tracker
with a sampling
frequency

of 1000Hz
[
24
]
.
The
EyeLink
II
provides

drift

free

eye tracking with a spatial resolution of
0.01
º,
and 0.25
-
0.5
º

of positional accuracy.
EyeLink
II
enables eye to camera distances between 60 and 150cm and
horizontal and vertical operating range

of 55°
and 45°
respectively
.
To ensure high accuracy of the eye movement
recording a chin rest was employed. The chin rest was
positioned to assure 70cm distan
ce between the display
surface and the eyes of the subject.

The OPC biometrics architecture was implemented in
MATLAB. All data was processed offline.

3.1.

Participants

A total of
59

participants (
46

males/
13

females), ages
18



45

years with an average age of
24 (SD=6.1
), volunteered
for the project.
M
ean

positional

accuracy of

the recordings
averaged between all screen regions was 1.41
º (SD=
1.91
º)
.

All subjects participated in the two recording sessions
that presented identical eye movement invocation tasks
w
ith approximately a 20 minute break between the
sessions. Before each recording session, for each subject
and eye movement invocation task, the eye tracking
equipment was recalibrated to ensure high positional
accuracy of the recorded data.

3.2.

Stimuli

& Resul
ting Datasets

The goal of the stimulus
was

to invoke a large number of
vertical and horizontal saccades to allow reliable
authentication. The stimulus
was

displayed as a jumping
dot, consisting of a grey disc sized approximately 1
º

with a
small black point

in the center. The dot perform
ed

100
jumps horizontally and 100 jumps vertically.

The amplitude of the vertical jumps
was

20
º

for all
subjects. However, horizontal jumps had the amplitude of
20
º for approximately half

of the subjects

(27)

and 30º

for
another half

(32)
. The
variation in
the horizontal
amplitudes allow
ed

assessing classification performance
due to stimulus changes while
fixed
vertical amplitude
allow
ed

testing for the scalability of the OPC biometrics for
a larger pool of individuals.

T
he h
orizontal

component of movement from horizontal
saccades with

20
º amplitude and
the
vertical

component of
movement from the vertical saccades

with
20
º

amplitude

obtained from first 27 subjects comprised

Dataset I.
The
h
orizontal

component of movement f
rom horizontal
saccades with

3
0
º amplitude and
the
vertical

component of
movement from the vertical saccades

with
20
º

amplitude

recorded from the remaining 32 subjects comprised

Dataset
I
I
.

Dataset I+II combined data from datasets I and II.

The use of just horizontal movement components from
purely horizontal saccades and vertical component from
purely vertical saccades allows substantial
improvement of
the quality of data employed for authentication by
disregarding orthogonal movement jitter
. Additionally,
such eye movement data allows
a
subsequent check for
saccade normality by
filtering via
the corresponding
amplitude
-

duration and amplitude
-

maximum velocity
relationships (main
-
sequence relationship)
[
12
]

and discard
outliers.

Each dataset represents the data from the first and second

recording sessions for all subjects. Each subject in a dataset
is represented by 30 “best” saccades in cases when only one
movement component is considered and 60 in cases when
both horizontal and vertical movement components are
considered. Best saccades
are defined as the saccades that
produce the smallest error between the recorded and the

T
echnical Report





5

5
00

5
01

5
02

5
03

5
04

5
05

5
06

5
07

5
08

5
09

5
10

5
11

5
12

5
13

5
14

5
15

5
16

5
17

5
18

5
19

5
20

5
21

5
22

5
23

5
24

5
25

5
26

5
27

5
28

5
29

5
30

5
31

5
32

5
33

5
34

5
35

5
36

5
37

5
38

5
39

5
40

5
41

5
42

5
43

5
44

5
45

5
46

5
47

5
48

5
49

5
50

5
51

5
52

5
53

5
54

5
55

5
56

5
57

5
58

5
59

5
60

5
61

5
62

5
63

5
64

5
65

5
66

5
67

5
68

5
69

5
70

5
71

5
72

5
73

5
74

5
75

5
76

5
77

5
78

5
79

5
80

5
81

5
82

5
83

5
84

5
85

5
86

5
87

5
88

5
89

5
90

5
91

5
92

5
93

5
94

5
95

5
96

5
97

5
98

5
99

stimulated trajectory. Such filtering enable
s

further
improve
ment of

the quality of the data employed for
the
authentication.

3.3.

Performance evaluation metrics

The follo
wing
metrics
were

employed

to assess
authentication accuracy:


False Acceptance Rate (FAR)


expresses, in general,
the probability that a given individual is falsely accepted
into the system. This rate was computed as the number of
rejection tests that fa
iled
(marking

two

different subjects

as
same person
)

divided by the total number of rejection tests
performed.

False Rejection Rate (FRR)


expresses, in general
, the
probability that a given individual is falsely rejected from
the system while it should be accepted. In this work, the
FRR was computed as the number of acceptance tests that
failed

(marking two datasets from the
same person as being from different
p
eople)

divided by the total number of
acceptance tests performed.

Half total error rate (HTER) is defined
as the averaged combination of false
acceptance and false rejection rates.

4.

Results

Principal
component analysis (PCA)
was performed on nine OPC that
comprise
an OPC vector in an effort to reduce the
number
of parameters needed for the
authentication. Results of PCA indicate
that series elasticity, passive viscosity of
the eye globe, eye globe’s inertia, agonist
muscle’s tension slope, and the antagonis
t
muscle’s tension slope account for 77% of
total

variance in the recorded data
. Only
authentication results involving these six
OPC for each of the movement
component are presented further by Table
I.

4.1.

Equipment Calibration Impact

S
accades
employed from
th
e same
recording session

for enrollment and
authentication
frequently
result
ed

in the

smalle
r

HTER

than in cases when
saccades from one session were

used for
the
enrollment and from another for
the
a
uthentication
. For example for the T(hor)
method, Dataset

I+II, single session
HTER are 23
-
25.5
%
, while combined
session increased HTER to 32%.
We
hypothesize tha
t such difference
s might

have
occurre
d

due to

biases
introduced

by
the calibration procedure and can be explained by

mathematics employed in Eye Link e
ye tracker
for
calibration

and subsequent interpolation of the
eye gaze
coordinates
.

During data collection all experimental parameters
including
the
subject’s head and equipment placement were
carefully controlled by using a chinrest with a goal of
keepin
g them exactly the same between the recording
sessions. However, the results of the calibration procedure
which matches eye gaze vector with screen coordinates
performed by the Eye Link eye tracker were different for
each recording session for the same sub
ject, within
specified positional accuracy error. This phenomenon can
possibly
be
explained by the calibration algorithms
employed by the Eye Link eye tracker.
The
Eye
L
ink
system uses algorithms similar to Stampe’s
[
25
]
, which
employs 2D regression
-
based gaze interpolation between
Table I.

Performance of the OPC biometrics for various authentication methods and
datasets expressed in the HTER (numbers show percentages). In the Methods & Data
Description column T represents Hoteling’s T
-
square test and S represents Students
t
-
test with voting
. (hor) represents data from horizontal movement component of
horizontal saccades, (ver) represents data from vertical movement of vertical saccades,
(hor,ver) represents the case of horizontal data fusion. OR and AND represent logical
fusion techniques. D
ataset row indicates datasets described in Section 3.2 and Dataset
I+II combines subjects’ records from Datasets I and II. Session row represents
recording session data (Section 3.1), i.e. 1 indicates that first half of the saccades from
session 1 was empl
oyed for the enrollment and the other half was employed for the
authentication and 1+2 indicates that the saccades from session 1 were employed for
the enrollment and saccades from session 2 were employed for the authentication and
vice versa.
Note that re
sults related to Students t
-
test with voting represent values
obtained with 4 votes

(8 votes in case of horizontal fusion)
. Significance threshold
α

for
Students t
-
test and Hoteling’s T
-
square test was 0.1

Dataset
Session
1
2
1+2
1
2
1+2
1
2
1+2
1
29
24.5
34
25.5
35
36.5
23
25.5
32
2
35
33
36.5
30
42.5
35.5
33
36
34
3
22
29.5
38.5
31
33
37
25
29
35.5
4
22
29.5
36
44
43.5
40
38.5
37
36
5
T(hor,ver)
32.5
18.5
25
22.5
36
35
25
26.5
29.5
6
S(hor,ver)
24
30.5
31.5
30.5
33.5
33.5
22.5
26.5
32
7
29.5
28.5
31
28.5
35.5
33
24
27.5
31.5
8
36.5
33
33.5
33.5
42.5
35.5
50
50
33.5
9
23.5
32.5
31.5
31.5
34
36.5
27
33.5
32
10
24.5
33
33
39
41
39
29
36.5
33.5
11
28.5
23
33
28.5
35
33.5
24.5
25.5
30
12
34
33
32.5
34.5
45
34
32.5
37
32
13
32
21
35
29.5
40.5
38.5
28.5
28.5
35
14
28.5
23
28.5
30.5
31.5
42.5
24
27
33.5
15
24
24
35
22.5
37.5
35.5
25.5
27
34
16
24
26
37.5
24.5
33.5
41
21.5
23.5
38
T(ver)
Method &
Data
Description
I
II
I+II
T(hor)
S(hor)
S(ver)
T(hor)
OR
T(ver)
S(hor)
OR
S(ver)
T(hor)
OR
S(hor)
T(ver)
OR
S(ver)
T(hor,ver) AND S(hor,ver)
T(hor,ver) OR S(hor,ver)
T(hor)
AND
S(hor)
T(ver)
AND
S(ver)
T(hor)
AND
T(ver)
S(hor)
AND
S(ver)

T
echnical Report





6

6
00

6
01

6
02

6
03

6
04

6
05

6
06

6
07

6
08

6
09

6
10

6
11

6
12

6
13

6
14

6
15

6
16

6
17

6
18

6
19

6
20

6
21

6
22

6
23

6
24

6
25

6
26

6
27

6
28

6
29

6
30

6
31

6
32

6
33

6
34

6
35

6
36

6
37

6
38

6
39

6
40

6
41

6
42

6
43

6
44

6
45

6
46

6
47

6
48

6
49

6
50

6
51

6
52

6
53

6
54

6
55

6
56

6
57

6
58

6
59

6
60

6
61

6
62

6
63

6
64

6
65

6
66

6
67

6
68

6
69

6
70

6
71

6
72

6
73

6
74

6
75

6
76

6
77

6
78

6
79

6
80

6
81

6
82

6
83

6
84

6
85

6
86

6
87

6
88

6
89

6
90

6
91

6
92

6
93

6
94

6
95

6
96

6
97

6
98

6
99

the calibration points. Stampe’s algorithms do not
guarantee head pose invariance. As a result the
interpolation is extremely sensitive to even slight
differences in estimation

of the eye
-
gaze direction for any
point (especially in periphery) in
the
calibration map and
even
the
slightest changes in subject’s head position are
translated into substantial calibration biases present in the
recorded data. We hypothesize that the app
lication of the
3D model
-
based eye
-
gaze estimation
[
26
]

as a part of the
calibration
and eye gaze components
might prove
beneficial for the OPC biometrics due to higher head pose
invariance. However very high
-
sampling frequency
commercial eye
-
tracking systems that use such ty
pe of gaze
estimation are not available yet.

4.2.

Impact of Person Authentication Methods

As shown in

rows 1
-
4 in Table I
,

Hoteling’s T
-
square test
in general produced slightly more accurate authentication
results than Student’s t
-
test with voting for all datasets

under consideration
. For example
,

in Dataset I+II
,

the
Hoteling’s T square test produced HTER of 32% for
horizont
al while Student’s t
-
test with voting produced
HTER of 35.5%.

4.3.

Impact of
2D Eye Movements &
Horizontal
Fusion

In

rows
7
-
8
of

Table I
,

the consideration of both the
vertical and horizontal components of the eye movements
via horizontal and vertical fusion en
sured slightly more
accurate authentication. This trend was true for both
Hoteling’s T
-
square test and Student’s t
-
test with voting.
For example in Dataset I+II the results of the Hoteling’s T
square test produced HTER of 32% for horizontal and 34%
for ver
tical data
;

however
,

the use of vertical and horizontal
components of movement via

horizontal fusion reduced
HTER to 29.5%.

4.4.

Impact of Logical Fusion

According to rows
7
-
1
4

in Table I
,

application of logical
fusion provided

a

slight increase in the authenti
cation
accuracy when compared to pure person authentication
methods. For example
,

in Dataset I+II
,

Hoteling’s T square
test for the horizontal component produced HTER of 32%
and Student’s t
-
test with voting produced HTER of 35.5%
;

however
,

logical fusion
reduced the HTER to 30% (row
11).

Logical fusion did not always provide the improvement
in accuracy when compared to horizontal/vertical fusion.
For example
,

in Dataset I+II
,

Hoteling’s T square test with
horizontal fusion yielded HTER of 29.5% and all

logical
fusion methods provided HTER that was slightly higher.

Approaches that involved both
logical and
horizontal
and vertical fusion

4.5.

Impact of Stimuli Properties

The results from horizontal data presen
ted in row 1 and 3
of Table I did

not indicate large changes when
different

amplitude of step stimulus was presented. The
corresponding HTER
s

were 4
-
16.5% smaller
within each
single session

than between sessions

for the group of
subjects with
presented
stimulus amplitude of 30° and the
gr
oup of subjects
presented
with stimulus amplitude of 20°.
Multiple session results differed slightly by <2%.

For

the
Dataset I+II
, where
different stimulus amplitudes
were used for different subject groups
,

the accuracy of
authentication

was

improved

by 1
-
4.5%.
This

result

suggest
s

that stimulus amplitude does impact the results of
biometric authentication, however slightly. Additional
research is
required for additional clarification.

4.6.

Scalability of the OPC biometrics

The results from the vertical d
ata presented in row 2 and
4 of the Table I indicate the scalability potential of the OPC
biometrics, because horizontal component of movements
considers saccades of the same amplitude. When the
amount of subjects was increased from 27 to 59

the results
of

the authentication did not change significantly
.

The
corresponding HTER remained
almost the same
for both
Students t
-
test and Hoteling’s T
-
square test
.

Multiple
session results
also
were affected very slightly with
accuracy
decreasing

by
1.5
-
4.5% in terms

of HTER for
both authentication methods
for the larger group of
subjects.

4.7.

Receiver Operating Characteristics Curve

Figure 2 presents
a
Receiver Operating Characteristics
(ROC) c
urve
. The results include a mix of best performing
methods with and without f
usion according to the
corresponding HTER for the data from Table I.

5.

Limitations

Recording Equipment
:

The OPC biometrics
exploration done in this work was conducted on
very
accurate eye tracking equipment with a very high sampling
rate. Subjects were posi
tioned in a chinrest to avoid
potential accuracy issues. Additional research is required to
understand the

tradeoffs between the authentication
accuracy of
the
OPC biometrics
and
equipment
’s

sampling
rate, positional accuracy,
and
free
dom of

head movements
.

Calibration:

The results indicate that there is a

large
impact
of
calibration methods

on
the
resulting accuracy of
authentication
,

necessitating further investigation
into
making OPC biometrics components less dependent on the
calibration or/and employin
g eye tracking calibration
techniques that maintain calibration consistency between
the recording sessions
.

Stimulus
:
The jumping dot stimulus employed in this

T
echnical Report





7

7
00

7
01

7
02

7
03

7
04

7
05

7
06

7
07

7
08

7
09

7
10

7
11

7
12

7
13

7
14

7
15

7
16

7
17

7
18

7
19

7
20

7
21

7
22

7
23

7
24

7
25

7
26

7
27

7
28

7
29

7
30

7
31

7
32

7
33

7
34

7
35

7
36

7
37

7
38

7
39

7
40

7
41

7
42

7
43

7
44

7
45

7
46

7
47

7
48

7
49

7
50

7
51

7
52

7
53

7
54

7
55

7
56

7
57

7
58

7
59

7
60

7
61

7
62

7
63

7
64

7
65

7
66

7
67

7
68

7
69

7
70

7
71

7
72

7
73

7
74

7
75

7
76

7
77

7
78

7
79

7
80

7
81

7
82

7
83

7
84

7
85

7
86

7
87

7
88

7
89

7
90

7
91

7
92

7
93

7
94

7
95

7
96

7
97

7
98

7
99

work was purposefully fixed in amplitude and exhibited a
large number of jumps. Such fixed exper
imental
parameters allowed
us
to establish a baseline for the OPC
biometric performance in
an
environment that is close to
ideal. However, additional work is required to understand
the OPC biometric performance for saccades that have
randomized amplitudes,

various spatial placement, and
different quantities.

OPC Estimation Speed
:

The estimation of an optimal
OPC vector containing nine parameters from a single
saccade takes on average two hours on an Intel Q6600
processor, using one core and assuming MATLAB

implementation of the
fminsearch

function. However, the
OPC biometrics architecture is highly
parallelizable
, with
each individual saccade trajectory easily processed by a
separate core.
Additionally, implementation
in a
programming language such as C/C++

might speed up the
estimation process.

The linear
design of the OPMM makes it possible to seek
analytical solutions to the differential equations describing
the model, therefore providing an opportunity for the direct
extraction of the OPC from saccade trajectories. However
the search for such a solution
presents a substantial
analytical challenge and will be explored in future work.

Stability of the OPC trait
:
T
he t
ime interval between
the recording sessions for each subject was approximately
20 min. Such
a
time difference provides
extremely limited insig
ht in terms of the
stability of the OPC biometrics over a
longer time span and impact of such
factors as stress, fatigue, aging and
illness. Additional research
needs
to be
conducted to explore
the
long term
stability of the OPC trait.

6.

Discussion

Highest
Possible
Accuracy

between

Multiple Sessions
:

I
t is possible to
perform an authentication
select
ing
“best”

combination of OPC

for each
dataset, session set, and authentication
method.
These best combinations
, in the
multiple session categories,

allow

reduc
i
ng

the
HTER by 4
-
9% compared to
the

results presented by Table II.

The best
combination of OPC parameters for
Dataset I+II, Session 1+2 with a single
authentication method provided the
HTER of 29% in case of S(hor) method.

T
he
best

overall result in the Da
tabase
I+II, Session 1+2

aided by horizontal and
logical fusion

was produced by
T(hor,ver) OR S(hor,ver)

method with
HTER of 2
5
%

(FAR=25%, FRR=25%)
.
The ROC curve for the best performing
method is displayed by Figure 2.

Highest
Possible
Accuracy

within Single Sessions
:

The

best combination of OPC parameters
for Dataset I+II
within a single session and a single authentication method
achieved the HTER of 16.5% (FAR=16% and FRR=17%)
for T(hor) method during Session 1. Overall best
performance in the

Database I+II, Session 1+2 aided by
horizontal and logical fusion was produced by
T(hor,ver)
OR S(hor,ver)

method with HTER of 15% (FAR=15%,
FRR=15%). The ROC curve for the best performing
method is displayed by Figure 2.

Large difference in the performan
ce in cases when
saccades from different and the same session (HTER=25%
vs. HTER=15%) are employed for the enrollment and
verification
again
support the hypothesis that eye tracker
calibration methods might be responsible for such
differences.
Future

inves
tigation that involves eye tracking
equipment with different calibration methods is required.

7.

Conclusion

and Fu
ture

Work

This paper outlined and explored a novel biometrics
approach that allows person identification via anatomical
characteristics of the
Oculomotor Plant (OP). Given the
limited pool of 59 volunteers
,

the OPC biometrics in the
authentication mode achiev
ed

the

HTER

of
25
%

in the
Figure 2.
Receiver Operating Characteristic curves for best performing methods of
biometric authentication via OPC. Biometric methods and datasets are coded similar to
the data presented in Table I.

0
10
20
30
40
50
60
70
80
90
100
0
10
20
30
40
50
60
70
80
90
100
False Rejection Rate (FRR)
False Acceptance Rate (FAR)
Reciever Operating Characteristics
T(hor), Dataset I+II, S 1+2
T(ver), Dataset I+II, S 1+2
T(hor, ver), Dataset I+II, S 1+2
T(hor) AND S(hor), Dataset I+II, S 1+2
T(hor,ver) OR S(hor,ver), Dataset I+II, S 1
T(hor,ver) OR S(hor,ver), Dataset I+II, S1, Overall best
T(hor,ver) OR S(hor,ver), Dataset I+II, S1+2, Overall best

T
echnical Report





8

8
00

8
01

8
02

8
03

8
04

8
05

8
06

8
07

8
08

8
09

8
10

8
11

8
12

8
13

8
14

8
15

8
16

8
17

8
18

8
19

8
20

8
21

8
22

8
23

8
24

8
25

8
26

8
27

8
28

8
29

8
30

8
31

8
32

8
33

8
34

8
35

8
36

8
37

8
38

8
39

8
40

8
41

8
42

8
43

8
44

8
45

8
46

8
47

8
48

8
49

8
50

8
51

8
52

8
53

8
54

8
55

8
56

8
57

8
58

8
59

8
60

8
61

8
62

8
63

8
64

8
65

8
66

8
67

8
68

8
69

8
70

8
71

8
72

8
73

8
74

8
75

8
76

8
77

8
78

8
79

8
80

8
81

8
82

8
83

8
84

8
85

8
86

8
87

8
88

8
89

8
90

8
91

8
92

8
93

8
94

8
95

8
96

8
97

8
98

8
99

optimal set of the OPC parameters and
when eye movement
records were affected by the eye tracking equipment
calibr
ation biases. When
the
impact of the calibration
procedure was removed the resulting
HTER

was

15
%

in the
best case
.

Among statistical methods employed for person
authentication Hoteling’s T
-
square test in general provided
higher accuracy. Information fusi
on prior to matching,
combining horizontal and vertical components, achiev
ed

slightly
higher
authentication accuracy than when no fusion
was performed.
The same
was true for the logical fusion
methods employed after the initial
match was done by a

separate

person authentication method. Application of both
information fusion types at the same time, in general, did
not provide higher authentication accuracy
than
employment of a single type of a fusion method.

An
increase
in the number of subjects from 27 to 5
9 did not
decrease the authentication performance.
The assignment
of different stimuli amplitude
s

to different subject groups
very slightly improved authentication accuracy.

It is important to conduct more work to ensure OPC
biometrics independence from equipment calibration
biases, because this is the main factor degrading
authentication performance. Additional work should be
performed to allow faster estimation of
the
OPC va
lue
s.
The stability of biometrics
needs to be verified against a
more diverse array of stimuli, eye tracking equipment,
larger group of subjects and a longer time span.

8.

Acknowledgements

This word was funded by grant #
60NANB10D213

from

t
h
e

National Institute
of Standards (NIST).

9.

References

[1]

L. D. Harmon, M. K. Khan, R. Lasch, and P. F. Ramig,
"Machine identification of human faces,"
Pattern
Recognition,
vol. 13, pp. 97
-
110, 1981.

[2]

A. Jain, L. Hong, and Y. Kulkarni, "A Multimodal Biometr
ic
System Using Fingerprint, Face, and Speech," in
International Conference on AVBPA
, 1999, pp. 182
-
187.

[3]

J. G. Daugman, "High Confidence Visual Recognition of
Persons by a Test of Statistical Independence,"
IEEE Trans.
Pattern Anal. Mach. Intell.,
vol.

15, pp. 1148
-
1161, 1993.

[4]

L. Wiskott, "Face recognition by elastic bunch graph
matching," 1997, pp. 129
-
129.

[5]

J. M. Williams, "Biometrics or ... biohazards?," presented at
the Proceedings of the 2002 workshop on New security
paradigms, Virginia Beac
h, Virginia, 2002.

[6]

K. Hollingsworth, K. W. Bowyer, and P. J. Flynn, "All Iris
Code Bits are Not Created Equal," in
Biometrics: Theory,
Applications, and Systems, 2007. BTAS 2007. First IEEE
International Conference on
, 2007, pp. 1
-
6.

[7]

A. Jain, L. Ho
ng, and S. Pankanti, "Biometric identification,"
Commun. ACM,
vol. 43, pp. 90
-
98, 2000.

[8]

U. Park, A. Ross, and A. K. Jain, "Periocular biometrics in
the visible spectrum: a feasibility study," presented at the
Proceedings of the 3rd IEEE international c
onference on
Biometrics: Theory, applications and systems, Washington,
DC, USA, 2009.

[9]

P. Kasprowski and J. Ober, "Eye Movements in Biometrics,"
in
Biometric Authentication
, ed, 2004, pp. 248
-
258.

[10]

R. Bednarik, T. Kinnunen, A. Mihaila, and P. Fränti
,
"Eye
-
Movements as a Biometric," in
Image Analysis
, ed,
2005, pp. 780
-
789.

[11]

D. R. Wilkie, "Muscle,"
Studies in Biology,
vol. 11, 1976.

[12]

R. J. Leigh and D. S. Zee,
The Neurology of Eye Movements
:
Oxford University Press, 2006.

[13]

O. V. Komogortse
v, U. K. S. Jayarathna, C. R. Aragon, and
M. Mechehoul, "Biometric Identification via an Oculomotor
Plant Mathematical Model," in
ACM Eye Tracking Research
& Applications Symposium
, Austin, TX, 2010, pp. 1
-
4.

[14]

O. V. Komogortsev, U. K. S. Jayarathna, D.

H. Koh, and M.
Gowda, "Qualitative and Quantitative Scoring and
Evaluation of the Eye Movement Classification Algorithms,"
in
ACM Eye Tracking Research & Applications Symposium
,
Austin, TX, 2010, pp. 1
-
4.

[15]

O. V. Komogortsev and U. K. S. Jayarathna, "2
D
Oculomotor Plant Mathematical Model for eye movement
simulation," in
IEEE International Conference on
BioInformatics and BioEngineering (BIBE)
, 2008, pp. 1
-
8.

[16]

O. Komogortsev, V. and J. Khan, "Eye Movement Prediction
by Kalman Filter with Integrated
Linear Horizontal
Oculomotor Plant Mechanical Model," in
ACM Eye Tracking
Research & Applications Symposium
, Savannah, GA, 2008,
pp. 229
-
236.

[17]

J. C. Lagarias, J. A. Reeds, M. H. Wright, and P. E. Wright,
"Convergence Properties of the Nelder
--
Mead Simp
lex
Method in Low Dimensions,"
SIAM J. on Optimization,
vol.
9, pp. 112
-
147, 1998.

[18]

C. C. Collins, "The human oculomotor control system,"
Basic Mechanisms of Ocular Motility and Their
Applications,
pp. 145
-
180, 1975.

[19]

C. Collins, "Orbital mechanics," in
The Control of Eye
Movements
, P. Bach
-
y
-
Rita and C. Collins, Eds., ed New
York: Academic, 1971.

[20]

B. J. Winer, D. R. Brown, and K. M. Michaels,
Statistical
Principles in Experimental Design
: McGraw
-
Hill, 1991.

[21]

H. Hotelling, "The Generalization of Student's Ratio,"
The
Annals of Mathematical Statistics,
vol. 2, pp. 360
-
378, 1931.

[22]

L. Lam and S. Y. Suen, "Application of majority voting to
pattern recognition: an analysis of its behavior and
performance,"
Syste
ms, Man and Cybernetics, Part A:
Systems and Humans, IEEE Transactions on,
vol. 27, pp.
553
-
568, 1997.

[23]

A. A. Ross, K. Nandakumar, and A. K. Jain,
Handbook of
Multibiometrics
: Springer, 2006.

[24]

EyeLink. (2010).
EyeLink II
. Available:
http://www.sr
-
research.com/EL_1000.html

[25]

D. Stampe, "Heuristic filtering and reliable calibration
methods for video
-
based pupil
-
tracking systems,"
Behavior
Research Methods,
vol. 25, pp. 137
-
142, 1993.

[26]

D. W
. Hansen and J. Qiang, "In the Eye of the Beholder: A
Survey of Models for Eyes and Gaze,"
Pattern Analysis and
Machine Intelligence, IEEE Transactions on,
vol. 32, pp.
478
-
500, 2010.