Automated Web-Based Behavioral

hesitantdoubtfulAI and Robotics

Oct 29, 2013 (3 years and 9 months ago)

86 views

Automated Web
-
Based Behavioral
Test for Early Detection of
Alzheimer’s Disease

Eugene Agichtein*
,
Elizabeth Buffalo
,
Dmitry Lagun
, Allan Levey, Cecelia
Manzanares, JongHo Shin,
Stuart Zola

Intelligent

Information

Access Lab

Emory
University


Emory IR Lab:
Research Directions


Modeling

collaborative

content
creation
for information
organization

and
indexing
.

2


Mining

search

behavior data

to
improve
information finding.


Medical applications
of Search, NLP,
behavior modeling.

Mild Cognitive Impairment (MCI) and
Alzheimer’s Disease


Alzheimer’s disease (AD)
affects more than 5M
Americans, expected to grow in the coming decade


Memory impairment (aMCI)

indicates onset of AD (affects

hippocampus first)


Visual Paired Comparison

(VPC)
task: promising for

early diagnosis of both MCI

and AD
before it is detectable

by other means

3

VPC: Familiarization Phase

4

VPC: Delay Phase








Delay

5

VPC: Test Phase








6

VPC Task: Eye Tracking Equipment

7

8

Subjects with Normal Visual Recognition
Memory > 66% of time on
Novel

Images

VPC:

Low Performance

Indicates

Increased Risk

for Alzheimer’s Disease

1.
Detects
onset
earlier than

ever before possible

2. Sets stage for

intervention

9

Eugene Agichtein, Emory University

Behavioral Performance on the VPC test is a
Predictor

of Cognitive Decline


Eugene Agichtein, Emory University

10

[Zola et al., AAIC 2012]

Scores on the VPC task accurately predicted,
up to three years prior

to a change in clinical
diagnosis, MCI patients who would progress
to AD, and Normal subjects who would
progress to MCI

VPC: Gaze Movement Analysis

11

Lagun et al., Journal of Neuroscience Methods, 2011

Visual examination behavior in the VPC test phase.

In this representative
example, the familiar image is on the left (A), and the novel image is on the right
(B), for a normal control subject. The detected gaze positions are indicated by
blue circles, with the connecting lines indicating the ordering of the gaze
positions.

Technical Contribution: Eye Movement Analysis

12

Lagun et al., Journal of Neuroscience Methods, 2011

Significant Performance Improvements

13

Method

Features

Accuracy

Sensitivity

Specificity

AUC

Baseline

NP

0.667

0.6

0.734

0.667

LR

NP+SO+RF+FD

0.71

0.712

0.707

0.71

SVM

NP+SO+RF+FD

0.869* (+30
%)

0.967* (+61%)

0.772* (+5%)

0.869* (+30%)

Lagun et al., Journal of Neuroscience Methods, 2011

Our Big Idea:
Web
-
based

VPC task (
VPW
)





with E. Buffalo, D. Lagun, S. Zola


Web
-
based version of VPC

without

an eye tracker



Can be administered

anywhere in the world


on any modern computer.



Can adapt classification
algorithms

to automatically
interpret

the viewing data collected with VPW


14

VPC
-
W Architecture

15

VPC
-
W: basic prototype demo

Delay

ViewPort


position

Familiarization (identical
images)

Test (novel image on left)

16

Experiment Overview


Step 1:
Optimize

VPC
-
W on (presumably) Normal
Control (NC) subjects



Step 2:
Analyze

VPC
-
W subject behavior with both
gaze tracking and viewport tracking simultaneously



Step 3:
Validate

VPC
-
W
prediction
on discriminating
Impaired (MCI/AD) vs. NC

17

VPC
-
W:
Novelty Preference
Preserved

Delay

(seconds)

Mean novelty

preference, VPC
(N=30)

Mean novelty

preference, VPC
-
W
(N=34)

10

67%

65%

60

68%

69%

Self
-
reported

elderly

NC

subjects

tested

with

VPC
-
W

over

the

internet

exhibit

similar

novelty

preference

to

that

of

VPC
.


Single
-
factor

ANOVA

reveals

no

significant

difference

between

VPC

and

VPC
-
W

subjects

18

VPC vs. VPC
-
W: Similar Areas of Interest

VPC ranking

VPC
-
W

ranking

Quantifying

viewing

similarity
:

Coarse

measure
:

divide

into

9

regions

(
3
x
3
),

rank

by

VPC

and

VPW

viewing

time
.

The

Spearman

rank

correlation

varies

between

0
.
56

and

0
.
72

for

different

stimuli
.

VPC

VPC
-
W

Areas

of

attention
:

heat

map

for

VPW

(viewport
-
based)

is

concentrated

in

similar

areas

to

VPC

(unrestricted

eye
-
tracking)

.



19

Actual Gaze vs. Viewport Position

20

Attention
w.r.t
.
ViewPort

Eye
-
Cursor Time Lag Analysis

21

XY: minimum at
-
75.00 ms 199.8578

X:minimum at
-
90.00 ms 161.8480

Y:minimum at
-
35.00 ms 116.3665


Viewport Movement ~ Eye Movement


N
ormal elderly subject (NP=88%, novel image is on left).


I
mpaired elderly subject (NP=49%, novel image is on left).

22

Exploiting Viewport Movement Data

Novelty Preference

fixation duration
distribution


+

23

VPC
-
W Results
: Detecting MCI

21 Subjects (11 NC, 10 aMCI), recruited @Emory ADRC:








Accuracy on the pilot data
comparable

to

best reported values
for
manually administered
cognitive assessment test (MC
-
FAQ, reported
accuracy, specificity, and sensitivity of 0.83, 0.9, and 0.89
respectively) (Steenland et al., 2009).

Classification
method

5
-
fold CV

10
-
fold CV

leave
-
1
-
out

Acc.

Sens.

Spec.

AUC

Acc.

Sens.

Spec.

AUC

Acc.

Sens.

Spec.

AUC

Baseline
:

NP>=
0
.
58

0.81

0.80

0.82

0.81

0.81

0.80

0.82

0.81

0.81

0.80

0.82

0.81

SVM

(VPC
-
W)

0.81

0.80

0.83

0.81

0.85

0.80

0.9

0.86

0.86

0.80

0.91

0.86

Accuracy,

Sensitivity,

Specificity,

and

AUC

(area

under

the

ROC

curve)

for

automatically

classifying

patients

tested

with

VPC
-
W

using

5
-
fold,

10
-
fold
,

and

leave
-
one
-
out

cross

validation
.

24

Current Work


Analysis:


Applying deep learning and “motif” analysis for more
accurate analysis of trajectory


Incorporating visual saliency signals


Data collection:


Longitudinal tracking of subjects


“Test/Retest”: effects of repeated testing


Sensitivity analysis: for possible use in drug trials


Wide range of “normative” data using
Mturk

worker pool



25

Future Directions and

Collaboration Possibilities


Can we apply similar or the same techniques for
cost
-
effective and accessible detection of:


Autism (previous work on difference in gaze patterns)


ADHD


Stroke/Brain trauma


Other possibilities?



What can we learn about the searcher from their
natural
search and browsing behavior?


Image search and examination
preferences

(anorexia)


Correlate behavior with
biomarkers

(Health 101 cohort)






26

VPC
-
W Summary



VPC
-
W,

administered

over

the

internet,

elicits

viewing

behavior

in

normal

elderly

subjects

similar

to

eye

tracking
-
based

VPC

task

in

the

clinic
.



Preliminary

results

show

automatic

identification

of

amnestic

MCI

subjects

with

accuracy

comparable

to

best

manually

administered

tests
.



VPC
-
W

and

associated

classification

algorithms

could

facilitate

cost
-
effective

and

widely

accessible

screening

for

memory

loss

with

a

simple

log

on

to

a

computer
.



Other

potential

applications
:

online

detection

and

monitoring

of

ADD,

ADHD,

Autism

and

other

neurological

disorders
.


This

project

has

the

potential

to

dramatically

enhance

the

current

practice

of

Alzheimer’s

clinical

and

translational

research
.

27

Eye Tracking for
Interpreting

Search Behavior


Eye tracking
gives
information about searcher
interests:


Eye position


Pupil diameter


Saccades and fixations

Reading

Search

Camera

28

We Will Put an Eye Tracker on Every Table!

-

E. Agichtein, 2010


Problem:
eye tracking
equipment is expensive
and not widely available.



Solution:
infer

searcher
gaze position from
searcher interactions
.


29

Inferring

Gaze
from Mouse Movements













Actual Eye
-
Mouse Coordination

Predicted

No Coordination (35%)

Bookmarking (30%)

Eye follows mouse (35%)

30

Guo & Agichtein, CHI WIP 2010

Post
-
click Page Examination Patterns


Two basic patterns: “Reading”
and “Scanning”


“Reading”: consuming or
verifying when (seemingly)
relevant information is found



“Scanning”: not yet found the
relevant information, still in the
process of visually searching

31

Cursor
Heapmaps

(
Reading vs. Scanning)

[Task: “
verizon

helpline number”]

32

Relevant (dwell time: 30s)

Not Relevant (dwell time: 30s)

Move cursor horizontally



“reading”

Passively move cursor



“scanning”

Typical Viewing Behavior (Complex Patterns)


[Task: “number of dead pixels to replace a Mac”]

33

Relevant (dwell time: 70s)

Not Relevant (dwell time: 80s)

Actively move the cursor with
pauses


“reading” dominant

Keep the cursor still and scroll



“scanning” dominant