IRALAR presentation - Cegt201 Bradley

peachpuceAI and Robotics

Nov 6, 2013 (3 years and 9 months ago)

81 views


Team IRALAR

Breanna Heidenburg
--

Michael Lenisa
--

Daniel Wentzel


Advisor: Dr. Malinowski


The Project


Why is it important



The Goals


System breakdown


Image recognition


Point transformation


User Interface



The Results


What is our project?


Track a user’s eye and use the information to
control a computer cursor.


Enhances Human Computer



Interaction



Speed of use


Hands
-
free use


3 Part System


Image Processing Application


Calibration and Mapping system


GUI designed for gaze
-
based interaction



Systems developed concurrently and
independently



Separate Applications at run
-
time


Hardware and Image
Processing Application

Hardware


(Input)






Hardware


(output)







Software










Image Processing
Application













User
Interface
Application







Image Processing
& Cursor Control
Thread

Shared Process Data

UDP Server
Thread

UI Thread

UDP channel

OS channel

Hardware


(Input)

Hardware


(output)

Software


Hardware


Camera


QuickCam

Pro for Notebooks


Visible Spectrum Camera



Polarizer


Tiffen

25mm polarizing filter


Removes glare from eye reflections



Lighting


Diffused LED


Slightly distracting to the user, but necessary to provide light
for the camera

Hardware


(Input)







Hardware


(output)

Software


LitEye LE
-
500


High resolution (SVGA)


Color Display


Translucent or opaque operation


Stationary relative to user’s eye

Hardware


(Input)







Hardware


(output)







Software

Software









Image Processing
Application


Real time pupil tracking system


Developed in C using OpenCV image processing libraries


Traditional image processing and blob tracking



Capabilities


Locate and determine center of pupil in image


Low light and high reflection environments


All eye colors


Data logging and static test modes


Packaged into self contained Windows installer for easy
deployment onto any computer

Query Frame From
Camera

Red Channel

Smooth

Contrast Stretch

Extract Blobs

Reject False
Positives

Calculate Blob
Center

Final Recognition

Capture
Image

Extract
Red
Channel

Smooth
image

Contrast
Stretch

Locate
blobs

Reject
false
positives

Determine
center of
pupil blob

Adapt
Algorithim


Summary


The Good


Dynamically adapts to changing lighting
conditions and eye types


Maintains performance in low
-
light and
specularly noisy conditions



The Bad


Still relies on Logitech camera drivers


Extreme reflections still cause problems



Examples of performance in poor conditions




Low Light



Difficult



False Positive

Note: Image brightness and contrast artificially enhanced for human visibility

Calibration and point mapping

Hardware


(Input)






Hardware


(output)







Software










Image Processing
Application












Image Processing
& Cursor Control
Thread


System for mapping the location of the center of
the pupil to a pixel on a computer screen



Must Calibrate for each User


Geometry


The eye is not flat but a screen is



User Customization


All eyes are different


Everyone wears the HMD differently



User Training


Calibration system also acts as a quick tutorial


3 dimensional best fit plane


Currently using a 4
th

degree best fit





X
pix

= A
1

+ X
eye
*B
1

+ Y
eye
*C
1




Y
pix

= A
2

+ X
eye
*B
2

+ Y
eye
*C
2




Calibration sub
-
system determines these
coefficients






How do we solve the problem?



Multiple Variable Linear Regression


Least Squares

Y = B
0

+ B
1
x
1

+ … +
B
k
x
k







Uses matrix algebra to obtain a coefficient matrix

B[]
=

(
X

X
)
-
1
X

Y



Cursor
position error


actual vs.
determined
position


Horizontal and
vertical error in
screen pixels



Error Mesh


Accuracy varies with position and skill of user


Corners of screen most difficult to calibrate


Focusing on a rapidly changing location requires skill



Limitations


1
°

Accuracy of human vision system


Eye Saccades



Original error goal


2% of screen dimension on both axes


Achieved error


1.18% Horizontal


1.46% Vertical




How do we click?


Monitor eye movements


Identify pauses



“Dwell time”
-

When eye position is focused on a
single area for a period of time



Currently set at 5 frames (~200 mS)


Generally, it takes 230 mS for a hand to click a
mouse.


Improved
interaction
speed


Trackpad vs.
Mouse vs.

Gaze Tracking



53% Increase
over Trackpad


12% Increase
over Traditional
Mouse


Custom GUI interface and
Communications

Hardware


(Input)






Hardware


(output)







Software










Image Processing
Application













User
Interface
Application







Image Processing
& Cursor Control
Thread

UI Thread


Custom GUI for Gaze Tracking Applications



Why?


Gaze tracking accuracy limited by inherent
properties of the human vision system


Traditional GUI too small and intrusive for use with
transparent HMD


Demonstrate applications of gaze tracking



Modern GUI Design


WPF using XAML layout


W
indows
P
resentation
F
oundation


e
X
tensible

A
pplication
M
arkup
L
anguage



XAML is similar to HTML


Uses tags and ‘code
-
behind’ in a similar style to
javascript



GUI coded in C#


Multiple
pages within
the interface



Screens for
functionality
testing


even games



Ability to
minimize
interface

Main Menu Screen


Large text and
buttons



Wide spacing
between options



Simple layout



Placement of
features in

high
-
accuracy
areas

Hardware


(Input)






Hardware


(output)







Software










Image Processing
Application













User
Interface
Application







Image Processing
& Cursor Control
Thread

UI Thread

OS channel


Why


Allows processes to communicate


Allows relay of time sensitive information



2 Communication Channels


OS Channel


Omni
-
directional (Image Processing to User Interface)


UDP Channel


Multi
-
directional


Separate Thread in Image Processing Application

Hardware


(Input)






Hardware


(output)







Software










Image Processing
Application













User
Interface
Application







Image Processing
& Cursor Control
Thread

UI Thread

Shared Process Data

UDP Server
Thread

UDP channel

OS channel

Creates

Multi
-
threading
issue


Multi
-
threading


Public variable usage


Solution: Critical Section


Raises thread priority (thread is uninterruptable)


Receive request
for data

(over UDP)

Raise
Thread
priority

Lower
Thread
Priority

Reply
to
request

Read
Variable

Hardware


(Input)






Hardware


(output)







Software










Image Processing
Application













User
Interface
Application







Image Processing
& Cursor Control
Thread

Shared Process Data

UDP Server
Thread

UI Thread

UDP channel

OS channel

Software







Controller







Controller Thread

Goals met and future projects






Image Processing based eye tracking system


Correlates eye position to location on screen


Adapts to wide variety of eye types


Ability to function as a hands
-
free input device


Cursor control via eye
-
tracking system


Display visual output to user


Visually controlled demo programs







Augmented reality system


Transparent Heads
-
up display over user
vision


Portable system for everyday use


Forward looking camera for correlation of
user real
-
world vision to eye position


Real world applications


Network integration


Sound output


Windows compatible
software package


Standard Windows
Installer


Contains image
processing application
and GUI





Installation Screen




Pupil Tracking with Neural Network


Implementation of camera driver with DirectShow


Implement Head tracking for gaze
-
tracking without

head
-
mounted display


Augmented reality


Front facing camera


Object/face recognition


Implement real
-
world applications



Any one of these is a senior project in itself!


Who helped us out?


Dr. Malinowski and the EE faculty


Mr. Mattus & Mr. Schmidt


Our test subjects