Eye-Controlled Computer Interface Using a Webcam and Infrared ...

erectboboΛογισμικό & κατασκευή λογ/κού

14 Δεκ 2013 (πριν από 4 χρόνια και 17 μέρες)

70 εμφανίσεις




Eye
-
Controlled Computer Interface


Using a Webcam

and Infrared Sensors



ECE 445 Senior Design Project Proposal





Justin Williams

Ryan Sellers




TA: Wladimir Benalcazar








Introduction


Computers have become an integral part of life and essentia
l for day to day operations.
There have been many improvements
in

computing and
the power and functionality of
computers is constantly reaching new heights. Computers save our data, help us communicate
across the globe, entertain us through video and gamin
g
,

and
new

capabilities
are
emerging every
day. However, there has not been much change in the way we interact with computers since
wireless keyboards and mice. We want to make interfacing with the computer more intuitive and
much simpler by allowing the u
ser to move the cursor by simply looking anywhere on the
screen.
Computers being a device intended for anyone to use, should be operable by everyone.
Disabled users who cannot operate a mouse should have the opportunity to use a computer. This
system will
allow them that ability.


This idea of an “eye mouse” is not a brand new idea. There is some research being done
in this area of human
-
computer interaction. The majority of these devices are still in
development
,

minus a few exceptions, but are not the sa
me as our proposed
system
. The current
implementations involve electro
-
oculograms, or uncomfortable headsets in order to work
properly. Our system involves a single high
-
def webcam, an inexpensive IR sensor
, a data
acquisition card and a computer. The user

does not need to clip a sensing device onto his visor or
wear bio
-
sensors while using our system. We will use labView and our own C++ code to process
both the information from the sensor and the camera to move the cursor in real time. Most of the
current
eye mouse systems involve a calibration and special set
-
up system to function correctly,
where our system simply involves plugging in the camera and clipping it, along with the IR
Sensor, to the monitor. We hope to develop this system as inexpensively as p
ossible to allow a
commercial
ly

market
able

product. A large component of the expenses

of this system is the high
-
def webcam. However, the webcam can used as a general purpose webcam because our system
does not modify the camera in any way.


Objectives


Be
nefits:



Allows disabled users to interface with computer



Provides more intuitive human
-
computer interaction



Create cost
-
effective commercially available product



Increased interaction in gaming




Features:



Webcam used to track eye movement



IR sensor that
determines user distance from monitor



Real
-
time analysis of eye movement



Easy setup with no calibration or headgear



Inexpensive



Design

Block Diagram:


Block Descriptions:

Camera:


We will use a Logitech Webcam C500 that will
clip onto the top of the monitor, set at a
specific angle to track the user’s eye movement. This camera records HD video up to
1280X1024 pixels which is essential in recording the iris movements with high precision and
little internal noise. The camera als
o records at 30 frames per second which is enough to capture
eye movement. The Webcam will connect to the computer via USB and the video will be
imported into LabView for analyzing.


Computer:


The computer has to have a USB connection and at least a 2Ghz

processor to be
compatible with the webcam. The computer will capture the video from the webcam and use a
combination of code blocks to analyze and process the information from the webcam. The
computer will also analyze data from the NI data acquisiton ca
rd from the IR sensor via PCI. The
three code blocks will comprise of LabView, C++, and openCV/openGL. The LabView will be
the control center of the processing while outsourcing the intensive processing to the C++ code
to optimize the system to operate in
real time. The C++ code will use existing openCV image
and video processing functions along with our own code to do the bulk of the processing. The
C++ code block will also use openGL libraries to display a 3D set of eyes to show the system is
working prop
erly.


Data Acquisition:


The NI data acquisition card will connect to the computer through PCI or PCI express
and will take the analog voltage input from the IR sensor and convert it to digital information for
the computer to analyze. This information wi
ll be sent to LabView to convert to a distance
measurement.


Infrared Sensor:


The Sharp GP2D12 IR sensor will be positioned near the camera and facing the user. The
sensor can measure distances from 4” to 30” with high precision and will be focused on th
e
user's forehead for a fairly accurate measurement of how far the user is from the monitor.


Performance requirement


IR Sensor:

IR Sensor requires 4.5 to 5.5V to run properly.

Camera:

To achieve real
-
time performance, the image must be sampled at at

lea
st 20 FPS
to keep up with eye motion. To be accurate in its measurements of the user's eye,
the camera must have a high resolution.


DAQ:


Must have at least one analog input available.

Computer:

At least 2GHz processor, USB port, PCI
-
Express slot

Our s
ystem will work in reasonable lighting conditions which is aided by a built in
RightLight

technology in the webcam.

The system will work for a

normal viewing distance of

1
0
-
40”. We want the system to work to the point where all users can agree that the pos
ition of
the cursor is indeed where they are looking.


Verification

Testing Procedures

The success of our system will be judged mainly by the user’s acceptance that it is working
properly. The system provides qualitative results rather than quantitative
results, so the system
will be considered working properly if the users can agree that the cursor’s location is indeed
where they are looking. However, we can set up quantitative tests using a grid to record actual
data values to test the system. These qua
ntitative results, while important, need not be very
accurate.




In testing our system, we hope to focus the tests in three main areas:

1.

Accuracy of the mouse cursor to where the user looks



Before implementing mouse functionality, we will test by looking
at a grid on a wall
to make sure we are accurately measuring the angle of the eyes.



We will begin with the camera looking out perpendicular to the wall, and try the head
at various distances and positions.

2.

Filtering out jitters caused by any motion of the
head



We will sample at a high enough frequency so that we have several samples to look at
before picking the average measured location for the motion. We will have to find a
good balance here between accuracy and response time.

3.

Before incorporating the mo
use, use openGL to show where the user is looking on the
screen.

4.

Test the system with just the camera and the screen at various light levels.



If it becomes too difficult to locate the eyes at lower light levels, we would consider
adding a light source on t
he camera that shines at a wavelength invisible to human
eyes but visible to the camera.

Tolerance Analysis


The tolerance of the system is going to be the most challenging aspect

of this project
.
There are several ways that the system could malfunction. S
ome areas of concern that need to be
addressed are: users wearing glasses and glare, the natural jitter of the eye even when focused on
a certain point, poor lighting conditions,
and head movements.


The Human Visual System’s spatial frequency response ca
n be modeled as a low
-
pass
filter that cannot see much above 20 cycles/degree. A person sitting at distance of twice the
monitor height can only distinguish 1120x1120 pixels. Most higher quality monitors now are
around 1920x1080 or 1440x900 in widescreen.
These “excess” pixels that the eye cannot resolve
allow some room for error in placing the cursors position.

The spec for our infrared sensor states that it works optimally from 10
-
80 cm

which is
appropriate

because, assuming a 19
-
22” monitor,

the desired
viewing distance is
10” to around
30”.

At this range, (with a horizontal field of view of 72 degrees) the user should be able to move
around 10” to either side from the center (at a distance of 20”). At the extreme close (10”), the
user could only move ar
ound 4” in either direction.


Our system should work for any user of any skin tone and reasonable lighting condition.
Using the camera
under little

light

will be aided by the light from the
monitor

and the RightLight
technology of the camera.






Cost an
d Schedule

Labor


Salary

Hours Worked

Total

Ryan Sellers

$45/hr

200 hours x 2.5

$22,500.00

Justin Williams

$45/hr

200 hours x 2.5

$22,500.00




$45,000.00


Parts

Part Name

Cost

Logitech Webcam C500

$50.00

Sharp GP2D12 IR Sensor

$12.50

*NI DAQ

$150.0
0

*Additional IR Circuitry

$10.00

TOTAL:

$222.50

*Price listed is an approximation









Schedule:

2/15:

Request parts and further research of facial detection algorithms.


Begin writing code with test video.


Sign up for and begin writing Design re
view report.


Justin
-
Research best algorithms for facial recognition.


Ryan
-
Write code to run on test video.

2/22:

Design Review


Ryan
-
Begin implementing camera with eye recognition code.


Justin
-
Set up IR sensor and data acquisition card and test accurat
e

distance
measurements in LabView.

3/1:

More coding to continue improving accuracy and speed.


Justin
-
Have IR sensor working properly.


Ryan
-
Write program in openGL to test accuracy.

3.8
:

Implement IR sensor with camera

system and test accuracy with
con
trolled conditions.

3/15:

Prepare for mockup demo. Continue fixing bugs.

3/29:

Mock
-
up Demo. Fix any bugs at demo. Continue Impr
oving accuracy
and speed. Work
on increasing lighting and head location tolerance.

4/5:

Test with a sampling of users. Add e
xtra features if time permits.

4/19:

Continue improving accuracy and tolerance.

4/26:

Prepare for final demo and presentation.

5/3:

Write final paper.