A Hands Free Human-Computer Interface Using Processed Head Motion

birthdaytestΤεχνίτη Νοημοσύνη και Ρομποτική

17 Νοε 2013 (πριν από 3 χρόνια και 8 μήνες)

56 εμφανίσεις


42

A Hands Free Human
-
Computer Interface Using
Processed Head Motion


Sarah Brown
1
, Robert Schilling
2
,

Stephanie Schuckers
2
,

Edward
Sazonov
2



In the modern world, computer use
has become

essential for

many everyday tasks such as
electronic communications,
in
formation gathering, and
recreational activities.

The current
computer interface set up of a mouse and keyboard requires the user to have full use of his or her
hands.
Unfortunately, many people do not have sufficient use of their hands due to injury or
i
llness and are thus unable to use a comp
uter using traditional
hardware [
1]
.

Some alternative interfaces

have been developed using electroencephalograms (EEGs)
and eye motion, however these require a great deal of expensive hardware, require significant
p
rocessing time, and only give the user limited control

[2
-
5]
.
More recently,
development has
focused on systems
that
monitor head motion either electromechanically or optically.
These
systems can provide faster speeds

and more control, but they are often v
ery expensive

and

difficult
or awkward
to use [6].
Additionally, many of these systems utilize mouth controls
to
some degree
, which can by hygienically troublesome.

This poster presents a simple and effective low cost optical system for implementing
mouse

operations using processed head motion.
The system consists of
several

basic
, off the shelf
components including a webcam, a
headset, a wide angle LED, and a voice activated switch.
Images from the webcam are analyzed using a combination of MATLAB and C++

software in
order to determine the position of the user’s head.
The LED is attached to the he
adset as a
reference point to simplify

the image

processing task.
This head position data is then transformed



1

BS Candidate, Electrical Engineering, Clarkson University Honors Program, Class of 2008, Poster
Presentation

2

Advisor, Clarkson University ECE Department


43

using a non
-
linear transformation

into a correspondi
ng screen position that is used to control the
mouse pointer. Clicking operations are accomplished using the voice activated switch.

This system was tested on small group of subjects.
Each subject
first
completed a short
calibration procedure in order to
build an accurate transformation matrix.
He or she was then
trained on the use of the system using a
simple icon selection task.
Once the subject was
comfortable using

the system,
he or she

completed a series of simple day to day tasks including
icon selec
tion, typing using the on
-
screen keyboard, and
web browsing.
As a reference, each
subject also completed the same tasks using a traditional mouse.

In the icon selection task,
the subjects had an average decrease in
throughput

of
2.63

bits/s, or
76.6
%,

whe
n using the head mouse system instead of a manual mouse.

The decrease in
accuracy, however, was
small

with an average decrease of approximately
11
%.

The typing and
web browsing tasks showed similar results, with speeds decreasing by
68.1
% and
84.3
%,
respe
ctively.

Future work
on this system
include
s

the integration of speech recognition using
Microsoft’s Speech Software Development Kit.
This more advanced speech recognition
should
allow for more advanced
clicking
operation
s such as right clicking, dragging
and dropping, and
double clicking
.

Additionally,
increased typing speed could be achieved using speech
-
based
text
entry
.

Works Cited


[1] S. Trewin and H. Pain, “A model of keyboard configuration requirements,”
Behav. Inform

Technol.
, vol 18, no 1, pp. 27
-
35, 1999.

[2] G. A. Rinard, R. W. Matteson, R. W. Quine, and R. S. Tegtmeyer, “An infrared system for

determining ocular position,”
ISA Trans
, vol. 19, no. 4, pp. 3
-
6, 1980.

[3] N. Gravil,
P. A. Grif
fi
ths
, R. Potter, and A. Yates, “Eye control of microcom
puter,”
Comput.

Bull. Serial
, vol. 3, pp. 15
-
16, 1985.

[4] J. R. Lacourse and F. C. Hladik, Jr., “An eye movement communication
-
control system for

the disabled,”
IEEE Eng. Med. Biol.
vol. 37, pp. 1215
-
1220, Dec. 1990.

[5] Z. A. Keirn and J. I. Aunon, “Man
-
machine communications through brain wave processing,”

IEEE Eng. Med. Biol.
, pp. vol. 37, pp. 1215
-
1220, 1990.

[
6
] D. G. Evans,
R. Drew
and P. Blenkhorn, “Controlling mouse pointer position using an

infrared head
-
operated joystick,”
IEEE Trans. Rehab. Eng.
, vol. 8, no. 1, pp. 107
-
117, 2000.