[1]. Since the beginning of the - Library & Knowledge Center

flosssnailsMobile - Wireless

Dec 10, 2013 (3 years and 10 months ago)

85 views



1

CHAPTER 1


INTRODUCTION


1.1

Background

The evolution of user interface is going rapidly today

[
1
]
.
Since the beginning
of the computer era, user interface has been one of the most important things to
develop because it directly
interacts with us.
For decades we had been using
keyboard and mouse
as the primary input device
s
on desktop and
laptop

computers
at
our house, in the office, and

public areas.

But
,

after the
reinvention

of mobile
phone in 2007, started by Apple with their iPhone
[
2
]

and followed by Android
devices
in the following year

[
3
]
,
we were introduced
with
the c
urrently
popular
input method that we
now
already familiar with, which
was
the touch screen
technology

[
4
]
.

It requires

much less
user
intervention

and effort to communicate
with the machine

yet offers better user experience because user

feels more natural
navigation control
.

The
technology
is used heavily in today mobile devices
, from
smartphones to tablets

and
even
laptop computers.

Yet
it is troublesome
when both
of our hands
are busy and near
or close interaction

be
tween the user and the machine

[
5
]
.

The problem exist
s

when the
user

doesn’t have what it takes

nor want

to
control

with the user interface using touch screen input method
,

which are
hands,
arms, and fingers

[
6
]
.
With larg
e number of handicapped people still able to
use their
eyes to communicate,
it has been our concern to help them
by utilizing technologies
.

Also,
there is a potential need
for

people

who
want

or need

to use another
easier
method to control their devices
.

T
his leads to an alternative for t
he user interface
input method, which
is the purpose of the thesis

created

[
7
]
.

2



Based on the problem explained above
, this thesis offer the solution to it by
creating an eye
-
tracking based
user interface control application that runs on Android
technology mobile platform.
The
main
reason
s

of
Android platform is more preferred
than the other mobile platform



Apple
iOS,
Blackberry

OS
, Windows Mobile,
Symbian


are

beca
use
,

first,

as

now,

eye
-
tracking algorithm, source code, and
library are running
natively and perfectly
on it

[
8
]
. Second,
it is compulsory for today
Android mobile devices


starting Android 4.0 Ice Cream Sandwich version


to
have a front
-
facing camera

for the face detection feature to unlock the phone

[
9
]
.
Thus, it is a
very good advantage to
make use of that condition

to implement the eye
-
tracking
feature.

One usage of the
application is to be the

alternative
input method
to control
the Android device
. With the use of the user’s eyes, there will be no need of
screen
touc
hing anymore to control the device. Hence, it will be more practical, efficient,
easier and faster to do almost all things on the mobile device.
It
also makes the user
actually
see what they
point before executing any actions on it.
P
reviously with touch
s
creen, users

experience difficulty to see what they actually touch because their
fingers block
that part of the screen.
In other words,
this application transforms
device
-
controlling requirements from eyes and fingers to only eyes
, which means
from 2
organs act as separate sensor and actuator into 1 organ act as both sensor and
actuator, that is our eyes.

Another

usage of the
application
is
to help
people suffering from stroke

to
control their devices with the help of a partner
. Stroke patients
usually

lose half
function of their
bodies
yet
still
able to
use their

eyes, listen to
sounds

and
understand
when other people

communicate with them. With the eye
-
tracking
control application, those patients can
communicate with other
people much easier
3



and faste
r.
The Android on
-
screen keyboard built
-
in
offers user to write
sentences
quickly.
The
Android
default user
-
friendly interface is easily controlled by pointer
,

which based on the eye
-
tracking coordinate
.
Ultimately
, this input method
could

be
used in
subst
itution

to touch screen method.

To sum up, this
research
thesis
is focused on developing an Android
application which control the navigation in the user inter
face by utilizing eye
-
tracking algorithm feature to position the x and y coordinates of the
pointe
r. This
ap
plication is useful for general
people who cannot
or
would not
use the touch screen
technology to
control

the user interface

of an Android mobile device
.


1.2

Scope

This research focused
on developing a working
Android
application
prototype, which

is able to determine
the pointer’s
x and y
coordinates in the
Android mobile device’s user interface
to navigate
around the application
environment

by tracking the eye
-
gaze
-
tracking
’s position

with customized algorithm
.

The prototype

should be able to do
following functionalities:



Track eye
s area on face,
each separate eye, iris and pupil estimated
position

using
front
-
facing camera



Track eye
-
gaz
e
-
based screen
position




Calibrate eye
-
gaze
center
position



Show “mouse” pointer as

the controlling navigator



Mo
ve pointer based on eye
-
gaze
tr
acking position



Imitate a touch or
press
event
at
the
pointer screen coordinate with the blink
detection

4



1.3

Aims and Benefits

The aim of this research project is to
give general people an alternative to the
current input method
by using

eye
-
tracking based user interface control application

technology

on

the
today
Android mobile

platform device.

It is also the vision

and goal

of this project to develop
an easier

input method to control a user

interface
, providing

better user exper
ience with less human
intervention yet
more

interaction

for everyone
.

The benefit of the implementation of the project is
communication between
people is easier to be done and
interaction between human and computer is also getting
simpler
.

Input devices ar
e also more compact because only a little camera lens
is needed
to get an input from user. Thus, electronic devices with input options can be more
compact and less expensive because no
need of physical devices anymore. Anything can
be controlled via
virtual input on the screen.


5




1.4

Structures

Chapter 1

This chapter explains

the background, scope, aim and benefits
of
creating
an Android eye
-
tracking based user interface control application.

Chapter 2

This chapter explains regarding the theoretical foundation. It
includes
des
cription
of input devices and input methods,
eye tracking

and
gaze
tracking

and its researches and trends
,

Android mobile platform and
devices, and other technologies to create the
application.

It
also describes
the basic of user interface
and its components.

Chapter 3

This chapter describes comparison of current

existing

input devices and
eye
-
tracking based input device
. It will also include analysis of the problem
and the details
of what will be accomplished from the solution.

Chapter 4

This chapter explains the
application
design

and algorithm created through
UML diagrams and other forms of diagrams

for decision
-
making design.

Chapter 5

This chapter
describes hardware and
software minimum requirements,
operational procedure, and test plans and results, ranging from module
testing to user testing

Chapter 6

This chapter
describes

the author’s experience and obstacle while creating
the
application, developing a better, effici
ent,
effective, and unique
algorithm for the eye
-
tracking feature
, observation and analysis of user’s
preference towards the
application, and Android
limitations found
throughout the development of the application
.

Chapter 7

This chapter
describes

the conclusion, both strength
s

and weaknesses, and
future possible improvements from the thesis.