Kinect Assisted Natural Gesture Approach to Robotic Operation

thunderclingΤεχνίτη Νοημοσύνη και Ρομποτική

13 Νοε 2013 (πριν από 3 χρόνια και 6 μήνες)

307 εμφανίσεις

Kinect Assisted Natural Gesture Approach to Robotic Operation

KANGARoO


A Senior Project

Presented to

the Faculty of the Compu
t
er Engineering Department

College of Engineering

and

the Faculty of the Mathematics Department

College of Mathematics and
Science

California Polytechnic State University, San Luis Obispo



In Partial Fulfillment

of the Requirements for the Degrees

Bachelor of Science, Computer Engineering

and

Bachelor of Science, Mathematics



by

Alex Gronbach

Anna Nickelson

June, 2012


Proje
ct Advisor: Franz Kurfess

Kinect Assisted Natural Gesture Approach to Robotic Operation

P a g e

1


Table of Contents

I

Abstract

................................
................................
................................
................................
.................

3

II

Introduction

................................
................................
................................
................................
..........

4

II.I

Motivation

................................
................................
................................
................................
.....

4

II.II

Design Considerations
................................
................................
................................
...................

4

II
.III

Overview

................................
................................
................................
................................
.......

5

III

Design Phases
................................
................................
................................
................................
....

6

III.I

Phase One

................................
................................
................................
................................
.....

6

III.II

Phase t
wo

................................
................................
................................
................................
......

6

IV

Texas Instruments Analog Design Contest
................................
................................
........................

7

IV.I

SN754410: Quadruple Half
-
H Driver

................................
................................
.............................

7

IV.II

UA7805: 3 Pin 1.5A Fixed 5V Positive Voltage Regulator

................................
.............................

7

IV.III

UCC283
-
5: Single Output LDO, 3.0A, Fixed(5.0V), Reverse Current Protection, Thermal Shutdown

.........

7

V

Additional Components

................................
................................
................................
........................

7

V.I

Microsoft Kinect

................................
................................
................................
............................

7

V.II

Flexible Action and Articulated Skeleton Toolkit (FAAST)

................................
............................

8

V.III

Xbee Explorer Dongle

................................
................................
................................
....................

8

V.IV

Xbee 1mW Wire Antenna


Series 1

................................
................................
.............................

8

V.V

Xbee Shield Module for Arduino Uno

................................
................................
...........................

8

V.VI

Arduino Uno, ATMega328P

................................
................................
................................
..........

8

V.VII

Solarbotics GM3: 224:1 Gear Motor 90° Output

................................
................................
..........

8

V.VIII

Solarbotics GMPW
-
Y Yellow Wheels
................................
................................
.........................

8

V.IX

Vex Claw Kit

................................
................................
................................
................................
...

9

V.X

MQ
-
3:
Alcohol Sensor

................................
................................
................................
...................

9

V.XI

AIC250W: Airlink 250W Wireless Camera

................................
................................
....................

9

VI

Conclusions

................................
................................
................................
................................
.......

9

VII

Appendix

................................
................................
................................
................................
.........

10

VII.I

Distributi
on of Tasks

................................
................................
................................
...................

10

VII.II

Lessons Learned

................................
................................
................................
..........................

11

VII.III

Special Thanks

................................
................................
................................
.........................

11

VII.IV

Source Code

................................
................................
................................
............................

12

VII.V

Trac Wiki

................................
................................
................................
................................
.....

15

Kinect Assisted Natural Gesture Approach to Robotic Operation

P a g e

2


Table of Figures

F
IGURE
1:

E
ARLY
B
ASE
S
TATION
,

K
INECT AND
X
-
CTU

SHOWN

................................
................................
................................
....

5

F
IGURE
2:

E
XCERPT FROM
P
OSTER DETAILING
KANGAR
O
O'
S
G
ESTURE
C
ONTROLS

................................
................................
........

6

F
IGURE
3:

E
ARLY
P
ROOF OF
C
ONCEPT
V
ERSION OF
R
OBOTIC
S
YSTEM
,

CIRCA
P
HASE
O
NE

................................
................................
.

6

F
IGURE
4:

E
XCE
RPT FROM
P
OSTER DETAILING INDI
VIDUAL COMPONENTS OF

KANGAR
O
O

................................
...............................

7



Kinect Assisted Natural Gesture Approach to Robotic Operation

P a g e

3


I

Abstract

KANGARoO is a lightweight and user
-
friendly prototype design that demonstrates how various
standalone systems can be integrated to create a solution that is greater than the sum

of its parts. This
design showcases a unique Kinect assisted natural gesture
-
based approach to robotic operation. The
system is portable, has a nearly unlimited range, and enables intuitive and precise control of a robotic
system. The robot currently co
ntrolled by the KANGARoO system has basic movement capabilities, a
fixed claw for grabbing objects, and basic sensing equipment. The KANGARoO system can be expanded
to control any robotic system that utilizes a standard remote control interface.



Kinect Assisted Natural Gesture Approach to Robotic Operation

P a g e

4


II

Introdu
ction

The following report outlines the six
-
month process involved in researching,

designing, developing,
prototyping and testing the
KANGARoO
.

II.I

Motivation

I
n Fall Quarter of 2011,
there was
much
deliberation and research regarding a proper

senior project.

Employing the still
-
new technology of the Kinect sparked much interest; however
,

it was unclear how
best to utilize this in an inventive manner.
Originally the idea was to use multiple Kinects to perform
kinesthetic measurements of the human body
. Unfor
tunately, it was soon determined that the Kinect

was

not accurate enough to perform such tasks.

After
reevaluation
, the next idea was to control a video game using Kinect
-
based gestures
, one that was
not originally designed to be controlled

in this fashion.
The first steps to this goal included

connecting
the Kinect to a computer, discovering and installing the Flexible Action and Articulated Skeleton Toolkit
(
FAAST
) created by
the
U
niversity of
S
outhern
C
alifornia
1
,
and installing the
appr
opriate Kinect drivers
.

This goal proved far too easily attainable; a control scheme for the video game was fully functional
within three hours.
Having completed what was expected to be an entire senior project
,

(and having
spent some time “testing” the
new control scheme in Electronic Arts’
Burnout: Paradise
) it was back to
the drawing board.

Finally the idea was conceived to create a robot that could be controlled via the Kinect, akin to the
underwater submersible robot in the beginning of
the movie
Tit
anic.

A rough game plan involving Texas
Instruments’ Analog Integrated Circuits (ICs), the
FAAST

toolkit, and a microcontroller unit (MCU) was
developed and set into action. The goal was
simple
:

develop a robot that could be remotely and
wirelessly contr
olled in a responsive and precise
manner
using nothing but the Kinect and the human
body as a control inference.

II.II

Design Considerations

The control scheme was
initially

developed
while
toying with
the
controls for the video game
. The car’s
movements in
Bur
nout:
Pa
radise
were

primarily controlled using
hand gestures and leaning of the body.
Hand gestures were a
n intuitive control interface for the car’s movements

and were therefore
kept
for
the robot’s control scheme. However, leaning
the body to start and

stop the robot proved unwieldy and
was later replaced by other hand gestures.

As
demonstrated

by the control scheme for the game, the
FAAST

toolkit

proved to be a

reliable way of
translating kinesthetic gestures into keystrokes
. The challenge from there was how to wirelessly
transmit those keystrokes in a similarly reliable and efficient manner to the MCU
.

The last task
was to develop a lightweight and user
-
friendly robotic unit to be utilized in the field of
intelligence gathe
ring.

The ability
to send a robotic scout in to a potentially hazardous or unknown



1

Suma, Evan A., Belinda Lange, Skip Rizzo, David Krum, and Mark Bolas. "Work Flexible Action and Articulated
Skeleton Toolkit (FAAST)."

Flexible
Action and Articulated Skeleton Toolkit (FAAST)
. University of Southern
California, 30 Dec. 2010. Web. 16 Jan. 2012. <http://projects.ict.usc.edu/mxr/faast/>.

Kinect Assisted Natural Gesture Approach to Robotic Operation

P a g e

5


environment
proved to be a practical application of
the KANGARoO
.
The goal of this project was to
provide proof of this concept. However, the limited time frame

and budget of a senior project does not
allow for a commercial grade robot to be developed. With this in mind, each
component

had to be
carefully chosen to provide as much information as possible while minimizing cost and weight. On the
other hand, many

components
throughout the design of this system are modular and

can be
re
-
designed
or replaced
.
This

allows for
a more
versa
tile

and
robust

final product

to be developed
.

A
perfect example
of the modular design is

the
alcohol sensor on the robot
.

I
t is

meant to be a
placeholder for more complicated gas sensors such as methane or carbon dioxide sensors
,

which are
difficult to calibrate on a student budget.

II.III

Overview

The
KANGARoO

consists of two major components.
First
is the
base station
, to which the Mi
crosoft
Kinect and an Xbee radio are connected as shown in Figure 1
below (Xbee not visible).
This component
is responsible for converting kinematic gestures to control signals, and communicating these signals via
the Xbee radio to the robot. It is also
responsible for listening to any feedback the robot has for the
user, such as a readout from the gas sensor. This information is displayed in a terminal window that the
user can monitor
; the user can

see the status of commands being sent
and received by
t
he robot
.
Figure
1
: Early Base Station, Kinect and X
-
CTU shown

Kinect Assisted Natural Gesture Approach to Robotic Operation

P a g e

6


The second aspect of the project is the robotic system itself. The humble plywood box contains an
Arduino Uno with an attached Xbee

radio, as well as two Texas Instruments Quadruple Half
-
H Drivers,
and
two

Texas Instruments Voltage Regulator
s
. The robot also contains a robotic claw, alcohol sensor,
and wireless security camera, as well as DC motors and wheels for movement. This robo
tic system
receives all of its control signals from the user’s gestures converted via the Kinect and transmitted
through the Xbee radios.


Figure
2
: Excerpt from Poster detailing KANGARoO's Gesture
Controls

III

Design Phases

The projec
t was split into two major design phases, one to be completed by the end of each of the two
quarters that this project spanned. The goal was to have phase one completed by the end of Winter
Quarter 2012 and to have phase two completed by the end of Spring

Quarter 2012.

III.I

Phase One

This initial version of the robot consisted
solely
of movement and communication architecture. This was
meant to be the proof of concept stage in the development
,
during

which

all control and communication
systems were functional.

At this point in development, the Kinect was able to recognize a user and
detect specific kinematic gestures.
FAAST
then converted these kinematic gestures into characters
,

which were continuously and wirelessly communicated via the Xbee radios to the M
CU
. The
MCU used

a
lookup table

to

output

different movements (forward, reverse, left and right turns) based on
the
input
received. This was done utilizing the Texas Instruments Quadruple Half
-
H Driver to enable motor
control and the Texas Instruments Vol
tage Regulator to govern the voltage levels going to the motors
and the MCU. Figure
3

shows the bare
-
bones version of the robot as it was nearing completion of phase
one (note the Texas Instruments
Voltage Regulator
s

had not yet been implemented at the ti
me this
photo was taken).


Figure
3
:
Early
Proof of Concept Version of Robotic System
, circa Phase One

III.II

Phase two

Once phase one was complete, the focus of the project shifted t
o expand
ing

the system to enable
practical and useful a
pplications for this technology.
While m
any use cases are applicable to this design,
the project

s focus was on
exploring

hazardous
or otherwise unknown environments
that may or may
not be safe for humans to
enter
. This was facilitated by the addition of

various components to the
robotic system, including a
front
-
facing
wireless camera,
a gas sensor, and front
-
facing claw. By the end
of this design phase, the robot was a fully functional prototype capable of being controlled wirelessly
and remotely with
great speed and precision.



Kinect Assisted Natural Gesture Approach to Robotic Operation

P a g e

7



Figure
4
: Excerpt from Poster detailing individual components of KANGARoO

IV

Texas Instruments Analog
Design Contest

Entering the Texas Instruments (TI) Analog Design Contest had a very positive impact in the completion
of this project. TI’s analog design contest inspired the design of the robot to be expanded
further than

originally envisioned, enabling more a more rob
ust feature set and increased functionality. Along with
the knowledgeable support of the TI community, TI’s Analog ICs greatly improved the development
cycle of the
KANGARoO
.

IV.I

SN754410: Quadruple Half
-
H Driver

A dependable and stable driver was an integral

part of the operation of the robot. Without this
component
,

enabling
separate power sources (and voltage levels) for both the MCU and the DC motors
would have been a much more difficult task.
The ability
to isolate the VCC levels for the motor power
sup
ply and logic levels of the MCU proved invaluable for the execution of this project.

IV.II

UA7805: 3 Pin 1.5A Fixed 5V Positive Voltage Regulator

A resilient voltage regulator was essential to the functionality of the robotic system.
This part ensured
that no m
atter what the input voltage was into the IC, the output would
remain

a fixed 5V. This
dependability ensured that voltage was never a concern in the design process. Paired with this chip’s
internal current
-
limiting and thermal
-
shutdown features, the UA78
05 was an invaluable piece of
insurance for the operation of the robotic system.

IV.III

U
CC283
-
5
:
Single Output LDO, 3.0A, Fixed(5.0V), Reverse Current
Protection, Thermal Shutdown

The UCC283
-
5 was an imperative component in the overall circuit design for our pro
ject. After
integrating all of the
parts

in our system, it become clear that the
camera was quite current hungry.

U
sing the UA7805 (which is limited at 1.5A) would not allow the camera to even turn on, never mind
function. Then UCC283
-
5 saved the day! B
y outputting a fixed 5V like the UA7805, it was

still able to
power the camera.

H
owever by having a higher current limit than the UA7805, the UCC283
-
5 (with its
current limit at 3A instead of 1.5A) solved the power management issues that were encountered
when
the camera was integrated into the KANGARoO.

V

Additional Components

In addition to the brilliant analog integrated circuits that Texas Instruments could provide for our
project, the decision was made to use
the following

complementary components
:

V.I

Micro
soft Kinect

By utilizing the Kinect, the need for a physical controller is removed from this project’s specifications.
Instead, the user simply moves their body to control the robot. The Kinect is able to detect the user’s
Kinect Assisted Natural Gesture Approach to Robotic Operation

P a g e

8


skeleton and the positions of t
heir joints in real
-
time. When paired with the
FAAST

toolkit described
below
, kinematic gestures were converted into keystrokes that acted as control signals for the robot.

V.II

Flexible Action and Articulated Skeleton Toolkit (FAAST)

As stated on the project’
s website:

FAAST is middleware to facilitate integration of full
-
body control with games and VR applications
using either

OpenNI or the Microsoft Kinect for Windows skeleton tracking software. The toolkit
can emulate keyboard input triggered by body postu
re and specific gestures. This allows the user
to
add custom body
-
based control mechanisms to existing off
-
the
-
shelf games that do not
provide official support for depth sensors.

V.III

Xbee Explorer Dongle

A b
reakout board

is

used to facilitate serial communication between
the
Arduino Uno and
a
portable
computer.

This device creates a virtual COM port when connected to a PC, which can be opened via X
-
CTU to communicate directly from a terminal to the Xbee radio.

V.IV

Xbee 1mW Wire

Antenna


Series 1

Two of these radios were used in the project
:

one attached via the Explorer Dongle (described above) to
a PC and the other connected via the Shield Module (described below) to the Arduino Uno. The two
radios communicate wirelessly via
serial to transmit data between the robot and the PC.

V.V

Xbee Shield Module for Arduino Uno

A b
reakout board
is
used to integrate
the
Xbee radio with
the
Arduino Uno.

This device fits snugly on
top of the Arduino Uno,
which preserves
the GPIO pins that it do
es not use
.

It also allows

the Xbee radio
to fit securely on top of the shield and be flawlessly integrated with the Arduino Uno. This device made
communication between the Xbee and the Arduino Uno a non
-
issue.

V.VI

Arduino Uno
, ATMega328P

Featuring 14 GPIO pi
ns and 6 Analog Input pins, the Arduino Uno was a strong choice for a MCU for our
project. When paired with the Xbee radios, wireless serial communication between the PC and the
robot was a breeze
.

As well, the simplicity of the
Arduino programming enviro
nment
contributed to
a
relatively painless
coding process.

V.VII

Solarbotics GM3: 224:1 Gear Motor 90° Output

This was a cheap, low
-
power, and effective motor. The 90
°

output meant
allowed for a simple and
painless mounting system due to
two through
-
holes in
the plastic chassis of the motor.

V.VIII

Solarbotics GMPW
-
Y Yellow Wheels

These wheels pair perfectly with the above motors
;

like
wise

they
were inexpensive and reliable. They
got the job done
and
didn’t break the bank in the process.

Kinect Assisted Natural Gesture Approach to Robotic Operation

P a g e

9


V.IX

Vex Claw Kit

Using a Claw Ki
t meant that we could spend less time designing a mechanical claw and more time
integrating
and polishing other aspects of the robotic system. The claw came mostly pre
-
assembled
.

I
nterfacing with it was no more complicated than interfacing with one of the

robot’s gear motors, as the
2
-
wire motor that the claw kit comes with acts identically to a DC motor.

V.X

MQ
-
3: Alcohol Sensor

This detects
the presence or absence of controlled substances with a hydroxyl functional group (
-
OH)
bound to a carbon atom.

The id
ea behind using this sensor was not to detect the presence or absence
of alcohol. It was more a place
-
holder for any other type of flammable, hazardous, or otherwise
dangerous gas sensor. This gas sensor was chosen because calibration is easy
,

simply bec
ause rubbing
alcohol is in abundant supply. Methane, carbon dioxide, propane, etc. are not as easily obtainable
.

However,

the setup and calibration processes would be similar enough that
,

given access to these
materials for calibration
,

they could be inte
grated into the existing robotic system without difficulty.

V.XI

AIC250W: Airlink 250W Wireless Camera

When set up over a wireless local area network, this camera is able to transmit a video feed wirelessly
back to a computer.
The viewer can watch the feed in any
browser by viewing the camera’s IP address
on the local area network
.

This enables the operator to “see” from the base station what the camera is
looking at, regardless of the separation that exists between the base st
ation and the robot.

VI

Conclusions

The beauty of the KANGARoO system stems from a blending of functionality, precision, reliability, and
adaptability.

Given the modular nature of the KANGARoO’s design, the flawless integration of the
internal components is
truly remarkable.

Each piece was purchased, calibrated, and assembled
individually, yet all work together as a dependable product.

The central idea behind this design was to create a lightweight and user
-
friendly product with various
real
-
world applicatio
ns.

These include, but are not limited to, military situations, exploratory
submersibles, and search and rescue operations.

The overall focus of this implementation was
intelligence gathering, regardless of the environment.

A modular design complements
this goal; each
piece can be modified, adapted, or replaced, to accommodate the requirements of any scenario.

A perfect example of the modular design is the onboard gas sensor; the current model employs an
alcohol sensor for proof of concept and ease of te
sting.


However, this sensor could easily be replaced
by another to detect oxygen levels, explosive materials, hazardous chemicals, or harmful gases.

From a
military standpoint, this could provide vital information to soldiers exploring unknown environmen
ts
about whether it is safe to enter.

Beyond the sensor, the wheels and attached motors can be upgraded to caterpillar treads or larger
wheels allowing for exploration into rough terrain.

The claw can be retrofitted or upgraded to allow
additional degrees

of freedom.


This could be used for collecting samples, removing hazardous waste, or
Kinect Assisted Natural Gesture Approach to Robotic Operation

P a g e

10


even bomb defusing. In these delicate situations, an intimate movement
-
based control system provides
a very simple and natural user interface.

Finally, the camera allows

for remote navigation and
information gathering pertaining to the robot’s immediate environment.

In theory, the only control
requirement is that both the user and robot have network access, as all data can be transmitted
wirelessly via satellite.

Theref
ore
,

the control system and robotic unit could be located in different
countries; due to a tight budget, we were unable to test this theory.

VII

Appendix

VII.I

Distribution of Tasks

Throughout the duration of the project, the
vast majority of
time spent working was
done in a group
with both members of the team present. Roles were loosely defined
;

group members
worked
on
accomplishing tasks together as a cohesive unit. That being said, each member in the group did
gravitate towards accomplishing certain tasks in ord
er to achieve the
most
efficient use of time.

Anna was primarily responsible for tasks including, but not limited to, quality control, opti
mization, and
research. She le
d the initial research into the project when the topic was under consideration. It wa
s
also her initial idea to create a Kinect
-
controlled robot. Additionally she took it upon herself to
determine the range of motion that the Kinect can interpolate, including the optimal range for the
Kinect, fields of vision, and other mathematical metri
cs.

Alex was the lead technical expert on the project. He analyzed much of the research that Anna
presented applying his understanding of Computer Engineering to determine what could be
accomplished in the six month time span that is senior project. He w
as also responsible for determining
how to create a Kinect
-
controlled robot, breaking down Anna’s idea for the project into reasonably sized
milestones. Additionally, Alex investigate
d

the Arduino Uno, Xbee radios, FAAST, and other technologies
to determi
ne what could be applied to the project and how it could be best utilized.

Anna took charge of quality control and optimization throughout the project, organized the materials
and task list, and streamlined all paperwork associated with the project. She p
laced the majority of the
orders for materials that were required for the project and kept track of expenses incurred
.

Having
taken CSC 101 and 102, as well as having experience with Matlab and SAS, Anna was a valuable resource
when developing

the codebase

for the project.

Alex on the other hand, spent time reading through datasheets for parts being considered for the
project to determine not only if they were a good fit, but also how to integrate them with the existing
components. He also wrote the majori
ty of the code, not because Anna was incapable of doing so (far
from it)
,

but rather because he is more practiced with software development and could develop code
faster. During the writing process, Anna was intent double checking each line of code as it
was created
to ensure that mistakes were caught and corrected early on in the development cycle.

Anna was also primarily responsible for the majority of the physical hardware realization of the robot.
She designed and built the robot’s frame and mounted t
he motors, wheels, claw, camera stand, and
Kinect Assisted Natural Gesture Approach to Robotic Operation

P a g e

11


many of the internal components. Given a circuit diagram designed by Alex she was
able
to wire (and
re
-
wire many times
while troubleshooting
) the internal circuitry of the robot.

Alex was responsible for the majo
rity of interfacing components, especially those associated with the
camera. He spent many hours attempting to establish communication with his laptop and the AIC250W
wireless camera via a wireless router to no avail. After much hard work, this task was
finally
accomplished, and now the wireless camera can relay a live video feed to the browser of his laptop
while the robot is in operation.

In summary, the entire project was truly a group effort. Neither person
undertook
a larger burden of
the work
;

each person was individually motivated
to accomplish

the
tasks
at which they excelled
. The
outcome and success of this project would not have been realized
without

both of these dedicated
individuals working

in collaboration
.

VII.II

Lessons Learned

The
following list is a compilation of the highlights of the lessons that we have learned while working on
this project. We hoped to include the lessons that could be applicable to future projects, regardless of
how similar they are to ours.



Ground everything

in the circuit to avoid floating logic levels. When it doubt, ground it out!



Removing the back of the breadboard causes the rails to drop out of the breadboard at sporadic
times, making troubleshooting hardware components a nightmare.



Double check all vo
ltages to components. Too much voltage fries things. Like our Xbee radio…



You get what you pay for. We should have invested in nicer wires, breadboards, camera, etc. to
save ourselves many headaches that were caused by cheap/unreliable components.

VII.III

Speci
al Thanks

We would like to thank the following resources for their various contributions to our project. The
following list is in no particular order, and without any one of these entities, our project would not have
been the same.



Dr. Franz Kurfess, for
his guidance and knowledge



Google, for its infinite wisdom and resources



Caffeine, for providing emotional, moral, and physical support



Allison Bachman Okihiro, for her design of our poster



Parents, grandparents, and other family members who supported us b
oth emotionally and
financially through the process



Cline Red Label Zinfandel, for keeping us sane and level
-
headed



The TI E2E™ Community and Arduino Forum, for their technical knowledge and troubleshooting
expertise



Each other, for not giving up, giving

in, or screaming in anger



Texas Instruments, for providing high
-
quality costs at zero costs



Steven “Loki” Hodges, for helping us come up with the KANGARoO acronym

Kinect Assisted Natural Gesture Approach to Robotic Operation

P a g e

12




Dr. Zoe Wood, for her help in the initial stages of the original project



The USC Institute f
or Creative Technologies, for providing F.A.A.S.T, without which our project
would not have been possible



Brian Fein and Aaron Goswami (Anna’s roommates), for allowing us to take over the living room and
dining room as a semi
-
permanent workspace



Max Paoloz
zi, for providing us hardware and knowledge for this project



Sparkfun, for replacing the XBee we destroyed free of charge



PolyPrints, for printing our poster at the last minute



Lucky Larry Blog, for providing a comprehensive design for the basic circuit we

used in the project

VII.IV

Source Code

Below is the code developed for this project using the Arduino coding environment. It relies heavily on
the built
-
in Arduino serial libraries.

int motor1Pin1 = 8; // pin 2 on SN754410

int motor1
Pin2 = 12; // pin 7 on SN754410

int motor2Pin1 = 11; // pin 10 on SN754410

int motor2Pin2 = 10; // pin 15 on SN754410

int motor3Pin1 = 2; // cla
w pin 1

int motor3Pin2 = 7; // claw pin 2


int gasSensorPin = A0;

int sensorValue = 42;

boolean clawOpen = false;



void setup()

{


// set the motor pins as outputs:


pinMode(motor1Pin1, OUTPUT);


pinMode(motor1Pin2, OUTPUT);


pinMode(motor2Pin1, OUTPUT);


pinMode(motor2Pin2, OUTPUT);


pinMode(motor3Pin1, OUTPUT);


pinMode(motor3Pin2, OUTPUT);



//Create Serial Object


Serial.begin(9600);

}


void loop()

{


//Have the arduino wait to receive input


while (
Serial.available() == 0);




//Read the Serial Data


char data = Serial.read();



Kinect Assisted Natural Gesture Approach to Robotic Operation

P a g e

13



switch (data)


{


case 'w':


Serial.println("Moving Forward");


digitalWrite(motor1Pin1, LOW);


digitalWrite(motor1Pin2, HIGH);


digitalWrite(motor2Pin
1, LOW);


digitalWrite(motor2Pin2, HIGH);


break;




case 's':


Serial.println("Moving Backward");


digitalWrite(motor1Pin1, HIGH);


digitalWrite(motor1Pin2, LOW);


digitalWrite(motor2Pin1, HIGH);


digitalWrite(motor2Pin2, LOW);


break;




case 'd':


Serial.println("Turning Right");


digitalWrite(motor1Pin1, LOW);


digitalWrite(motor1Pin2, HIGH);


digitalWrite(motor2Pin1, HIGH);


digitalWrite(motor2Pin2, LOW);


break;




case 'a':


Serial.printl
n("Turning Left");


digitalWrite(motor1Pin1, HIGH);


digitalWrite(motor1Pin2, LOW);


digitalWrite(motor2Pin1, LOW);


digitalWrite(motor2Pin2, HIGH);


break;




case 'q':


case 'e':


if (clawOpen == true)


{


Serial.pr
intln("Qlosing Claw");


digitalWrite(motor3Pin1, LOW);


digitalWrite(motor3Pin2, HIGH);


clawOpen = false;


delay(1000);


digitalWrite(motor3Pin1, LOW);


digitalWrite(motor3Pin2, LOW);


}


else


{


Serial.println
("Expanding Claw");


digitalWrite(motor3Pin1, HIGH);

Kinect Assisted Natural Gesture Approach to Robotic Operation

P a g e

14



digitalWrite(motor3Pin2, LOW);


clawOpen = true;


delay(1000);


digitalWrite(motor3Pin1, LOW);


digitalWrite(motor3Pin2, LOW);


}


break;




case 'f':


Seria
l.println("Reading Alcohol Sensor...");


sensorValue = analogRead(gasSensorPin);


Serial.print("Value: ");


Serial.println(sensorValue);


break;



default:


Serial.println("Turning Off");


digitalWrite(motor2Pin1, LOW);


digital
Write(motor2Pin2, LOW);


digitalWrite(motor1Pin1, LOW);


digitalWrite(motor1Pin2, LOW);


digitalWrite(motor3Pin1, LOW);


digitalWrite(motor3Pin2, LOW);


}




//Echo the Input


//Serial.print(data);



}



Kinect Assisted Natural Gesture Approach to Robotic Operation

P a g e

15


VII.V

Trac Wiki

Below is a text
-
only version of the milestones logged on the
Trac Wiki throughout the project. The full
Wiki is available here:
https://wiki.csc.calpoly.edu/KinectExplore/roadmap?
show=all

Milestone: Control Video Game using Kinect Gestures

Completed 5 months ago (01/16/12)

By utilizing FAAST and the Xbox Kinect, we will be able to control a video game that has basic
keyboard controls (i.e. w / a / s / d) with gestures. These inclu
de, but are not limited to, leaning,
arm extension, leg movement, body rotation, and stepping.


Milestone: Finish Milestones

Completed 4 months ago (01/28/12)

Finish inputting milestones for Wiki.


Milestone: Purchase MCU for Robot

Completed 4 months ago (
01/29/12)

Make a decision on the following components:


UART / Serial

Bluetooth / RF

USB

Purchased:


Xbee Shield Module: http://www.amazon.com/gp/product/B006TQ30TW

Arduino Uno: http://www.amazon.com/gp/product/B006GX8IAY

(2) XBee

1mW Wire Antenna
-

Series 1: http://www.sparkfun.com/products/8665

XBee Explorer Dongle: http://www.sparkfun.com/products/9819

Milestone: Research Wireless Communication

Completed 4 months ago (02/06/12)

Research wireless communication capabilities and
development between Arduino Uno and
laptop.


Milestone: Purchase Wheels and Motors

Completed 4 months ago (02/08/12)

Four wheels, two DC motors. Two wheels will be attached to one motor each, the other two will
not have power.


Order #1J76140:


4 x #186 So
larbotics GMPW
-
Y Yellow Wheel = 14.00

2 x #181 Solarbotics GM3 224:1 Gear Motor 90 deg. Output = 11.90

Purchased 2/8/2012 2:27PM


Milestone: Enable Communication Between Laptop & Arduino

Completed 2 months ago (03/21/12)

Make sure wireless communication is

set up and stable. Set up real
-
time communication
capabilities.


Partially completed as of 2/6/2012

Kinect Assisted Natural Gesture Approach to Robotic Operation

P a g e

16


Enabled communication from Arduino to Laptop, need to enable Laptop to Arduino
communication


Update as of 2/27/2012

Communication from Laptop to Arduino

still does not work. We have reached out to the Arduino
community to see if there is supplementary code or ideas of how to fix this problem.


Update as of 3/11/2012

We believe the shield we are currently using has a manufacturing defect that does not allow the
XBee antennae to sit flush. We have ordered a new shield, which should come by the end of the
week.


Update as of 3/21/2012

The jumper was on the USB setting i
nstead of XBee setting. When we received the boards, they
were set up in this manner, so we did not think to change this setting.


Update as of 5/22/2012

While re
-
wiring the circuit, we accidentally hooked up 9V instead of 5V to the Xbee, and we now
have
only 1 functional Xbee. We will attain a replacement ASAP


Milestone: Design Body & Arms

Completed 2 months ago (03/27/12)

This includes, but is not limited to motors, servos, "hand", materials, power supply, webcam
mount.


Used a box design for the body w
ith a hinged lid allowing for storage of internal components
inside the box, and an overall clean look for the robot. Also drilled two holes in the lid; one is to
allow for wires to exit the box and the other is for the camera mount. The mount came with th
e
camera when purchased.


The motors chosen were simple plastic geared DC motors because they were low power and
inexpensive. No servos were used in this project.


The claw was a kit obtained from Vex robots, which included a DC motor for the claw's
operat
ion.


The power supply consists of battery packs and a fixed 5V output linear voltage regulator for the
Arduino Uno, Camera and claw. The motors operate on a separate circuit powered via a 9V
battery.


Milestone: Enable Wheel & Motor Usage

Completed 2 mont
hs ago (03/28/12)

Hook up motors and wheels. Test and troubleshoot maneuverability.


Update: 3/9/2012

We called companies that sample ICs. Mouser Electronics has agreed to send us three motor
controllers, completely free of charge.


Update: 3/13/2012

Kinect Assisted Natural Gesture Approach to Robotic Operation

P a g e

17


The motor controllers arrived in the mail today. We will hopefully be able to get these hooked up
by next week at the latest.


Update: 3/22/2012

Wired up motor controller and motors to bread board. Wired to Arduino Uno. Motors are not
turning.


Update: 3/
28/2012

Fixed issue with motors; a floating ground existed in the circuit. Once the Arduino Uno and the
motor's power source shared a common ground, the problem was resolved.


Update: 5/15/2012

Discovered that Arduino Uno was becoming unusually hot when
motors were operating. Unsure
as to the source of the problem.


Update: 5/23/2012

After re
-
reading the datasheet for the motor controllers, noticed that there are separate VCCs
for motor power and logic level on the motor controllers. We had hooked up the

motor power
(9V) to both VCC1 and VCC2, making the logic level 9V, even though the Arduino Uno's logic
level is 5V. This was causing current to sink into the Arduino Uno. Re
-
wiring the circuit so that
VCC1 and VCC2 are separate voltages seems to have alle
viated the problem.


Milestone: Buy all materials for body

Completed 2 months ago (03/30/12)

Used 1/4" plywood, 1/2" nails, gorilla glue, and 2 hinges.


Milestone: Complete Build of Basic Model

Completed 2 months ago (03/30/12)

Build the basic model of the

robot. This includes, but is not limited to the following:


wheels

motors

camera base

bread board

batteries for Arduino

9v battery for motors

Arduino Uno


Milestone: Create Movement
-
Only Look
-
up Table

Completed 2 months ago (04/01/12)

Create and implement

a movement only look up table. These movements will include forward /
back / left / right.


Update 4/1/2012

The current table is as follows:


User Gesture

Robot Movement

Left hand out


Turn left

Left hand forward


Move backward

Right hand out


Turn right

Kinect Assisted Natural Gesture Approach to Robotic Operation

P a g e

18


Right hand forward


Move forward


Picture: Alex hard at work coding the look
-
up table and corresponding gestures. (Kinect in
picture)


Milestone: Research Sensors

Completed 7 weeks ago (04/13/12)

Possible sensors found on pololu:


Flammable gas & smoke s
ensor

Alcohol gas sensor

Methane gas

LPG / Isobutane / Propane gas sensor

Carbon monoxide gas sensor

Carbon monoxide and flammable gas sensor

Other possible sensors:


Temperature

Oxygen

Distance

We have ordered two gas sensors: alcohol and propane. We
chose these because they are
relatively easy to test.


Update: 5/1/2012 Due to space constraints, we decided to only use the alcohol sensor.


Milestone: Arm Design

Completed 7 weeks ago (04/14/12)

We found an arm that is capable of grabbing and manipulatin
g various objects. According to the
manufacturer, it is strong enough to hold a 20
-
oz soda can. Link below.



Robotic Hand


Milestone: Order Arm

Completed 4 weeks ago (05/01/12)

We have ordered the following hand component. Once we have the claw, we will d
etermine the
best way to mount it to the robot.



Robotic Claw


Milestone: Enable Webcam Capabilities

Completed 4 weeks ago (05/03/12)

Utilize two 4
-
AA battery packs running at 7
-
12V DC, which will be run through a voltage
regulator. The regulator will out
put 5V which the camera will use so it will be run on a total of 8
AAs. Camera being used is the AIC 250W.


Update as of 4/28:

We are still working on getting the camera working through a router. We are having issues with
the software used to view the fe
ed.


Update as of 5/7:

Kinect Assisted Natural Gesture Approach to Robotic Operation

P a g e

19


Enabled webcam on Thursday. Used the following websites as a guide:


Setup Site 1


Setup Site 2


Camera mount shown below:


Milestone: Enable Sensors

Completed 4 weeks ago (05/03/12)

Enable the alcohol sensor.


Update as of
4/28/2012: We wired up and were able to read in from the alcohol sensor. We
tested it using 71% rubbing alcohol and were able to determine the absence or presence of the
alcohol.


Update as of 5/2/2012: We determined the best resistor is 100KΩ. We discover
ed the sensor is
very difficult to calibrate. Thus, we plan to use it to provide boolean output. The user will get an
alert when the sensor detects alcohol.


Milestone: Enable Arm

Completed 3 weeks ago (05/13/12)

Hook up and mount arm. Test code and integr
ate final code into main robotic control code.


Update 5/12/2012: We chose to mount the arm on the bottom of the robot so it would not get in
the way of the camera. The arm is now mounted and fully functional.


Milestone: Test & Finalize

2 weeks late (05/1
4/12)

Milestone: Last Minute Checklist

2 weeks late (05/18/12)

Final preparations (in no particular order):

* Finalize wiki

* Finalize total costs

* Poster board

* Temperature sensor

* ADC