Block Diagram - WordPress.com

fiercebunElectronics - Devices

Nov 2, 2013 (3 years and 5 months ago)

43 views

AUTOBOTS

Block Diagram

Editor: Ben Wagner


H.J. Mroz

Scott Owen

Teya Reinbold

Ben Wagner


10/17/10









2


Abstract:


This arti
cle contains block diagrams
describing the Grove City College Autobots autonomous
robotics project.
The article contains the
block diagrams of the
overview of the system
as well as block
diagrams
for more specific pieces of the project.

Introduction:


This

paper

begins

with
the
overarching system block diagram

and ends

with
the smaller sensor
block diagr
ams.
The micr
ocontroller is the brains of the

sy
stem and is in the middle of the

block diagram
in all of the levels.

The microcontroller accepts all of the inputs from all of our sensors and contains all of
the intelligence to make decisions about what the

correct adjustments for the outputs are.

The
microcontroller takes all of the programming a
nd allows different parts of the

system to interact to
produce a working product.


















3



























Chassis/Arm

Microcontroller

Sensors

4


Figure 1:
Overall System

B
lock
D
iagram

Figure 1 shows
the three main components of the

desig
n. This is the broad view of the

project.
The three main parts shown are important to having a good design. Having a working physical system,
which is the chassis and arm, having a microcontroller that can interact properly with the all of
the

motors and sensors, and having the correct s
ensors giving us all of the necessary information and
getting all of these to work together will result in a successful product.


Getting both electrical and
mechanical systems to work together
correctly is the main problem

for this type of embedded system
.

Teya Reinbold will be in charge of making sure that
interact properly with the

arm and claw system for
picking up important objects on the playing field.

Ben Wagner will program the interaction between the
microcontroller and the motors that drive our ch
assis.


















Figure 2: Specific interactions in system

Robot Steering

Arm Control

Magnetic Sensor

Microcontroller

Camera

Docking Port

Motor Encoders

5



Figure 2 shows how the microcontroller interacts with the physical systems, as well as the
electrical sensor system.
The physical systems are the steering system which drives the chassis, and the
arm, which collects the parts on the floor of the competitio
n field. The microcontroller sends the signals
in order for these systems to make the correct movements, acting as the brain of the system. The
sensors such as the camera, and the magnetic sensor, are interactive with the microcontr
oller
.

Scott
Owen will b
e responsible for the integration and implementation of the camera with the
microcontroller. He will write the programming for object recognition, and send the appropriate data to
the microcontroller.

The microcontroller sends signals out to these sensors,

such as
an
acknowledge

bit
,
and receives information about the surroundings of the object such as positioning from the camera, and
whether a
gizmo is magnetic or not.

H.J. Mroz will be in charge or making sure that a working magnetic
property sensor will
interact with the microcontroller and the data collected can be used to make critical
decisions about how the claw reacts to this information.

The microcontroller also interacts with the
docking
station; this section will also be the responsibility of H.J.

Mroz
.

H.J. will make sure that the dock
is all making good contact with
the
microcontroller input ports.

When the robot makes contact with this
docking station, the microcontroller accepts input which lets us know which type of product is defective,
and w
hich type is acceptable.
Motor encoders allow the system to keep track of how far the motors
have rotated and compare this data in order to make decisions to make the robot drive straight.

















6














Figure 3:

Steering and Arm Control interaction

system

A more detailed representation of the robot
steering and arm control can be seen in Figure 3.
The robot steering is split into the left and right chassis motors. The motor will make its respective
wheel go faster or slower depending on what movement is indicated by the microcontroller. The arm
mo
vement is also determined by the microcontroller. The height controlling motor raises and lowers
the arm so that it is at the right level to gather objects. The claw adjustment motor controls whether
the claw is open or closed. This allows the robot to
collect gadgets and gizmos according to the
information gathered by the sensors.

In testing this system, we will individually piece together each of the sections so that they are
working by themselves. First each leading member will get his part working al
one, and then slowly we
will implement them together. Once this is done, we can begin to program the whole system

together
.

Once we have the camera working, we will begin to make driving decisions based on its input, and once
we have the arm working, the c
amera will also control this movement. After this step, the magnetic
sensor and docking station will be implemented to make our system complete, able to complete all of
the functions of the game.

We will also be planning on using wood to build our chassis
. Wood is an easy
to manipulate material which will allow us to have a system that is easy to put together. We also need
the wood to be light enough so that we meet our goal of being under 24 pounds.

In order to have

enough power for
the

system,
the maxim
um draw of power at one time

had to
be calculated
. With four motors running at once. Two large motors would be used to drive the system
which uses a maximum of

3.34 A,one medium motor used for arm height uses a maximum of 2.39 A,
Microcontroller

Camera

Docking Port

Magnetic Sensor

Left Chassis
Motor

Right Chassis
Motor

Claw adjust
ment
motor

Arm height

adjustment

motor

Motor Encoders

7


and one

small motor which
uses .8 A. A
dd
ing

2 amps to cover any extra pull from the motors and the
microcontroller and the sensors

gives
a total max amp rating of the system

of 11.87 A. Because of this,

a
7.2 V, 3000mAh battery

will be used
.
3/11.8 equal
s the fraction of an hour t
hat

the
battery

will produce
the necessary power
, the result still
allows the

system to run for over 15 minutes, which is 5 times
longer than

the

competition time of 3 minutes.

One important change
from

the

original design and requirements is the additi
on
of the camera.
Originally, the plan was to use
the Wii camera or some other IR camera, combined with ping sensors to
locate the gizmos and gadgets. However, after further consideration, those sensors would have a very
hard time doing what
was necessary to
locate these items
. Ping sensors would be hard to use, as the
objects are round, and wouldn’t necessarily reflect a ping perfectly.
The decision was made
that shape
recognition would be easiest, and this could be done using a camera and image processing so
ftware.
Furthermore, this would allow us to implement color recognition, which in turn would enable the robot
to autonomously sort the gizmos, instead of the users having to sort them.

The camera system simplifies
our design because instead of using two or

more sensors, we only need to use one, this simplifies our
implementation process.



Conclusion:


The block diagram illustrates

the different components of the

project, and how they interact
together.
Completing of all of these sensors and their interact
ions will allow us to fulfill our goals of
being able to sort through both the gizmos organized by magnetic properties and the gadgets organized
by color.
Each block is essential to completing
the previously
stated objectives, and the successful
implementa
tion of all blocks will result in a successful project.