here - The Media Corporation

ranchocucamongabrrrΤεχνίτη Νοημοσύνη και Ρομποτική

6 Νοε 2013 (πριν από 3 χρόνια και 7 μήνες)

138 εμφανίσεις






SAN DIEGO STATE UNIVERSITY



12/22/2010


NEUTRONNIX

Presents

TrakTor



Louis Dudley

Vanessa Capetillo

Jason Farnsworth

Israel Cintora

Eric Bailey

Tuan Le

Emmanuel Garsd



SPONSORED BY SAN DIEGO STATE UNIVERISTY DEPARTMENT OF
ELECTRICAL AND COMPUTER ENGINEERING


1


Contents

Abstract

................................
................................
................................
................................
.........................

2

Introduction

................................
................................
................................
................................
..................

3

System Design

................................
................................
................................
................................
...............

3

Navigation

................................
................................
................................
................................
.................

3

Sensor Network

................................
................................
................................
................................
.....

4

Sensor Network Printed Circuit Boards

................................
................................
................................

4

Sensor Master:

................................
................................
................................
................................
......

5

Sensor Master Printed Circuit Board:

................................
................................
................................
...

5

25kHz Ultrasonic Sensors:

................................
................................
................................
.....................

6

Zigbee Rf Module:

................................
................................
................................
................................
.

8

42 kHz Ultrasonic Sensors:

................................
................................
................................
....................

8

Compass

................................
................................
................................
................................
................

9

Appearance Based Navig
ation

................................
................................
................................
............

10

Locomotion

................................
................................
................................
................................
.............

12

Chassis

................................
................................
................................
................................
.................

12

Locomotion Software:

................................
................................
................................
.........................

14

Speed Control:

................................
................................
................................
................................
....

14

Locomotion Pri
nted Circuit Board

................................
................................
................................
......

15

Conclusion and Recommendations

................................
................................
................................
............

16

Appendix

................................
................................
................................
................................
.....................

18

Schematics

................................
................................
................................
................................
..............

18

Printed Circuit Boards

................................
................................
................................
.............................

18






2


Abstract

Given the task to d
esign an autonomous robot to guide its way through a predetermined course, Team
Neutronnix presents to you TrakTor. TrakTor is the only robot in San Diego State University’s robo
-
navigation challenge to implement image recognition into the design. This i
mage recognition, along
with an array of sensors, guides TrakTor through each successive beacon avoiding obstacles along the
way. Using this advanced technology, TrakTor was able to win the competition and pave the way to the
future of autonomous
navigation.




3


Introduction

The challenge presented to Team Neutronnix was to build an autonomous robot to navigate through a
predetermined course in the quickest time possible. The course consisted of five beacons, strategically
placed to require an intell
igently crafted robot with a variety of sensors. Each beacon emitted a 25kHz
ultrasonic signal and a Zigbee Radio Frequency transmitter to aid Traktor in navigation.

Team
Neutronnix decided to split itself up into two teams: Navigation and Locomotion. The

Navigation team
was responsible for determining the direction the robot should move, while the Locomotion team was
responsible for making the robot move in this direction. The central processing unit for TrakTor is a
netbook computer. The netbook brain co
nnects to a master from the Navigation team and a controller
from the Locomotion team. The Navigation team utilized 25kHz ultrasonic sensors and Zigbee RF
receiver module for beacon detection, 42kHz ultrasonic sensors for obstacle avoidance, compass for
er
ror correction and image processing through Fabmap for overall robot guidance. The Locomotion
team utilizes
Pololu Motor Controllers and Wheel Encoders with an Atmel AVR microcontroller to move
the robot to the correct location. Team Neutronnix selected th
ese specifications because we believed
this setup would provide the greatest probability of completing the course. Traktor is a 20”x20”x20”
image processing autonomous robot, complete with two power sources and an emergency stop switch,
designed with the i
ntegrity and intelligence to reach each beacon within a $750.00 budget.

System Design

Navigation

The goal of the navigation is t
o guide
TrakTor

through the course by avoiding obstacles

and locating
waypoints
and send direction inform
ation to the locomotion
.


4


Sensor Network

In order to isolate and maximize performance of all of our sensors, we decided to build a sensor
network. The sensor network allowed each sensor to feed up to the microsecond current information
while maintaining just a single point of con
tact with the netbook brain. In order to do this, we used a
single microcontroller, called the sensor master, to act as a hub for the sensors. Each sensor group
would have their own satellite microcontroller that would poll its sensors for data. There was
one
satellite microcontroller for the two 40 kHz ultrasonic sensors, one for the compass, one for the two 25
kHz ultrasonic sensors, and one to connect to the RF module. Each microcontroller was a pic16f1827.
This microcontroller was chosen because it had
multiple MSSP modules, used for SPI and I
2
C, had
adequate speed, had an internal oscillator, and was very low cost.

Sensor Network

Printed Circuit Boards

In order to integrate our sensor network efficiently and inexpensively we decided to employ an array
of
custom made printed circuit boards (PCBs). Each sensor in the network had it’s own printed circuit
board (4 total) which contained its dedicated PIC microcontroller.

The printed circuit boards for each of the sensor networks employed a relatively simp
le design for ease
of use and customizability. Each comprised of a 2
-
pin power in header, that was adjacent to a bypass
capacitor to reduce noise, a 6
-
pin header to allow for programming and in circuit debugging of the PIC
microcontrollers via a PicKit pr
ogrammer, and a 4
-
pin header employed to communicate via the SPI
protocol to our sensor master. The PCBs utilized for the 25kHz and 40kHz ultrasonic sensor networks
each had two 3
-
pin headers, one for each sensor connected, while the Zigbee RF sensor netw
ork had
only one 3
-
pin header. For these 3
-
pin headers, one pin used for data from the related sensor, one used
for power, and the last for ground. The PCB utilized for the compass made use of a 4 pin header, as the
compass required use of the I2C protoc
ol for communication and required an extra pin for a clock signal.
Each header pin was connected via a trace to a pin on the PIC microcontroller.


5


Sensor Master
:

The heart of the sensor network was the sensor master. The sensor master used the MSSP module to
create an SPI bus that all of the satellite microcontrollers could connect to. SPI was chosen because it
has low overhead and high speed, while needing very fe
w connections. That is because the in, out, and
clock ports are shared among the slaves. A satellite microcontroller is selected by lowing its slave select
pin. The master lowers this pin and then sends a clock signal. Each clock tick allows the master to
receive
another bit from the slave.

The sensor master was connected to the netbook brain using an FTDI chip. This chip converts USB
signals into RS
-
232 signals, then converts the logic levels so the microcontroller can understand them. In
order to stay sy
nchronized with the netbook brain, the sensor master sat and waited to receive the ascii
character ‘a’. Once this character was received, the master would poll out to each satellite using SPI to
receive its sensor data. Once all the data had been received,

the master sent it to the brain so it could be
integrated into the navigation program.

Sensor Master Printed Circuit Board
:

The main sensor master printed circuit board was designed to be connected to each of the outlying
sensor network PCBs and also to

the netbook that we used for the brain of our project

through the FTDI
chip
. The sensor master PCB consisted of an array of seven
4
-
pin SPI headers and seven 2
-
pin power headers. These
were used to communicate with and power the other PCBs
employed. Al
so included was a 6
-
pin header attached to
program and debug the PIC microcontroller with the PicKit
and a 2
-
pin header used for TX/RX (transmit and receive)
signals to be attached to an FTDI chip to communicate with
the laptop. All of the header pins wer
e connected to the PIC microcontroller using traces on the PCBs.

6


Although we did not need, or end up using, all of the SPI and power headed we included them so we
would not have any setbacks if we decided to include more sensors.


25kHz Ultrasonic Sensors
:

Detecting the exact location of each beacon is paramount to the success of Traktor. To complete this
task, two 25kHz ultrasonic sensors were designed in order to detect the beacon and to return the most
optimal direction to the sensor master. In order to

receive a useable signal outside of 3 feet, it is
necessary to create an amplification circuit to read the signal input from a farther distance. A huge gain
on the operation amplifier is the best way to read the signal from the greatest distance. We decid
ed on
a gain of 1000 V/V with a
capacitively coupled

input signal from a band
-
pass filter to control stray
frequencies. A variable gain is also possible because a potentiometer was implemented into the design
as can be seen on the schematic in the Appendix
. A comparator receives the amplified output and
decides if the analog signal is strong enough to trigger a “hit” (output the supply voltage) for each
sensor. A variable voltage threshold was implemented for the comparator which will output a digital 5
vol
t signal or if it’s below the threshold, it will output 0 volts.

This circuit can be seen as an analog to digital signal amplifier, which is necessary for our requirements.
After testing the circuit numerous times, we decided on a threshold voltage of 2.
55 volts. This value
provided the most consistent and reliable output with a distance up to 25 feet. The comparator will
output 5 volts because we decided to run our supply voltages at 5 volts for the 25kHz detection circuits.
Once the comparator receives
the amplified analog signal, if it is higher than the threshold voltage, it
outputs a 5 volt digital signal into our PIC16f1827 microcontroller. The process of the beacon detection
circuit can be seen below (input signal was recorded from 4 inches away and

other signals were
recorded from 3 feet away):

Input signal (1.05V amplitude):





Amplified signal (4.80V amplitude):


7




Digital Signal (5V amplitude):




Zoomed
-
out Amplified Signal (4.80V amplitude):




Debugging the circuit’s were necesarry because every operation amplifier and 25kHz sensor reciever are
not exactly the same. To ensure that each sensor was the same, we monitored each circuit after the
signal outputted through
the operational amplifer on the oscilliscope. We placed the beacon 15 feet
away and ajusted the gain potentiometer on each circuit to make each circuit exactly the same.

The
debugging process was a success.

Upon receiving the ouput from the comparator, th
e PIC16 records the “hits” and computes valuable
information to return to the sensor master. Interrupt on Change(IOC), a feature on the PIC16f1827, is
what we used to record the inputs from the left and right sensors. Each sensor is inputted into its own
p
in on the PORTB register which supports the IOC feature. The IOC event triggers when a rising edge
occurs on the input signal, i.e. change from 0 to 5 volts. Once the interrupt is triggered, the TMR0 value
is reset and begins counting until the other senso
r triggers the IOC flag. Each pin has its own associated
flag (IOCBF0 & IOCBF2), which is used for knowing when a pin changes and which pin was triggered first.

8


Once the other pin triggers, the Timer stops, and the value is stored. A prescaler was set for
TMR0 which
would have TMR0 overflow after 1 mS. If the other pin does not trigger in that time, we know only one
sensor hit and we are far off to that side. After receiving both the time difference and which sensor hit
first, the program decides on an appr
opriate angle (strong left, left, straight, right, hard right) and
returns the associated direction to the sensor master.

Zigbee Rf Module:

The RF module, as provided by John Kennedy, was essential for
knowing when Traktor had arrived at a beacon. Each u
ltrasonic
beacon emitted different radio frequencies which were received by
the RF module. The module would distinguish which beacon it was
receiving and also the distance Traktor was away from the beacon through PWM. Each data packet
would be stored on th
e module and the strongest beacon and signal would be outputted. We decided
only to output the strongest beacon and signal to the sensor master because Traktor only uses the RF
module for knowing when Traktor is under the beacon. When Traktor travels under

the beacon, the
module will output a value of approximately 0xFC
-
> 0xFF. When Traktor receives multiple inputs of
greater than FC, Traktor knows it’s under the current beacon and changes into its next state.

42

kHz Ultrasonic Sensors
:

We used two 42
kHz u
ltrasonic sensors on the robot. One was placed on the front of the robot and the
other was placed on the front right side. The idea was for these sensors to allow us to do object
avoidance as well as some wall tracking. In practice, though, we ended up usi
ng them to help the robot
go straight in areas where the compass was inaccurate. The front sensor had a very wide beam, which
made it difficult to work with as it picked up on a lot of small items. The side sensor, however, was a
very narrow beam, and was
extremely accurate. The sensors were connected to their own
microcontroller, which connected to the master using SPI. This allowed us to make sure the data was

9


always current, as the microcontroller was very underused. This was fine, however, because the c
hip
was so low cost, and the accuracy made it worth it. The sensors were connected to the microcontroller
using pulse width modulation. We connected the output of each sensor into two inputs on the
microcontroller, for a total of four pins. This allowed on
e pin to monitor the high pulse, while the other
monitored the low pulse. This was repeated for both sensors. Once the pulse went high, timer zero was
reset. Once the pin went low again, timer zero was recorded. Using a 1:128 prescaler gave us
approximatel
y inch units. The chip would constantly monitor the sensors, stopping only to send the data
to the master when the chip select interrupt occurred. This interrupt was triggered by a low signal on
slave select.

Compass

Another sensor used was the compass mod
ule. We decided to go with a very inexpensive module with
no tilt compensation, as we didn’t need a lot of accuracy and only needed relative readings. Our vision
for how we were going to use the compass changed greatly over the course of the project. The o
riginal
intention was to use it to mark metadata and for turning. This proved to be unnecessary, but the
compass found great use for keeping our robot on a straight line. Before the robot moved straight it
would take a compass reading. If at any time while

going straight it veered off path more than a degree
the robot would compensate with a slight inverse veer.

It was very important that the microcontroller had two MSSP modules, as the compass module
communicated using I
2
C. We set the module to continuous

mode so the data would always be ready
when needed. The alternative would be a polling system, but that would take too much time between a
request and receipt. The only downside to continuous mode is it uses more power, but power was not
an issue in our d
esign so the tradeoff was well worth it. The microcontroller would constantly get the
compass heading using I2C. Once the chip’s slave select pin went low, the data would be sent back to

10


the sensor master. Two SPI transfers were needed, as the compass head
ing was a 16
-
bit number. This
number represented the degrees with .1
-
degree accuracy.

Appearance Based Navigation

Our main method of Navigation was FABMAP (Fast Appearance Based Mapping). FABMAP is an
appearance based navigation system developed by Mark Cummings at Oxford University. It uses visual
vocabularies created using the OpenCV libraries to describe a particul
ar image and return a probability
when comparing the description to another image's descriptor. Using FABMAP we were able to take
pictures of the entire course in what we called mapping mode and create vocabularies for each picture.
Then using these vocabu
laries and we were able to create a vocabulary for an image captured while
running the course and using FABMAP calculate the probability that the image taken matches the
location of one of the mapping mode images.

Although finding the similarity of places
based on appearance is not a trivial task FABMAP did a very
effective job at finding matches. It provided very low probabilities of a match when images were clearly
from a different location and acceptably low probabilities when images looked similar but w
ere
indistinct. For images that were from the same location FABMAP provided very high probabilities.
FABMAP was very robust and was able to filter out people and return matches even when objects were
missing from one image to the next.

We implemented FABMA
P using a netbook with an atom processor and a high quality PlayStation
camera. The netbook was the main control for the robot which did all of the sensor processing and
FABMAP image processing. Because we needed to map the course before we could run the c
ourse we
had two different modes of operation: Mapping mode and Running mode.

In mapping mode we manually drove the robot around the predetermined course. The robot either
went forwards or backwards, or tuned in place to the left or right. The pictures wer
e only taken when

11


changes in the current motion had to be made to navigate the course or when we wanted to make sure
the robot was in the right position on the course. The mapping procedure consisted of taking a picture
and then moving to the next position

that we wanted to do image comparison in. The duration and
direction of the move were obtained and written as meta data to be used in running mode for the image
taken. This process was repeated until the entire course was mapped.

In running mode the robot

began by taking a picture and submitting it to FABMAP. This picture was
compared to the map of images generated in mapping mode and a probability that it matched one of
the images was returned. If an acceptable match above a 90 percent probability thresho
ld was found,
the robot then performed some simple error correction in pose. The error correction consisted of the
robot adjusting its forward and backwards position and its rotation. After small adjustments the robot
would take another picture to see if t
he probability of a match went up. If the probability increased then
the positioning was improving and this process was continued. If the probability went down then that
particular error correcting process was undone and aborted. Once the optimum positioni
ng was found
the meta
-
data for the particular image that matched was read. The meta
-
data comprising of the
direction and duration of the robots move after the image was taken in mapping mode was then used
by the robot to move itself in running mode. After
this another image was taken and the whole process
began again. In essence the robot was performing dead reckoning to certain checkpoints at which it
used FABMAP to find where it was on the course and error correct. The error correcting zeroed out the
erro
rs associated with dead reckoning, such as odometry from the wheel encoders, that over time
integrates and makes navigation impossible.

There was also the possibility that FABMAP failed to recognize the location it was in, probably because
the robot was no
t in the right position. In this case the error correcting described previously was
attempted first. However, if error correcting failed to find a match the robot then started sweep mode.

12


In sweep mode the robot would take a picture, try to localize itself

using FABMAP and then rotate a few
degrees. This process was very effective when the error from dead reckoning had left our robot at a bad
angle. If 360 degrees were rotated and the robot still had not found its location the robot went into
what we dubbed

cockroach mode. In this mode the other sensors took priority until the robot could
locate itself.

Locomotion

The objective of the locomotion is to take directions from the navigation team and move TrakTor in the
desired direction and speed adjusting for e
rror along the way.

Chassis

The only requirements regulating the chassis of our robot was that it was to fit within a 20”X20”X20”
cube, but other design constraints were left up to us. When we initially were determining what size we
would like to impleme
nt, the decision was made to attempt to go with a much smaller robot than the
maximum restricted size. The previous robo
-
navigation competitors and our competition all used the
7.5 inch rubber wheels and maximum robot dimensions in their design. We figur
ed that a smaller and
subsequently lighter robot could give us some advantages for obstacle avoidance and speed. However,
after much consideration, the disadvantages of attempting a smaller design were too great for us to
overcome. With a smaller design,
we would be required to purchase the smaller motors and wheels
from our restricted budget, rather than have them provided for us, and we figured the larger wheels
could give us some much needed benefits over rough terrain.

When completing the actual design

for the chassis for our robot, we decided to keep our dimensions
relatively square (18”X16.5”), with the two larger wheels near the front and a caster wheel centered in
the back. Going with a longer design would have put more weight on the caster wheel a
nd would have
put more stress on the motors when we had to turn. We decided to have the spaces for the font wheels
cut 2” from the front of the body piece. This would allow us to have a front bumper than would protect

13


our wheels in case we did get off c
ourse and
run into something. Instead of going with a
pure square however, we decided to employ
angles on all of the corners, including our
front bumper, which would be more likely to
glide off something in an impact scenario
rather than get hung up. The
top and
bottom of our chassis body were made from
1/10” aluminum sheets. Although the aluminum did provide some rigidity, we had to make use of a
horizontal aluminum square tube to add some strength against flexing, which became a major issue
when we were

testing without the top panel attached. To be able to mount the gears and wheels, we
went with aluminum rails with slits already cut into them. The slits allowed for very easy mounting but
also assisted when we had to adjust the gear position when tensi
oning the belts. When mounting the
wheels and axels to the aluminum rails we went with very basic aluminum brackets and positioned the
wheels below the bottom robot chassis panel. This
positioning increased our ground clearance so we
would not have to w
orry about being snagged on
the ground, but also insured our ultrasonic sensor
would be positioned correctly when mounted.
When mounting the gears to the aluminum rails we
also made use of basic aluminum brackets and
mounted the gears above the bottom alu
minum chassis panel. This positioning made mounting the
motors and motor controllers much easier, but also kept our motors away from the ground where they
could be easily damaged. To attach the top aluminum chassis panel to the bottom we employed 4 steel


14


bolts as stands and self locking nuts to securely fasten the body together. This provided an inexpensive
and secure solution to the body assembly
.

Locomotion Software:


In order for the locomotion

team to accomplish the task, w
e decided to use the ATMega
328
Microcontroller. We decided to use two of them because noise from the encoders was an issue. Also
coding for one motor will be the same for the coding of the other motor, but just reversed. The
ATmega328 has 4 PWM channels, and has two interrupts.

We used the interrupts included in the
Microcontroller to read the pulses from the encoder. With the encoder pulses we were able to
determine our current speed.

Speed Control:


In order to make the motor go to our desired speed
,

we decided to implement

a PID control
system. The way the PID works is as follows: We read the encoder value with an interrupt and save this
value in a unsigned long called count. Our actual speed will be the value of count minus previous count.
This actual
speed and the

de
sired speed
will be sent

into the main PID function. This PID function will
then send us back the correct PWM in order for our actual speed to equal our desired speed. Inside the
PID function we w
ill subtract our desired speed from

our current speed and based on this error we will
update the new PWM until the motor
reaches zero

error. With zero
error, our

desired speed will be
equal to our actual seed.




15


Included also in our function was the ability of the navigation team to sen
d a stop signal at any time and
we will stop the motor immediately.


Every revolution for our motor counts as 464 encoder ticks. Each revolution is two feet, giving us a
speed of 0.46 miles per hour when doing this course. Handling the
turns was

an easy task for the
locomotion team. We received from the navigation team a different command, which told us to go at a
really slow turning speed until navigation team sends us a stop signal. Our team used two AVR
Microcontrollers but only one of them

was talking to the navigation team via serial communication. This
way the navigation team only worried about sending one signal to the locomotion team. The
locomotion team was in charge of getting that signal and transmitting it to the other AVR
microco
ntroller. The two AVR Microcontrollers talked to each

other

with I2C communic
ation. The
Navigation team sent

data to the I2C master, and the master will
send
the data to the I2C slave. Noise
was a big issue for the locomotion team, especially when it came

reading values from the encoder. The
way we correct this problem is by putting a pull up resistor and a capacitor on the encoder.

Locomotion Printed Circuit Board

The board gets power from battery 26V and reduces voltage down to 5V by pololu step
-
down
voltage
regulator D24V6ALV.And the H
-
bridge 8 pins are connected to 8 headers on the PCB board. The only 3
pins are using which are PWM, DIR and ground. The micro controller chip is power up 5V by voltage
regulator and receiving the command from navigation

part and drives the wheels. The microcontroller
chip is Atmega328.

The
LED

on each PCB

is connected to 5V pin out from Voltage regulator and

to the


16


ground.

This informs us if the PCBs are being powered.

Another
LED is connected to pin 19 (digital pin
13
) of Atmega328. This LED is blinking

if the microcontroller is working.

We utilized a
10k ohm resistor
"up" (to power) on the RESET pin in order to prevent the chip from resetting itself during normal
operation. The RESET pin reboots the chip when pulled
down to ground.

A
16
MHz

external clock

is
connected

between pin 9 and 10, and add two 22 pF capacitors running to ground on each of those pins

to eliminate noise.
Atmega328 pin 4 is connected to the gear encoder also connecting to 33.3up
capacitor and 1
K Ohm resistor to reduce noise from encoder.

Atmega328
pin
s 2 and 3 (RX and

TX) are
using to communicate to navigation
side via serial communication.


Atmega328 pin
s 27and 28 (analog
pins 4 and
5) are using for I2C to c
ommunicate between the two loco
moti
on PCB
.

Conclusion and Recommendations

The SDSU Robo
-
Navigator challenge showcased the importance of team
unity and time management.
TrakT
or did not reach all five beacons on competition day, but we deemed it a success based on our
knowledge gained and imp
ortance of our interp
r
etation of an autonomous robot. W
e

also

feel our goal
is still attainable.
Our
effort

never dwindled during the process, but our time sadly ran out as the end of
the project neared. We are proud to demonstrate the results of our project because it was the first
image
-
processing robot to attempt the course, but many outside variables preven
ted a complete
success.


The first thing we would have done differently to attempt to successfully complete the project is

to
increase

our course testing and minimize

our theoretical testing.
TrakTor

performed very well in a
group of smaller tests, but w
hen it came time to run the real course, there were still too many little
things that needed to be adjusted. Several of these problem problems arose from inconsistencies in our
theoretical testing to the actual runs on the day of the competition. Mostly
vehicles and civilians that
were not an issue

because they were not there

during our initial testing became large issue
s

on the

17


course.

Also lighting became an important issue as different times of the day introduce new shapes and
edges that TrakTor uses
to map and at night ou
r robot was effectively blind. We would recommend
minimizing theoretical testing and begin actual testing as soon as possible
to students in successive
semesters
.

Another major issue and something we would have changed was the time i
t took to integrate the pieces
of
TrakTor

together. We made too many assumptions that if components worked well alone then they
would also work
in conjunction with each other.
Each sensor on our robot performed perfectly when
isolated, but when integrate
d together, many problems arose. These problems
included

serial timing
problems, corrupted information and environmental variabl
es.

If we had assumed that integration
would be such an integral part and consumed so much time, we could have better planned h
ow we
distributed our time.

For future students, we recommend assuming the worst when it comes to
integration and giving yourselves a large enough window to make sure everything works together.

This leads to our final issue and the most important thing we

would have changed
, which is
time
managem
ent.

Our plan did not fail us, however the clock did.

We believe our design and goals could
have been met if we would have organized our time in a better fashion.
We underestimated the
amount of work that would
be involved in going from a breadboard
-
based design to a final design using
printed circuit boards.


We should have completed the final design and testing much earlier in the
project, which would have given us a few extra weeks to make the final changes.

Part of the issue was
treating deadlines softly and believing we still had enough time to complete the project.
We would
recommend to give at least two full weeks to integration and debugging alone

and meeting set
deadlines
, resulting in plenty of time to

test the robot in real
-
world scenarios.




18


Appendix

Schematics


Printed Circuit Boards