Doug Carlin CS 451 Extra on Parallel Computing (SSD) In attending John Oleszkiewicz presentation on Data Parallel Programming Using the Beowulf Supercomputing Clusters, I learned from the presentation that the Beowulf uses a multi instruction multi-data pipeline. I learned that MPI stood for missing passing

unevenoliveSoftware and s/w Development

Dec 1, 2013 (3 years and 9 months ago)

99 views

Doug Carlin

CS 451

Extra on Parallel Computing (SSD)



In attending John Oleszkiewicz presentation on Data Parallel Programming Using the
Beowulf Supercomputing Clusters, I learned from the presentation that the Beowulf uses
a multi instruction multi
-
data
pipeline. I learned that MPI stood for missing passing
interface and it similar to a library in a run time environment (it uses send and receive).
Data can be split by sections and run on the quad processor via a network (not necessarily
a bus, because th
e Ethernet is used as the bus).


As far as what is parallel computing it is defined as multiple simultaneous application or
execution of a codified sequence of operation on related data.
(
http://newton.gsfc.nasa.gov/aCe/aCe_dir/dictionary/ParaComp.html
)


Parallel programming is writing a p
rogram such that it allows users to execute a
sequential program as a parallel one.
T
he system the code was compiled with
performs
all the

low
-
level bookkeeping and provides for all of the communication needed between
processors


The technique seems to work best
with a workstation (networked shared memory
computer running Linux
, but could also work with Mac OS X and Windows NT
).


It has theo
retical mathematical applications such as with recursive, summation, and high
-
order functions or anywhere it doesn’t make economic sense to use a supercomputer, but
the performance of a supercomputer is expected.