Extra on Parallel Computing (SSD)
In attending John Oleszkiewicz presentation on Data Parallel Programming Using the
Beowulf Supercomputing Clusters, I learned from the presentation that the Beowulf uses
a multi instruction multi
pipeline. I learned that MPI stood for missing passing
interface and it similar to a library in a run time environment (it uses send and receive).
Data can be split by sections and run on the quad processor via a network (not necessarily
a bus, because th
e Ethernet is used as the bus).
As far as what is parallel computing it is defined as multiple simultaneous application or
execution of a codified sequence of operation on related data.
Parallel programming is writing a p
rogram such that it allows users to execute a
sequential program as a parallel one.
he system the code was compiled with
level bookkeeping and provides for all of the communication needed between
The technique seems to work best
with a workstation (networked shared memory
computer running Linux
, but could also work with Mac OS X and Windows NT
It has theo
retical mathematical applications such as with recursive, summation, and high
order functions or anywhere it doesn’t make economic sense to use a supercomputer, but
the performance of a supercomputer is expected.