The Future of Computer Technology

worshiprelaxedElectronics - Devices

Nov 2, 2013 (3 years and 3 days ago)


The Future of Computer Technology

and its implications for the computer industry

Over the past four decades the progress in computer technology has been
driven by Moore’s Law, something which started life as an observation has
now turned into a self fulfilling prophecy which businesses rely on for planning.
Moore’s Law although is depe
ndent on the ability
to shrink transistor
dimensions and increasing the number of transistors that can be manufactured
economically on an integrated circuit
and with that comes collateral gains in
performance, power
efficiency and, last but not least, cost
. The semiconductor
industry is confident in

ability to continue to shrink transistors for another
decade or so, but it is unknown what will happen as we get closer to the
atomic limits. We cannot assume that smaller circuits will be faster or more
er efficient. This is already being noticed as design costs are increasing
rapidly which in turn is impacting the economics of design
in ways that will
affect the entire computing industry.

The modern era of computing started on June 21 1948, when the
Baby computer first executed a program stored in its cathode ray tube

Over the last 62 years we have seen huge developments in computer
and the technology used to build the machines. A major
development has been seen in the e
nergy efficiency of the machines.

Manchester Baby used 5 joules per instruction,
n 1985, the first ARM
processor executed 6 million instructions per second and used 0.1 W.

used 15 nanojoules per instruction and now ARM968 uses 100 picojoules

instruction. Comparing the energy efficiency of The Manchester Baby and
ARM968 energy efficiency has improved by a factor of
0*10. It is this
massive improvement which has made the consumer electronics market what
it is today.

After the invention
of the first recognisably modern, electrically powered
computers the limited speed and memory capacity forced programmers to
write hand tuned assembly language programs.
It was soon discovered that
programming in assembly language required a great deal of

intellectual effort
and was error

In the 1950’s the first three modern programming
languages were designed FORTRAN, invented by John Brackus, LISP invented
by John McCarthy and COBOL created by the Short Range Committee. ALGOL
60 was also created i
n the late 50’s
which featured two key language
innovations, nested block structure and lexical scoping.

The main period of development in programming languages however came
during the late 1960’s and 1970’s as most of the major language paradigms
now i
n use were invented in this period such as Simula, C, Smalltalk, Prolog,
ML. Each of these languages spawned an entire family of descendents and
most modern languages count at least one of them in their ancestry.

The 1980’s

and 1990’s were

mostly spent e
laborating upon the ideas invented
in the previous decade. C++ combined object
orientated and systems

in 1980. The 90’s saw the recombination and maturation of older
ideas seeing productivity as the main goal. As the internet took off many new
scripting languages were incorporated as many consider these to be more
productive but often because of choices that make small programs simpler but
large programs more difficult to write and maintain.