pucefakeAI and Robotics

Nov 30, 2013 (4 years and 5 months ago)



Erik Sveiby Oct 1994, updated 31 Dec 1998. All rights reserved.


Information according to Cybernetics

Information according to Shannon

The Contradiction between Shannon and Wiener

Meaning in a Cybernetic Sense

Meaning in Shannon`s Sense

mation or Knowledge or Life?

Information via Massmedia

Information Complexity and Overload

Information has No Value and No Meaning

Information in Etymology

The word information is derived from Latin

which means "give form to".
The etymology thus connotes an impo
sition of structure upon some indeterminate
mass. Allén & Selander (1985) have analysed how the word is used in Swedish
language and find that this is probably the most widely used meaning of the word.
Most people tend to think of information as disjointed

little bundles of "facts". In the
Oxford definition of the word it is connected both to knowledge and communication.

Knowledge communicated concerning some particular fact, subject or event; that of
which one is apprised or told; intelligence, news.

The w
ay the word information is used can refer to both "facts" in themselves and the
transmission of the facts.

Information according to Cybernetics

The double notions of information as both facts and communication are also inherent
in one of the foundations o
f information theory:

introduced by Norbert
Wiener (1948). The cybernetic theory was derived from the new findings in the 1930s
and 1940s regarding the role of bioelectric signals in biological systems, including the
human being. The full title

Cybernetics or Control and Communication in the
Animal and the Machine.

Cybernetics was thus attached to biology from the

Wiener introduces the concepts, amount of information,
, feedback and
background noise as essential characteristics of how the human brain functions.

From Wiener (1948) p. 18:

The notion of the amount of information attaches itself very naturally to a classical
notion in statistical mecha
nics: that of entropy. Just as the amount of information in a
system is a measure of its degree of organisation, so the entropy of a system is a
measure of its degree of disorganisation.

Wiener coins the label of a whole new science:

We have decided do cal
l the entire field of control and communication theory,
whether in machine of animal by the name Cybernetics, which we form from the
Greek steersman.

And declares his philosophical heritage:

If I were to choose a patron for cybernetics... I should have to
choose Leibnitz.

What is information and how is it measured? Wiener defines it as a

One of the simplest, most unitary forms of information is the recording of choice
between two equally probable simple alternatives, one or the other is bound t

a choice, for example, between heads and tails in the tossing of a coin. We
shall call a single choice of this sort a decision. If we then ask for the amount of
information in the perfectly precise measurement of a quantity known to lie between
and B, which may with uniform a priori probability lie anywhere in this range, we
shall see that if we put A = 0 and B = 1, and represent the quantity in the binary scale
(0 or 1), then the number of choices made and the consequent amount of information
is infinite.

Wiener describes the amount of information mathematically as an integral, i.e. an area
of probability measurements (p.76). Wiener says the formula means:

The quantity that we here define as amount of information is the

of the
ty usually defined as entropy in similar situations.
(My bold)

Wiener`s view of information is thus that it contains a structure that has a meaning.

It will be seen that the processes which lose information are, as we should expect,
closely analogous to th
e processes which gain entropy.

Information is from its conception attached to issues of decisions, communication and
control, by Wiener. System theorists build further on this concept and see information
as something that is used by a mechanism or organi
sm, a system which is seen as a
"black box", for steering the system towards a predefined goal. The goal is compared
with the actual performance and signals are sent back to the sender if the performance
deviates from the norm. This concept of negative fee
dback has proven to be a
powerful tool in most control mechanisms, relays etc.

Information according to Shannon

The other scientist connected with information theory is Claude Shannon. He was a
contemporary of Wiener and as an AT&T mathematician he was pri
marily interested
in the limitations of a channel in transferring signals and the cost of information
transfer via a telephone line. He developed a mathematical theory for such
communication in
The Mathematical Theory of Communication
, (Shannon & Weaver
59). Shannon defines information as a purely quantitative measure of
communicative exchanges.

Weaver (in Shannon & Weaver 1959), links Shannon`s mathematical theory to the
second law of thermodynamics and states that it is the entropy of the underlying
chastic process in the information source that determines the rate of information
generation (p.103):

The quantity which uniquely meets the natural requirements that one sets up for
"information" turns out to be exactly that which is known in thermodynamic
s as

Shannon defines the amount of information as the negative of the logarithm of a sum
of probabilities. The minus sign in this formula means the opposite of Wiener`s minus
sign. It is there because
the amount of information according to Shannon

is equal
to entropy

For an information theorist based on Shannon it does not matter whether we are
communicating a fact, a judgement or just nonsense. Everything we transmit over a
telephone line is "information". The message "I feel fine" is informatio
n, but "ff eeI
efni" is an equal amount of information.

Shannon is said to have been unhappy with the word "information" in his theory. He
was advised to use the word "entropy" instead, but entropy was a concept too difficult
to communicate so he remained
with the word. Since his theory concerns only
transmission of signals, Langefors (1968) suggested that a better term for Shannon´s
information theory would therefore perhaps be "signal transmission theory".

But Shannon`s "information" is not even a signal

If one is confronted with a very elementary situation where he has to choose on of two
alternative messages, then it is arbitrarily said that the information, associated with
this situation, is unity. Note that it is misleading (although often con
venient) to say
that one or the other message conveys unit information. The concept of information
applies not to the individual messages (as the concept of meaning would), but rather
to the situation as a whole, the unit information indicating that in thi
s situation one
has a freedom of choice, in selecting a message, which it is convenient to regard as a
standard or unit amount.

The contradiction

Weaver, explaining Shannon`s theory in the same book:

Information is a measure of one´s freedom of choice in

selecting a message. The
greater this freedom of choice, the greater the information, the greater is the
uncertainty that the message actually selected is some particular one. Greater
freedom of choice, greater uncertainty greater information go hand in h

There is thus one large

and confusing

difference between Shannon and Wiener.
Whereas Wiener sees information as negative entropy, i.e. a "structured piece of the
world", Shannon`s information is the same as (positive) entropy. This makes
s "information" the opposite of Wiener`s "information".

How can something be interpreted as both positive entropy and negative entropy at
the same time? The confusion is unfortunately fuelled by other authors. The systems
theorist James G. Miller writes in

Living Systems (p.13): It was noted by Wiener and
by Shannon that the statistical measure for the negative of entropy is the same as that
for information.

Miller also quotes (p. 43) Shannon`s formula but omits Shannon`s minus sign. Since
entropy is define
d by Shannon as a negative amount, a positive amount should be the
same as negative entropy, i.e structure. It seems that Miller makes a misinterpretation
of Shannon.

Meaning and the Observer

Meaning in a Cybernetic Sense

There are many meanings about wha
t meaning2 is. I try to approach an understanding
by distinguishing between living systems or natural objects versus manmade.

James G. Miller defines in his work Living Systems (p.39) goals for living systems
like this:

By the information input of its char
ter or genetic input, or by changes in behaviour
brought about by rewards and punishments from its suprasystem, a system develops a
preferential hierarchy of values that gives rise to decision rules which determine its
preference for one internal steady
ate value rather than another. This is its purpose.
A system may also have an external goal. It is not difficult to distinguish purposes
from goals. I use the terms: an amoeba has the purpose of maintaining adequate
energy levels, and therefore it has the
goal of ingesting (= swallow) a bacterium.

As I interpret the cybernetic view, the signals in a system thus contain "information"

which have some meaning for the purpose of the particular system. Someone or a
system outside the system may define a goal,
but the meaning of the information that
is sent/received within the system does not necessarily have a meaning outside the
system. The information in the feedback loop has a meaning only in relation to the
purpose. The goal of the subsystem is determined b
y a system on a higher level. The
brain itself can be seen as such a suprasystem as long as it performs functions of
temperature regulator etc.

The signals controlling the muscle thus have no meaning outside the system of the
muscle although the goal of th
e muscle is determined by a suprasystem like the brain.
The only thing that the suprasystem "cares about" is whether the muscle fulfils its
purpose or not.

Suppose I interfere with some apparatus and with the intention to interpret the signals
from one of
my own muscles while it is fulfilling its purpose according to the goal of
my suprasystem, the brain. I am able to impose several meanings on the signals
caught by the apparatus but those meanings are outside both the system and the
suprasystem. I would as
sume that it is the same with a non living system man made
system like the computer. The signals controlling the computer programs have no
meaning outside the computer, even if they originally are programmed by a human

"Meaning" in the cybernetic co
ncept relates to a system of systems only. If the human
being interferes with a purpose outside the system

it imposes an interpreted level of
meaning outside the system.

Wiener`s "information" presumes an observer with a meaning of his/her own outside
e system who determines the goal of the system. The observer may be another
machine but in the end (or perhaps beginning) there must be a human being
somewhere with an intention or purpose. The observer`s meaning is thus interrelated
with the system`s mean
ing. The signals of the system therefore have a relation to a
human meaning, even if it can be very distant.

Miller argues that a living system should in principle be the same. The difference is
the observer, however. Who is the final observer with a purpo
se or a goal? What is the
goal of the amoeba in the world or the muscle in the human body or what is the
purpose of man? One might as systems theory does, see no purpose other than the
living system maintaining itself (Miller p.39). It seems a very meaning
less world to
me but I can`t answer that question! It seems to me that we then enter the realms of
philosophy or theology.

Wiener`s concept of information relates both to manmade systems and to living
subsystems like the liver or even the brain as a tissue

of neurons. These systems use
signals in a way that cybernetic theory seems to explain.

But there is a difference between the brain tissue itself and how this tissue is used in
reflection and interpretation. They reflect two different unrelated levels of
Even if one assumes that the only purpose of mankind is to maintain life, it seems that
a human being may from time interfere in a way that rocks the foundation of any
system. Vamos (1990) argues in a convincing way that closed manmade systems are


Meaning in Shannon`s Sense

One of the conclusions from Shannon`s theory is that entropy contains more
information than structure. It is a strange idea that goes against common sense. Let us
first try to understand Shannon`s concept by seeing i
t in connection with
human/human communication.

Shannon presumes something/someone outside the transmission chain with a message
which corresponds to "information". However, it is not information that is
transmitted, but signals. There is a sender and a r
eceiver of these signals. The sender`s
meaning must be interpreted by the receiver outside the transmission itself. For doing
this, both sender and receiver must have something in common

at least a language,
otherwise they will not understand each other.

If someone receives a meaningless
expression via the telephone line there thus exists a very large number of possible
interpretations. The information exists as a potential3 which of course is very large in
a meaningless expression. If the expression on t
he other hand is crystal clear and the
sender and the receiver share exactly the same understanding then there is very little
information transmitted, only the signals themselves.

The signals thus exist on another level than the information and the two hav
e nothing
to do with each other unless the code of meaning is shared from the beginning.

Let us assume a natural system or object like a stone. The stone is meaningless in
itself. Even a stone can thus be seen as "containing" an infinite number of potentia
meanings. It is a very large amount of "information" in Shannon`s sense. The stone
may be measured, weighed observed etc. by humans down to the atomic level. The
number of interpretations from such observations is equally infinite.

We will also see that
the "sender" of the signals is not the stone but the human being`s
apparatus. It would be an inconceivable task to transmit over a telephone line the
entire possible amounts of data which make up the movements of the atoms which in
their turn make up the o
bject we call "stone". The object
stone exists as a
source of potential information, which is not there until some human being interprets
it, i.e. gives it meaning.

The word "stone" on the other hand consists of only five symbols that take up very
ittle channel capacity. It is a very reduced human interpretation of the object
stone as compared to the richness of the atomic movements. The notion "stone" is
therefore on another level than the atomic movements that make up the object
e. The meaning of the word "stone" (as well as the meaning of any signals or data
from any measurements of object
stone) are both human constructions, which
have nothing to do with the object
stone itself.

All meaning is interpreted outside t
he transmission of signals. "Information"
according to Shannon, must therefore not be confused with meaning. Shannon`s
information relates not so much to what you do say as to what you could say (or do
not say). The problems of interpreting signals into a
"message" are left outside
Shannon`s definition. Not so with Wiener. He assumes some meaning at least for the
system level.

In order to get around the problem of meaning a common method is to contrast the
word information with the word data. See for instan
ce Schoderbek & al (1975/85,
p.152). Data is according to them seen as:

unstructured, uninformed facts so copiously given out by the computer. Data can be
generated indefinitely; they can be stored, retrieved, updated and again filed. They
are a marketable

commodity . . . each year the cost for data acquisition grows on the
erroneous assumption that data are information.

The usage of the word information is by these authors restricted to facts with meaning
or evaluated data. Information is connected to the
circumstances of the receiver or
user, whereas data exist independent of user. Data are seen as unevaluated pieces or
materials, whereas information refers to data evaluated for a particular problem.

It is tempting to see Shannon`s signals as "data" and th
e meaning of the signals as
"information", but this would be wrong. Shannon`s information can not be
transmitted, like the system theorists assume.

This distinction is problematic also because meaningful information for one user in a
specific situation mi
ght be devoid of meaning for another user in another situation.
What can be defined as information in one context becomes data in another. The same
set of symbols might therefore be toggling between "data" and "information"
depending on the circumstances.

A definition of this kind does not bring any further understanding. Other successors of
Shannon have suggested mathematical theories which add "meaning" to his theory.
One idea suggested by Brillouin 1956 (referred in Jumarie 1990) is to regard the
of information as a function of the ratio of the number of possible answers
before and after a communication has taken place. Information would then be the
difference between the two.

It might be this that Gregory Bateson referred4 to when he made his famo

Information is a difference that makes a difference.

Brillouin also came up with a paradox: Suppose that a lenghty piece of information is
sent as a text. The last item of the text is a bit which tells the receiver that all the text
before th
e bit is untrue. Has any information been transferred then? Brillouin suggests
an addition to Shannon`s theory which would take care of this: "negative"

Another concept along the same line is "relative" information introduced by Jumarie
. He tries to define a mathematical theory which incorporates "subjective
transinformation" or a meaning that is relative to the receiver.

The suggested mathematical additions to Shannon`s theory have found little practical
usage, however. Is it perhaps be
cause they try to relate two categories which are not
possible to combine?

Information or Knowledge or Life?

If information is seen as having "meaning" is it not the same as knowledge? By
building on the notion that structure contains more information tha
n chaos it is often
suggested that by "engineering" information or by "adding value" or by selecting,
interpreting and updating information, it can be transformed into knowledge.

Such non
mathematical hierarchies are suggested by several authors. One examp
le is
Barabba & Zaltman (1990) who are discussing the use of market research information
and how one is to know whether the information gathered are "facts" or not. They
propose a hierarchy that I have come across elsewhere: Data (numbers, words) lowest

the hierarchy, Information (statements), Intelligence (rules), Knowledge
(combination of the levels below) and Wisdom (combined knowledge bases) highest
in the hierarchy.

The link between information and knowledge can be found also in the quote above

the Oxford Dictionary. This link has been made even closer in some popular
books, especially the best
sellers Megatrends (1982) by the market researcher John
Naisbitt and The Third Wave (1980) by the journalist Alvin Toffler.

They build on a/o Masuda (198
0) and interpret the change in the US economy as a
transition from a society based on smoke stack industry to a society based on
information. Naisbitt charted the "megatrend" as follows: we now mass
information the way we used to mass
produce cars.

In the information society, we
have systematised the production of knowledge and amplified our brain
power. To
use an industrial metaphor, we now mass
produce knowledge and this knowledge is
the driving force of our economy.

Notice how "information" subtl
y becomes synonymous with "knowledge" as if there
were no distinction between the two.

A similar analogy is often made by in common debate :

The amount of knowledge is doubled every 7th year. (Measured as volume of
scientific articles)


Here the quantity
of symbols contained in articles is equalised with "knowledge". Is
that a meaningful statement?

I could add others to the list. Computer manufacturers sometimes claim that their
machines are "knowledge processors". The label "Information Society" is no lon
fresh so Peter Drucker (1993) calls the present times "Knowledge Society".

Why do computer scientists join forces with computer manufacturers and popular
authors and make these kinds of statements? Is it because they share a common
interest? With Fouca
ult`s words: Baptizers are not innocent...

The latest technological developments has further challenged the interpretation of
what information is. We tend to regard information as fixed in a text or a set of
numbers. If I look at it tomorrow it will be the

same text. However, digitised
information in networks like Internet has no "final cut". As in oral tradition, it is
copied and added in a continuos process. Information unconstrained by package in
form of books or journals becomes a continuous process, mo
re like the continuous
adaptation of stories of the oral tradition before literacy, which were changing with
every retelling or resinging.

Below is a quote from the magazine Wired, March 1994.

Information is an activity. Information is a life form. Informa
tion is a relationship.
Information is a verb not a noun, it is something that happens in the field of
interaction between minds or objects or other pieces of information. Information is an
action which occupies time rather than a state of being which occu
pies physical

For the information enthusiasts information becomes equal to "life". And it comes
equipped with intention:

Information wants to be free.

Information via Massmedia

Information theory is restricted to one sender/receiver relationship via

one channel.
Therefore, none of the theories cover a communication situation where messages are
conveyed from one sender to many receivers via a broadcasting massmedium.

Today we consume information in such enormous quantities that no one, born before

technical media revolution, could possibly imagine it. Wiener`s words. . .

. . . to live effectively is to live with adequate information. . .

. . . do not fit a world filled by the flickering of TV
screens, fragments of texts,
snatches of music, "authore
d" by copy
writers, journalists, electronic devices,
commentators etc.

We live in societies that are rapidly approaching a stage where 50% or more of the
citizens are writing and speaking words and processing texts, numbers and pictures
which are possible
to reproduce. The "fact" in one text book or encyclop
dia or CD
Rom may be contradicted by another fact in a later edition. It does not matter how
well the information has been structured or how potentially valuable the knowledge
is; as soon as it leaves the presses, the loudspeaker or the screen it adds to,

or drowns
in, chaos.

In our massmedia rich societies, information is

from the receivers` point of view

more like chaos than facts. The receivers have to make a choice not between amounts
of information but between information channels in an informatio
n rich chaos. The
only possible "feedback" is "zapping" between the channels.

Let us assume the communication between a journalist, (= the sender of information),
and a receiver (= reader/viewer of information).

The world from a journalist`s point of view
can be regarded as a chaos of physical
objects, people, empirical data, facts, other people`s knowledge, theories, etc. The
writer focuses on the particular piece of the world and uses his/her tacit knowing as a
tool, when writing a text. The text in the a
rticle is the writer`s attempt to give meaning
to a piece of the chaos.

It is important to realise that the words of the text do not "contain" the tacit knowing
of the writer, only the inaccurate articulation of it. The text becomes a blend of clues
from t
he senses, the data and the concepts, rules and values of the journalistic
profession. The blend is new tacit knowing created in the mind of the writer. The
writer then tries to articulate this tacit knowing into a text. The structured text in the
will contain less knowledge than the writer knows and less information than
the writer acquired.

The reader will therefore read the words, but since he/she can not read the writer`s
mind, the reader`s tacit knowledge will blend with the writer`s articulate
d knowledge
and form "new" tacit knowing. The reader`s new process
knowing can never be the
same as the writer`s but it might be similar. How close their knowing is depends on
whether they share the same tradition, culture, knowledge, profession, busine
ss etc.
This difference in semantic meaning has nothing to do with the technical
communication, the noise level etc.

the difference occurs because of the inherent
fuzziness of our language. (Fuzzy does not mean uncertain but different possible
s of the same concept or categorisation,


The reader must reconstruct the meaning in a tacit process. The writer and the reader
are not in direct contact however, and th
erefore much of the meaning gets lost. The
text in an article or book is an attempt to communicate knowledge but the value lies
not in the text or program itself but in what is not there, in the work the writer did
when he/she tried to "make sense" of the
chaos. The reader`s reconstruction is energy
consuming and takes time. Therefore the reader must make a choice whether to read
the text or not. The reader does not know before
hand whether it is worth spending
time on. The choice will therefore have to bas
ed on something else than the text itself
like: rumour, the name of the author, the medium, the context (at home, on vacation,
in office etc.). (This is incidentally a feature also shared by services).

The situation in society today thus more resembles Sha
nnon`s notion of information
than Wiener`s.

It is possible to make the analogy with the stone again. The massmedia "contain" an
infinite amount of potential information, but the information is not communicated
between a sender and a receiver in a relations
hip of mutual understanding. It is
broadcasted from a sender and there it stops.

The receiver is more like the observer of the stone I mentioned above. There are an
infinite number of senders and channels and an infinite number of possible ways to
and combine the signals. The signals have to be found, meaning must be
interpreted by the receiver.

Although Information theory`s concepts cover only the technical level of
communication they were developed for human/human communication. Especially
tic theory claims its closeness to the human brain.

There are however several problems connected with the notion of information when it
is used in a theory for human to human communication. Human/human
communication is a question of interpretation and cont

First, humans do not communicate with electric signals. Most of human to human
communication in daily life is

(Polanyi 1967).

Second, "signals" between people are manifold. They can be
anything from speech to
silence, from a hand wave to an unconscious twitch in the eye or a stern
expressionless face.

Third, the same signal may be interpreted differently by different individuals. Fourth,
human communication involves a very complex interp
retation by the "receiver". One
school of thought, constructivism (see von Glaserfeld 1988), even regards
communication as a construction in the mind of the individual.

Fifth, people often "enact" their environment (Weick 1979). They impose their own

on others and then receive back clues that they have been constructing
themselves. Communication may thus even be seen as going in the opposite direction,
from receiver to sender.

There are at least three arguments that speak in favour of Shannon`s origi
nal notion of
information as having no connection with meaning at all.

The meaning of a text or a table does not exist independently of the receiver as a fixed
state. "Meaning" must somehow be constructed by the receiver. Every meaning is
therefore unique
for the human being interpreting it. The meaning can not be
forecasted by someone else. This is implied in Shannon`s mathematical definition of
information as a probability.

There seems to be confusion among successors to Shannon as regards the
l consequences of his theory. If I interpret his theory according to his
own texts, information is equal to entropy, i.e. chaos. Chaos contains no meaning.

My own experience from the financial information markets is that they

from a
receiver`s point of v

are more comparable to chaos than to structure.

Information in the cybernetic sense then becomes a special case restricted to
laboratory experiments with fixed settings and restricted boundaries or in manmade
systems on the system`s level.

on Complexity and Overload

Since the early days of information theory, scientists have been studying a
phenomenon they call information overload. System theorists say that information
overload exists when a system receives more information input than it c
an handle.
Changes in several aspects of information inputs may create overloads; pure quantity
or changes in meaning or intensity. Miller (1978 p. 121ff) defines overload as when
channel capacity is insufficient to handle the information input.

Miller goe
s through a large number of laboratory experiments with animals, cells,
neurons and human beings made by psychologists.

The conclusion from Miller`s experiments is that when the information input rate goes
up, the output rate increases to a maximum and the
reafter decreases, showing signs of
overload. In his laboratory experiments "information" is equal to simple signals and
the response to these signals are also simple signals. The question of meaning or
interpretation are not included in these experiments.

Such simple experimental settings have been widely criticised and other psychologists
have tried to handle the problem by introducing a theory of information complexity
(Schroder & al. 1967 referred in Hedberg 1981)). Schroder & al. found that
s and groups respond quite differently to identical stimuli in complex
settings. They conclude that individuals differ in abilities to handle integrative
complexity in information. However, Streufert (1972) finds that this integrative
complexity differs ac
cording to situation. The same person responds differently in
"simple" environments compared to "complex" environments. She therefore suggests
(1973) a new concept
information: relevance

This research

as is most psychological research

is based on the
axiom that
information or signals are meaningful in themselves.

It seems as these problems with the information concept have a common root in the
confusion about the concept of information. Has the creation of concepts like
complexity and overload made re
search overlooked one of the basic features of
information according to Shannon, that of entropy?

Is it not possible that a better analogy is that signals from an environment may be
regarded analogical with the massmedia situation and the stone as describe
d above?

Trying to distinguish lower levels of information like "data" which are said to have no
meaning or higher levels of "complex" information in laboratory settings make no
sense if experimenters are using Shannon`s information theory as a basis for s

If information is equal to entropy and devoid of meaning the problems of information
overload, complexity and relevance, etc. are natural features and can not be overcome
in the real world.

The experiments above are then restricted to a specia
l case: that of a closed system.

Information has No Value and No Meaning

I have pointed at some arguments which speak in favour of regarding information as a
potentiality, i.e. the way Shannon does. The implication from this is that there is more
ion in chaos and complexity than in structure, although this notion seems to
go against a lot of the senses we call common. My own experience from the financial
massmedia is however that the more information we produce, the more chaotic the
world turns.

e conclusion from following Shannon is that information has no value in itself. The
value of information comes out mainly in connection with human action or as an
indirect relation.

A lot of what scientists have been working with in experimental psychology

and the
information sciences is based on the notion that information has a meaning or value
independent of the user. If one follows Shannon`s notion, they have been studying
special cases like the closed system or the level of reduced information only.

still unsolved issue is why (at least some) systems theorists seem to base their
analysis on an interpretation of Shannon that goes against his theory.


Allen & Selander (1985): Information om information. Studentlitteratur.

Barabba V. & Zaltman
G. (1990): Hearing the Voice of the Market. Harvard Business
School Press.

Drucker Peter (1993): Post Capitalist Society. Butterworth&Heinemann.;

von Glasersfeld E. (1988): The Constructon of Knowledge, Contributions to
Conceptual Semantics.Intersystems Pu
blications, Salinas California.

Jumarie G. (1990): Relative Information. Springer Verlag.

Masuda Yoneiji (1980): Informationssamh
llet. Liber.

Miller James G. (1978): Living Syst
ems. McGraw

Naisbitt John (1982): Megatrends. Warner Books New York.

Polanyi Michael (1967):
The Tacit Dimension
, Routledge&Kegan Paul.

Schoderbek, Schoderbek & Kefalas (1985): Management Sys
tems. Business

Shannon & Weaver. (1959): The Mathematical Theory of Communication. Univ.of
Illinois Press.

Sotto R (1993): The Virtual Organisation. Research Paper. Dept.of Business Admin.
Stockholm University

Streufert S. (1972): Success and Response Rate in Complex Decision Making in Jrl of
Experimental Social Psychology 8:389

Streufert S. (1973): Effects of information relevance on decision making in complex
environments. Memory & Cognition 1973: 1:3,

Sveiby K
E (1994): Towards a Knowledge Perspective in Organization. PhD

Toffler Alvin (1980): Tredje v
gen. Esselte Info.

Vamos Tibor (1990): Computer Epistemology. World Scientific.

Weick Karl (1979): The Social Psychology of Organising. McGrawHill.

Wiener Norbert (1948): Cybernetics. MIT Technology Press.