Stefan M Wasilewski: Upgrade Report: April 2010

pantgrievousΤεχνίτη Νοημοσύνη και Ρομποτική

30 Νοε 2013 (πριν από 3 χρόνια και 6 μήνες)

104 εμφανίσεις

Upgrade Report:
April
20
10


Stefan M Wasilewski: Hull Business School

Page
1

of
76

Student: 361429

Stefan M Wasilewski:
Upgrade Report:
April 2010







































Supervisors:



Angela Espinosa & Jennifer Willby

Student



361429
Upgrade Report:
April
20
10


Stefan M Wasilewski: Hull Business School

Page
2

of
76

Student: 361429

Abstract

In 1990 Michael Rothschild wrote “Bionomics”: Economy as Ecosystem (Rothschild
1990), within which he described "BIONOMICS holds that economic development, and
the social change flowing from it, is not shaped by a society's genes, but by its
accumulated technical knowledge". Rothschild recognises that
the theories of
Darwin's
(Darwin 1859)
, Adam Smith's (Smith 1776), Thomas Malthus
(Malthus 1798)
, and David
Ricardo
(Ricardo 1817)

have co
-
evolved along with Lamarck

(Gould 2002)
. Yet even
when he analyses economic evolution "by its very

nature, an evolving economy cannot
be planned, so the entire rationale of centralised economic decision
-
making collapses"
he is thinking of Marx
and Communism,
but
he
is still in a two
-
dimensional plane upon
which "the selfish genes"
(Dawkins 1976)

compete. Rothschild never takes us beyond
competing genes and a tantalising glimpse of spontaneous order that would become
c
omplexity
t
heory.

Yet had
he

look wider and into the works of Ashby
(Ashby 1956)
,
McCulloch
(McCulloch 1988)
, Weiner
(Wiener 1954)

and Beer
(Beer 1981)

he may have
seen a world closure to Lovelock’s
(Lovelock 2000)

and defined the ‘economic
-
ecosystem’ differently.
Their view was of a highly connected and interdependent set of
systems, whether natural and/or social.


By late 2007
,

as the beginning of the ‘Credit Crunch’ was u
nfolding
,

Black, Scholes and
Merton
’s

(Black and Scholes

1973; Merton 1973)

option pricing approach had been
implemented into
two trading and
one investment platform
,
RiskMetrics
™/CreditMetrics™
1

and Long Term Capital Management (“LTCM”), of which
LTCM had to be saved in 1998 by the US Government and the former withdrew the
embedded Normal Distribution reference.
The former had been the bedrock of trading
shares and credit whilst
the latter had implement huge leveraged investments in a
vehicle outside of the regular governance regimes. Both

(Morgan 1997; Lowenstein
2000)

believed they were able to manage risk according to models based upon the
abundance of new
economic
data

and yet upon reflection new capital
(Settlements.
2009)

rules talk in
the same terms of connectedness as Ashby, Beer Weiner and Vester.


Lord Kelvin once said: “Anything that exists, exists in some quantity and can therefore
be measured”
(Beer 1967; Adams 1995)
.
Whilst the physical and biological sciences
have moved beyond Newton and Descartes into
quantum electrodynamics

(Feynman
1985)
, economics and business is still grounded in the second law of thermodynamics
and the Keynesian cloud
(Keynes 1936)
. Efforts have been made t
o model economics

using Chaos Theory (Peters 1996) and modern thermodynamics (Prigogine 1997, 2003)
but investment practice still follows a model sketched in 1900
(Bachelier 1900)

and
built upon by Prof. Eugene Fama
(Fama 1976)

called ‘the Efficient Market Hypothesis’,
where information is perfect, rational investment expectations are the norm and follow
a normal distribution
2

(Moivre 1756)

of activity.

Risk measurement
theories
using

economic waves by R. N. Elliott
(Elliott 1938)

and socio
-
e
conomic

by
N. Kondratiev
(Schumpeter 1939)

were
followed by
Modigliani & Miller’s work
(Stiglitz 1969)

the
latter who’s paper some believe ushered in modern capital markets
(Culp 2001)

have all
been criticised
(Mandelbrot an
d Hudson 2008; Sorkin 2009)

as their application led to
risk strategies like LTCM.


Current experience

seems to indicate that the economy is a complex set of closely
coupled agents whose governance model is fragmented. The recent increase in
economic ‘
boo
m
-
bust

cycles’

seems c
orrelated to rapid increases in
regulatory activity
(Goodhart 1988; Cooper 2008; Wiggin, Incontrera et al. 2008)

whose

original

aim
was

to



1

Are the creation of and trademarked by J.P. Morgan, now JPMorganChase

2

Also called Gaussian or Weiner process
,

or Brownian motion

Upgrade Report:
April
20
10


Stefan M Wasilewski: Hull Business School

Page
3

of
76

Student: 361429

protect inv
estors but whose implementation

created
fe
ed
-
forward and feedback
processes
with unforeseen consequences
.


Mandelbrot’s

(Mandelbrot and Hudson 2008)

criticism of Elliott Waves is its subjective
nature by chartists and even though the "Schumpeter
-
Freeman
-
Pe
rez" paradigm
(Schumpeter 1939)

has its followers for Kondratiev his critics point out that
too little
explanation supported the components of his theory much based around the accuracy of
the data and the exact development of technology. This is very important when
considering the transliteration of technology with
in

finance and how it affects ri
sk and
pricing

as the follows of a theme assume that the underlying rationale is robust
.


With regard to the latter t
wo

major events are worthy of comparison: the Lloyds of
London
R
einsurance
S
piral (1988) and the global credit crunch 2007.
They
have a lot

in
common from a regulatory and pricing perspective. Each have been analysed but
any

predictive power and regulatory responses
proved

ineffectual
, too little too late, the
markets had moved on and adjusted. T
he first begat the second

by providing the capi
tal
markets with new financial structures and improved data manipulation

with which to
create new products
.

In between there were

t
hree

intermediary
economic bubbles

in
1992/3
,

1998
/9

and 2004

the
middle

seeing

the failure of
LTCM

(Lowenstein 2000)

of
which a more stringent regulatory response here may have
averted
both
the
2004 and

2007 crisis all together
(Bookstab
er 2007)
.

A few reviews captured the ‘agent
-
type’
activity
(Arthur 1999)

but none extended to embrace a clos
e coupling between the two
paradigms
(Kuhn 1974)

of insurance and banking as much of the data was opaque to the
underlying operational
nature

of the products being created
(Cooper 2008)
.


Adam Smith envisioned a self
-
regulating economy
(Heilbroner 1999)

but in today’s
economy investors are faced with at least 30
-
years of changing regulatory accounting,
opaqu
e reporting, inflationary markets and for the larger deals a leveraging process
biased to the lead principal.
It is also the longest period of relatively peaceful financial
activity where data was gathered but there’s a catch.
Homogeneous data is the sourc
e of
good pricing whether equity or debt
(Briys and Varenne 2001)

and the precursor to
such is access to consistent, clear

and well correlated
information
; though the data may
have been gather it was neither homogeneous nor transparent
.



Whilst risk and i
nvestment ma
nagement has taken advantage of
processing
technology
the actual data is still based upon published accounts and cursory interviews with
management. This is a Newtonian view that takes no account of an agent’s coupling to
the b
road

economy and
one Complexity Economics
(Arthur 1995; Beinhocker 2006)

tries to take into account. The
consequences of ignoring the close coupling o
f economies
and impact of regulatory change have

been an increase

in
economic bubbles

both in size
and frequency.



More importantly both Beer
(
Casti 1975
)

and Vester
(Vester 2007)

reflect upon the
i
mportance of any Systems predictability horizon which imputes a time limitation in
financial risk analysis
; for instance securitisation may lock in profit

now

but the ‘tail
risk’ is left with the buyer

that may extend over this horzon

and in a changing eco
nomy
incorrect pricing beyond predictability boundaries exposes the underlying business to
loss. This is compounded is leveraged finance is used the consequence being the transfer
of equity risk to debt investors and the creation of seemingly new asset cla
ss with
high
returns thereby diverting investors from establishing better
data
-
models for risk
diversification
(Ashby 1956)

and a new supply of information on the systems that
feed

them
(Beer 1972; Shannon and Weaver 1998)
.


Upgrade Report:
April
20
10


Stefan M Wasilewski: Hull Business School

Page
4

of
76

Student: 361429

Current governance regimes base capital adequacy on
R
isk Weighted asset models
relat
ing

to systems that are dynamic, conditional and have ‘fat
-
tails’

which Malcolm
Gladwell
(Gladwell 2008)

points out should not be dismi
ssed. In its recent consultative
document the Bank of International Settlements
(Settlements. 2009)

offer a new
standard of capital adequacy that is still based around risk weightings and not the
systems that support them

but does suggest a scenario based approach for capital
default
.


If current governance structures and in
vestment processes are failing to manage an
increasing frequency of boom/bust cycles how can

O
rganisational
C
ybernetics

(Wiener
1950
; Beer 1985
)
,
C
omplexity
E
conomics

(Durlauf 199
7)
,

and modern
N
etwork
T
heory

(Strogatz 2003; Watts 2003)

assist investors in making better informed decisions and
the market more self
-
regulated?


The purpose of this research is to
explore how

Complexity Economics
,

Organisational
Cybernetics


and the Viable System Mod
el
(Beer 1972)

can
be used to
develop
innovative analytical tools to
b
etter enable investor decisions thereby returning the
transparency of risk information at a primar
y level that Smith
(Heilbroner 1999)

required for a market t
o be self
-
regulating.


The hypothesis is that non
-
homogeneous data sets
and opaque management data cause
poor investment decisions
and

that the ‘fat
-
tails’ embedded within
many Value at Risk
models
(Wilmott, Howison et al. 1995; Wilmott 2000)

should

be
reconsidered as vital
outliers that
c
ould be interpreted using scenario modelling

t
hrough tools based upon
organisatio
nal cybernetics and complexity
economic
theory
.

It also posits that
structured finance theory coupled with excess leveraging creates feedback loops that
cause asset bubbles if the products that support it remain ungoverned
.


Keywords: Chaos, Complexity Eco
nomics, Emergence, Network Th
eory,
CyberFilter Organisational Cybernetics, and Longitudinal

Upgrade Report:
April
20
10


Stefan M Wasilewski: Hull Business School

Page
5

of
76

Student: 361429

Table of Contents

Research Aims and Objectives

................................
................................
................................
....

6

Literature Review

................................
................................
................................
............................

9

Current State of

Knowledge

................................
................................
................................
....

9

Economics: A General Background
................................
................................
...............

11

Capital Markets

................................
................................
................................
....................

13

Insurance

................................
................................
................................
................................

13

Equity/Debt Capital Markets

................................
................................
..........................

15

Systems
................................
................................
................................
................................
....

16

Stafford Beer: The Viable System Model & Team Syntegrity

.............................

17

Cyber Filter

................................
................................
................................
............................

18

Tying it all together

................................
................................
................................
.................

18

Complexity, Complex
ity Economics & Chaos Theory

................................
.................

18

Second Order Cybernetics

................................
................................
................................
....

21

Summary

................................
................................
................................
................................
.....

21

Conceptual Framework & Methodology

................................
................................
..............

23

Conceptual Framework

................................
................................
................................
.........

24

Conceptual Methodology
................................
................................
................................
.......

28

Summary

................................
................................
................................
................................
.....

31

Research Strategy

................................
................................
................................
.........................

33

“Do I know what I know and how do I find out”?

................................
........................

33


I’m satisfied that I know, but is it real to me”?

................................
............................

37

A Path to a Methodology?

................................
................................
................................
.....

38

The Research Method

................................
................................
................................
.............

39

Ethics, Intellectual Property and the Observer’s Role

................................
...............

41

Fieldwork Design

................................
................................
................................
..........................

48

Structure

................................
................................
................................
................................
......

48

Appendices

................................
................................
................................
................................
......

60

Examples of Market Failures

................................
................................
...............................

60

Glossary of Terms

................................
................................
................................
....................

61

Ecosystems

................................
................................
................................
............................

61

Economics

................................
................................
................................
..............................

61

Time

................................
................................
................................
................................
..........

62

Recursion

................................
................................
................................
................................

62

System & Organisation

................................
................................
................................
......

62

Supplemental In
formation

................................
................................
................................
...

64

The Viable System Model

................................
................................
................................
......

65

The Triple Index or CyberFilter

................................
................................
..........................

68

Team Syntegrity


TSI

................................
................................
................................
............

69

Vester Sensitivity Model

................................
................................
................................
........

70

R. Buckminster Fuller’s: Tensegrity Model

................................
................................
....

72

Algorithmic vs Linear or Deterministic Modelling

................................
......................

73

An Axiomatic Language for Financial Products
................................
............................

74

So you want to be a Researcher?

................................
................................
........................

75

Complexity: Papers by Stafford Beer, John Casti & Steven Durlauf

......................

76


Upgrade Report:
April
20
10


Stefan M Wasilewski: Hull Business School

Page
6

of
76

Student: 361429


Research Aims and Objectives

Whilst the last 60 years has seen significant advances in c
omputing power

to
analyse portfolios of investments,

and structured finance

has created new
products to align
discrete
risk with investor strategies,

the fundamental
methodologies used to

discriminate good from bad investments have not
substantively change
d. However all
current
investment and credit
portfolios rely
on the underlying
economic
performance
of its components
and the impact that
regulators have on the capital requirements required to protect against
catastrophic failure

whether directly for
fina
ncial institutions
3

or indirectly
because
of
the latter’s cost of borrowing
.


The Wall Street Words: An A


Z
4

Guide to Investment Terms for

Today’s
Investor


defines “investment objective” as


“the financial goal or goals of an investor. An investor
may wish to
maximise current income, maximise capital gains, or set a middle course
of current income with some appreciation of capital. Defining investment
objectives helps to determine the investments and individuals should see
that.”


The aim of this re
search is to
better enable the decision
-
making abilities o
f
investors and regulators that are currently being hampered by an inability to
differentiate the performance of disparate organisational structures and capital
risk exposures
. The former being impo
rtant to discern which ‘management horse
to back’ and the latter the yields expected over tim
e.


An

initial hypothesis
could have

two essential elements:


(i)

Each change to the organisation,
whether exogenous or endogenous,
changes the organisation a
nd its context

(ii)

If the economy is an emergent function
of society, and societies comprise
strong/weakly
-
coupled agents that
can
be viewed as
systems, then like any
system decisions and changes occur
within a reference frame for that system
and therefore dif
ferent systems have
different time horizons beyond which
any forecast only has meaning on a
qualified basis
5
.


So what do we mean by a ‘
system
’? Stafford Beer defines a ‘system’
6

as: consisting
of a group of elements dynamically related in time according t
o some coherent
pattern. He goes on to add that this ‘system’ must have a ‘purpose’ and in the



3

www.g20.org/Documents/FM__CBG_Declaration_
-
_Final.pdf

4

<a href="http://f
inancial
-
dictionary.thefreedictionary.com/Investment+objective">Investment objective</a>

5

The latter is mentioned by Prof. Vester, ‘Art of Connected Thinking’ page 91

6

See Appendices for a fuller definition

Upgrade Report:
April
20
10


Stefan M Wasilewski: Hull Business School

Page
7

of
76

Student: 361429

course of his investigations introduces viability and awareness essentially as
follows.


A system can be any organisation of functions and/or agents that has an
input
and output but to be viable the organisation must not only do this consistently
but also be able to adapt to a wide variety of both internal and external
challenges. To become
viable

in Stafford Beer terms it must also be
aware
of how
to manage this
variety internally. This is also a basic qualifying principle for
‘autopoiesis’ (Maturana & Varela 1980)

a property Beer
subsumed into the VSM
.


This is the basic principle of the Viable System Model and the usual form that
such systems take is to have fiv
e essential functions; production, audit/control,
regulation, planning, and vision. Adding different
production units

that follow the
same form is a simple matter of wrapping the whole with one vision and
management layer but an audit and regulatory functi
on for each production unit.


We must be careful here to differentiate between process and function. Simply
put you can have lots of different
processes

(indeed need them) but the
function

is the
purpose

(Beer 1994) to which we employ the processes. There
fore
functions can be
systems

and systems can therefore be systems of
systems
: eco
-
systems.


Here we have a process itself:



<input

processes

output>

adaptable (plan & audit)?


“going in the
right direction?”



It is a little more complicated but not much
. When you start to get systems
comprising “multiple production units” that become agents for a larger system
this is where we start to use the Stafford Beer’s Viable System Model and Prof.
Frederic Vester’s Sensitivity Model to check whether the boundarie
s and
management systems are ‘viable’ or just agents with temporary functions.


An important point here is to note that we can have viable systems, with multiple
production units and layers of management, but when we look outside that
organisation

and ask
if it is an agent of a bigger system we must use the same
analysis techniques to check whether that larger
-
system is
viable
?


So we can have
viable systems

amongst non
-
viable but in order for this new
aggregation of agents to become viable they must meet t
he same rules of
viability. Otherwise we have a complex adaptive system that can create
emergent systems


Here we are at the point of interrogating
eco
-
systems

and because we have
created a process and set boundaries we can start to use algorithmic models
within Petri
-
Nets, Iterative Maps and complex adaptive systems programs. There
is a warning though: the analysis must take into account the intrinsic and
complex set of nested feedback processes across multiple layers and systems.

Upgrade Report:
April
20
10


Stefan M Wasilewski: Hull Business School

Page
8

of
76

Student: 361429


The objective is to buil
d a more robust framework within which investors can
build sustainable investment strategies and regulators optimise the capital
requirements within the financial industries.

Such a framework will tr
eat the
Economy as an ecosystem
(Rothschild 1990)
and
build upon the

hypothesis

where:




Local

agents and recursion layers can become aware of their topography
7

and
begin to

fulfil
the requirements for
Autopoeisis
8



Communication

and decision making abilities are optimised within these s
tructures, and



Taking

time as a function of motion

and motion as a function of
decisions

that cause change
,

then

there is a unique reference frame
within
each
system and/or at each level of recursion


This
invokes
consideration

of Second Order Cybernetic
s that
relates

not only
cognition of the system and the management of variety
,

but also the memory
with which operations depend.


If this is true
then
how investors measure risk within t
his framework will be
different compare
d

to current
models

solely based upon Values at Risk and an
unlimited time horizon for events
.


Current models use data that is untimely and loses qualitative information in the
processing: For instance, modern financial models are based o
n the longest
running period of stable, growing economies. However the predictability of any
system has a limited horizon
(Vester 2007)

so long run models used in
securitisation that don’t take this into account would have a major flaw in pricing
and any leveraging would exacerbate the problem. Added to this positive and
negative feedback loop
s have parameters that maintain a stable system but
when inverted drive it into chaos.


The question is
:

How
can we create innovative discriminatory frameworks based
on complexity economic
s

and viability theory
that will allow more informed
choices to be m
ade by redefining data according to a
decision based reference
frame?


The research will explore the possibilities offered by contemporary complexity
theory on management (OC), complexity economics (CE) and network theory
NT). In particular it will draw up
on insights from existing performance
measuring systems, e.g. by adapting the Triple Index/CyberFilter (originally
suggested by Stafford Beer (Beer 972) and complex systems analysis (modern
game and network theory
(W. Brian Arthur 1997)
).





7

Topography

-

the arrangement of the natural a
nd artificial physical features of an area
;
Topology

-

the way in which
constituent parts are interrelated or arranged

8

Autopoiesis

literally means "auto (self)
-
creation" (from the
Greek
:
auto



αυτό

for self
-

and
poiesis



ποίησις

for creation or
production)

Upgrade Report:
April
20
10


Stefan M Wasilewski: Hull Business School

Page
9

of
76

Student: 361429


Literat
ure
Review

Current State of Knowledge

Ontologically
views have changed over the years, from
a

Universe a
s a collection
o
f objects with clear boundaries
;

that can be empirically measured
;

and
modelled with precision
,

to one where boundaries leak and those objects are a
collection of deeply nested, coupled ‘systems’ whose ex
istence is probabilistic
, i
n

other words from a distinctly Newton/Descartes reductionist Universe to
Complex and Quantum Dynamical.



Epistemological
approach
es

ha
ve

therefore equally changed from a taught
reductionist
approach to a bifurcated one that
on the one hand
measures what
one

can

but

then leaps to conclusions based upon these and an internal model of
the Universe
of

simple rules
,

common to all recursive levels that emerge,
combine, are sustaine
d or die. It is the life cycle, the constant interplay between
energy, context and tension.


Today we have the Internet with experiences recorded by
b
illions of
observations but
by

the mid
-
50’s

World War II had
finished

but rationing
pers
isted;

the Cold War was about to begin
;

and
Einstein’s Relativity Theory was
generally known but
not so
Quantum Mechanics
. Advances in number theory,
cryptology and physics were slow to surface but a changing world order
had
prompted Warren McCulloch to as
semble a disparate group of
thinkers/disciplines with the assistance of the Macy Foundation: The Macy
Conferences. From here
were advances in Decision Theory, the cognitive
sciences, cybernetics, system and communication theory.


Their participants were to

have significant influence on old and new thinkers.
Two of
note
was

R Buckminster Fuller for his architectural theory

Tensegrity


and Stafford Beer
whose

Viable System Model i
n management cybernetics
form
ed

a
c
oherent

order

that would change operational approaches and even
run countries
.


Sharp changes in local and global social order has always been the tipping
point
(Gladwell 2000)

of significant change that itself is preceded

by a silent
development in technology and usually followed by developments in regulation
and taxation. Biologists developed the technology to better manage health and
food followed by a burgeoning society in turn requiring better technology and
shelter an
d so to the industrial revolution. The cataclysmic events post 1900 in
the social world order increased the frequency of change and technology.
Harnessing the atom need
s

to be placed in the context of a social regime that
promoted fairness but allowed regu
lation to manage the whole. Cryptography
had spawned ‘The Laws of the Game’ (Von Neumann 1953; Eigen 1981), laid
down the rules of communication (Shannon, 1948) and spurred Weiner, Ashby,
Bertalanffy and others to look for the codes in biology, physiology,

mathematics
and sociology.


Upgrade Report:
April
20
10


Stefan M Wasilewski: Hull Business School

Page
10

of
76

Student: 361429

At the same time Ralph Elliott
(Elliott 1938)

and Nikolai Kondratief
(Schumpeter
1939)

proposed sociological and economic theories that
technology was a
catalyst within others that created short, medium and long run wave.


Of all the arts and sciences Economics failed to advance at the same rate
suffering boom and

bust with increasing frequency without truly analysing the
reasons. Though MacKay wrote eloquently about the South Sea, Tulip and
Louisiana Bubbles, Marx about the politics, and Schumpeter about the 14
th

to
17
th

Century (Smith, Ricardo, Malthus, Mill) wri
ters
,

they all wrote from the

perspective

of a Newtonian ‘Natural
-
Law’ basis
.

Keynes,
Markowitz

and
Freidman developed
Adam Smith
(Smith 2005)

ideas and
Vilfredo
Pareto

(Mandelbrot and Hudson 2004)

supplied the
20
th

Century market economy
with
economic and mathematical advances, however they did so

using an agent based

process
where the observer is independent rather than inclusive
; this was only
to arrive wit
h the development of Mandelbrot

fractal analysis
(Mandelbrot and
Hudson 2004)
, Prigogine
thermodynamics
(Prigogine 1996)

and lately
Beinhoc
ker
’s

(Be
inhocker 2006)

review

of W
.

Brian Arthur’s ‘Complexity
Economics’

(Arthur 1997)
.


Within the ‘Economy’ insurance had changed little until the 1970’s and even then
its
direct
products didn’t change but the manner in which risk was managed
within

insurers did. These innovations followed the same mathematic
al
discoveries of Graunt (1662)
, Pascal, Pareto, De Fermat and Gauss in their ability
to ‘predict’ likely loss
and were
further enabled by the advent of personal and
business computer.


Whilst t
he Swiss Re, Munich Re,
and
Hanover Re publish
weather,
liability and
other
studies
,

insurance

risk pricing

still uses actuarial approaches founded by
motor and life offices
(Beard, Pentikäinen et al. 1969)

and have

relied on
the
innovation
s in

reinsurance
to leverage profit. All these
processes
require
homogeneous data,
a time
variable and the insurance concept
of an uncoupled
probable cause
to make comparative sense for the underwriter
.

When

the
y were

transfe
r
r
ed

to the capital markets
they

lost

the

insurance element
, became
closely coupled to economic strategy and data
was not generally available.


It was the biologists and mathematicians that would
start to
change things in
analysis of the economy
: the random walk being analogous to Brownian
Movement
.

Mandelbrot’s fractal analysis, P
rigogine’s complexity and Arthur’s
use of
c
ellular
a
utomata to model market behaviour are some
applications in
economics. However

many others have developed the science of system and
complexity theory such as:
Feyman

(Feynman, Leighton et al. 1995)
, Gould

(Gould 2002)
, Holland

(Holland 1998)
, Wolfram

(Wolfram 2002)
, Bollobas

(Bollobás and Riordan 2006)
, Waldrop

(Waldrop 1992)
, Nicoli

(Nicolis and
Prigogine 198
9)
, Weinberg
(Weinberg 1975)
,
Watts

&
Strogatz

(Watts 2003)
,
(Barabási 2003)

and Stewart

(Stewart and Golubitsky 1992; Stewart 2001)
.



Stuart Kauffman

(Kauffman 1993; Kauffman 2000; Kauffman 2008)

and James
Gleick

(Gleick 1987)

are an unlikely read for many economists
, but few know of
Upgrade Report:
April
20
10


Stefan M Wasilewski: Hull Business School

Page
11

of
76

Student: 361429

Checkland

(Checkland and Scholes 1990)
, Beer

(Beer 1972)
,
Maturana

and
Varela
(Maturana and Varela 198
0)
.

O
rganisational systems people like Senge

(Senge 1990)
, and game theory
business
strategists like Dixit
(Dixit and Skeath
1999)

may be known in risk management circles
but
the
ir philosophy hasn’t
penetrated the mainstream of management or shop floor for a simple reason;
their language is too complex

for a busy broker
.


Governance has been with society from t
he Stone Age but few financial managers
think if governance outside of their local boundaries and the politics of an
electoral year. Governance and management are
co
-
dependent and highly
correlated

as Paul Stokes and Ralf
-
Eckhard Turke note (Stokes 2009; T
urke
2009) in their latest reflections using the Viable System Model (Beer 1972) as a
base.


In the last thirty years we have had at least four major economic ‘crashes’
prompting new
regulation
s. However much of this

was in place but the
ir

application was

politically un
-
palatable and
new financial products vetted ex
-
post complaints. T
here needs to be
a regulatory
paradigm shift (Kuhn 1962
;
Vester 2007
)
,

one

that
recognis
es

boundaries, identity and the need to have
the

regulator ‘in the market’.

T
his

is possible at two levels, the first that brings
transparency to operational risks and the second that regulates how capital
is
managed
for closely coupled economic factors:
both using
common sense
terminology.

Economics: A General Background

Whil
st history has defined a common set of investment
products
(Goodhart
1988; Galitz 1995; Bernstein 1996; Galbraith 1998; Benninga and Czaczkes
2000; Fabozzi 2002)

to reflect: different levels/appetites for risk; ret
urns and
time frames; investors have not availed themselves of the tools developed to
analyse operational risk created over the last 70 years by operational risk
(Beer
1966; Beer 1972; Weick 1995; Arthur Andersen &
Co. 1998; Jackson 2003;
Schwaninger 2009)

and network theory researchers
(Strogatz 1994; Watts
1999; Barabási 2003; Ostrom 2009)
.
Whilst the ‘Normal Distribution’ is deemed
simpler to expla
in in terms of risk prob
ability; i.e.
‘the chances of getting a loss
are X’, there is a perceived complexity in algorithmic processes.
However agent
-
based models are now being built for the personal computer that use common
business terminology and yield more accurate market resu
lts (Arthur 1997).


Apart from Nature the common

source of risk for all human sociological ventures
(Mackay, Rosenwald et al. 1852; Ackoff 1974; Lempert, Popper et al. 2003)

(White 2002)

is how they organise an
d dynamically manage their business
functions and their processes.


Whilst philosophers across time (Egypt, Harris
2009; Jichuang Hu 1984;
M
iura
Baien

2009
;

Galbraith 2009; Keynes1997) have
pondered

the be
nefits of prudent
governance the dominant economic logic of our current age derives from
a band
of thinkers

starting with Adam Smith and John
Laugher
(Smith 1798; Smith
2009; Murphy 1997) through Keynes, Galbraith to Stiglitz, Prigogine,
Upgrade Report:
April
20
10


Stefan M Wasilewski: Hull Business School

Page
12

of
76

Student: 361429

Mandelbrot, Fukuya
ma and Beinhocker. Commentators on the reasons why
past
economic
failures
have
occur
red

(Mackay 1852; Bookstaber
2007; Taleb 2007;
Cooper 2009) are numerous but curiously there are few or no commentators
from an Asian/Far East culture that rank alongside th
ese authors. This is
particularly interesting as there is a significant cultural difference in how trade is
conducted which finds its roots in language; most Chinese derived cultures
emphasise the connections between things and
relationships rather than
th
e
objects themselves.
W
hen

considering the structure of a system (Beer 1994;
Fuller 1986)

it changes the emphasis from functions/processes
only
to include
the communication protocols as well (Shannon 1949; Beer 1994)
.


Modigliani and Mille
r
{
Stiglitz, 1969;
Miller, 1988
}

(“M&M”)
are credited by some
(Culp 2003) as being the founders of modern corporate finance detailing
para
meters in 1958 that would reaffirm the concepts of the Efficient Market
Hypothesis and Capital Asset Pricing Model. A more cautious approach
and
at a
discrete level of investment had been laid down by Benjamin Graham
(
Graham,
1949
; Graham
,
1985) but both G
raham and M&M required access to quantitative
and qualitative parameters in order to assess the value of an investment, the
recording of which seldom are included in financial data today.


M
odel based management

(
Schwaninger 2009)

was to gr
o
w
from McKinsey

through to Beer
.

However the development of technology and how it evolves
(Arthur 2009) was not apparent within the financial world whose focus was to
maximise profits and if technology enabled this then it would be invested and
utilised.
A
dvancement
s

in
‘expert system shell’
computer languages
became

commercially available
and
led me to create business models that standardised
the mundane processes allow
ing

the

focus
to shift to the

more complex
economics of insurance and the capital market
s
.



Anyone with an interest in biology and physics would find from re
ading

Stafford
Beer (Beer 1994), Michael Rothschild (Rothschild 1990) and Ilya Prigogine
(Nicolis 1989, Prigogine 1996
)

that
,

like eco
-
systems
,

business was a closely
linked set of functions or actors whose communications linked the who
le.
Therefore I needed to understand the language of the capital markets and the
structures that society built in the form of its business organisations.


Economics
is poor
ly served at the moment having followed the physical sciences
along an ever more dis
creet risk analysis path rather than understanding why
common patterns emerge
.

Developments in understanding

biological systems
(
Stiglitz 1969
; Noble 2006; Kauffman 2008)

could be extended to
economics,

as
societies
are emergent artefacts of
closely coupled
social structures
.


Modern

accounting method
s

trace
ba
ck to Fra Luca Pacioli (1447


1517)
and

little has changed in the
basic
process of
budgeting
,

just the manner in which
data is collected.
The ac
cumulation of data over time is still the pre
-
dominant
method of assessing the inherent risks within an organisation and one the
insurance (Beard 1969) and capital markets (Marko
w
itz 1953) mainly depend
Upgrade Report:
April
20
10


Stefan M Wasilewski: Hull Business School

Page
13

of
76

Student: 361429

upon today. So

how
should
business manage the ‘black
box’
9

that is the
operations of the organisation and communicat
e

across

its boundaries

to let
investors better understand the risks involved?

Capital Markets

We are all bounded by one large ecosystem
(Lovelock 1979
; Johnson 2001;
Mandelbrot and Hudson 2004)

within which the co
-
dependencies frame internal
boundaries and seemingly follow a common set of mathematical laws, generally
"Power Laws"
(Lovelock 1979; Kauffman 2008)

and/or
"Gaussian" distributions.
The tendency to assume models represent the world is rife in business (Taleb
2007; Bookstaber 2008) and as the pressure of time weighs upon investors
prudence is sacrificed to reward when the complexity of a business can
se
emingly be overcome by mathematical models promising guaranteed returns.


The

capital markets can be split into two components whose co
-
dependency is
still little understood by the majority of its practitioners. These components are
the insurance and equit
y/debt capital markets; the former
generally
dealing with
uncorrelated indemnities for losses derived from natural events whilst the latter
creates, manages and trades what we think of as business
capital
and monetary
affairs.


Until the late 1960’s the wo
rlds stock markets
were confined by their
technology, investment
s

transacted within a timeframe that allowed reasonable
due
-
diligence to occur and
,

whilst advantage could still be taken of unwary
,

the
market was essentially self
-
regulating; the governor tr
ading in the same markets
and aware of abuse.


As technology increased and capital transfer boundaries collapsed making a
global economy possible the speed of transaction and the complexity of products
has denied time for the natural feedback loops necess
ary for self
-
regulation to
operate. Coupled with the need for yield to counter inflationary prices few
investors had the time to research individual businesses and trust was placed
into seemingly independent agencies to do the job, attach a ‘rating’ for a
business
that allowed
treasurers to ‘invest by manual’: X million in this rating, Y million in
that and so on. This productising of investment has removed
essential analysis
processes and thereby data over the last 30
-
years.

Insurance

Whilst insurance is t
he process by which a first party may protect themselves
against an event whose statistical frequency is manageable over time, (Beard
1960) reinsurance is the process by which insurance entities protect themselves
against identifiable risks (Grieg 1976; Ca
rter 1977;
Gerathewohl 1980
)

that may
unbalance the profitability of their portfolio.

The use of Reinsurance (Grieg 1976;
Carter 1977; Gerathewohl 1980) to balance a portfolio of insurance (Carter
1976) predates portfolio theory (Markowitz 1953) as a proce
ss but not as an
exact pricing methodology.




9

Beer uses ‘black box’
(Beer 1994) but attribution has gone to J C Maxwell in 1860’s with modern application via
Wilhelm Cauer in 1941 for Network transfer processes

Upgrade Report:
April
20
10


Stefan M Wasilewski: Hull Business School

Page
14

of
76

Student: 361429


The maxim of insurance ‘the premiums of

the many pay for the claims of the few’
summarised the approach to pricing risk and gaining profit
(Arthur 2005;
Bertuglia and Vaio 2005)
. What i
t

assumes is a well
-
behaved culture and
h
omogeneous data collected in a timely manner. This presents a feedback loop
from which pricing is derived (Swiss Re Sigma 1/2009) that is time
-
dependent
for which the governors of the business must be responsive and the regulators of
the industry aware of
mistakes when errors occur. Risk is at the heart of it all,
(
Adams 1994
), how to identify, price and manage it is the role of the
underwriter, whilst the broker is the intermediary that ideally selects the
optimum price for the client.



The

catastrophes that have befallen the capital markets from 2007 are not new
to the insurance world (Lloyds of London 1992) with insurers failing more often
in the USA and coverage becoming scarce or withdrawn.

This has led to volatility
in
capital and has a
direct impact on the economy as businesses became
de
pendent upon insurance to trade,

particularly prevalent in the liability classes
(Swiss Re Sigma 2/2009) that indemnify against human error rather than
natural events. It is here more than any other area
of insurance that change in
the method of assessing risk is required.


Pricing

in the insurance markets was quite elementary
as

early 1980’s. Motor
(Beard 1969) and life businesses gathered data according to long standing legal
requirements but the remaind
er of the non
-
life insurance market outside of the
United States (‘USA’) did not retain data in a consistent form or at all. Within the
USA statutory reporting requirements were more considered with its Policy &
Claims Service data but regional differences

by State along with an inconsistent
format has devalued the data.


As insurance company performance became volatile and insolvencies increased
(mainly in the USA) the demand for capital created synthetic instruments that
held properties similar to insuran
ce but were capital market based. These were
us
ual constructed using a trigger mechanism or index (Swiss Re SIGMA 1/2009)
again using time series data and discounting according to a modified Gaussian or
Weibul curve as these were the most commonly understo
od by the more
sophisticated investor markets being Hedge Funds.


The

most commonly identified cause of insurer failure was ‘operational risk’ but
no attempt has been made to define what ‘operational risk’ even though
Solvency II, part of the Basle Accord
II
10
,

attempted to define it for banks without
success.


Notwithstanding

insurer failure, a general maxim used when assess primary
insurance risk is ‘the best risks are those best managed’ but the identification of
‘best managed’ was mainly based on experie
nce of the underwriter whose
general characteristics were one of close cooperation with the client and
rating



10

http://www.bis
.org/stability.htm

Upgrade Report:
April
20
10


Stefan M Wasilewski: Hull Business School

Page
15

of
76

Student: 361429

agencies like Standard and Poors, Moody’s, A M Best who specialised in
reviewing corporate performance and rating their performance

Equity/Debt Ca
pital Markets

The

Equity and Debt capital markets have grown in complexity and size as
inflation took hold post the Oil
-
Crisis of 1972/3 and the liberalisation of cross
border capital shortly thereafter. The main reason for the latter was the
development o
f financial instruments that affected the same result for the client
capital strategy but overcoming local country laws against capital transfer. It
being recognised by central banks that the world had changed, national and
economic boundaries were no long
er the same.


In

response to the volatility in interest and currency rates treasurers sought and
were supplied insurance
-
like products that would ‘hedge’ their bets. Originally
created to make sure
a firm

managed its contract pricing and hence its balance
sheets (Hull 1989) these processes went on to beget new forms of gambling in
themselves (Lowestein 2000).


Data

was at the crux of the deal and patterns that could be discerned then
translated into models made ‘Hedge Funds’ extremely wealthy (Bookstaber
20
07). Although many of the processes were quite old and had found disfavour
(Stein 1990) in the activities of Michael Milken
(Drexel, Burnham, Lambert: TIME
26
th

Feb1990)
they were still in play throughout the 1990’s, and up to 2007,
indeed the processes th
at brought down Long Term Capital Management
(‘LTCM’) and the CDO/CDS markets (Credit Default) relied upon the same
approach Milken used in his seminal paper on Junk Bonds; i.e. that there was a
homogeneous set of data that followed a binomial curve and th
at an arbitrage
could be exacted because catastrophe’s (in this case default of bonds) seldom
happened.


Structured finance (
Stiglitz 1969;
Ward
1993; Galitz 1995; Cohen 2000;
Benninga 2000; Pike 2005) is the practice of identifying elements of risk type
w
ithin a capital instrument that appeal to certain investors and ‘carving them
out’ into separate instruments for resale. What is left is called ‘basis risk’ or
‘equity risk’ usually attracting a large premium but as a smaller percentage part
of the whole.
As the markets gained more confidence and data showed certain
similarities the advent of the derivative (Wilmott 1995 & 1998) markets
increased liquidity and spawned Hedge Funds whose strategies were focused on
one particular angle but who seldom ‘kept to
their knitting’ as LTCM and one of
George Soros’s ‘Soros Fund Management 1998’ showed in their demise.


The main suppliers of capital at one time were Pension Funds whose prime
desire was to manage the payment of their clients as they matured to retirement
.
A particular risk that has been exposed of late is the ‘longevity’ of the global post
-
war population.
Of its own accord this would not have been a particular problem
had governments and corporations left their reserving practises and taxation
procedures
alone. In the UK and USA as structured finance open up cross border
take
-
overs
,
green
-
mailing on the stock exchanges became more profitable and
Upgrade Report:
April
20
10


Stefan M Wasilewski: Hull Business School

Page
16

of
76

Student: 361429

the desire
to
tax the wealth of the funds proved too great for governments
that
needed to keep up public spendin
g that
the pension funds fought to maintain a
healthy asset/liability balance.


As

the economy worsened the returns from equities faltered and failed first
leaving holes but then forcing treasurers to move to what they considered ‘less
risky’ investments;
bonds. The perfect storm was thus set to occur as the
property market ballooned and the ‘leverage crunch’ took its toll on the credit
market because returns were faltering and ‘bullet payments’ on leveraged
investment funds could not be repaid (Bookstaber
2008).


To maintain yield management had progressively forced down costs until most
of middle management had been removed and replaced by younger less costly
staff. This had several impacts in the short term it achieved their goal but
without the memory re
quired to manage historic variety mistakes increased.


The

role of the rating agencies within the capital markets is a pivotal one because
their purview of industry statistics from management accounts
sh
ould have
given them unparallel data. However their p
rimary source of data was published
accounts, themselves historic by the time a ‘rating’ was needed’ and incomplete
because of the lack of true operational data. Add to this the profit incentive to
take a cursory glance at accounts and pronounce risk metri
c that became a bible
for lazy treasurers and you have the beginnings of a disaster.


Stand
ard & Poors, Moody’s and Fitch each have their approaches to risk and
accounting data but all change their method of collecting assessing data
regularly. This alone
makes the

transition

rating matrices that were to find
themselves at the heart of RiskMetrics


and CreditMetrics
™ (J. P. Morgan Chase)
faulty and the credibility of these metrics doubtful.


The effect of transition matrices, the heart of CreditMetrics™, wa
s to imply that
credit could be securitised with full transfer of risk. However tail risk was left
with the primary investors and volatility in future production resulted in lower
returns. The latter would be exacerbated as investors moved away from the
Pr
ivate Equity market to a new, and seemingly, higher yielding leveraged finance
market: in effect equity risk was being taken by investors at debt prices.

This
move away from dynamically managed risk stored up asset bubbles in hidden
data as Hedge Funds did
n’t operate under the same disclosure rules.

Systems

Economics

is an emergent function of human societies (Smith 1798; Murphy
1997) whose individual components have their own internal models of the
world (Marko
w
itz 195
2
)
,

know how they fit together and hav
e strategies to
navigate it. The expression "Corporate Eco
-
System" accurately describes how
these individual, and groups of individuals, combine to create the network of
organisations that we call the
e
conomy.


Upgrade Report:
April
20
10


Stefan M Wasilewski: Hull Business School

Page
17

of
76

Student: 361429

An ecosystem is a set of closely coupled org
anisms that form a sustainable whole,
(Mingers 1995; Arthur 2005)

bounded but capable of absorbing new inputs. Its
fitness is measured by how effectively the variety of its inputs is managed while
still maintaining the whole.


Like organic ecosystems businesses evolve,
develop, mutate or die according to
this ability t
o adapt
(Mingers 1995; White 2002; Arthur 2005)
. However there
are, like the genetic code, certain telltale indications
(Be
er 1966; Kauffman 1993;
Vester 2007)

of how businesses will behave by the manner in which it is
constructed. Therefore peering into existing business operations can: illuminate
the fitness of the business; how vision is co
-
developed then communicated;
ide
ntify the proper functions and their processes; and how control parameters
are shared across the whole.


The

tools to do this can generally be categorised as 'model
-
bias
11
'
, network
topology
(Jackson 1991; Watts 199
9)
, communication bandwidth
(White 2002)
,
coupling (strong/week)
(Shannon and Weaver 1998)
, resonance or information
transfer and lastly,
how diversity
(Ashby 1960; Mingers 2006)

is managed

including t
he optimisation of variety to a strategy
.

Stafford Beer: The Viable System Model & Team Syntegrity

The ‘Brain of the Firm’ (Beer 1972) introduced the Viable System Model’ as well
as the ‘Cyberfilter’ which was used as the basis to

explain how the business
Organisation was structured and the primary performance metric. In “Beyond
Dispute” (Beer 1994) Beer explained how Team Syntegrity’s protocols
, his
method of gaining and sharing knowledge,

were founded, developed and
integrated to

provide a language to interrogate existing, and provide for new,
forms of governance: In business terms “a means of addressing and
strengthening the links between the internal and immediate management
functions, and the external and future management fun
ctions of the enterprise”
(Beyond Dispute 1996; Truss, p333) so that a balance is maintained between
investment for the future and the maintenance of day
-
to
-
day business.

“The
Heart of the Enterprise”
(Beer 1985)

expand
ed

the understanding of the VSM
a
nd introducing
Autopoiesis
: Maturana & Varela’s work on self
-
aware renewal.

With

“Decision and Control”
(Beer 1979)

put the whole subject matter in the
general context of control
and self organising systems, and “Diagnosing the
System”
(Beer 1966)

pro
vided a workbook that an educated management
-
cybernetician could use to investigate
a
ny

system. To complement the general
understanding of Stafford’s work t
wo books,
“How Many Grapes went into the
Wine”
(Beer 1985)

and “The Viable System Model”
(Harnden and Leonard 1994)
,

provide

good examples and relevant applications
.

On a meta
-
level “Designing
Freedom”
(Espejo and Harnden 1989)

and “Platform for Change”
(Beer 1975;
Beer 1994)

shows Beer’s application of his language to technological and
p
olitical governance.




11

Model
-
bias is defined here as the type and consistent application of management and functional models

Upgrade Report:
April
20
10


Stefan M Wasilewski: Hull Business School

Page
18

of
76

Student: 361429

Cyber Filter

Stafford Beer described the CyberFilter in ‘
Brain of the Firm”

(Beer 1994; Beer
1994)

and in the project described above

it was
used to provide

a metric to price
risk
. U
sing terms
that any treasurer could understand we created metric
s from
measures of productivity

defined as “actuality”, “capability” and “potentiality”
according to the definitions laid down by Stafford Beer: Productivity being the
ratio of actuality to capability,
latency ratio of capability and potentiality, and
performance being two measures, the ratio of actuality and potentiality, and also
the product of latency and productivity.


Tying it all together

In the last three years the financial world has shown itself to be a closely coupled
set of financial networks
(Beer 1994; Lowenstein and L
ong
-
term Capital
Management (Firm) 2000; Cooper 2008)

that have used common business
models with a common flaw: they could not generally differentiate between good
or bad management and the attendant operational risks that subsequently
destroy value
(Bookstaber 2007)
. If they had then they would not be at the
mercy

of poor credit
-
worthy businesses nor invested leveraged equity
accordingly.


In April 1975 Stafford Beer and

John Casti wrote a research memoranda for the
IIASA titled 'Investment against Disaster In Large Organisations'
(Bookstaber
2007)
.

Within they discuss the then recently published 'Catastrophe Theory' by
Rene Thom
(Casti 1975)

and how it may be applied to optimise in the
management of risks within large organisa
tions. It identifies homoeostatic loops
(Zeeman 1977; Mingers 1995)

and many of the criteria that would become
central to identifying why boom
-
bust
(Beer 1972)

occurs and integral to the
devel
opment of the Stafford Beers' 'viable system model'
(Cooper 2008)
.




An essential part of this paper is

that management is a dynamic process and
ultimately predictable only over short periods and that a

business should
optimise the return on capital not maximise earnings

with non
-
cash flow assets
.


R
ecent developments in network theory can embody much of th
e mathematics
empl
oyed by Stafford Beer in the viable system model by adapting certain
elements of an agents programming and providing dedicated
(
Beer 1972
; Fuller
and Applewhite 1986; Watts 1999; Carringt
on, Scott et al. 2005; Ostrom 2009)

‘cellular automata’
(Wolfram 1994; Arthur 2005; Ostrom 2009; Schweitzer,
Fagiolo et al. 2009; Vespignani 2009)

parameters

with the task of governing the
boundary conditions
.

Thi
s allows an algorithmic model to be created
using
feedback and feed
-
forward parameters along with
both
the self
-
similar
functionality

of the VSM and the creation of other systems that may be emergent
but not viable in the same sense as the VSM.

Complexity
,

Complexity Economics

& Chaos Theory

Networks of ‘Black Boxes’ (
Maxwell 18
60;
Beer 1994; Arthur 2003) or
Cellular
Automata C
A’s showed that individual rules are not a clear indication of
collective behaviour on a given landscape (Holland 1986 1998; Johnson 2001;
Upgrade Report:
April
20
10


Stefan M Wasilewski: Hull Business School

Page
19

of
76

Student: 361429

Wolfram 2002; Kauffman 2008).

It is important in this research to i
dentify those
rules necessary
to incorporate recursiveness and variety handling for agent
modelling to expand and potentially
replicate economic
-
ecosystems.


The following table illustrates some of the differences between Complexity
Economics and Tradition
al Economics:



Figure
1
: Beinhocker and W Brian Arthur


The
follow
ing

us
es

a mixture of Arthur
and Beinhocker’s

description of complex
systems behaviour being
12
:


1.

Dispersed Interaction



What happens in the economy is determined
by the interaction of many dispersed, possibly heterogeneous, agents
acting in parallel. The action of any given agent depe
nds upon the
anticipated actions of a limited number of other agents and on the
aggregate state these agents co
-
create.

2.

No Global Controller



No global entity controls interactions. Instead,
controls are provided by mechanisms of competition and coordinat
ion
between agents. Economic actions are mediated by legal institutions,
assigned roles, and shifting associations. Nor is there a universal
competitor

a single agent that can exploit all opportunities in the
economy.

3.

Cross
-
cutting Hierarchical Organizatio
n



The economy has many
levels of organization and interaction. Units at any given level behaviors,
actions, strategies, products typically serve as "building blocks" for
constructing units at the next higher level. The overall organization is
more than h
ierarchical, with many sorts of tangling interactions
(associations, channels of communication) across levels.

4.

Continual Adaptation



Behaviors, actions, strategies, and products are
revised continually as the individual agents accumulate experience

the
sy
stem constantly adapts.




12

This was posted to a Wikipage on Complexity Economics

Upgrade Report:
April
20
10


Stefan M Wasilewski: Hull Business School

Page
20

of
76

Student: 361429

5.

Perpetual Novelty Niches



These are continually created by new
markets, new technologies, new behaviors, new institutions. The very act
of filling a niche may provide new niches. The result is ongoing, perpetual
novelty.

6.

Out
-
of
-
Equ
ilibrium Dynamics



Because new niches, new potentials,
new possibilities, are continually created, the economy operates far from
any optimum or global equilibrium. Improvements are always possible
and indeed occur regularly.


Most of the approaches used i
n agent modelling and complexity use feedback
and feed forward parameters but do not extend to a nested approach as defined
by our

economic
-
ecosystem
’ t
hey are generally path
-
dependent and/or have
defined termi
nation parameters (
Arthur
,
1978
). F
ew recognise the type of work
introduced by Stafford Beer includ
ing

the concept of recursion an
d

Ashby’s law
of
requisite
variety.



Relevant to the research topic are several

contributors to Complexity Economics
like

W Brian Arthur

(Arthur 19
99
)
, Steven Durlauf

(Arthur 1978)
, Eric
Beinhocker

(Beinhocker 2008)
, Ilya Prigogine
(
Prigogine 1997) for their works
in econo
mic agent modelling and complexity as it relates to economics
. Whilst
some of their proposals involve recursion mathematics it is the lack of explicit
discussion
on the various levels of decision
-
making
and

taking
into
account

the
couplings between
the different levels of governance
, as Beer discusses in the
Decision and Control

that limits them
.


A lot of work has been done in applying chaos theory to economics (Peters 1995)
as well as

complexity economics. The difference between
partial or complete
path dependency

where a course is taken regardless of feedback indicators to
the contrary
,

and the actions of agents following rules under complexity
conditions
,

highlights the need to recog
nise the presence of all elements and
their boundaries in the discussion.


Chaos Theory (Gleick 1995) became popular at a time enough data could be
collated to analys
e
the changing economic background following the 1987 and
1992 crashes.
Models of

stock m
arket trends
ha
s

shown certain similarities to
actual
outcomes and
the
possible use as a model was important to structured
finance departments
,

for obvious reasons
,

but the
y were

not sustainable as
markets chang
ed
, accounting rules were being flouted (Enro
n/WorldCom) and
the data became corrupted.


Similarly complexity theory has many useful elements, is not
fully
path
-
dependent and can sustain models whose initial conditions do not predicate
given outcomes
(Kauffman

1993; Vespignani 2009)
. It also has the benefit of
describing emergent forms but few models reviewed show the type of c
losure
(Maturana

& Ver
e
la)

to create a viable system.


Upgrade Report:
April
20
10


Stefan M Wasilewski: Hull Business School

Page
21

of
76

Student: 361429

Second Order Cybernetics

The development of cybernetics must include its deriva
tive, Second Order
Cybernetics, to include the observer
(Dupuy, 2009).
Anthropologists Gregory Bateson and Margaret
Mead were part of the Macy conferences that drew a
diagram outline how it relates to an underlying
system. Maturana and Varela took this fur
ther and
t
he important aspect for economics is that the
suppliers of this element of feedback are the human
components and management of a business, as the
interchange of data can only be turned into information by those with the
cognitive ability to do so
.

Summary

The problem
seems

we are motivated by success
,

the

unbridled extreme
of
which
is greed
,

and
we
ignore everything but catastrophes (MacKay 1974;
Cooper 2009). Technology is seen as a solution to discreet problems but
seemingly
not as an enabler to

the whole (Arthur 2009)

such that between 1986
to 2007
the
investment
focus shifted to
:

a 3
-
month returns horizon
; the practise
of

lower
ing

costs

and increas
ing

production;
and
the
use
of
structured finance

to
lower capital cost in order

to solve genuine
business problems. Enron and
WorldCom became leveraged finance houses with off
-
balance sheet risks hidden
from investors whilst GM (GMAC) and GE Capital (GE Finance & Employers
Insurance) had larger lending balance sheets than
their
primary
business
produc
t
s
.


Although the UK Treasury had been advised of the potential problem and
advisors publishing the material effects of bad trading methods (Goodhart 1988)
they persisted in the regulatory approach, commonly assumed in the market but
unstated, that the St
ate would support the banks (Cooper 2009). So whether
insurance, lender or trader financial institutions were level in the ey
e
s of the
State? The less scrupulous of managers took this as a sign to trade beyond their
abilities with little corrective managem
ent from a hobbled Bank of England and a
Financial Services Authority that was over burdened.



There are two principle gaps
:


1.

Investors perceive organisations as ‘black boxes’
13

when
they should be
looking inside and judging the performance of m
anagement in response
to market activity
.


2.

The lack of resolution between levels of governance within the economy
hinders the understanding of
different systemic
reference frames and
time as a variable
because of
its application

in pricing ris
k
. E.g. the b
ad
timing of regulation when market may have already adjusted that can be
overcome by building tools from the above and complexity economic
models




13

Attributed to James Clerk Maxwell by Prof. Ranulph Glanville but clearly al
luded to by Wilhelm Cauer 1941 “
process of
network synthesis

from the
transfer functions

of black boxes” although coined later

by others

Upgrade Report:
April
20
10


Stefan M Wasilewski: Hull Business School

Page
22

of
76

Student: 361429


In order for an index to be properly effective it needs to be consistent and
transparent. As mentioned above

Rating Agencies like Standard & Po
ors,
Moody’s and Fitch were the
heart of
global structured
finance
products
for
decades
. However

their indices were fatally flawed
as they relied upon out of
date accounting data, anecdotal management opinions and did no
t include real
-
time management performance indices like those provided by the VSM and the
CyberFilter.
This disadvantaged investors’ assessment of risk and returns
compounded by the fact that they were effectively the only source of long
-
term
data.


B
y

providing a framework that can account for non
-
VSM like organisations

and
then analysing them as a whole across each level of recursion we could start to
create tools for investors that can discriminate between organisations
and their
performance
over various timeframes in order to improve their investment
decisions.

Upgrade Report:
April
20
10


Stefan M Wasilewski: Hull Business School

Page
23

of
76

Student: 361429


Conceptual Framework & Methodology

A

view gaining ground over recent years

is

that the Economy is much like an
ecosystem
14

and that the me
thods used to investigate biological systems can be
applied to the organisation and dependencies that create the Economy.
An

augmented
definition of ‘Economics
1


maybe

to reflect a ‘living relationship’
between ‘agent(s)’ that themselves could be

‘ecosystems’ in a recursive
1

process.
So like a forest within whose boundary trees, insects, and animals are coupled
within/to the whole; communicate and act out a dance of cooperation or
competition so too can businesses within the Economy. Each complex
of
components, say within a animal, that achieve a cooperative arrangement can be
considered an ecosystem within the boundary of the animal

(Fuller 1971;
Lovelock 1987;
Noble 2006
; Dupuy 2009
)
.


A de
finition of recursion, as outlined by Angela Espinosa is

Beer’s
: “
the Law of
Recursive Viable Systems:
in a recursive organisational structure, any viable
system contains and is contained in, a viable system’.
(
Beer, 79, p. 118
)
.

This
applies to ‘
recursive

viable systems’ but
at some level of organi
sation
there are
agents who are viable and some
,

that may not but whose function i
f

constantly
available
,

can provide a ‘viable system

at that level.

This varies from Beer’s rule
above and
is part of the research hypothesis herein.


I
n

order to remain viable
system

boundary conditions and operating parameters
must be common in
each reference frame
. Beyond this organisations may emerge
temporarily but doomed

to wither unless the conditions for
Autopoeisis

occur:
i.e. become closed and aware of their ability to sustain themselves.

The
treatment of time is
vital

as
temporal conditions for each organisation are
different for each reference frame

and predictabili
ty has a time horizon (Vester
2007)
.


Therefore l
ayers of economic
cooperation

can emerge across sector,
and/or
national
and/
or transnational boundaries but in order for
the collective of agents

to form

a coherent patter
n
/system and therefore layer of rec
ursion

the
conditions for autopoeisis
15

must be met and maintained for both it
and those
beneath it.
L
ayers of recursion need not consist of agent(s)
/components

whose
identity is the same
across all layers
but
their
function
al identity

according to
Beer’s V
SM

must be
.


A

layer of recursion
consist
s

of a dynamic relationship between one or more
agents that may

be economic ecosystems

and whose

constraints will impute
temporal and other contextual conditions
.
For instance a ‘wi
dget’ manufacturer
may achieve independence locally, nationally then become transnational only to
find it loses one of its local markets to a local competitor without losing its
higher viability and visa versa. Equally either a local or transnational
‘sect
or/agent’ can be critically constrained if another attracts capital resources at



14

See Ecosystem, Economics & Recursion
in Glossary of Terms

15

See previous definition of autopoeisis

Upgrade Report:
April
20
10


Stefan M Wasilewski: Hull Business School

Page
24

of
76

Student: 361429

higher rate.

The ability to predict activity breaks down at a certain time horizon
for each system (Vester 2007) and is further complicated if systemic formation is
unclear or

is emergent (Vester 2007). Modelling activity would better be served
using a Complex Adaptive System approach, again part of the research
hypothesis.



The researcher’s experience
,

and therefore view
,

conditions the approach herein
and

follow
s

Weick
(Kauffman 1993; Weick 1995; Stokes 2009)

in many respects:

‘little
islands

of
relative

stability and
relative

certainty in a complex and uncertain
environment’ and that this environment is in ‘constant flux’ where the i
slands
are businesses grown from concept to manufacture through an organising
principle
(Weick 1995; Weick 2001; Stokes 2009)

into different structures
(Beer
1972; Beer 1985
; Morgan 2006)

that can be cooperative or competitive but all of
whom are dynamic and must be measured in some form
(Fukuyama 1995;
Ormerod 1999; Gharajedaghi 2006; Bookstaber 2007; Greene 2008; Ostrom
2009)
.


Da
niel Trietsch

(Jackson 2003)

reviewed the prevalent business models such as
TQM, SSM , except

Six
-
Sigma
(Trietsch 2009)
, an
d compared them against
the
Viable System Model
(Beer 1994)

id
entifying their strengths and
w
eaknesses;
See Appendices
.

Conceptual Framework

Like Chaos Theory
(
Glieck 1990
)

the initial conditions of philosophical enquiry
can determine the outcome: if of Western origin then it predominantly follows
from Newton, Darwin and the Bible, but if of Far Eastern then it is far older with
common roots in Zoroastrian beliefs through In
dian Sanskrit writing to Zen
Buddhism. However few knew in the West that Newton was a secret Alchemist
(Gleick 1987)
)

whose process driven researches amplified the observations with
a mind to write them down. Indeed many aspects of the Veda’s can be found in
the origins on Quantum Physics and Cabalistic teachings show

nested systemic
processes at work. It is not the spiritual beliefs but the manner in which each
portray the organisation of the Universe: The West became discrete/corpuscular
whilst the East focused on the whole/connectedness.


The

definition of ‘Economic
s’
herein provides an

essential framework
but
like all
frameworks there are sub
-
frames contributing significant support

through
constant communication and is a philosophical mixture of both East and West
. In
this respect the works of Stafford Beer (Viable
System Model and CyberFilter

(Dobbs 1991)
) provide a language with which identify ‘agents’ whose functional
components are organised to form a
viable

entity regardless of product or
market

but must
similarly be in constant communication
.
Viable

in this context
is: “
a business that meets its obligations, over the life of its products, regardless of
the majority of perturbations that impact it
”. This draws upon the concept that to
achieve a stable exis
tence all organisations conform to a similar functional
framework where resources are transformed and the result passed into a market
place in which it must survive. The process is ‘a dynamic within a dynamic’ and
in order to be optimised the internal cont
rol system must be capable of adapting
Upgrade Report:
April