Why Information Should Influence Productivity

spongereasonInternet και Εφαρμογές Web

12 Νοε 2013 (πριν από 3 χρόνια και 11 μήνες)

316 εμφανίσεις

Draft 1.5


8/28/02

© 2002 Marshall Van Alstyne and Nathaniel B
ulkley

Draft version. Comments are welcome. Please do not cite or circulate without permission.

Why Information Should Influence Productivity


Marshall Van Alstyne


University of Michigan

550 East University Ave.

304 West Hall

Ann Arbor, MI 48109

mvanalst@umich.edu

(734) 647
-
8028

Nathaniel Bulkley

University of Michigan

550 East University Ave.

304 W
est Hall

Ann Arbor, MI 48109

natb@umich.edu

(734) 741
-
9330



This is a draft not intended for circulation. Please do not cite or distribute.



2

Abstract


This survey articulates two models of information and relates them to the economic
definition of tota
l factor productivity. We then recount hypothesis from the literature that
seek to explain the influence of information on productivity. Hypotheses are developed
from economic, computational and humanistic perspectives. Although underlying
assumptions d
iffer, we argue that their contributions are ultimately complementary. By
highlighting interrelationships, we seek to provide a foundation for empirical research
relating information management practices to organizational productivity.

3


“When studying
the universe and the structure of matter we can follow the same practice
that we follow when studying complex adaptive systems: concentrate on the information.
What are the regularities and where do accidents and the arbitrary enter in?”

--
Murray Gell
-
Man
n,
The Quark and the Jag
uar
, p. 24


Since the mid
-
1990s, studies have generally supported the hypothesis that investments in
information technology positively influence productivity
(Brynjolfsson and Hitt 1996;
Br
ynjolfsson and Hitt 2000; Jorgenson 2001; Lehr and Lichtenberg 1999; Oliner and Sichel

2000)
. At the level of the firm, productivity per information technology dollar varies
widely, while gains correlate with the adoption of complementary clusters of tec
hnology,
strategy and organizational practice. Findings in the literature correspond with mangers’
conventional wisdom
--

it is not the presence of the technology itself that influences
productivity, but how it is used. Our central question follows from
this insight:
specifically, how should information management practices influence productivity and
what theories would explain these effects?


This survey differs from earlier work on the relationship between computerization and
productivity by focusing

on how information influences productivity as distinct from the
technology per se. The central question is approached from two distinct theoretical
perspectives: the economics of uncertainty and computational complexity theory. Both
theories are highly
abstracted from social context and emphasize the thorough and rigorous
development of results related to information and efficiency. But they ask different
questions, use different tools and, most importantly, conceptualize information in different
ways.

We apply them here intending to inform organizational theory.


Roughly stated and somewhat exaggerated, the Neoclassical economic view of information
and productivity is that if you could reduce uncertainty about the state of the world to zero
then soluti
ons to productivity puzzles would be obvious. Risk, precision, and mechanisms
for achieving efficient outcomes despite incomplete information are central concepts in the
economics of uncertainty. A common feature is that they can be modeled within Bayesia
n
frameworks of decision theory and principal agency that consider only how information
addresses the probabilistic “truth” of a proposition.


In contrast, computational complexity theory asks first whether a problem can be solved,
given an algorithm, en
coding or heuristic. If so, theoretical interest then centers on
determining an upper bound for the time it will take specific search procedures to locate a
satisfactory solution. Sampling rates and standardization of data, coordination, sequencing
and f
ailure rates of processes, and processor efficiency are central concepts in
computational complexity theory. A common feature is an emphasis on the relative
efficiency of algorithmic processes in searching large, but well
-
defined problem spaces.


The valu
e of information is also interpreted differently in each model. The economics of
uncertainty defines information value as the difference between informed and uninformed

4

choice
(Hirshleifer 1973)
.
This works well if the context is a decision and the information
is truth expressed as news (e.g. stock prices, medical diagnosis, inventory levels and
accounting schedules). It has less to say, however, if the context is uncertain and the
information is
procedure expressed as instructions as in computer software, gene
sequences, blueprints, musical scores, process patents and production know
-
how. In that
case, the intuition that follows from computational complexity theory is to treat the
information as
an instrument of cause and effect and define value as the difference in
option value in the portfolios of a knowledgeable and ignorant agent
(Van Alstyne 1999)
.


The intuitive notion that procedural informatio
n contributes to productivity is widely
recognized. Hayek
(1945)

notes that “civilization advances by extending the number of
important operations which we can perform without thinking about them”; evolutionary
economics and organizational theory emphasi
ze the contribution of “routines”
(Cyert and
March 1963; March and Simon 1958; Nelson and Winter 1982)
; corporate strategy speaks
of “capabilities” or “competencies

(Barney 1991; Kogut and Zander 1992; Prahalad and
Hamel 1990; Wernerfelt 1984)
. When procedural inf
ormation is recognized within
economics broadly defined, it is most frequently modeled as accumulating stocks of
knowledge capital. Examples include the endogenous growth theory literature of
macroeconomics
(Adams 1990; Aghion and Howitt 1998; Rivera
-
Batiz and Romer 1991;
Romer 1986; Romer 1990)

and the industrial organization literature
(Griliches 1986; Pakes
1986)
. Our contribution is to adopt a modeling approach from computational complexity
theory as a way of expressing an underlying similarity in these representations.


The first
section of this survey seeks to develop a common theoretical understanding by
posing and addressing four questions: (1) What is productivity, (2) What defines the
Bayesian and computational perspectives, (3) What is the relationship between these
perspecti
ves and the economic definition of total factor productivity, (4) How do these
perspectives relate to theories derived from observations of human behavior in
organizations.


Hypotheses are developed in the second section of this survey. Hypotheses are g
rouped
into three perspectives. An economic perspective emphasizes the implications of
incomplete and imperfect data on decision making as well as strategies and mechanisms
that promote efficient collective outcomes in the face of self
-
interested behavior
. A
computational perspective characterizes the selection, recombination and coordination of
processes within organizations as a dynamic search to increase the ratio of the value of
output over input. A humanistic perspective considers ways in which simi
larity of purpose
and the network structure of communication may facilitate information sharing and the
creation of new value. The objective in bringing these three perspectives together around
hypotheses is to achieve a consilience in understanding
(Wilson 1998)

and reconcile
conflicting explanations of the same underlying phenomena.


The scope of this survey is limited to consideration of how information managem
ent
practices might theoretically influence productivity. It does not address issues in

5

productivity measurement or strategic uses of information in either an economic or
political sense.
1


What is Productivity?


In this paper, productivity refers to
the definition of total factor productivity in economics:
the difference between the rate of growth of real product and the rate of growth of real
factor input
(Jorgenson 1995)
.
2

The definition is consistent with an increase in the ratio of
the total value of output divided by the total value of input. An interpreta
tion is suggested
by Abramovitz
(1962)
: “the effect of ‘costless’ advances in applied technology, managerial
efficiency, and industrial organization (cost


the employment of scarce resources with
alternative uses


is, after all, the touchst
one of an ‘input’)…”


Following the economic theory of production, firms are assumed to possess a method for
transforming inputs into outputs, which is expressed as a mathematical relationship in a
production function. Different combinations of inputs can

be used to produce any specific
level of output, but the production function is assumed to adhere to certain mathematical
assumptions. Inputs and outputs are valued at market prices and investments in fixed
factors of production are apportioned as shares

of input across time. Productivity increase
is defined as an
outward shift

of the production function. A productivity increase is
differentiated from substitution of factors due to changes in the relative prices of inputs,
which is identified with
movem
ents along

the production function. Since strong
interaction effects between factors are an empirical regularity in dealing with the effects of
information, total factor productivity is generally considered to be the most meaningful
economic measure for
addressing our central question.



Economists also distinguish between the concepts of productivity, profit and increases in
consumer welfare
(for an empirical study, see Hitt and Brynjolfsson 1996)
. Depending on
who enjoys the resulting surplus, a productivity gain may lead to an increase in firm profit,
an increase in consumer welfare or a combination of both. In a perfec
tly competitive
market, all of the surplus from a productivity gain goes to the consumer; consumer welfare
increases, but profit does not. For a productivity gain to generate profit, barriers to entry
(Bain 1956; Porter 1980)

or application
(eg. Barney 1991)

must exist that prevent another
firm from appropriating the source of the productivity gain. In the
case of information,
firms may be able to appropriate a productivity gain to earn profits as the result of legal



1

References for measurement issues include Jorgenson (1995) and
Griliches (1998). A thoughtful review of
theoretical difficulties in firm level productivity estimation can be found in Nelson
(1981)

Strategic uses of
information are covered in literatures on game theory and industrial organization
(Fudenberg and Tirole
1991; Tirole 1988)
. We are not aware of examples of empirical research that have successfully addressed the
question of how information politics influences total factor productivity and refrain from speculation here.

2

The rates of growth of real produ
ct and real factor input are defined, in turn, as weighted averages of the
rates of growth of individual products and factors. These weights are relative shares of each product in the
value of total output and of each factor in the value of total input.
If a production function has constant
returns to scale and if all marginal rates of substitution are equal to the corresponding price ratios, a change in
total factor productivity is expressed as a shift in the production function. Changes in real product

and real
factor input not accompanied by a change in total factor productivity may be identified with movements along
a production function.



6

protection (eg. patents) or knowledge of a process that others are unable to reverse
-
engineer or otherwise imitate.


At the macro level of an

economy, productivity may be roughly interpreted as a proxy for
the standard of living. More precisely, productivity growth increases the potential for
welfare, but is not an independent standard because it may be accompanied by positive or
negative chan
ges in the physical, economic and political environments, as well as the
relationship between work and leisure
(Griliches 1998, p. 368)
. At the micro level of the
firm, productivity metrics are used to benchmark performance against standards of market
value
(Sudit 1995)
.

Two Views of Information: Homo Economicus and Homo Computicus


In adopting a Bayesian framework, Neoclassical economics defines information as a
reduction in uncertainty regarding the state o
f nature. Using the rational choice framework
of decision theory,
homo economicus

begins with
a prori

estimates over a complete
description of possible states. A signal contains information if it improves his
a priori

estimate. For example, in choosing
when to plant crops, a farmer’s strategy may be
conditional on the weather. Information in a weather report has value if it leads to a more
informed decision.


On the other hand, computational complexity theory considers information in declarative
and p
rocedural forms. Declarative information provides an internal representation of the
external environment within a state space. Procedural information specifies rules for
moving between states. State descriptions are finite, but typically orders of magni
tude
larger than those found in economics


for example, consider all of the possible
permutations of values that could be stored in the memory of a modern PC. As a result, it
is not uncertainty, but rather the complexity or computational costs of searchi
ng an
enormous state space that concerns
homo computicus
.


The two distinct orientations towards information lead to different kinds of questions,
different uses of tools and different notions of efficiency. The most common form of
efficiency in economi
cs is Pareto efficiency, an allocation such that no person can be made
better off without making someone else worse off. Calculus and fixed
-
point theorems are
used to give definitive results, such as the identification of equilibria. Efficiency concepts
in Neoclassical literature include those associated with incentive alignment, learning
-
by
-
doing and the application of mechanisms that mitigate the effects of information
asymmetries. The common element is a focus on static efficiency concepts that are
co
nsistent with the perspective of equilibrium models.


On the other hand, efficiency in computational complexity theory reflects the problem
solving orientation of engineering: establish the best algorithm given the constraints.
Algorithms are used to n
avigate a problem space, as in the traveling salesman problem.
More generally, computer science offers theoretical results dealing with how to
disassemble tasks, allocate communication links, recover from error and improve the speed

7

of execution. In cont
rast to economic perspectives on information, computational
perspectives are more dynamic.


Simon’s parable of the two watchmakers, Tempus and Hora, can be used to illustrate how
these two conceptualizations of information can affect theoretical interpre
tations of the
relationship between information and productivity
(Simon 1996)
. Each watchmaker builds
watches of 1,000 parts each. Hora divides the task into subassemblies based on pow
ers of
10, while Tempus does not. Interruptions by customers cause the watchmakers to lose any
partially completed work


five expected steps in the case of a subassembly, but 500
expected steps otherwise. At the end of the day, Hora has built more than t
hree orders of
magnitude more watches Tempus. Drawing on computational theory, Simon explains
Hora’s superior productivity as the result of a superior strategy for managing complexity.
A theorist that only acknowledges a Bayesian interpretation of inform
ation, however, is
unlikely to draw the same conclusion. Hora’s elevated productivity seems inexplicable
when posed purely in terms of Bayesian news.


One interpretation of Simon’s story is that more complete information about problem
structure can lead t
o the identification of more efficient procedures. Revisiting our
canonical farmer, more efficient farming procedures have historically increased agricultural
productivity. By incorporating conceptualizations of information from computational
theory, it
becomes possible to model these effects in a way that corresponds to the creation
of more efficient or effective paths through an individual or organizational knowledge
base.


Interpretation of Information Value Frameworks within the Context of Productiv
ity
Estimation

<Insert Fig. 1 here>


According to the economic theory of production, a firm transforms inputs into outputs
using the most efficient technology available. This is represented mathematically by a
production function. An efficient frontier e
quates combinations of inputs that produce the
same level of output.


In a Bayesian framework, noise and uncertainty about the state of nature lead to poorer
decisions and hedging. Production lies inside the efficient frontier. News reduces
uncertainty,

which moves the firm towards the frontier of its state
-
dependent production
function. The efficient frontier is reached only if present and future states of the world are
known with certainty.


In both economics and computer science, processes transfo
rm inputs into outputs. The
economic representation of a process can be thought of as coarse grain: a black box
transforms inputs into outputs. The computational representation is fine grain: a flow
diagram represents intermediate stages or sub
-
processes

in the overall process of
transforming an input into an output. Conceptualizing the knowledge base as a road map,
acquiring procedural information can be thought of in terms of identifying shorter routes

8

between two cities or the ability to reach new ci
ties entirely. When this new path creates
more real value output from an equivalent amount of real value input, it corresponds to the
Neoclassical definition of a productivity increase: an outward shift of the efficient frontier.


Furthermore, the knowl
edge base can be conceptualized as a portfolio of options
--

reflecting the idea that the value of an instrument is derived not so much from a direct flow
of benefits as from a claim on benefits. In economic terms, informational options provide
indirect u
tility. The value of informational options can be quantified by applying
techniques from finance for valuing real options
(Dixit and Pindyck 1994; 1995)
.
Modeling information as instructions valued as options, increases in value may arise from:
(1) stochastic variation in the value of the computed result, (2) decreasing
the strike cost,
(3) increasing the risk neutral interest rate (because the option functions as a loan), (4)
increasing environmental uncertainty (because the option can be used in better outcomes
and ignored in worse ones) and (5) increasing the valid tim
e horizon.


How do the theoretical perspectives relate to a human perspective?


The Bayesian and computational frameworks emphasize different aspects of the way
people use information. The Bayesian framework is roughly analogous to the strategy of
statis
tical inference applied to decision making, while the instrumental framework is
roughly analogous to searching through combinations of subroutines to build a path
through a problem space. Combined into a single general model for valuing both data and
proce
ss, these two perspectives offer theorists a precise way of modeling relationships
between information and productivity.


We do not believe or argue, however, that this generalized model necessarily captures all
the ways in which human information proces
sing abilities influence productivity. Focusing
attention on gaps in the analogies offered above leads us to consider specific areas in which
a humanistic perspective complements and enriches our abstract theoretical framework.


In empirical studies of h
uman decision
-
making, psychologists have frequently found
evidence of behavior that appears to contradict normative predictions. Persistent biases
identified in empirical studies include: over
-
focusing on the first information received
(anchoring), favori
ng the status quo, protecting earlier choices (sunk costs), seeing what
you want to see (confirming evidence), framing issues, overconfidence, neglecting relevant
information (overlooking the base rate), over focusing on dramatic events and seeing
patterns

in randomness
(Hammond, Keeney et al. 1999; Tversky and Kahneman 1974;
Tversky and Kahneman 1981)
.


When economists respond to such criticisms they often invoke the concept of “bounded
rationality”
(eg. Simon 1955)

or acknowledge that the plausibility of “as if” arguments
hinges on conditions that favor learning about problem structure. Such conditions include:
recurrence, simple context, slight environmental variation, acc
urate feedback, rewards and
small deliberation costs at each repetition
(Conlisk 1996)
. Heiner
(1983; 1988)

gives a
precise meaning to “bounded rationality”

by modeling a gap between the competency of

9

the decision maker and the complexity of the decision problem. As decisions become
more complex, the probability of decision errors increases. This leads to scenarios in
which less communication and adherence
to rule
-
based behavior that restricts action to
reliable procedures may improve organizational performance. By implicitly assuming
perfect knowledge of problem structure, Neoclassical theory side
-
steps questions of how
rational actors might use informatio
n to generate actions that are likely to improve
performance within a potentially complex or dynamic task environment in favor of
explaining how features of the economic environment shape rational behavior
(Newell and
Simon 1972: 53
-
54)
.


Homo computicus’
development in cognitive science provides a complementary
interpretation of disc
repancies between the normative predictions of Neoclassical theory
and empirical studies of human behavior: when confronted with complexity, humans rely
on heuristics, which are seen as both the source of persistent biases and the key to problem
solving
(Anderson 1995)
. Em
pirical work reveals specific empirical regularities regarding
human cognitive capacity constraints
(eg. Miller 1956)
. More general results apply to
cognitive phenomena, such as expertise
(e.g. de Groot 1965)
. For example, the superior
performance of chess experts may follow from the development of physiological structures
that can be interpreted “as if” they were specialized encodings and algorithms. Evidence
suggest
ive of specialized encodings follows from experiments that show chess experts
exhibit superior memory for board configurations that logically follow from actual play,
while they exhibit roughly average memory for random configurations
(Ander
son: 293)
.
Evidence suggestive of algorithmic
-
like processes, termed heuristics, follows from
observations of the way players of progressively higher rank select patterns of moves from
an increasingly well
-
developed base of strategic knowledge.


Cognitive psycholog
ists frequently invoke analogies between human and
computational processes. However, pure computational frameworks have two important
limitations for modeling the influence of information on productivity. First, computational
frameworks are mute on quest
ions of human motivation. Around this point, the self
-
interest assumption of economics and the much broader set of motivational explanations
invoked by humanistic theorists, such as searches for meaning and affiliations with those
who exhibit similarities

of background or purpose, can be considered complementary.
Second, human cognitive processing differs from computational processing in a number of
ways that affect information sharing and the creation of value, such as a greater willingness
to share base
d on homophily or like types
(Blau 1977; Riolo, Cohen et al. 2001)
. Around
this point, research from humanistic perspective can enrich both economic and
computational understandings.


Differences between computational and human information processing can be illustrated

by a series of examples. Computational frameworks do not recognize tacit knowledge, by
definition inexpressible in words
(Polanyi 1966)
. Humans can process and act
on
equivocal information, a process described as sensemaking, while computers do not
(Weick 1979; Weick 1995)
. Humans can solve
a wide variety of practical problems with
relative ease that artificial intelligence researchers have not
(Russell and Norvig 1995)
.

10

Humans readily recognize sociological constructs such as gender, power, status, role,
identity and group affiliation, while explicit representation of these constructs remains
difficu
lt within in either a Bayesian or computational framework. A humanistic
perspective thus supplements hypotheses engendered by the other two.


DEVELOPMENT OF HYPOTHESES


Different perspectives on information lead to different lines of analysis around ou
r central
question. In adopting a Bayesian view of information, economists place particular
emphasis on the role of information as data that informs choice. Neoclassical theorists
have also tended to focus more on the role of information within the conte
xt of exchange
than production. Nevertheless, economic theory makes major contributions towards
understanding the relationship between information and productivity by offering
mathematical conceptualization of factors such as risk, as well as insights int
o the design of
theoretical mechanisms that enable self
-
interested individuals to achieve efficient joint
outcomes.


In contrast to economists, computer scientists have devoted far more attention to
developing precise theoretical notions of informational f
actors that influence the efficiency
and effectiveness of processes, many of which have economic implications within the
context of production. Cognitive psychologists, organizational theorists and strategy
researchers can be counted among the groups of t
heorists that have adopted and refined
computational concepts for applications in organizational settings. In part, the popularity
of computational theory as metaphor may reflect the intuitive appeal of theorizing about
computational systems as extensions

of the individual or organizational mind. Application
of computational theory to organizations often focuses on tradeoffs arising within the
context of socio
-
technical systems.


In contrast to the other two perspectives, theorists in the humanistic tra
dition tend to draw
their concepts from empirical observation as opposed to abstract principles. Disciplines
such as social psychology and network sociology contribute intellectually distinct concepts
regarding the role of information in the negotiation o
f meaning between individuals and the
formation of interpersonal communication networks. In this survey, the humanistic
perspective is used primarily to develop hypotheses about the role of information sharing
and the creation of new value. While inclusi
on of a humanistic perspective complicates the
project of forming an integrated theory of the influence of information on productivity, the
authors believe it is better to address important questions raised from this perspective than
ignore them. This see
ms particularly true given the formative state of theory relating
information to productivity.


Economic Perspectives on Organizational Productivity


A fundamental question economists ask about data is what makes it more or less valuable?
While a monoton
ically increasing relationship between quantity and value is typically
assumed for physical goods in the theory of production, this relationship does not generally

11

hold for information in a Bayesian sense, which considers the value of information in terms
of the specification of a decision problem.

3

There is, however, one sense in which
information is always considered more valuable within a Bayesian framework: more
precise (less noisy) information sets are always at least as valuable as less precise (mor
e
noisy) ones.
4

By definition, an information structure that represents another information
structure at a finer level of granularity is more precise.
5

More precise information reduces
risk, which can be stated formally as,


Hypothesis 1: Information t
hat reduces risk increases productivity when it leads
risk averse decision makers to choose actions that are closer to optimal risk
neutral levels.
6



A risk
-
averse decision maker is one that would be willing to pay a premium for insurance
against an arbi
trary risk
(Pratt 1964)
. Risk aversion i
s assumed to be the normal case for
investors (who seek insurance through diversification) and for employees in principal
-
agent

theory
(for a survey, see Eisenhardt 1989)
. While firms and
principals are traditionally
assumed to be risk neutral, economists have increasingly recognized situations in which
they may be risk averse as well, conditions that often relate to informational imperfections
in capital markets, such as under
-
investment
(Arrow 1962; Stiglitz 2000)
.


In addition to risk
reduction through the pooling of risk in insurance and contingent
commodities markets, the economics of uncertainty focuses on the design of mechanisms
that address risks arising from information asymmetry in market exchanges, such as
adverse selection (pr
e
-
contractual information asymmetry) or moral hazard (post
-
contractual information asymmetry). The underlying idea is that efficiency gains can be
realized through informational mechanisms that prevent failed or poor transactions or
unnecessarily waste re
sources in the course of establishing mutually beneficial
transactions. These include signaling mechanisms established by the party with the private
information (eg. the seller of a used car offering a guarantee) and screening mechanisms
established by th
e party attempting to ascertain the truth of private information (eg. a



3

This can be formally demonstrated by showing that it is possible to construct two utility functions in which
t
he ordered value of an information set is reversed
(Marschak and Radner 1972)
.

4

Blackwell
(1953)

demonstrates a formal equivalence relation between increased precis
ion and reduced
noise.

5

Precision may also be thought of in computational terms. In computational complexity theory, the
Kolmogorov (algorithmic) complexity of a string x with respect to a universal computer U is the minimum
length over all programs th
at print x and halt. It is useful to think of the shortest description of a sequence that
one person can give to another that would lead unambiguously to computation of the sequence in a finite
amount of time, then the number of bits in the communication i
s an upper bound on the Kolmogorov
complexity
(Cover and Thomas 1991, p. 147)
.

6

The economics of uncertainty defin
es reduced risk in terms of the formally equivalent concepts of second
order stochastic dominance and a mean preserving spread, refinements to the general notion of reduced
variance that are made to ensure a risk averse decision maker will always prefer a
distribution that is formally
defined as less risky
(Mas
-
Colell, Whinston et al. 1995, p. 199; Rothschild and Stigl
itz 1970)
.


Given two distributions, X and Y with the same mean, the following three statements are formally equivalent
to the formal economic definition, while reduced variance is not: Y is equal to X plus noise, every risk averter
prefers X to Y, Y has
more weight in the tails than X
(Rothschild and Stiglitz 1970)
.


12

potential employer requesting educational credentials from job applicants)
(Akerlof 1970;
Spence 1973)
. Economic literature further distingu
ishes between one
-
shot and repeated
contracts. In the latter, factors such as reputation and the expectation of future gains from
trade may mitigate the risk of opportunistic behavior
(Tirole 1988)
.


While economists have focused primarily on the role of information in reducing risk, more
precise information can also lead to the identification of more efficient allocations of
resources
. For example, as the precision of hourly data permits more accurate adjustment
to environmental conditions than weekly or monthly data, firms might invest in finer
grained information gathering. This may be stated formally as,


Hypothesis 2: More precis
e information leads to Pareto superior decisions by
reducing waste.


More precise information may increase productivity when it leads to more accurate
matching of supply and demand or reduces organizational slack and delay costs
(Cyert and
March 1963; Feltham 1968; Galbraith 1973)
. While humanistic observation suggests that
slack or delay can at times promote productive outcomes by providing capital

for
investments that lead to innovation or hiding political conflict, the theoretical justification
for slack or delay in economics is typically limited to reduced risk.


Economists typically assume perfectly specified decision problems and costless
inf
ormation gathering and processing from which it follows that more precise data is
always better. However, complex decision problems often involve unforeseen
interdependencies or complementarities in the analysis of data, which can result in local
suboptim
ization. Considered more generally, tradeoffs often exist between gathering more
precise data around the perceived decision problem or sampling more broadly in hopes of
developing deeper understanding. The economic implications can be stated formally as,



Hypothesis 3: Broader intra
-
organizational data access improves productivity.
Balkanized information reduces productivity by missing economies of scope and
scale. Sharing information reduces balkanization.


Informational economies of scale arise beca
use fixed costs of information production are
generally high relative to the marginal costs, such as distribution. Most informational fixed
costs are also sunk, in forms such as learning and the establishment of trust relations, so
they cannot be recovere
d if production is halted
(Arrow 1974; Shapiro and Varian 1999)
.
Economies

of scope arise when the joint cost of producing two different outputs is less
than the cost of producing them separately. Informational economies of scope arise from
indivisibilities in the application of a specialized knowledge base
(Teece 1980)
. For
example, costs of developing products and services may be reduced and new opportunities
recognized when data is shared across product design, manufac
turing and sales. A similar
argument underlies potential efficiency gains in large diversified companies. Balkanization
can be measured empirically using similarity and distance metrics from information theory
and graph theory measures of social networks
(Van Alstyne and Brynjolfsson 1996a, b;
Wasserman and Faust 1994)
.


13


While notions of information

balkanization are rarely if ever considered in Neoclassical
theory, this line of theoretical development could plausibly address commonly noted
aspects of the way managers allocate their time
(e.g. Mintz
berg 1990)
. Preferences for rich
media
(Daft and Lengel 1984; 1986; Daft and Weick 1984)
, meetings and impromptu
interactions are difficult to explain in Bayesian terms. However, the realization of
informational economies of scope and scale in the face of balkanization sug
gests an
economic interpretation given the frequency with which organizational decisions must be
made around ill
-
defined problems.


Humanistic observation further suggests that interpretation of equivocal data from novel or
multiple perspectives is often
an important element in problem solving. Examples range
from accounts of major scientific innovations
(e.g. Roberts 1989)

to principles for
communication systems that support knowledge work
(e.g. Boland and Tenk
asi 1995)
. It
is worth noting that the role of information in problem solving has received little attention
from economists. In one notable exception, Hong and Page
(2001)

model the contribution
of a diversity of perspectives to superior problem solvi
ng in the absence of incentive and
communication problems.


Economists’ interest in the role of information frequently extends beyond a focus on the
individual decision maker to an interest in how the structure of exchange relationships
affects the abili
ty of self
-
interested individuals to jointly achieve efficient outcomes. The
favored style of argument begins with the identification of fundamental principles that
serve as common concepts from which mathematical models are used to explore the
properties

of relationships between agents. For example, the principle that co
-
location of a
decision right with the most complete information promotes efficiency underlies economic
arguments for conditions under which competitive markets promote efficient outcomes
.
Transferring this principle to an organizational setting and supporting it with
complementary insights from computational theory leads to the following hypothesis
regarding the balance between centralization and decentralization of data:


Hypothesis 3:

Centralized decision making promotes decision consistency and not
having to rediscover the same information twice. Decentralized decision making
promotes data gathering and adaptation. Productivity increases to the extent that
distributing control of i
nformation optimally balances complementarity and
indispensability.


Centralization of data is typically favored for decisions in which global implications
predominate, since it promotes coordination and consistency. Examples include decisions
involving o
rganization
-
wide processes (eg. accounting, finance and legal services), integral
aspects of design processes
(Ulrich 1995)

and crisis situations in which rapid coordination
is essential
(Bolton and Farrell 1990)
. Economically, centralization limits the costs of
redundant systems, in terms of construction, maintena
nce and search. Technically,
centralization is favored in terms of data integrity and enforcing a uniform standard
(Van
Alstyne, Brynjolfsson et al. 1995)
.



14

On the other hand, decentralization favors data gathering and adaptation. In terms of data
gathering, expertise in a specific area is inseparable from the ability to know w
hat data to
pay attention to, an idea that is typically invoked in economic literature using the concept
of asset specificity from transaction costs economics
(Chodhury and Sampler 1997;
Williamson 1975; 1985)
.


Examples of issues related to organizational tradeoffs between c
entralization and
decentralization that have been pursued in the economic literature include: the influence of
information technology on the make
-
or
-
buy decision
(Gurbaxani and Whang 1991;
Malone,

Yates et al. 1987)

and the location of organizational decision rights
(Nault 1998)
;
horizontal coordination
across multiple markets
(Anand and Mendelson 1997)

and
economic justification for hierarchy
(Radner 1992)
. Organizational theorists h
ave also
framed the tradeoff in computational terms by equating decentralization of decision rights
with parallel processing and centralization with sequential processing
(Cohen 1981)
.


The argument for decentralization favor
ing adaptation follows from a second economic
principle: ownership of an asset influences incentives for future investment. In the theory
of incomplete contracts, ownership is interpreted as designation of residual rights of
control, which refer specifica
lly to decision rights that for reasons of cost or generalized
uncertainty are not pre
-
specified in a contract
(Grossman and Hart 1986; Hart and Moore
1990)
. The efficiency argument that allocates residual rights of control to the party with
the greatest incentive to invest follows from the assumption of self
-
i
nterest considered in
light of the hold
-
up problem. In particular, non
-
ownership creates a disincentive for future
investment by reducing the expectation of subsequent gains since these must be negotiated
with the owner.


Applying the theory of incomple
te contracts to information systems offers the following
results: (1) information systems that are independent of other parts of the organization
should remain decentralized, (2) systems with complementary information should be
combined under centralized c
ontrol whenever possible, (3) more indispensable agents
should exercise greater control, (4) no distribution of control will induce first best levels of
investment in situations involving both complementary information sources and more than
one indispensab
le agent, (5) providing independent copies may mitigate this problem at the
risk of reintroducing fragmentation
(Brynjo
lfsson 1994; Van Alstyne, Brynjolfsson et al.
1995)
.


Despite the non
-
existence of a first best distribution of control, organizations may still
realize significant benefits from sharing data across groups. In such cases, control over an
information so
urce should be granted to the most indispensable department. In the case of
conflict over design principles, a reasonable heuristic is to consider the investment
motivations of the group that contributes both the greatest marginal and the greatest total
v
alue.


While allocations of information access and decision rights suggest elements of a policy
framework that might inform organizational practice, they do not specifically address the

15

question of how organizations might motivate self
-
interested employee
s to proactively
share valuable information. A third economic principle
--

well
-
aligned incentives can
motivate behavior that encourages efficient outcomes
--

suggests that when organizational
information sharing is desired, absolute incentives may have a

specific advantage over
relative incentives. Stated formally as a hypothesis,


Hypothesis 4: Absolute incentives encourage information sharing, which promotes
group productivity; relative incentives discourage information sharing, but promote

individual
productivity. The optimal incentive policy in terms of productivity
becomes increasingly absolute with increasing task interdependence.


The intuition follows from an example of classroom grading policies: under an absolute
incentive scheme, every student

who answers 90 percent of the questions correctly on an
assignment gets an “A” (regardless of the number of students), under a relative scheme, the
top 10 percent of the students get an “A” (regardless of the actual score). Assuming self
-
interested behav
ior, the former policy promotes sharing, while the later discourages it. An
axiomatic model of this phenomenon is developed in Van Alstyne and Brynjolfsson
(1995)
, while the motivation follows from Orlikowski’s
(1992)

case study

of groupware
use in a competitive up
-
or
-
out consulting firm. Orlikowski found junior consultants
refused to share information for fear of losing strategic advantage, while senior consultants,
who were rewarded based on the absolute performance of the fir
m shared information
extensively. The optimal incentive policy is hypothesized to depend on the degree of task
interdependence, which correlates with increased information sharing.


The question of optimal incentives to promote information sharing seems

ripe for
theoretical contributions to supplement a growing applied literature on knowledge
management
(Davenport and Prusak 1998; Sveiby 1997)
. However, literature from all
three perspectives suggest theoretical challenges. Economic theorists focusing on the
private provision of public goods
(Bergstrom, Blume et al. 1986)

and
team production
(Alchian and Demsetz 1972; Marscha
k and Radner 1972)

have highlighted problems
inherent in the design of group incentive policies, such as freeriding and identification
issues in the presence of complementary inputs
(See Barua, Lee et al. 1995 for examples
along these lines)
. Computational theorists suggest that even within the context of some
well
-
defined problems, such as checkers or ch
ess, it is often unclear how to assign credit
for earlier actions that set the stage for later gains
(Samuel 1963)
. Humanistic observers
commenting on the har
sh realities of information politics suggest that “no technology has
yet been invented to convince unwilling managers to share information or even to use it”
(Davenport, Eccles et al. 1992: 56)
.


These li
teratures also suggest a number of compensating factors. Studies of machine
learning suggest that increasing the frequency and accuracy of feedback, which includes
aligning incentives to intermediate goals can improve performance
(Samuel 1963)
. Case
studies of information sharing suggest that the effectiveness of monetary incentives may be
enhanced when they are combined with efforts to develop feelings of a c
ommon purpose,
such as peer evaluations and the encouragement of norms
(Hansen, Nohria et al. 1999;
Starbuck 1992)

. In case study research on consulting firms, Hansen et. al
(1999)

observed

16

a difference between incentives in strategy firms, which seek unique solutions and favor
incentives tied to peer
-
ratings of helpfulness to promote knowledge of “who knows what”,
an
d mass
-
market firms, which seek to minimize the costs of replicating best practices and
favor incentives tied to contributions of data to the organizational knowledge base to
promote knowledge of “how to.” Other studies conducted in technical environments

have
observed sharing practices motivated by norms that support fast and accurate feedback in
resolving specific problems
(Hargadon and Sutton 1997; Orr 1996)
.


Computational Perspectives: Organizations as Information Processing Systems


In contrast to the economic emphasis on data that informs choice, computational theory
emphasizes efficiency in

matching algorithms, heuristics and encodings to problems that
are formulated as interpretations of a complex environment. All computational
frameworks assume standardization at the level of encodings, without which algorithmic
manipulation of symbols is

infeasible. At the same time, organizational environments are
most realistically portrayed as open systems
(Thompson 1967)
, in which efficiency and
effectiveness depend in part on the flexibility o
f human responses to novel or poorly
defined problems. Applied to standards, we state the tradeoff formally as,


Hypothesis 5a: Informational standards foster interoperability and sharing, which
increase productivity. In contrast, standards limit adaptati
on and flexibility, which
reduce productivity. In terms of output, optimal information standardization
increases with decision stability.


Standards may increase productivity by reducing costs of monitoring, deliberation or
search, promoting economies of
scale or scope in information processing and fostering
network externalities. In human
-
to
-
human information exchange, recognition of common
knowledge or language can also assume the role of a standard, an interpretation that will be
further developed in
Hypothesis 9.


At the same time, standards are constraints, which promote efficiency by limiting degrees
of freedom at the interface between processes. Short run costs of working around a
standard involve the cost of recognizing, formulating and handlin
g an exception or the
hidden costs of ignorance
(e.g. Balakrishnan,
Kalakota et al. 1995)
. In the long run,
deference to a standard can limit adaptation and flexibility.


In the past 20 years, economists’ have developed a considerable literature on standards,
which includes exploration of their efficiency and welfare p
roperties.
7

In the presence of
perfect information and perfect foresight, organizational standard setting is trivial
--

affected parties simply select the equilibrium that offers the highest payoff
(Farrell and
Salon
er 1985)
. However, when information or foresight is imperfect and network
externalities are present, typical conditions in real organizational settings, expectations



7

For good introductory treatments of strategic issues involved in standards setting from an economic
perspective see Shapiro and Varian
(1999)

and Shy
(2001)
. For strategic issues from a computational
perspective, modeled using landscape theory, see Axelrod et. al
(1997)
.


17

equilibria predominate
(Katz and Shapiro 1985)
.


Once adopted, standards give rise to patterns of complementary investment. Economic
implications include: path dependency, increasing returns, switching costs and network
externalities
(Arthur 1989; David 1985; Katz and Shapiro 1985; Liebowitz and Margolis
1990; L
iebowitz and Margolis 1994; Shapiro and Varian 1999; Shy 2001)
. Following
organizational adoption of a standard, economists often assume productivity will increase
over time, although at a decreasing rate, which corresponds with empirical regularities se
en
in learning curves or learning
-
by
-
doing
(Arro
w 1962; Epple, Argote et al. 1996)
. In the
presence of learning
-
by
-
doing effects, a central microeconomic question for productivity
analysis involves timing the switch to a potentially more efficient standard
(Jovanovic and
Nyarko 1996)
.


While economists have focused on coordination issues inherent in the selection of
standards, organizational theorists typically foc
us on the coordination of processes.
Contingency and coordination theorists consider the selection of specific coordination
strategies across firms by identifying and analyzing tradeoffs that arise in managing the
handoffs or interdependencies between ac
tivities
(Galbraith 1973; Lawrence and Lorsch
1967; Malone and Crowston 1994; Thompson 1967)
; while knowledge and resource
-
based
theorists argue that the difficulty of replicating tacit aspects of coordination generates

sustainable advantages in efficiency
(Barney 1991; Conner and Prahalad 1996; Kogut and
Zander 1992; Kogut and Zander 1996)
. Stated formally as a hypothesis,


Hypothesis 5b: Coo
rdinating information improves the efficiency of existing

processes by reducing the number of bad handoffs and improving resource
utilization rates.


Many of the questions posed by pioneering works in contingency theory have formed the
basis for subseque
nt works in coordination theory. A broad interdisciplinary framework
covers both mechanisms, typically developed from computational and economic
perspectives, and their application in specific media and tools, which involves
consideration of issues involv
ing the interaction of humans and information technologies
(Malone and Crowston 1994; Malone, Crowston et al. 1999; Olson, Malone et al. 2001)
.


The concept of modularity underlying Simon’s watchmaker parable suggests a specific
example of how insights from computationa
l theory might be used to inform organizational
design. Dividing a process into sub
-
processes requires an investment in learning about
problem structure. However, once acquired, procedural information can be re
-
used. In
addition, the functional indepen
dence of modular processes creates opportunities to
leverage this investment through recombination. Stated formally as a hypothesis,


Hypothesis 5c: Modularity increases the number of independent processes, while
standardizing interfaces between proces
ses. Modular designs can increase
productivity by spreading the risk of process failure or enabling new combinations
of processes that extend the efficient frontier.



18

Computational notions of modularity are exemplified by the nesting of procedural buildin
g
blocks that underlie the growth of complex systems
(Holland 1995; 1998)
. Within
organizational contexts, modularity is most frequently associated w
ith design processes.
By dividing tasks into independent modules, organizations partition the search space of
potential designs. In theory, this reduces the costs of experimentation and speeds
development by allowing design processes to operate in parall
el
(Baldwin and Clark 2000;
Fine 1998)
.


Current rates of change in processes, product
s and organizational structures vastly exceed
those observed by early contingency theorists. By definition increasing rates of
environmental change increase uncertainty, shorten windows of opportunity and place a
premium on flexibility and adaptation. St
ated formally as a hypothesis,


Hypothesis 6a: The optimal rate of information gathering and flow increases with
the rate of environmental change. Conditional on an ability to adapt, organizations
that match their information gathering to environmental c
hange rates will be more
productive.



Rates of environmental change or clockspeeds are generally thought of in terms of cyclical
frequencies. Product cycles are the most familiar, but cyclical patterns of transition from
integral to modular arrangements
may also be observed in organizational processes and
structures
(Fine 1998)
. While industrial clockspeeds are generally increasing, particularly
dramatic examples can be found in industries collectively referred to as MICE (multimedia
information, communications and electronics).


St
udies of information management practices in high clockspeed industries suggest
information intensive strategies may be favored in dynamic environments. Examples
include value chain analyses, which may be applied along the full chain of supply,
distributi
on and alliance networks to search for bottlenecks and high value processes, and
concurrent engineering, which may be applied to analyze products, processes and supply
chains along a spectrum from integrality to modularity in dimensions of space and time
(Fine 1998)
.


While computati
onal theory suggests strategies that may facilitate adaptation through the
creation and recombination of processes, questions surrounding the extent to which
organizations are actually able to implement changes in core processes are reflected in
longstandi
ng debates between proponents of organizational selection and strategic choice
(Aldrich 1999)
. Population ecologists Hannah and Freeman
(1
989)

suggest capacities to
recognize environmental changes typically outrun internal abilities to implement change in
rapidly changing environments. They argue that factors that promote efficiency in stable
environments, such as reliability and accountab
ility, may become liabilities in rapidly
changing environments because of resistance to change. Large organizations are
hypothesized to have the resources to pursue diversification strategies, while small
organizations must rely on their agility in spotti
ng and capitalizing on new opportunities.
Given a delay, some organizations are likely to adapt, while others will be selected out of
the population, typically through bankruptcy. Olley and Pakes
(1996)

have empirically

19

t
ested both adaptation and selection hypotheses within the telecommunications industry
and found more evidence in support of selection.


Researchers studying business process re
-
engineering have compiled a large volume of
evidence on the difficulties of
making large scale changes in routines
(Bashein, Markus et
al. 1994; Biazzo 1998; Davenport 1992; Hammer and Champy 1993; Motwani, Kumar et
al. 1998)
. Failure rates have been estimated to be as high as 75 percent
(Bashein, Markus
et al. 1994)
. On th
e other hand, risks of persisting in organizational practices that may be
highly efficient in a technical sense, but are no longer economically viable are reflected in
the notion of “competency traps”
(Levitt and March 1988)
. Organizations may become
particularly vulnerable when competitors successfully innovate at the level of process
architecture
(Christensen 1997; Henderson and Clark 1990)
. If radical changes are deemed
necessary, awareness of complementarities and interference effects between existing and
target practices may potentially increase the probabil
ity of success
(Brynjolfsson, Renshaw
et al. 1997)
.


In computational frameworks, adaptation is typically accomplished through

feedback.
Insights from queuing theory suggest a specific tradeoff organizations may face in the
allocation of communication channels: redundant links minimize the risks of not being
able to access key information sources in a times of urgent need; whil
e a diversity of links
maximizes information that leads to the development of new options. Stated formally as a
hypothesis,


Hypothesis 6b: The need for redundant links increases with the likelihood of agent
incapacitation. Latent links are needed for occa
sions when novel domain specific
experience becomes essential. Redundant links conflict with the desire to use
these links for new information.


While sensitivity to context is clearly important in applying this principle, empirical
observations of organiz
ational communication networks suggest some potential
interpretations. In the context of innovation, sparse networks rich in weak ties may provide
the latent links needed to spot new opportunities, while strong ties characterized by
repeated interactions
may be needed to transfer complex knowledge
(Hansen 1999)
. On
the other hand, organizat
ions with high turnover are likely to favor redundancy. A
potentially interesting case arises in the context of high reliability organizations, such as
nuclear power plants, naval aircraft carriers and air traffic control systems. Although
redundancy mi
ght be hypothesized to enhance safety, case study research suggests high
reliability may paradoxically be associated with an abundance of latent communication
links (Weick et. al,
Forthcoming)
. A potential explanation is that communicating with
many different sources may help human operators dev
elop a greater repertoire of skills, as
well an enhanced ability to detect weak signals that contradict expectations. In other
words, the system property of requisite variety as opposed to redundancy at the level of the
individual and not the group appear
s particularly important in this context.


The application of computational theory to organizations often focuses on the search for
synergies between computational power and human interpretative abilities. Zuboff
(1988)


20

makes a particularly useful distinction between “inform
ating,” which refers to
computational processes that can potentially expand human abilities to interpret
developments within complex, dynamic environments, and “automating” which are
processes that algorithmically process large quantities of data. We conc
lude the section of
hypotheses from a computational perspective by considering potential productivity
implications of two broad classes of technologies that may correspond with “informating”:
simulation modeling and data mining.


In simulation and modeli
ng, tacit conceptualizations of design object or problem are made
explicit
(e.g. Sterman 2000)
. Simulations may increase the potential for intra
-
organizational information sharing by acting as a boundary object between distinct
com
munities of practice
(Brown and Duguid 1998)
. Simulations are also likely to increase
favorable conditions for learning about problem structure by lowering the costs of learning
and promoting feedback
(Conlisk 1996)
. Better decisions may result from a better sense of
complex interrelationships betwe
en factors or a sense of the distribution from which
outcomes are drawn as opposed to a particular draw sampled from experience
(Cohen and
Axelrod 2000; M
arch, Sproull et al. 1991)
. Stated formally as a hypothesis,


Hypothesis 7a: Simulation and modeling help decision makers more accurately
identify leverage points within dynamic systems and reduce the cost of exploring
alternative courses of action. They

boost productivity by reducing wasted
resources and creating new options.



Simulation modeling may be particularly effective for formulating decisions that involve
dynamics. Two examples from the supply chain literature include: (1) the “beer game” o
r
bullwhip effect, in which the volatility of demand and inventories in the supply chain tend
to be amplified as one looks farther upstream away from the end user and (2) clockspeed
amplification which tends to occur as one looks downstream towards the fin
al customer
(Fine, 1998; Sterman, 2000). Both effects correspond with empirical regularities that can
potentially be managed to increase productivity, yet management without an understanding
of the underlying dynamics is likely to be problematic at best.



A second potential form of informating is data mining, in which statistical techniques are
used to uncover patterned relationships in business data. Stated formally as a hypothesis,


Hypothesis 7b: Data mining boosts productivity by improving the accu
racy of
information available to decision makers and by surfacing connections between
previously disparate pieces of information that can lead to the creation of new
options.


Data mining processes are generally defined in terms of classification (eg. good

vs. bad
loan prospects), association (eg. milk is purchased with cereal), sequencing (eg. someone
who buys cleats will eventually buy Ben
-
Gay sports rub) and hypothesis testing (eg.
professionals will be less sensitive to couponing). These processes incl
ude statistical
techniques include traditional methods of Bayesian inference, as well as techniques
developed in artificial intelligence research, such as expert systems and neural nets.


21


Data mining may be used to analyze both the internal and external bu
siness environment
(Brachman, Khabaza et al. 1996; Fayyad and Uthurusamy 1996; Glymour, Madigan et al.
1996)
. Internally, data mining may be us
ed to surface information surrounding
performance and optimization opportunities, as well as identifying leverage points for
improving processes and troubleshooting problems. Externally, it may be used for the
analysis of critical suppliers, competitors a
nd customers


Quantities of data collected and warehoused continue to increase. Yet business data is
inherently noisy and subject to decay. With imprecise data, there is no guarantee that
sophisticated statistical techniques will converge on meaningful
answers. Whether data
mining actually increases productivity and under what conditions poses a potentially
interesting empirical question.


Humanistic Perspectives: Information Sharing and the Creation of New Value


While economists and computational t
heorists develop theory at high levels of abstraction,
humanistic theorists tend to stick closer to empirical particulars. Results from disciplines
such as social psychology and network sociology tend to be context
-
dependent and rarely if
ever invoke the
economic concept of productivity. However, humanistic theory contributes
important observations regarding the creation of new procedural options and their selection
and diffusion through human patterns of interaction. In this way, humanistic perspectives

often complement rational and computational models, offering both depth and insight.


Considered from a humanistic perspective, the notion of that productivity increases through
the accumulation of more efficient processes highlights the importance of i
nformation
sharing. Stated formally,


Hypothesis 8: The sharing of know
-
how can create new procedures or options that
extend the efficient frontier for those who were unfamiliar with them. This sharing
increases productivity.


Common forms of procedura
l information sharing include: grafting from new
organizational members, joint ventures or consultants
(Huber 1991)
; informal know
-
how
trading in which non
-
proprietary information is routinely exc
hanged based on norms of
reciprocity
(Von Hippel 1988)
; sharing through the networks of informal, professional or
industry associations
(Crane 1969; Saxenian 1994)
; and diffusion when the information is
offered at little or no cost as a complement that enhances the sale of a prod
uct
(Griliches
1958)
.


While technological information sharing relies on standardization of encodings,
information s
haring between humans requires partial knowled
ge overlap.
If
partners lack
common understanding, neither will be able to connect information regarding a new
opportunity to anything they know how to perform. Likewise, a partner that knows
everything another partner knows gains nothing through sharing. This can be s
tated
formally as,


22


Hypothesis 9: Efficiency gains accrue to the sharing of complementary information.
Optimal sharing occurs between partners with partial information overlap.


Studies of technology transfer suggests that when two parties are motivated

to share
information, the difficulty is typically related to the complexity of the information
(Hansen
1999)
. Two relevant dimensions of complexity are the level of codification
(Winter 1987;
Zander and Kogut 1995)

and the extent to which information to be transferred is an
element of a set of interdependent components
(Teece 1986; Winter 1987)
. For example, a
standalone software package is easy to transfer along both dimensions. In contrast, tacit
knowledge is by definitio
n uncodified and, at the level of internal encoding, likely to be
highly interdependent. Both properties make it difficult to transfer. Research in cognitive
psychology further suggests the greater the overlap between the components acquired in
learning
one skill and those required for the performance of a new task, the greater the
anticipated transfer of learning (Singley and Anderson, 1989), while observational research
on human communication suggests a least cognitive effort principle in which
interpre
tations of feedback play a key role in economizing in the search for common
ground
(Clark and Brennan 1991)
.


Conditions that facilitate the sharing of complementary information give rise to notions of
the advantages of organizations and networks in adaptive learning. Comput
ationally
influenced perspectives explore convergence on shared understandings through the
adaptation between individual and organizational codes
(Cyert and March 1963
; March
1992)
. Strategy researchers emphasize the transfer of tacit knowledge and the utilization of
information within complex patterns of coordination and cooperation
(Conner and
Prahalad 1996; Grant 1996; Kogut and Zander 1992; Kogut and Z
ander 1996; Nonaka
1994; Spender 1996; Tsoukas 1996)
. Structuration theory provides a framework for
modeling the dynamics of two
-
way interaction between individual and institutional
representations, including those that surround new technologies and inno
vations
(Barley
1986; 1990; Giddens 1984; Orlikowski and Yates 1992; 1994)
. Similarly, resea
rchers
adopting a communities of practice perspective suggest understanding of innovations and
how they might be applied emerges within the context of complex, collaborative social
relationships
(Brown and Duguid 1991; 1998; 2000; Lave and Wenger 1991; Wenger
1998)
.


Combinin
g adaptive learning models with economic intuition suggests that the creation of
new economic value requires information from both the demand side and the supply side.
With respect to information flows, we formally state the following hypothesis,


Hypoth
esis 10:
Internal information gathering boosts productivity. External
information gathering boosts productivity. Internal and external information
gathering are complementary.


Cohen and Levinthal
(1990)

describe “absorptive capacity” as “the ability of a firm to
recognize the value of new, external information, assimil
ate it, and apply it to commercial
ends.” They hypothesize that absorptive capacity can be explained through a psychological

23

learning model that exhibits positive feedback from both internal and external information
gathering. The implication is that ther
e is no way for a firm to know a priori whether a
level of R&D investment is optimal. However, as the rate of technological advance
increases, the locus of competition shifts increasingly to the dynamic ability to recognize
new sources of value and new te
chniques from the external environment and encode them
within the firm’s internal procedural knowledge base.


In a survey of organizational learning and strategy literature, Bierly and Hamalainen
(1995)

focus on complementarities between internal and external information gathering. They
argue that theoretical development is hindered by a lack of attention to specific
characteristics of multiple internal levels (eg. indi
vidual, intrafunctional, interfunctional
and multilevel) and external domains (eg. customers, suppliers, industry networks,
institutional environment) across which information flows.


The ability to seek out sources of new value through the integration o
f internal and external
information depends crucially on human capabilities for information processing.
Computational notions of “information overload” are precisely defined in the channel
coding theorem of information theory
(Cover and Thomas 1991; Shannon 1948)
. A
s long
as data elements are confined to membership in a pre
-
specified finite set, optimal codes
and rates of transmission can be computed for both error
-
free and noisy channels. While
highly successful as an engineering principle, the technical definition

of optimal
information flow says nothing about factors that might influence human performance.
Recognizing the elusive nature of precise definitions of optimal information, we state the
following qualitative hypothesis,


Hypothesis 11: Optimal informatio
n gathering balances the costs of overload
against the costs of ignorance.



Empirical research in psychology has led to a theoretical relationship between levels of
arousal, which are typically influenced by information, and task performance.

The Yerkes
-
Dodson Law considers two effects: an inverted “U” shaped relationship
between arousal and the efficiency of performance and an inverse relationship between the
optimal level of arousal for performance and task difficulty
(Broadhurst 1959; Weick
1984)
. In the overload condition, coping mechanisms become more primitive in at least
three ways: (a) reversion to more dominant, first learned behaviors, (b) patterns of
responding that have been the most recently learned are the

first to disappear, (c) novel
stimuli are treated as if they are similar to older stimuli
(Staw, Sandelands et al. 1981)
. In
other words, information overload may decrease productivity because behavioral responses
that are likely to be most finely attuned to present conditions a
re also likely to be the first to

shut down in overly stressful situations. Chronic overload also decreases productivity
through fatigue.


So far, our consideration of the human perspective has focused on the characteristics of
human information sharing

and information flows in which adaptive learning may lead to
creation of new options. One way of turning these insights into testable hypothesis is to

24

investigate dynamic relationships between social structure, information flows and
productivity.


Sociol
ogists refer to “social capital” embedded in the links that constitute interpersonal
networks
(Coleman 1988; Putnam 1995)
. Social network theorists have paid particular
attention to relationships between social structure and economic opportunity
(Burt 1992;
Burt 2000; Granovetter 1973; Granovetter 1985)
. Granovetter emphasizes the importance
of weak ties,
while Burt focuses on the importance of bridging structural holes, defined as a

gap between two communities with non
-
redundant information. Burt considers the return
on investment from social capital flowing from three sources: access, referrals and timin
g.


With respect to social networks, bigger is hypothesized to be better, since information
about new opportunities is time dependent and flows through existing contacts. However,
consideration of opportunity costs in the face of bounded rationality lea
ds to Burt’s
suggestion that players optimize social networks by focusing on maintaining primary
contacts within non
-
redundant communities, so as to maximize access to information from
secondary sources. Stated formally as a hypothesis,


Hypothesis 12: Ne
twork efficiency balances network size and diversity of contacts.
Network effectiveness distinguishes primary from secondary contacts and focuses
resources on preserving primary contacts.


Burt’s theory of structural holes is a theory of competitive advan
tage that follows from
social capital. Although the emphasis is strategic, parties that occupy structural holes
might theoretically increase productivity in two ways. The first explanation could be
conceptualized in economic terms as a form of informatio
nal arbitrage, in which profits or
social status are realized through personal relationships, but the end result is a more
efficient allocation of resources. For example, Baker
(1984)

documents the extent to
which social network topologies of floor traders dampen the volatility of options prices
within a national securities market.
The second explanation can be conceptualized as
realizing economies of scope and scale in the face of balkanization. In this case,
information about an opportunity is transmitted across a structural hole, while the actual
creation of new value follows fro
m subsequent information flows. Examples include:
Hargadon and Sutton’s
(1997)

ethnographic analysis of brainstorming practices that
facilitated the broke
ring of technological expertise within a design firm; Powell et. al’s
(1996)

analysis of relationships between intra
-
organizational collaboration patterns,
profitability and growth in biotechnology; and Saxenian’s
(1994)

ethnographic account of
regional differences between Route 128 and Silicon Valley.


Optimal netw
ork structure that balances efficiency of information sharing across strong ties
and the identification of opportunities across structural holes remains an open research
question
(1999)
. A network topology that theoretically combines the desired properties is
the “small world” topology
(Watts 1999; Watts and Strogatz 1998)
. The small world is
defined by two measures: characteristic path length (the smallest number of links it takes to
connect one node to another averaged over all pairs of nodes in t
he network) and the

25

clustering coefficient (the fraction of neighboring nodes that are also collected to one
another). Stated formally,


Hypothesis 13: The small world pattern of high local clustering and short longest
path lengths promotes productivity m
ore than either hierarchical or fully connected
networks.


The explanation for this effect is that it only takes a few short cuts between cliques to turn
a large world into a small world. Using formal models, Watts and Strogatz shifted
gradually from a re
gular network to a random network by increasing the probability of
making random connections from 0 to 1. They found characteristic path length drops
quickly whereas the value of the clustering coefficient drops slowly. This leads to a small
-
world networ
k in which the amount of clustering is high and the characteristic path length
is short.


Importantly, however, we emphasize small worlds characterized by clustering and short
paths apart from the randomness of connection that is one way of shortening di
stance.
Shortcuts can be intentional. Kleinberg (2000) points out that although random graphs
may have shortcuts, individual agents will typically have insufficient information to exploit
them. Since the average person (node) is not directly associated
with the key people
(clique
-
linkers), it is impossible to determine whether you live in a small world or a large
world from local information alone. However, the small world hypothesis can be tested
through the collection of network data.


Conclusion


Emp
irical evidence supporting the hypothesis that investments in information technology
positively influence productivity has renewed longstanding debate among economists over
the sources of productivity growth. While acknowledging the contributions of
micro
economic theory, we argue that complexity of the relationship between information
and productivity necessitates approaches that transcend traditional disciplinary boundaries.


Our argument begins by linking theoretical notions for valuing information as da
ta and
process to the economic definition of total factor productivity. Formally recognizing the
economic value of information as process opens the door for the integration of theory from
multiple traditions. In developing hypotheses, we suggest that whi
le relationships between
information and productivity are clearly complex, they are unlikely to be random.


Empirical verification of hypotheses will undoubtedly involve considerable ingenuity in
translating the predictions of theory into metrics that ca
pture patterns of information use
and human interaction. However, the promise lies in the potential to not only reflect on
patterns of organization as they exist, but to generate new lines of research that actively
informs business practice in light of th
e opportunities offered by continued advances in
information technologies.



26







27

FIGURE 1


Labor
Capital
Labor
Capital
Moving to the Efficient Frontier
(Bayesian)
Shifting the Efficient Frontier
(Instrumental)


28




Abramovitz, M. (1962). "Economic Growth in the United States."
American Economic
Review

52
(4): 762
-
782.

Adams, J. D. (1990). "Fundamental S
tocks of Knowledge and Productivity Growth."
Journal of Political Economy

98
(4): 673
-
702.

Aghion, P. and P. Howitt (1998).
Endogeneous Growth Theory
. Cambridge, Mass., MIT
Press.

Akerlof, G. A. (1970). "The Market for "Lemons": Quality Uncertainty and the
Market
Mechanism."
Quarterly Journal of Economics

84
(3): 488
-
500.

Alchian, A. A. and H. Demsetz (1972). "Production, Information Costs and Economic
Organization."
American Economic Review

62
(5): 777
-
795.

Aldrich, H. (1999).
Organizations Evolving
. London,
Sage Publications.

Allison, G. (1971).
Essence of Decision: Explaining the Cuban Missile Crisis
. Boston,
Little, Brown.

Anand, K. and H. Mendelson (1997). "Information and Organization for Horizontal
Multimarket Coordination."
Management Science

43
(12): 16
09
-
1626.

Anderson, J. R. (1995).
Cognitive Psychology and its Implications
. New York, W. H.
Freeman.

Arrow, K. J. (1962). "The Economic Implications of Learning by Doing."
The Review of
Economic Studies

29
(3): 155
-
173.

Arrow, K. J. (1962). Economic Welfare

and the Allocation of Resources for Invention.
The
Rate and Direction of Inventive Activity
. R. R. Nelson. Princeton, N.J., Princeton
University Press: 609
-
625.

Arrow, K. J. (1974).
The Limits of Organization
. New York, W. W. Norton & Company.

Arthur, B.
W. (1989). "Competing Technologies, Increasing Returns and Lock
-
in by
Historical Events."
The Economic Journal

99
(394): 116
-
131.

Axelrod, R., W. Mitchell, et al. (1997). Setting Standards: Coalition Formation in
Standard
-
Setting Alliances.
The Complexity o
f Cooperation
. R. Axelrod. Princeton,
Princeton University Press
:
96
-
123.

Bain, J. (1956).
Barriers to New Competition
. Cambridge, Mass., Harvard University
Press.

Baker, W. E. (1984). "The Social Structure of a National Securities Market."
American
Journa
l of Sociology

89
(4): 775
-
811.

Balakrishnan, A., R. Kalakota, et al. (1995). "Document
-
Centered Information Systems to
Support Reactive Problem
-
Solving in Manufacturing."
International Journal of
Production Economics

38
: 31
-
58.

Baldwin, C. Y. and K. B. Cla
rk (2000).
Design Rules: The Power of Modularity
.
Cambridge, Mass., MIT Press.

Barley, S. R. (1986). "Technology as an Occasion for Structuring: Evidence from
Observations of CT Scanners and the Social Order of Radiology Departments."
Administrative Scienc
e Quarterly

31
(1): 78
-
108.

Barley, S. R. (1990). "The Alignment of Technology and Structure through Roles and
Networks."
Administrative Science Quarterly

35
(1): 61
-
103.


29

Barney, J. (1991). "Firm Resources and Sustained Competitive Advantage."
Journal of
Man
agement

17
(1): 99
-
120.

Barua, A., C. H. S. Lee, et al. (1995). "Incentives and Computing Systems for Team
-
based
Organizations."
Organization Science

6
(4): 487
-
504.

Bashein, B. J., M. L. Markus, et al. (1994). "Business Process Reengineering:
Preconditions
for Success and How to Prevent Failures."
Information Systems
Management
(Spring).

Bergstrom, T., L. Blume, et al. (1986). "On the Private Provision of Public Goods."
Journal
of Public Economics

29
(1): 25
-
49.

Biazzo, S. (1998). "A Critical Examination of th
e Business Process Re
-
engineering
Phenomenon."
International Journal of Operations and Production Management

18
(9/10): 1000
-
10016.

Bierly, P. E. and T. Hamalainen (1995). "Organizational Learning and Strategy."
Scandinavian Journal of Management

11
(3): 209
-
224.

Blackwell, D. (1953). "Equivalent Comparison of Experiments."
Annals of Mathematical
Statistics

24
(2): 265
-
272.

Blau, P. M. (1977).
Heterogenity and Inequality
. New York, Free Press.

Boland, R. J. J. and R. Tenkasi (1995). "Perspective Making and Per
spective Taking in
Communities of Knowing."
Organization Science

6
(4): 350
-
371.

Bolton, P. and J. Farrell (1990). "Decentralization, Duplication and Delay."
Journal of
Political Economy

98
(4): 803
-
826.

Brachman, R. J., T. Khabaza, et al. (1996). "Mining Bu
siness Databases."
Communications
of the ACM

39
(11): 42
-
48.

Bresnahan, T. F. and M. Trajtenberg (1995). "General Purpose Technologies: "Engines of
Growth"?"
Journal of Econometrics

65
(1): 83
-
108.

Broadhurst, P. L. (1959). "The Interaction of Task Difficult
y and Motivation: The Yerkes
-
Dodson Law Revisited."
Acta Psychologica

16
: 321
-
338.

Brown, J. S. and P. Duguid (1991). Organizational Learning and Communities of Practice.
Organizational Learning
. M. D. Cohen and L. Sproull. Thousand Oaks, Sage
:
58
-
82.

Brow
n, J. S. and P. Duguid (1998). "Organizing Knowledge."
California Management
Review

40
(3): 90
-
110.

Brown, J. S. and P. Duguid (2000).
The Social Life of Information
. Boston, Harvard
Business School Press.

Brynjolfsson, E. (1994). "An Incomplete Contracts T
heory of Information, Technology and
Organization."
Management Science

40
(12).

Brynjolfsson, E. and L. Hitt (1996). "Paradox Lost? Firm
-
level Evidence on the Returns to
Information Systems Spending."
Management Science

42
(4): 541
-
558.

Brynjolfsson, E. and
L. Hitt (2000). "Beyond Computation: Information Technology,
Organizational Transformation and Business Performance."
Journal of Economic
Perspectives

14
(4): 23
-
48.

Brynjolfsson, E., A. A. Renshaw, et al. (1997). "The Matrix of Change: A Tool for
Business
Process Reengineering."
Sloan Management Review
(Winter): 37
-
54.

Burt, R. S. (1992). Structural Holes: The Social Structure of Competition.
Networks and
Organizations
. N. Noria and R. G. Eccles. Cambridge, Mass., Harvard University

30

Press
:
57
-
91.

Burt, R. S.

(2000). The Network Structure of Social Capital.
Research in Organizational
Behavior
. R. I. Sutton and B. M. Staw. Greenwich, Conn., JAI Press.
22
.

Chodhury, V. and J. L. Sampler (1997). "Information Specificity and Environmental
Scanning: An Economic Per
spective."
MIS Quarterly

21
(1): 25
-
53.

Christensen, C. (1997).
The Innovators Dilemma
. Boston, Harvard Business School Press.

Clark, H. H. and S. E. Brennan (1991). Grounding in Communication.
Socially Shared
Cognition
. L. B. Resnick, J. M. Levine and S. T
easley, American Psychological
Association.

Coase, R. H. (1937). "The Nature of the Firm."
Economica

4 n.s.
(16): 386
-
405.

Cohen, M. D. (1981). "The Power of Parallel Thinking."
Journal of Economic Behavior
and Organization

2
: 285
-
306.

Cohen, M. D. (1986).
Artificial Intelligence and the Dynamic Performance of
Organizational Designs.
Ambiguity and Command: Organizational Perspectives in
Military Decision Making
. J. G. March and W.
-
B. R., Pitman Publishing.

Cohen, M. D. (1994). Individual Learning and Organiz
ational Routine.
Organizational
Learning
. M. D. Cohen and L. Sproull. Thousand Oaks, Sage Publications
:
188
-
194.

Cohen, M. D. and R. Axelrod (2000).
Harnessing Complexity: Organizational Implications
of a Scientific Frontier
, The Free Press.

Cohen, M. D. a
nd P. Bacdayan (1994). Organizational Routines Are Stored as Procedural
Memory.
Organizational Learning
. M. D. Cohen and L. Sproull. Thousand Oaks,
Sage Publications
:
403
-
429.

Cohen, W. M. and D. A. Levinthal (1990). "Absorptive Capacity: A New Perspective

on
Learning and Innovation."
Administrative Science Quarterly

35
: 128
-
152.

Coleman, J. S. (1988). "Social Capital in the Creation of Human Capital."
American
Journal of Sociology

94
: S95
-
120.

Conlisk, J. (1996). "Why Bounded Rationality."
Journal of Econo
mic Literature

34
(2):
669
-
700.

Conner, K. R. and C. K. Prahalad (1996). "A Resource
-
Based Theory of the Firm:
Knowledge vs. Opportunism."
Organization Science

7
(5): 477
-
501.

Cover, T. M. and J. A. Thomas (1991).
Elements of Information Theory
. New York, Jo
hn
Wiley & Sons, Inc.

Crane, D. (1969). "Social Structure in a Group of Scientists: A Test of the "Invisible
College" Hypothesis."
American Sociological Review

34
(3): 335
-
352.

Cyert, R. M. and J. G. March (1963).
A Behavioral Theory of the Firm
. Malden, Ma
ss.,
Blackwell Publishers.

Daft, R. L. and R. H. Lengel (1984). Information Richness: A New Approach to
Managerial Behavior and Organizational Design.
Research in Organizational
Behavior
, JAI Press.
6:
191
-
233.

Daft, R. L. and R. H. Lengel (1986). "Organiz
ational Information Requirements, Media
Richness and Structural Design."
Management Science

32
(5): 554
-
571.

Daft, R. L. and K. Weick, E. (1984). "Toward a Model of Organizations as Interpretation
Systems."
Academy of Management Review

9
(2): 284
-
295.

Davenp
ort, T. H. (1992).
Process Innovation: Reengineering Work Through Information

31

Technology
, Harvard Business School Press.

Davenport, T. H., R. G. Eccles, et al. (1992). "Information Politics."
Sloan Management
Review
: 53
-
65.

Davenport, T. H. and L. Prusak (
1998).
Working Knowledge
. Boston, Harvard Business
School Press.

David, P. A. (1985). "Clio and the Economics of QWERTY."
American Economic Review

75
(2): 332
-
337.

David, P. A. (1990). "The Dynamo and the Computer: A Historical Perspective on the
Modern Pro
ductivity Paradox."
American Economic Review Papers and
Proceedings

1
(2): 355
-
361.

de Groot, A. D. (1965).
Thought and Choice in Chess
. The Hague, Mouton.

Dixit, A. K. and R. S. Pindyck (1994).
Investment Under Uncertainty
. Princeton, Princeton
University
Press.

Dixit, A. K. and R. S. Pindyck (1995). "The Options Approach to Capital Investment."
Harvard Business Review
: 105
-
115.

Eisenhardt, K. M. (1989). "Agency Theory: An Assessment and Review."
Academy of
Management Review

14
(1): 57
-
74.

Eisenhardt, K. M.
and B. N. Tabrizi (1995). "Accelerating Adaptive Processes: Product
Innovation in the Global Computer Industry."
Administrative Science Quarterly

40
(1): 84
-
110.

Epple, D., L. Argote, et al. (1996). Organizational Learning Curves: A Method for
Investigating

Intra
-
Plant Transfer of Knowledge Acquired Through Learning by
Doing.
Organizational Learning
. M. D. Cohen and L. Sproull. Thousand Oaks,
Calif., Sage Publications
:
83
-
100.

Farrell, J. and G. Saloner (1985). "Standardization, Compatibility and Innovation.
"
RAND
Journal of Economics

16
(1): 70
-
83.

Fayyad, U. and R. Uthurusamy (1996). "Data Mining and Knowledge Discovery in
Databases."
Communications of the ACM

39
(11): 24
-
26.

Feltham, G. (1968). "The Value of Information."
Accounting Review

43
(4): 684
-
696.

Fi
ne, C. H. (1998).
Clockspeed
. Reading, Mass., Perseus Books.

Fudenberg, D. and J. Tirole (1991).
Game Theory
. Cambridge, Mass., MIT Press.

Galbraith, J. R. (1973).
Designing Complex Organizations
. Reading, Mass., Addison
-
Wesley.

Galbraith, J. R. (1974). "O
rganizational Design: An Information Processing View."
Interfaces

4
(3): 28
-
36.

Gell
-
Mann, M. (1994).
The Quark and the Jaguar
. New York, W.H. Freeman.

Gersick, C. J. G. and J. R. Hackman (1990). "Habitual Routines in Task
-
Performing
Groups."
Organizational

Behavior and Human Decision Processes

47
: 65
-
97.

Giddens, A. (1984).
The Constitution of Society: Outline of the Theory of Structure
.
Berkeley, CA, University of California Press.

Glymour, C., D. Madigan, et al. (1996). "Statistical Inference and Data Min
ing."
Communications of the ACM

39
(11): 35
-
41.

Granovetter, M. (1973). "The Strength of Weak Ties."
American Journal of Sociology

78
(6): 1360
-
1380.

Granovetter, M. (1985). "Economic Action and Social Structure: The Problem of

32

Embeddedness."
American Journa
l of Sociology

91
(3): 481
-
510.

Grant, R. M. (1996). "Toward a Knowledge
-
Based Theory of the Firm."
Strategic
Management Journal

17
(Winter Special Issue): 109
-
122.

Griliches, Z. (1958). "Research Costs and Social Return: Hybrid Corn and Related
Innovations.
"
Journal of Political Economy

66
(5): 419
-
431.

Griliches, Z. (1986). "Productivity, R&D and Basic Research at the Firm Level in the
1970s."
American Economic Review

76
(1): 141
-
154.

Griliches, Z. (1998).
R&D and Productivity: The Econometric Evidence
. Chica
go,
University of Chicago Press.

Grossman, S. J. and O. D. Hart (1986). "The Costs and Benefits of Ownership: A Theory
of Vertical and Lateral Integration."
Journal of Political Economy

94
(4): 691
-
719.

Gurbaxani, V. and S. Whang (1991). "The Impact of Info
rmation Systems on
Organizations and Markets."
Communications of the ACM

34
(1): 59
-
73.

Hammer, M. and J. Champy (1993).
Reengineering the Corporation
, Harper Business.

Hammond, J. S., R. L. Keeney, et al. (1999).
Smart Choices
. Boston, Mass., Harvard
Busin
ess School Press.

Hannah, M. T. and J. Freeman (1989).
Organizational Ecology
. 1989, Harvard University
Press.

Hansen, M. T. (1999). "The Search
-
Transfer Problem: The Role of Weak Ties in Sharing
Knowledge Across Organizational Subunits."
Administrative Sc
ience Quarterly

44
(1): 82
-
111.

Hansen, M. T., N. Nohria, et al. (1999). "What's Your Strategy for Managing Knowledge."
Harvard Business Review
: 106
-
116.

Hargadon, A. and R. I. Sutton (1997). "Technology Brokering and Innovation in a Product
Development Fir
m."
Administrative Science Quarterly

42
(4): 716
-
749.

Hart, O. D. and J. Moore (1990). "Property Rights and the Nature of the Firm."
Journal of
Political Economy

98
(6): 1119
-
1158.

Hayek, F. (1945). "The Use of Knowledge in Society."
American Economic Review

35
(4):
519
-
530.

Hebb, D. O. (1949).
The Organization of Behavior
. New York, Wiley.

Hebb, D. O. (1980). Ch 6: A Physiological Theory.
Essay on Mind
. D. O. Hebb. Hillsdale,
NJ, Lawrence Erlbaum
:
79
-
97.

Heiner, R. A. (1983). "The Origin of Predictable Behavi
or."
American Economic Review

73
(4): 560
-
595.

Heiner, R. A. (1988). "Imperfect Decisions and Organizations."
Journal of Economic
Behavior and Organization

9
(1): 25
-
44.

Helpman, E. (1998).
General Purpose Technologies and Economic Growth
. Cambridge,
Mass.,
MIT Press.

Henderson, R. M. and K. B. Clark (1990). "Architectural Innovation: The Reconfiguration
of Existing Product Technologies and the Failure of Established Firms."
Administrative Science Quarterly

35
(1): 9
-
30.

Hirshleifer, J. (1973). "Where Are We i
n the Theory of Information."
American Economic
Review

63
(2): 31
-
39.

Hitt, L. and E. Brynjolfsson (1996). "Productivity, business profitability, and consumer
surplus: Three different measures of information technology value."
MIS Quarterly


33

20
(2): 121
-
142.

Holland, J. H. (1995).
Hidden Order
. Reading, Mass., Addison
-
Wesley.

Holland, J. H. (1998).
Emergence
. Reading, Mass., Addison
-
Wesley.

Hong, L. and S. E. Page (2001). "Problem Solving by Heterogeneous Agents."
Journal of
Economic Theory

97
: 123
-
163.

Huber,

G. (1991). "Organizational Learning: The Contributing Processes and the
Literatures."
Organization Science

2
(1): 88
-
115.

Jorgenson, D. (1995).
Productivity
. Cambridge, Mass., MIT Press.

Jorgenson, D. (2001). "Information Technology and the U.S. Economy."
American
Economic Review

91
(1): 1
-
32.

Jovanovic, B. and Y. Nyarko (1996). "Learning by Doing and the Choice of Technology."
Econometrica

64
(6): 1299
-
1310.

Katz, M. L. and C. Shapiro (1985). "Network Externalities, Competition and
Compatibility."
American E
conomic Review

75
(3): 424
-
440.

Kogut, B. and U. Zander (1992). "Knowledge of the Firm, Combinative Capabilities, and
the Replication of Technology."
Organization Science

3
(3): 383
-
397.

Kogut, B. and U. Zander (1996). "What Firms Do? Coordination, Identity
and Learning."
Organization Science

7
(5): 502
-
518.

Landauer, T. K. (1995).
The Trouble with Computers: Usefulness, Usability and
Productivity
. Cambridge, Mass., MIT Press.

Lave, J. and E. Wenger (1991).
Situated Learning: Legitimate Peripheral Participatio
n in
Communities of Practice
. Cambridge, Cambridge University Press.

Lawrence, P. R. and J. W. Lorsch (1967). "Differentiation and Integration in Complex
Organizations."
Administrative Science Quarterly

12
(1): 1
-
47.

Lehr, B. and F. Lichtenberg (1999). "Inf
ormation Technology and its Impact on
Productivity: Firm
-
Level Evidence from Government and Private Data Sources,
1977
-
1993."
Canadian Journal of Economics

32
(2): 335
-
362.

Levitt, B. and J. G. March (1988). "Organizational Learning."
Annual Review of Socio
logy

14
: 319
-
340.

Liebowitz, S. J. and S. E. Margolis (1990). "The Fable of the Keys."
Journal of Law and
Economics

33
(1): 1
-
26.

Liebowitz, S. J. and S. E. Margolis (1994). "Network Externality: An Uncommon
Tragedy."
Journal of Economic Perspectives

8
(2):
133
-
150.

Malone, T. W. and K. Crowston (1994). "The Interdisciplinary Study of Coordination."
ACM Computing Surveys

26
(1): 87
-
119.

Malone, T. W., K. Crowston, et al. (1999). "Tools for Invention Organizations: Toward a
Handbook of Organizational Processes.
"
Management Science

45
(3): 425
-
443.

Malone, T. W., J. Yates, et al. (1987). "Electronic Markets and Electronic Hierarchies."
Communications of the ACM

30
(6): 484
-
497.

March, J. G. (1992). "Exploration and Exploitation in Organizational Learning."
Organiza
tion Science

2
(1): 71
-
87.

March, J. G. and H. A. Simon (1958).
Organizations
. New York, John Wiley.

March, J. G., L. Sproull, et al. (1991). "Learning from Samples of One or Fewer."
Organization Science

2
(1).

Marschak, J. and R. Radner (1972).
Economic The
ory of Teams
. New Haven, Conn., Yale

34

University Press.

Mas
-
Colell, A., M. D. Whinston, et al. (1995).
Microeconomic Theory
. Oxford, England,
Oxford University Press.

Milgrom, P. and J. Roberts (1990). "The Economics of Modern Manufacturing."
American
Econo
mic Review

80
(3): 511
-
528.

Milgrom, P. and J. Roberts (1992).
Economics, Organization and Management
, Prentice
-
Hall.

Miller, G. A. (1956). "The Magical Number Seven, Plus or Minus Two: Some Limits on
Our Capacity for Processing Information."
Psychology Rev
iew

63
(2): 81
-
97.

Mintzberg, H. (1990). "The Manager's Job: Folklore and Fact."
Harvard Business Review

68
(2): 163
-
176.

Motwani, J., A. Kumar, et al. (1998). "Business Process Reengineering: A Theoretical
Framework and an Integrated Model."
International J
ournal of Operations and
Production Management

18
(9/10): 964
-
977.

Nault, B. R. (1998). "Information Technology and Organization Design: Locating
Decisions and Information."
Management Science

44
(10): 1321
-
1335.

Nelson, R. R. (1981). "Research on Productivi
ty Growth and Productivity Differences:
Dead Ends and New Departures."
Journal of Economic Literature

19
(3): 1029
-
1064.

Nelson, R. R. and S. G. Winter (1982).
An Evolutionary Theory of Economic Change
.
Cambridge, Mass., Harvard University Press.

Newell, A.

and H. A. Simon (1972).
Human Problem Solving
. Englewood Cliff, N.J.,
Prentice
-
Hall.

Nonaka, I. (1994). "A Dynamic Theory of Organizational Knowledge Creation."
Organization Science

5
(1): 14
-
37.

Norman, D. A. (1988).
The Design of Everyday Things
. New Yor
k, Doubleday.

Oliner, S. D. and D. E. Sichel (2000). The Resurgence of Growth in the Late 1990s: Is
Information Technology the Story? Washington, DC, Federal Reserve Board.

Olley, G. S. and A. Pakes (1996). "The Dynamics of Productivity in the
Telecommunic
ations Equipment Industry."
Econometrica

64
(6): 1263
-
1297.

Olson, G. M., T. W. Malone, et al. (2001).
Coordination Theory and Collaboration
Technology
. Mahwah, New Jersey, Lawrence Erlbaum Associates.

Orlikowski, W. J. (1992).
Learning from Notes: Organiza
tional Issues in Groupware
Implementation
. Proceedings of CSCW 92.

Orlikowski, W. J. and J. Yates (1992). "Genres of Organizational Communication: A
Structurational Approach to Studying Communications and Media."
Academy of
Management Review

17
(2): 299
-
326
.

Orlikowski, W. J. and J. Yates (1994). "Genre Repertoire: The Structuring of
Communicative Practices in Organizations."
Administrative Science Quarterly

39
:
541
-
574.

Orr, J. (1996).
Talking about Machines: An Ethnography of a Modern Job
. Ithaca, New
York
, IRL Press.

Pakes, A. (1986). "Patents as Options: Some Estimates of the Value of Holding European
Patent Stocks."
Econometrica

54
(4): 755
-
784.

Polanyi, M. (1966).
The Tacit Dimension
. New York, Anchor Day Books.


35

Porter, M. (1980).
Comptetive Strategy
. Ne
w York, Free Press.

Posner, M. I. (1986). Empirical Studies of Prototypes.
Noun Classes and Categorization
. C.
Craig. Amsterdam, John Benjamins Publishing Co.
7:
53
-
61.

Powell, W. W., K. Koput, et al. (1996). "Interorganizational Collaboration and the Locu
s
of Innovation: Networks of Learning in Biotechnology."
Administrative Science
Quarterly

41
(1): 116
-
145.

Prahalad, C. K. and G. Hamel (1990). "The Core Competence of the Corporation."
Harvard
Business Review

68
(3): 79
-
91.

Pratt, J. W. (1964). "Risk Aversi
on in the Small and in the Large."
Econometrica

32
(1/2):
122
-
136.

Putnam, R. (1995). "Bowling Alone."
Journal of Democracy

6
(1): 65
-
78.

Radner, R. (1992). "Hierarchy: The Economics of Managing."
Journal of Economic
Literature

30
(3): 1382
-
1415.

Riolo, R. L.
, M. D. Cohen, et al. (2001). "Evolution of Cooperation Without Reciprocity."
Nature

414(22 Nov.): 441
-
443.

Rivera
-
Batiz, L. and P. M. Romer (1991). "Economic Integration and Endogeneous
Growth."
Quarterly Journal of Economics

016
(2): 531
-
555.

Roberts, R.
M. (1989).
Serendipity: Accidental Discoveries in Science
. New York, John
Wiley and Sons.

Romer, P. M. (1986). "Increasing Returns and Long
-
Run Growth."
Journal of Political
Economy

94
(5): 1002
-
1037.

Romer, P. M. (1990). "Endogenous Technological Change."
Journal of Political Economy

98
(5): S71
-
S102.

Rothschild, M. and J. E. Stiglitz (1970). "Increasing Risk: I. A Definition."
Journal of
Economic Theory

2
(3): 225
-
243.

Russell, S. and P. Norvig (1995).
Artificial Intelligence: A Modern Approach
. Saddle
River
, NJ, Prentice Hall.

Samuel, A. L. (1963). Some Studies in Machine Learning Using the Game of Checkers.
Computers and Thought
. E. A. Feigenbaum and J. Feldman. Berkeley, Calif.,
McGraw
-
Hill
:
71
-
105.

Saxenian, A. (1994).
Regional Advantage: Culture and Comp
etition in Silicon Valley and
Route 128
. Cambridge, Mass., Harvard University Press.

Shannon, C. E. (1948). "A Mathematical Theory of Communication."
The Bell System
Technical Journal

27
: 379
-
423

623
-
656.

Shapiro, C. and H. R. Varian (1999).
Information Ru
les
. Boston, Harvard Business School
Press.

Shy, O. (2001).
The Economics of Network Industries
. Cambridge, England, Cambridge
University Press.

Simon, H. A. (1955). "A Behavioral Model of Rational Choice."
Quarterly Journal of
Economics

69
(1): 99
-
118.

Sim
on, H. A. (1996).
The Sciences of the Artificial
. Cambridge, Mass., MIT Press.

Spence, M. (1973). "Job Market Signaling."
Quarterly Journal of Economics

87
(3): 355
-
374.

Spender, J. C. (1996). "Making Knowledge the Basis of a Dynamic Theory of the Firm."

36

St
rategic Management Journal

17
(Winter Special Issue): 46
-
62.

Starbuck, W. (1992). "Learning by Knowledge
-
Intensive Firms."
Journal of Management
Studies

29
(6): 713
-
739.

Staw, B. M., L. E. Sandelands, et al. (1981). "Threat
-
rigidity Effects in Organizational

Behavior: A Multi
-
level Analysis."
Administrative Science Quarterly

26
: 501
-
524.

Sternman, J. D. (2000).
Business Dynamics: Systems Thinking and Modeling for a
Complex World
, Irwin McGraw
-
Hill.

Stiglitz, J. (2000). "The Contributions of the Economics of I
nformation to Twentieth
Century Economics."
Quarterly Journal of Economics

115
(4): 1441
-
1478.

Stinchcombe, A. L. (1990).
Information and Organizations
. Berkeley, Calif., University of
California Press.

Sudit, E. (1995). "Productivity Measurement in Industr
ial Operations."
European Journal
of Operational Research
: 435
-
453.

Sveiby, K. E. (1997).
The New Organizational Wealth
. San Francisco, Berrett
-
Koehler.

Teece, D. J. (1980). "Economies of Scope and the Scope of the Enterprise."
Journal of
Economic Behavior

and Organization

1
: 223
-
247.

Teece, D. J. (1986). "Profiting from Technological Innovation: Implications for Integration,
Collaboration, Licensing and Public Policy."
Research Policy

15
: 285
-
305.

Thompson, J. D. (1967).
Organizations in Action: Social Sci
ence Bases of Administrative
Theory
. New York, McGraw
-
Hill.

Tirole, J. (1988).
The Theory of Industrial Organization
. Cambridge, Mass., MIT Press.

Tsoukas, H. (1996). "The Firm as a Distributed Knowledge System: A Constructionist
Approach."
Strategic Manag
ement Journal

17
(Winter Special Issue): p. 1
-
25.

Tversky, A. and D. Kahneman (1974). "Judgement Under Uncertainty: Heuristics and
Biases."
Science

185
(27 September): 1124
-
1131.

Tversky, A. and D. Kahneman (1981). "The Framing of Decisions and the Psycholog
y of
Choice."
Science

211
(30 January): 453
-
458.

Ulrich, K. (1995). "The Role of Product Architecture in the Manufacturing Firm."
Research

Policy

24
: 419
-
440.

Van Alstyne, M. (1999).
A Proposal for Valuing Information and Instrumental Goods
.
International C
onference on Information Systems.

Van Alstyne, M. and E. Brynjolfsson (1995).
Communication Networks and the Rise of the
Information Elites
.

Van Alstyne, M. and E. Brynjolfsson (1996).
Electronic Communities: Global Village or
Cyberbalkans?

International C
onference on Information Systems.

Van Alstyne, M. and E. Brynjolfsson (1996). "Internet
-

Could the Internet Balkanize
Science?"
Science

274
(5292): 1479
-
1480.

Van Alstyne, M., E. Brynjolfsson, et al. (1995). "Why Not One Big Database? Principles
for Data O
wnership."
Decision Support Systems

15
(4): 267
-
284.

Van Zandt, T. and R. Radner (2001). "Real
-
time Decentralized Information Processing and
Returns to Scale."
Economic Theory

17
: 545
-
575.

Von Hippel, E. (1988).
Sources of Innovation
. Oxford, Oxford Univers
ity Press.

Wasserman, S. and K. Faust (1994).
Social Network Analysis: Methods and Applications
.
Cambridge, Cambridge University Press.

Watts, D. J. (1999).
Small Worlds: The Dynamics of Networks Between Order and

37

Randomness
. Princeton, Princeton Universit
y Press.

Watts, D. J. and S. H. Strogatz (1998). "Collective Dynamics of "Small
-
World" Networks."
Science

393
: 440
-
442.

Weick, K., E. (1979).
The Social Psychology of Organizing
. New York, McGraw Hill.

Weick, K., E. (1984). "Small Wins: Redefining the Scop
e of Social Problems."
American
Psychologist

39
(1): 40
-
49.

Weick, K., E. (1995).
Sensemaking in Organizations
. Thousand Oaks, Calif., Sage
Publications.

Weick, K., E., K. M. Sutcliffe, et al. (Forthcoming). Organizing for High Reliability: The
Process of C
ollective Mindfulness.
Research in Organizational Behavior
. B. M.
Staw and J. E. Dutton. Greenwich, Conn., JAI.

Wenger, E. (1998).
Communities of Practice: Learning, Meaning, Identity
. Cambridge,
England, Cambridge University Press.

Wernerfelt, B. (1984).
"A Resource
-
based View of the Firm."
Strategic Management
Journal

5
: 171
-
180.

Williamson, O. E. (1975).
Markets and Hierarchies: Analysis and Antitrust Implications
.
New York, Free Press.

Williamson, O. E. (1985).
The Economic Insititutions of Capitalism
.
New York, Free
Press.

Wilson, E. O. (1998).
Consilience
. New York, Vintage Books.

Winter, S. G. (1987). Knowledge and Competence as Strategic Assets.
The Competitive
Challenge
. D. J. Teece. Cambridge, Mass., Ballinger
:
159
-
184.

Zander, U. and B. Kogut (199
5). "Knowledge and the Speed of Transfer and Imitation of
Organizational Capabilities: An Empirical Test."
Organization Science

6
: 76
-
92.

Zuboff, S. (1988).
In the Age of the Smart Machine
, Basic Books.