Example?

fallenleafblackbeansΠετρελαϊκά και Εξόρυξη

8 Νοε 2013 (πριν από 3 χρόνια και 7 μήνες)

192 εμφανίσεις

Beck, Risk, Complex
Systems, Technological
Societies

Class Notes

June 1, 2007

Yurika’s first Question


Is this true
:
threats against natural disaster, and this
kinds of threats have been continually reduced since
the beginning of industrialization.


Is there a relationship between the size of a natural
disaster and society’s ability to manage it?


In other words maybe we have an infrastructure of a
certain scale that has been adapted to handle
emergencies of a certain size (but not other sizes)


In general we can say that threats have been
reduced as society has become more advanced?

What to think About


We have some social theory about what society is and how
it is/can be organized


Is : fact (the way it is)


Empiricially verifiable


Subjective
--

no facts are necessarily true


Can be : possibility (doesn’t have to be necessarily)


Should : way it ought to be (comes with a value
attached)


In methodology we worry about “is”s becoming
“oughts/should”s;


how can we distinguish between them


Benefits of society


Control


over risks; minimizes risk


Aggregate knowledge (collect it)


Build on the insights/mistakes/lessons of
people who came before us

About Society and Choice


Yurika wrote
:
For example, people can not escape from
natural disaster, but people have a choice of using new
technology with risks or not.


Is this true (I.e. do we have choice of the
technological systems we exist with/in)?


There must be some minimum criteria for this to be
true:


Money (to invest in/acquire new technology)


Ex: Iran may be developing nuclear power because it can’t afford
to fix its ageing oil technology


Assume that we have the money
--

do we have a
choice (of technological use)?

Distinguishing between: Natural
disasters and new technologies



Katrina case
: some of problem was that the
levees were not correctly designed or that the
water pumps were not sufficient to stop the
flood waters


Message: Sometimes there are going to be
natural disasters that humans are not yet
capable of controlling.

Is Beck only concerned with
Natural Disasters?


Famine


Epidemics


BSE


Nuclear accidents


Biotechnology



These are related to planning, human design
as well

From the Case Reading

There are 2 main categories of human factors related to
error:


-

Design stage


-

Operating Stage

Conclusion
: that we could not avoid the accidents
completely.

Does that mean that we should accept them as inevitable.

The “Shoganai factor” about living in this modern society


Idea of “acceptable risk”

Acceptable Risks


Why are they acceptable?


Economic model
: says that we can roughly
predict the odds of dangers/risks


We calculate costs and benefits and probabilities
before making decisions


Because we choose to live in society we are
also choosing (without resisting and thinking) to
accept a certain level of risk


Because the benefits of being inside the society
outweigh the costs


Preferable than a life outside society

Technological Risk versus

Cultural Risk


There are dangers to the Self (individual)
who has disembedded from one context and
become embedded in another


Here we are mixing in an insight/concept from
Giddens


Dangers to the collective upon re
-
embedding
by that individual


Example: students who return to Myanmar or
China from Japan or the USA

Reflexive Modernity

Beck, Giddens. Lasch (1994)
:

Risk society is coextensive with reflexive modernity

Think about how through our reflection, our mental work
we are able to try to confront or offset the risks posed by
a fully developed industrial society.

These risks cannot simply be assimilated into the
system; they require conscious attention


Just Distribution of Risks


Beck, Giddens and Lasch argue: “
Rather than “just
distribution of goods and services” now the
concern is with the “just distribution of risks”


What is a “distribution of risk”?


This relates to the “shoganai” idea


We have decided to choose/accept risk


Once we have decided this, we become vulnerable to
risk


Risk gets distributed (placed in certain places,
situations, contexts, times) however, not all risk is
distributed equally


Example: nuclear power


Is risk really distributed “fairly”

Example of Nuclear Power in
Miyagi


Why is there a nuclear power plant in Miyagi?


Acceptable because it is located in a place where
there are less people


And even in Miyagi they selected a place with fewer
people.


Matsushima is . . . A tourist spot because it is one of
the three national treasures


Meaning that many people come


Also on the water


Meaning there is a good chance of widespread transmission
of contamination (if it occurs)

How do we decide these
distributions?


Government/policy


Why is this of concern in a democracy?


“Public decision”


Meaning that Government leaders will make
decisions on behalf of “the people”


Decisions are based on scientists’ (experts’)
opinions



What is the role of “the public”?


The opening of science?

The Public and Expert Systems


Why are systems “expert”?


Because they depend (are built on ) people who
possess a specialized knowledge that few people
have


A good/bad thing? A part of modern society. Can’t
have a modern society without it?


Trade, global exchange, also technical

The Public and Expert Systems


It is presupposed that research will
fundamentally take account of the public’s
questions


However, this is an open question


Mechanically, who has control over the
instruments of government (control over these
technical processes)


Also, who possesses the knowledge to match
the experts (thereby controlling them).

Relationship between Citizens
and Government

“Blind citizens” can win back the autonomy of their
own judgment by making the threats publicly
visible and arousing attention in detail.

-
Example?
: BSE

-
Although that was “voting with your pocketbook”E

-
Example?: Narita

-
Citizen protests have not had any effect in stopping the
building or operation of the airport

-
Thus, citizens may not always have the kind of
effect that is assumed by authors who point to
their oversight power

Accidents: Some Examples

What kind of accidents could be caused by Human
errors?


Design


Things don’t work as they were supposed to


Example: US car industry: “Unsafe at any speed” (general
consumer security and oversight)


O
-
ring on the Space Shuttle


Examples where you use a thing as it was intended to be
used, but it fails


Operation


Things that don’t work when you use them.


You use it as it was intended to be used, but someone or
thing fails during its use; it is used improperly


Example: in a hospital electricity stops and a respirator stops
and a person dies

About Complexity


Some technologies require systems of
experts and control that we are not capable
of managing


Society may not have the ability (the level
of development) capable of handling such
systems


An example that we will consider in the
coming weeks: terrorism