Science, Technology and Ethics

polarbearfellowshipBiotechnology

Dec 12, 2012 (4 years and 8 months ago)

236 views

UNIT10: Science, Technology and
Ethics

Ethics:

A system of moral principle or
values


Principle: A basic truth, law, or

assumption


Value: A principle, standard, or quality

considered worthwhile


Focus of Ethical Dilemmas in S&T

Ethical issues related to science and
technology usually focus on the
following:





Medicine



Military



Economics


Ethics in Medicine:


Biotechnology




Cloning



Gene Therapy



Altering species for enhanced


production, e.g., growth hormone



Medicine




Transplants



Life Extending Machines



Fertility


Ethics and the Military



Weapons of Mass Destruction

Nuclear Testing

Human subjects in Military Experiments, i.e.,
Chemical Warfare, LSD, Nuclear testing


Ethics and Economics


Pollution for Profit

Whistle Blowers

Monopolies

Advertising

Buying Influence


Ethical Decision Making in S&T

Generally ethical issues can be usefully clarified
if the following considerations to ethical
decision and judgment making are applied.




1. The facts of the matter


2. Affected Patients and Their Interests

(all affected parties)


3. Key concepts, criteria, and principles

(What is life? What does it mean to kill)


4. Ethical Theories and Arguments


Ethical Theories and Arguments


The theories and arguments for ethical behavior
in science and technology have developed
over many centuries. The sources and
foundations for our ethical and moral
behavior are usually traced to religion, family,
schools, employers, moral leaders, even
ancient philosophers. Generally speaking our
ethical decisions can be associated with

one
of three models ethical decision making.




Teleology (Consequence Ethics)




Deontology (Duty Ethics)




Personal Ethics

Teleology (Consequence Ethics)

Teleology (Consequence Ethics)
: determination of
rightness or wrongness based on

consequences

.




Utilitarianism

-

the view that an action or policy is
right if and only if it is likely to produce at least as
great a surplus of good over evil consequences as
any available alternative




Hedonic Utilitarians:

19th century, Jeremy Bentham,
Pleasure is the only good and pain the only bad




Ideal Utilitarians:

Friendship and beauty (good) and
opposites, alienation and ugliness (bad)


Deontology (Duty Ethics)

Deontology (duty ethics)
: Certain actions are
inherently or intrinsically right or wrong
-

that
is, right or wrong regardless of
consequences. For example, telling lies or
breaking a promise are intrinsically wrong,
regardless of the consequences


Six Classifications of S&T Ethical Conflicts


1.
Violation of Established World Orders: (Natural or
Social Order of Things):


2.
Violations of Supposedly Exception less Moral
Principles

3.
Distribution of Science or Technology Related
Benefits

4.
Infliction of Harm or Exposure to Significant Risks
of Harm without Prior Consent

5.
Infliction of Harm or Exposure to Significant Risks
of Harm without Prior Consent

6.
Science or Technology
-
Engendered “Positive
Rights”




Violation of Established World Orders: (Natural
or Social Order of Things):


Some ethical conflicts arise from the fact that scientific
or technological breakthroughs make possible
actions that some believe violate some “established
natural or social order.”




Biomedicine



Genetic Engineering



In vitro fertilization



Animal Science:Transgenic Animals (Beefalo,


etc.) Bovine Growth Hormone


Ethical Responses to
Violations of Established World
Orders: (Natural or Social Order of Things):



Teleologists:

Concern for safety of humans who consume from
an “unnatural alliance”


Deontologists:

The “natural order of things” is intrinsically good.
Technology is seen as artificial, therefore, its use to change the
natural order is bad.



Those in favor of this technology might counter by saying that God
created the natural order. Humans are part of the natural order.
Therefore, intervention of humans is natural.


Some opposition to intervention in the “natural” or “social” order is
based on “
sacredness”.

Examples:

Hasidic Community

of Brooklyn, NY. Birth control is
forbidden on

the basis of the
Torah
. Wahibi Muslim sect. TV
violates sacred order related to the
Koran.


Violations of Supposedly Exception less
Moral Principles


Ethical issues related to the use, failure to use, or
withdrawal of particular scientific or technological
procedures that are seen by some as violating one or
another important moral principles that is believed
exception less.

Examples:



1. Any course of action sure to result in the destruction of


innocent civilian lives in time of war is ethically



impermissible. Iraq



2. Life must always be preserved (Kavorkian)




3. A human being must never be treated merely as a

means to

an end (harvesting of fetal tissue).


Distribution of Science or Technology Related
Benefits


Benefits of developments in science and technology allocated in
ways that do not seem equitable to one or another social
groups; particularly so with respect to medical benefits, whether
diagnostic tests, surgical procedures, or therapeutic drugs
devices or services.


The ethical issue often centers on “who should receive the benefits
and who will not.” Often life and death decisions.



Example:

Transplant criteria often based on middle class
values.


1. Motivated to save life


2. Understands the benefits


3. Capable of adhering to strict diet


4. Show up for post transplant appointments


5. Post treatment quality of life


6. Contribution of treatment candidate to community


Ethical Responses to
Distribution of Science or
Technology Related Benefits


Deontologists:

Medical care is a basic human right, therefore, it is
morally unthinkable to deny a person treatment simply because
of socio
-
economic status.



Teleologists:

May find the concept of “absolute right” potentially
dangerous; that is that guaranteeing everyone who needs an
expensive exotic treatment may preclude many more individuals
from getting less expensive, more beneficial, non
-
life
-
or
-
death
treatments. Individuals do not have the moral right to draw,
without limit, on public or insurance company funds to have their
lives extended, regardless of the quality sustained life.


Infliction of Harm or Exposure to Significant
Risks of Harm without Prior Consent


Developments in science and technology while
undertaken to benefit one group, may inflict harm or
impose significant risk of harm on another without the
latter’s prior consent.


Examples:
Research on animals;


production of cross
-
border and multi generational pollution; the
maintenance of carcinogen
-

containing workplaces;
and the operation of hair trigger military defense
systems.


Responses to
Infliction of Harm or Exposure to
Significant Risks of Harm without Prior Consent


Teleologists:

on cost
-
benefit grounds, activities that
promise future benefits for humans but inflict
suffering on animals are ethically permissible and
perhaps obligatory.

Rationale:
since animals cannot consent to anything,
they are different in a moral ly relevant respect from
humans.


Deontologists:

Research using animals as morally
wrong because animals are not capable of
consenting.


Science or Technology Precipitated Value
Conflicts:


Scientific or technological advances allow something
new to be done that precipitates a value conflict.


Example:

Human life preservation and death with
dignity. The critical point is that this conflict would not
exist without technology.


Example:
Genetic tests showing predisposition for
certain diseases. Should disclosure be made? What
actions are appropriate?


Science or Technology
-
Engendered “Positive
Rights


Irrevocable Entitlements
: Right to “life” and “liberty”
Privacy is a part of liberty. Technology is often
perceived as a threat to rights of privacy.

Other Issues: Ethics in Science and Technology


Public Harm of Aggregation:
Accumulation of small
transgressions by human’s results in an aggregation
that has significant consequences.



Example
:

The aggregate pollution of 400 million
automobiles.


Practitioner Problems:



Falling within the province of “Professional Ethics” we
often rely, perhaps unfairly, on the scientist or
technologist to make the “ethical decision”


Other Issues: Ethics in Science and Technology


(cont.)

Problems of Execution: Edward Wenk’s three kinds of ethical issues faced
by engineers;

1.
Distributive Justice:
Should a project be given approval if a non trivial
degree of risk to health and safety could exist without the consent of
those within the impact area?


Examples:

Hydroelectric dam in an unstable area. Dilemma often
relates to what constitutes “acceptable risk”




1932, U.S. Public Health Service, 432 Black Males




1950s CIA, “mind controlling experiments”




1949
-
1969 Biological Warfare, 239 tests

2.
Whistle Blowing:
Cheap unreliable designs; testing shortcuts;
misrepresented results; faulty manufacturing; botched installations, etc.



Example:
Morton Thiokal
-

O
-
rings 1986 Challenger

3.
Consideration of Long
-
term Effects:


Example:
Half
-
life of nuclear waste, Design of Obsolescence



The Challenge of Contemporary Science and
Technology to Traditional Ethical Theory


Developments in contemporary science and technology require
revisions in traditional ethical thinking and decision making.


McGinn Proposal

Qualified Neo Consequntialism:
Assessments must have the
following Neo Consequentialist Qualities:



1. Focused on harm and well
-
being:
directed to identifying and
weighing the importance of consequences likely to influence the
harm or well being of affected patients

2. Refined:

designed to be sensitive to subtle effects

3. Comprehensive:
designed to attend to all harm and well being
related effects
-

social, cultural, as well as economic and
physical on all participants.

4. Discriminating:
Designed to enable scientific and
technological options to be examined on a case
-
by
-
case basis.

5. Prudent:

Embodying and attitude toward safety