Principles for Quality Research and

lifegunbarrelcityUrban and Civil

Nov 26, 2013 (3 years and 11 months ago)

85 views

Sandia National Laboratories is a multi
-
program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary

of Lockheed
Martin Corporation, for the U.S. Department of Energy’s National Nuclear Security Administration under contract DE
-
AC04
-
94AL8500
0.

Principles for Quality Research and
Quality Evidence

Ted Kreifels, Ph.D.

Artist: Ferdinand
Bol


Artist: Rembrandt
Harmenszoon

Van Rijn

Artist:
Domenico

Feti

Portraits of a Scholar
, from the 16
th

Century….

Artist:
JKoshi’s

photostream
,
flickr

...to today

Overview


Importance of good research


Traits of quality research


Standards and methods used to assess quality
research and quality evidence


BAD research practices


Common causes of bias in data


Methodological “potholes”


How to trust information


Errors in Research

Introduction


Every businessman, scientist, engineer, technician, clinician,
and manufacturer investigates, develops, or reveals useful
knowledge (
research
)



We each play important roles:


Scientists
,
engineers
, and
analysts

(
create

information)


Librarians
(
manage

information)


Decision
-
makers

(
apply

information)


Jurists

(
judge

information)


Journalists

(
disseminate

information)


Other examples?


Our Motivation

We (typically) have a sincere desire and
an interest in determining what is
TRUE

based on the
information

and
evidence

we have available


Motivation
(continued)


Good research empowers us to reach our own conclusions


Bad (distorted) research


Starts with a conclusion


Presents only facts, usually taken out of context, that supports the
author’s initial conclusion


Bad research should not to be confused with
propaganda



Propaganda is information that is intended to persuade and is
sometimes misrepresented as objective research


Bad research should not be confused with “
bull****”


Bull**** is a deliberate, manipulative misrepresentation and steers
one away from the truth


Bad research causes real harm and deserves strong censure


Research
versus

Evidence


Quality Research
and
Quality Evidence
are

related,
but separate
topics


Quality
Research

pertains to the
scientific process


Quality
Evidence

is the sum collection of research
data, and pertains to the

judgment

regarding the
strength and confidence one has in the findings
emanating from the scientific process



Research
produces

Evidence


Quality
research

is a precursor to quality
evidence


Following factors influence the
type

and
quality

of
evidence produced


Design


Questions


Methods


Coherence and consistency of findings



Quality Matters!


If scientific research lacks
credibility,
it’s
difficult to make confident, concrete
assertions or predictions


Confidence

is obtained by the robustness of
the research and the analysis done to
synthesize results


Traits of Quality Research

Quality

Research

Exhibits these Traits


Credibility

Objectivity, internal validity

Transferability

External validity

Dependability

Construct

validity, r
eliability

Confirm
-
ability

Objectivity,

Honest/Th
orough reporting

Traits of Quality Research
(continued)


Have I introduced any bias in the manner I collect or
think about my data?


Can
changes in the outcome be attributed to
alternative explanations that were not explored in
the study?


Do findings apply to participants/specimens whose
place, times, and circumstances differ from those of
other study participants/specimens?


Does
the research adequately measure key
concepts?


Have we collected the data in a consistent manner?


“…the truth, the whole truth, and nothing but the
truth…?”

Objectivity

Internal Validity

External Validity

Construct Validity

Reliability

Honest and Thorough
Reporting

Standards Used to Assess Quality of
Scientifically
-
Based Research


Pose a significant, important,
well
-
defined question
that can
be investigated empirically and that contributes to the
knowledge base


Offer a description of the
context

and existing information
about an issue


Apply methods that best
addresses the question
of interest


Test questions that are linked to relevant theory and
considers various perspectives


Standards Used to Assess Quality of
Scientifically
-
Based Research
(continued)


Ensure an
independent, balanced, and objective
approach to
the research with clear inferential reasoning supported by a
complete coverage of relevant literature


Use appropriate and reliable
conceptualization and
measurement of variables


Provide sufficient
description of the samples
, and any
comparison groups


Ensure the study design, methods, and procedures are
transparent

and provides the necessary information to
reproduce or
replicate

the study


Standards Used to Assess Quality of
Scientifically
-
Based Research
(continued)


Present evidence, with data and analysis in a format that
others can
reproduce

or replicate


Use adequate
references
, including original sources,
alternative perspectives, and criticism


Adhere to quality standards for
reporting

(i.e., clear, cogent,
complete)


Submit research to a
peer
-
review

process


Standards Used to Assess Quality of
Scientifically
-
Based Research
(continued)


Evaluate alternative explanations
for any findings, discusses
critical assumptions, contrary findings, and alternative
interpretations


Assess the possible impact of systematic
bias


Use Caution

to reach conclusions and implications


The more one aligns to these
standards,

the

higher the
quality


Following
only a few of these principles

is
insufficient

to assert quality

Publishing


Publishing is an important benchmark, but the quality of
research should not be judged solely by whether (or not) it is
published in leading journals



Using
biblio
-
metric analysis (citing by other authors) as a
measure of quality is also faulty


All research that is published in journals or cited by others is NOT
necessarily accurate, reliable, valid, free of bias, non fraudulent


Biblio
-
metric analysis is primarily a measure of
quantity

and can be
artificially influenced by journals with high acceptance rates



Assessing Quality Research


In
industry
, one of the most respected means of assessing
quality is to
establish consensus
among
subject matter experts
and
systematic review


Same is true in
academia


Strategies for reaching consensus in academia include position
statements, conferences, the peer review process, and systematic
review


What
other
differences

(or similarities) exist between
industry

and
academia
?



Assessing Quality Research
(continued)


Another form of reaching consensus is by using
standardized
reporting techniques


Report essential information regarding samples, statistics,
randomization, and analysis


Publish detailed technical standards in relevant professional societies


What other techniques help us
reach consensus
?

Assessing Quality Research
(continued)


Sandia National Laboratories
exhibits traits of basic research,
advanced development, industrial, and manufacturing


We use a “layered defense” or
layered strategy for defect
prevention


Bottoms
-
Up meets Top
-
Down in the middle (via
R
eviews, Gates, etc.)


Triple
-
A Teamwork
:
Assurance
,
Acceptance, Assessment


We do
our best work
when we work together to establish
consensus during
each step
to achieve quality


Bad

Research Practices


Defining issues in
ideological

terms



i.e. Using exaggerated or extreme perspectives to
characterize a debate


Ignoring/suppressing
alternative

perspectives or
contrary evidence


Insulting/ridiculing

others with differing views


Totally
unacceptable

…reflects poorly on oneself, one’s organization


Bad

Research Practices
(continued)


Designing research questions to reach
particular

conclusions


Using
faulty logic
to reach conclusions


Using
biased
data

and analysis methods


Ignoring
limitations

of analysis and exaggerating
implications of results


Bad

Research
Practices
(continued)


Using
unqualified

researchers

not familiar with
specialized issues


Not presenting
details

of key data and analysis for
review by others


Citing
special interest groups
or popular media,
rather than peer
-
viewed professional and academic
organizations


Bad

Research
Practices
(continued)


And, the
MOST COMMON
mistake:

Assuming
association

(events that occur together)…

Proves

causation
(one event causes another)


Have I missed anything
?

Example of a Methodological Pothole

Reference Units


Conclusion A
: As measured
per capita
, various safety efforts have
FAILED


Conclusion B
: Conditions require more people to drive further, yet vehicle
handling and safety have improved so people feel safer while increasing risk
(driving faster, leaving less distance between cars, etc.)

various safety
strategies (e.g. better roads, vehicles, laws) have
PASSED


No single right or wrong reference unit

different reference units reflect
different perspectives
and may affect analytical results

OBSERVATIONS


Traffic fatality trends over four
decades


When measured
per capita
they
show little decline


When measured
per vehicle
-
mile
,
fatality rates declined significantly


EXERCISE

Same Reference Units, Different Perspectives


What are your
CONCLUSIONS?


What further
QUESTIONS
would you ask
?


Different quality researchers reflect different perspectives,
knowledge, and experience

OBSERVATIONS


Alcohol
-
impaired driving fatalities

have decreased
over 5 years
throughout the United States


Alcohol
-
impaired

driving fatalities
have sharply increased in Kansas


Both are measured
per vehicle
-
mile


8
(of 60)
Methodological Potholes

Pothole

Problem

Remedy/Advice

Range

restriction effect

Failure to vary independent variables over sufficient
range, so effects look small.

Decide what range of a variable or what effect
size is of interest. Run a pilot study.

Ceiling effect

When a task is so easy that the experimental
manipulation shows little/no effect.

Make the task more difficult. Run a pilot study.

Floor effect

When a task is so difficult that experimental
manipulation shows little/no effect.

Make the task easier. Run a pilot study.

Sampling bias

Any confound that causes the sample to be
unrepresentative of the pertinent population.

Use random sampling. If sub
-
groups are

identifiable use a stratified random sample. Avoid
“convenience” or haphazard sampling.

History effect

Any change between a pretest measure and posttest
measure not attributable to the experimental factors.

Isolate subjects from external information. Use
post
-
experiment debriefing to identify possible
confounds.

Reactivity

problem

When the act of measuring something changes the
measurement itself.

Use clandestine measurement methods.

Order effect

In a repeated measures, the effect that the order of
introducing treatment has on the dependent variable.

Randomize or counter
-
balance treatment order.
Use between
-
subjects design.

Hypocrisy

Holding others to a higher methodological standard
than oneself.

Hold yourself to higher standards than others.
Apply self
-
criticism. Follow your own advice.

*
Sixty

Methodological Potholes,
David Huron, Ohio State University, 2000

CARS

How To Trust Information

E
specially

from Media and the Internet

Checklist for
Information Quality

Description

Goal

Credible

trustworthy source, author’s credentials,
evidence of quality control, known or
respected authority, organizational
support

A known,

respected authority, a

source
of

trusted

evidence


Accurate

up to date, factual, detailed, exact,
comprehensive, audience and purpose
reflect intentions of completeness

A source that is correct today (not
yesterday), a source that gives the
whole truth

Reasonable

fair, balanced, objective, reasoned, no
conflict of interest, absence of fallacies
or slanted tone

A source that engages the subject
thoughtfully and reasonably, concerned
with the truth

Supported

listed sources
,
contact information,

available corroboration, claims
supported, documentation supplied

Provides compelling

evidence for the
claims made, a source you can
triangulate (i.e.

find at least two other
sources that support it)

* Evaluating Internet Research Sources,
Robert Harris, November 2010

CARS Checklist


C
redible


Trustworthy source


Quality evidence


Quality control


Known, respected authority


Credentials


Organizational support


A
ccurate


Current


Factual


Detailed


Exact


Comprehensive


Whole truth



R
easonable


Fair and balanced


Objective


Reasoned and thoughtful


No conflict of interest


No fallacies or slanted
tone


Seeks the
truth


S
upported


Listed sources


Contact information


Corroboration available


Claims supported
w/evidence


Documentation supplied


Triangulated
sources

EXERCISE

Flour Power

Research and Evidence Challenge:

Is
a
liquid

cup and a
dry

cup the same measure
?



I used the internet to research this question and draw a
conclusion



What percentage of internet sources answered: Yes/No?

The “Ounce”

Background Information


Unit of
MASS (or weight)


Abbreviated,
oz
, from Latin “
uncia



Original Roman measure = 1/12 pound


Troy ounce
(still used for precious metals) =
A
pothecary ounce
= 1/12
lb


Several definitions and standards for an “ounce”:
Mother Theresa, Spanish, metric


United States uses
a
voirdupois ounce
= 1/16 pound


Unit of
VOLUME


Abbreviated,
fl

oz
, fl.
o
z., or oz. fl.


Other,
fabric weight


Expresses the areal density of a textile fabric in North
America, Asia, UK


Weight of a given amount of fabric, a square yard, or
yard of a given width


Mass

Variant

Equivalent
(grams)

Avoirdupois

28.3495231

Troy

31.1034768

Apothecary

31.1034768

Maria

Theresa

28.0668


Spanish

28.75

Dutch

metric

100

Chinese

metric

50

Volume

Variant

Equivalent
(ml)

US

30

Imperial

28

On Propaganda

Collected from several sources including dictionaries,
Wikipedia,and


* Garth Jowett and Victoria O'Donnell, Propaganda and Persuasion 4
th

ed. Sage Publications, p. 7



Propaganda

is
information
,
ideas
, (or even rumors) and a
form of
communication

intended
to
persuade
and influence


P
ropaganda

often presents facts selectively to encourage a particular
synthesis and emotional, rather than rational, response


“Propaganda is a deliberate and systematic attempt to shape perceptions,
manipulate cognitions, and direct behavior to achieve a response that
furthers the desired intent of the propagandist.” *


Originally, etymologically, the word “propaganda” is
neutral


Positive, benign, innocuous examples: Public
health
recommendations, buying
w
ar
b
onds, reporting crimes to the police, getting out the vote


Negative example: Nazi (used to justify Holocaust), etc.

Be wary!

Propaganda is sometimes misrepresented as objective research!

On Bull****

(a real book) by philosopher Harry G. Frankfort, (Princeton Press 2005)


Bull****
is a manipulative misrepresentation


Bull****

is WORSE THAN A
LIE (more dangerous) because
it denies the
value of truth


In contrast, lying is concerned with the truth in a perverse fashion: “A liar
wants to lead us away from the truth.”


Truth Tellers (researchers) and Liars play opposite sides of the Game


Bull****
ters

take pride in ignoring the rules of the Game altogether


People sometimes try to justify their bull**** by citing
relativism
, a
philosophy that suggests that objective truth does not exist

“There are no facts, only interpretations”
-

Nietzsche


Any issue can and should be viewed from multiple perspectives…but



Anyone who denies the value of truth and objective analysis is
really
bull****ting!


Special Acknowledgement


The following section regarding Errors in Research and
the workshop case studies were taken from

On Being a Scientist

Responsible Conduct in Research, 2
nd

Edition


produced by:


-

The National Academy of Sciences (NAS)


-

National Academy of Engineering (NAE)


-

Institute of Medicine (IOM)


Printed by the National Academy Press, Washington D.C., 1995

Errors in Research

1
st

Category


The “
Honest Error



Usually caught
internally

through informal
and formal peer
review
processes


Dealt with
internally

through evaluations and appointments


Errors in Research

2
nd

Category


Ethical transgressions


Gross negligence


Misallocation
of credit


Cover
-
ups of misconduct


Reprisals against whistle blowers


Malicious allegations


Violations of due process


Sexual and other forms of harassment


Misuse of funds


Tampering with experiments, instrumentation, results


Violations of government research regulations


May be caught
internally

or
externally

any number of ways


Dealt with by administrative, legal, and professional penalties


Misconduct

3
rd

Category and
Most Grave Error in Research


Deception


Making up data (
fabrication
)


Changing or misreporting data or results (
falsification
)


Using the ideas or words of another person without giving appropriate
credit (
plagiarism
)


Deception strikes “at the heart” of values in good research


Deception may cause extreme consequences


Undermines progress, personal and institutional credibility


Loss of time in related research


Squanders public funds


Threatens future funding and support


Threatens public safety


Deception is dealt with using severe,
career
-
ending
, penalties

The Selection of Data

1.
How should the data from the two suspected runs be handled?

2.
Should the data be included in tests of statistical significance and why?

3.
What other sources of information, in addition to their faculty advisor, can Deborah and Kathleen use to
help decide?



During the measurements at the national lab, Deborah and Kathleen observed that there were power
fluctuations they could not control or predict. Furthermore, they discussed their work with another group
doing similar experiments, and they knew that the other group had gotten results confirming the theoretical
prediction and was writing a manuscript describing their results.


In
writing up their own results for publication, Kathleen suggests dropping the two anomalous data
points near the abscissa (the solid squares) from the published graph and from a statistical analysis. She
proposes that the existence of the data points be mentioned in the paper as possibly due to power
fluctuations and being outside the expected standard deviation calculated from the remaining data points.
“These two runs,” she argues to Deborah, “were obviously wrong.”


Deborah, a third
-
year graduate student, and
Kathleen, a
postdoc
, have made a series of
measurements on a new experimental
semi
-
conductor
material using an expensive neutron
source at a national laboratory. When they get back
to their own lab and examine the data, they get the
following data points. A newly proposed theory
predicts results indicated by the curve.

Beam Intensity

Response

The Selection of
Data

Prologue


Deborah and Kathleen’s
principal obligation
, in writing up
their results for publication, is to
describe what they have
done and give the basis for their actions
. They must
therefore examine how they can meet this obligation within
the context of the experiment they have done.


Questions that need to be answered include:

1.
If the authors state in the paper that data have been rejected because of
problems with the power supply, should the data points still be included in the
published chart?

2.
Should statistical analyses be done that both include and exclude the
questionable data?

3.
If conventions within their discipline allow for the use of statistical devices to
eliminate outlying data points, how explicit do Deborah and
K
athleen need to be
in the published paper about the procedures they have followed?

A Conflict of Interest


John, a third
-
year graduate student, is participating in a department
-
wide seminar where students, postdocs, and faculty members discuss work in
progress. An assistant professor prefaces her comments by saying that the
work she is about to discuss is sponsored by both a federal grant and a
biotechnology firm for which she consults.


In the course of the talk, John realizes that he has been working on a
technique that could make a major contribution to the work being discussed.
But his faculty advisor consults for a different, and competing, biotechnology
firm.


1.
How should john participate in this seminar?

2.
What, if anything, should he say to his advisor

and when?

3.
What implications does this case raise for the traditional openness and sharing of
data, materials, and findings that have characterized modern science?

A Conflict of
Interest

Prologue


Science thrives in an atmosphere of open communication.
When communication is limited, progress is limited for
everyone. John therefore needs to weight the advantages of
keeping quiet

if, in fact there are any

against the damage
that accrues to science if he keeps his suggestions to himself.
He might also ask himself how keeping quiet might affect his
own life in science.


Questions:

1.
Does John want to appear to his advisor and his peers as someone who less than
forthcoming with his ideas?

2.
Will he enjoy science as much if purposefully limits communication with others?

Summary


Why is good research important?


What are the traits of quality research?


Can you provide a few examples of standards and methods
used to assess quality research and quality evidence?


What are examples of bad research?


What are a few common causes of bias in data and
methodological errors?


How does one trust information from the internet?


What are the three categories of errors in research?

Bibliography


Evaluating Research Quality, Guidelines for Scholarship


Todd
Litman
, Victoria Transport Policy Institute, November 28
th
, 2010


What are the Standards for Quality Research?


Editor’s Focus , Technical Brief Number 9, National Center for Dissemination
of Disability Research (NCDDR), 2005


Sixty Methodological Potholes


David Huron, Ohio State University, 2000


Evaluating Internet Research Sources


Robert Harris, Virtual Salt, November 22
nd
, 2010


On
Being a
Scientist, Responsible
Conduct in Research, 2
nd

Ed.


Bruce
Alberts
, President, The
National Academy of Sciences (
NAS), 2005


Kenneth Shine, President, National
Academy of Engineering (
NAE), 2005


Robert White, President, Institute
of Medicine (IOM
), 2005