Proposed NRC Assessment of Doctoral Programs

breadloafvariousΒιοτεχνολογία

20 Φεβ 2013 (πριν από 4 χρόνια και 3 μήνες)

114 εμφανίσεις

NRC Assessment of Doctoral Programs

Charlotte Kuh

(ckuh@nas.edu)

Study Goals


Help universities improve their doctoral programs
through benchmarking.


Expand the talent pool through accessible and
relevant information about doctoral programs.


Benefit the nation’s research capacity by
improving the quality of doctoral students.

Background


NRC conducted assessments in 1982, 1993


The “gold standard” of ranking studies


In 2000, formed a committee, chaired by Jeremiah
Ostriker, to study the methodology of assessment


What can be done with modern technology and
improved university data systems?


How can multiple dimensions of doctoral
programs be presented more accurately?


Findings

(November 2003)


An assessment was worth doing


More emphasis and broader coverage needed for
the quantitative measures: a benchmarking study


Present qualitative data more accurately: “rankings
should be presented as ranges of ratings”


Study should be made more useful to students


Analytic uses of data should be stressed


On
-
going updates of quantitative variables should
continue after the study was completed.

Committee


Jeremiah Ostriker, Princeton,
chair
(astrophysics)


Virginia Hinshaw, UC
-
Davis,
vice
-
chair
(bioscience)


Elton Aberle, Wisconsin
-
Madison (agriculture)


Norman Bradburn, Chicago
(statistics)


John Brauman, Stanford
(chemistry)


Jonathan Cole, Columbia
(social sciences)


Eric Kaler, Delaware
(engineering)


Earl Lewis, Emory (history)






Joan Lorden, UNC
-
Charlotte
(bioscience)


Carol Lynch, Colorado
(bioscience)


Robert Nerem, Georgia Tech
(bioengineering)


Suzanne Ortega, Washington
(sociology)


Robert Spinrad, Xerox PARC
(computer science)


Catharine Stimpson, NYU,
(humanities)


Richard Wheeler, Illinois
-

Urbana (English)

Panel on Data Collection


Norman Bradburn,
Chicago,
chair


Richard Attiyeh, UC
-
San
Diego


Scott Bass, UMd
-
Baltimore County


Julie Carpenter
-
Hubin,
Ohio State


Janet L. Greger,
Connecticut


Dianne Horgan, Arizona



Marsha Kelman, Texas


Karen Klomparens,
Michigan State


Bernard Lentz,
Pennsylvania


Harvey Waterman,
Rutgers


Ami Zusman, UC System


Agricultural Fields are Included for the
First Time

Fields and Sub
-
fields (1)


Agricultural Economics


Animal Sciences


Aquaculture and Fisheries


Domestic Animal Sciences


Wildlife Science


Entomology


Food Science and Engineering


Food Engineering and Processing (sub
-
fields are not
data collection units)


Food Microbiology


Food Chemistry


Food Biotechnology

Agricultural fields and sub
-
fields (2)


Nutrition



Animal and comparative nutrition



Human and Clinical Nutrition



International and Community Nutrition



Molecular, Genetic, and Biochemical Nutrition



Nutritional Epidemiology


Plant Sciences



Agronomy and Crop Sciences



Forestry and Forest Sciences



Horticulture



Plant Pathology



Plant Breeding and Genetics

Emerging Fields:


Biotechnology


Systems Biology


Next steps



Process has been widely consultative. Work began in fall,
2005.


July 2006
-
May 2007
: Fielding questionnaires, follow
-
up,
quality review and validation. Competition for research
papers.


December 2007
-
Data base and NRC analytic essay
released.


December 2007
-
March 2008
: Data analyses performed
by commissioned researchers


April 2008
-
August 2008
: Report review and publication


September 2008
: Report and website release. Release
conference

A New Approach to Assessment of Doctoral
Programs


A unique resource for information about doctoral programs
that will be easily accessible


Comparative data about:


Doctoral education outcomes


Time
-
to
-
degree, completion rates


Doctoral education practices


Funding, review of progress, student workload, student
services


Student characteristics


Linkage to research


Citations and publications


Research funding


Research resources







No pure reputational ratings


Why not? Rater knowledge



Fields have become both more
interdisciplinary and more specialized


Why not? The US News effect

rankings without
understanding what was behind them.


What to substitute? Weighted quantitative
measures. Possibly along different dimensions.


How will it work?


Collect data from institutions, doctoral programs, faculty,
and students


Uniform definitions will yield comparable data in a
number of dimensions


Examples of data


Students: demographic characteristics, completion
rates, time to degree


Faculty: interdisciplinary involvement, postdoc
experience, citations and publications


Programs: Funding policies, enrollments, faculty size
and characteristics, research funding of faculty, whether
they track outcomes

Program Measures and a Student Questionnaire


Questions to programs


Faculty names and characteristics


Numbers of students


Student characteristics and financing


Attrition and time to degree


Whether they collect and disseminate outcomes
data

Examples of Indicators


Publications per faculty member


Citations per faculty member


Grant support and distribution


Library resources (separating out electronic

media)


Interdisciplinary Centers


Faculty/student ratios

Some Problems Encountered


What is a faculty member?


3 kinds: Core, Associated, New


Primarily faculty involved in dissertation
research


Faculty can be involved with more than one
doctoral program


Multidisciplinarity can result in problems due to
need to allocate faculty among programs

Rating Exercise: Implicit


A sample of faculty will be asked to rate a sample
of programs.


Provided names of program faculty and some
program data


Ratings will be regressed on other program data


Coefficients will be used with data from each
program to obtain a range of ratings

Rating Exercise: Explicit


Faculty will be asked importance to program quality of
program, educational, and faculty characteristics.


Weights on variables will be calculated from their answers.


Weights can be applied to program data to produce range
of ratings


Rankings can be along different dimensions


Examples: research productivity, education
effectiveness, interdisciplinarity, resources


Users may access and interpret the data in ways that
depend on their needs.


Database will be updateable



Project Product


A database containing data for each program
arrayed by field and university.


Software to permit comparison among user
selected programs


In 2008

papers reporting on analyses conducted
with the data

Uses by Universities


High level administrators


Understanding variation across programs


Ability to analyze multiple dimensions of
doctoral program quality


Enabling comparison with programs in peer
institutions


Program administrators, Department chairs


An opportunity to identify areas of
specialization


Encourages competition to improve educational
practice



Uses by prospective students


Students can identify what’s important to them
and create their own rankings


Analytic essay will assist students on using the
data


Updating will mean the data will be current


Better matching of student preferences and
program characteristics may lower attrition rates.

Project Website

http://www7.nationalacademies.org/resdoc/index.html