docx - Experiential Learning Center

stalliongrapevineΒιοτεχνολογία

1 Οκτ 2013 (πριν από 3 χρόνια και 10 μήνες)

112 εμφανίσεις

DRAFT VERSION NOT FOR CITATION


21
st

Century Skills
-

1











Tentative Title: The Assessment of 21
st
-
Century Skills in Community College
Career and Technician Education Programs


Running Head: 21
st

Century Skills


Louise Yarnall, SRI International


Jane Ostrander, De Anza College



21
st

Century Skills
-

2






Introduction


Community colleges have a
n

extensive
history of training technicians

(Cohen &
Brawer, 2002)
, but
new
21
st

century
technical
fields such as biotechnology,
forensics, and environmental science call
for new
models

of
technician
preparation
. Traditionally, career and technical education (CTE)
has

emphasized
hands
-
on training in basic procedures (Lynch, 2000).
The new fields
of the 21
st

century
blend hands
-
on skill

with
two other advanced elements:
the application of
advanced
concepts from
ma
thematics and science
,

and
coordinated
teamwork
with professionals from multiple disciplines.

To keep pace with these changes,
c
ommunity college

educators
are

transform
ing

CTE

programs
so

students
can
engage in
21
st
-
century professional activities
that
require

interdisciplinary
problem solving, teamwork,
and strong communication skills
.



This chapter

describes
how
using

evidence
-
centered assessment reflection

(EC
-
AR)
with technician educators
can help

align the

learning

goals
of technician

courses with
the demands of

the new high
-
technolo
gy workforce.
This is an
approach to assessment that grew out of foundational work in evidence
-
centered
design

(ECD)

(Messick, 1994; Mislevy, 2007; Mislevy & Riconscente, 2006)
.
Evidence
-
centered design

is an alternative to the measurement m
odels of
class
ical test theory and
item response theory. Unlike these models, which
examine a test’s technical features

such as
item difficulty and
latent traits
measured
(e.g., verbal skill)

after

test design
,
ECD
outlines the assessment
argument
first
. The ECD “front
loaded” approach allows assessment designers to
capture
more complex student performances in a wider range of assessment
tasks.


Key Points

The
EC
-
AR
process
helps

instructors
clarify the full range of skills involved in
competent workplace performance
, which includes not only
technical skills but
also

social
-
technical


and “social”

skills
:


-

“S
ocial
-
technical” skills

are meta
-
cognitive in nature and focus on ways to link
client needs to technical

concepts through problem framing, solution design, and
s
olution implementation.

-

“S
ocial” skills capture the communicative activities that help students leverage
distributed networks of knowledge
(Pea, 1993)
.

-

“S
ocial
-
technical” and “social”

learning goals

should supplement, rather than
supplant,

th
e traditional CTE classroom focus on

hands
-
on, technical

skills. Our
research suggests they
act

as
the cognitive equivalent of “bread starter
”:
they
raise the standards and outcomes of
any
CTE course.



21
st

Century Skills
-

3





While
EC
-
AR’s

theoretical roots,
tools
, procedures, and summary findings

are
described
in
another chapter (
cite here
), t
he present
chapter will
describe details
that may be useful to the practitioner. First, we will list

the
social
-
technical

and
social
skills that
the EC
-
AR process
identified as

most important
to teach across

multiple

CTE

field
s
.
Then, we

will
describe the features of

the kinds of EC
-
AR
assessments that provide evidence that students are learning these new skills.
Drawing from preliminary
studies of content and construct validity
, the chapter
also will describe how inclusion of

these new
learning goals

can support deeper
technical learning. Finally, the chapter will

list
common challenges that CTE
instructors confront

when integrating both the
instruction and assessment of
these new skills into their courses.



Practical
Applications


Practitioners may use the

information in
this chapter in
various ways. They may
want to use the list of social
-
technical and social skills as a set of standards or
guidelines for their own CTE programs.
They may want to implement their own
ve
rsion of the EC
-
AR process in a professional development program, or base
an online set of “faculty tips” on this document that faculty can use as needed.
They may want to use the list of assessment features to formulate templates that
CTE instructors
in t
heir institutions
may use to create assessments. They may
want to use the information from the validity studies in professional d
evelopment
programs to deepen faculty members’ appreciation of the complexity and
difficulty of learning to apply technical kno
wledge to real world problems. They
may want to review the challenges section to guide recruitment to and
implementation of professional development programs, and to provide faculty
with some tips on how to engage students.

Each subsection includes some
sp
ecific ideas for application in practice.

Background



The
EC
-
AR process unfolded within the context of an

instructional materials
development project
calle
d Scenario
-
Based Learning (SBL)

in Technical
Education
. This project

involved instructors
collaborating with industry experts to
design

curriculum activity modules that

engage

students in
real world

problems
(
http://elc.fhda.edu/
).
These goal
-
based scenarios
(Schank, 1997
)

challenge
students to collaborate with their team members to solve authentic workplace
problems, use the vocabulary of their target profession, and communicate their
solutions to stakeholders.

Each workplace problem was

presented
through a
series of
e
mails
from a fictional supervisor

(the instructor)
.
The online materials
contain
ed

tips and links to resources that students consult
ed

as needed to solve
problems. All problems require
d

students to share their problem solving
approach in various forms of communication, both informally within a team and
more formally before the class
or

an
industry representative.




21
st

Century Skills
-

4





The SBL project was funded by
the National Science Foundation’s Advanced
Technological Education (ATE) program. This federal program supports
technician education for high
-
growth, high
-
demand fields in science, technology,
engineering, and mathematics (STEM).
In the initial phases of this work, the
need for a focus on assessmen
t emerged as we
experienced challenges similar
to those
encountered

by other innovators who incorporated problem
-
based
learning into their
programs

(Alverno College, 1977; Woods, 2000; Yarnall,
Toyama, Gong, Ayers, & Ostrander, 2007)
. Both community college instructors
and students expressed discomfort with the less structured quality of
SBL

instruction. Community college technical students liked learning from team
members, but demanded clearly defined learning goals, better evidence
that they
were learning, and instructor feedback. Instructors wanted to know how to
manage teams and track individual student progress.


To address these concerns, the second phase of

our work focused on
assessment
.
To engage instru
ctors in defining the

n
ew learning goals

associated
with SBL
, we
used
the
EC
-
AR

process. This process involved:


-

A
n initial interview that engaged

instructors in
domain analysis
, a process that
characterizes the range of knowledge, skills, and disposi
tions required in a
technica
l
field.

-

A rough
documentation phase called
domain modeling
, which describes
the
features of

assessments
that provide evidence that students have acquired the
requisite knowledge, skills, and dispositions.

-

A detailed documentation
phase in which the
conceptual assessment framework

(CAF)
defines the
specific
types of
prompts, stimuli, performances, and scoring
rubrics for assessments
measuring the key learning outcomes
.

-

The final processes of
assessment implementation and delivery
system
development,
during which

validity testing is performed

(e.g.,
scoring reliability,
construct validity, content validity,

and instructional sensitivity).

In total, we co
-
developed 72 final assessment tasks and documented 44 in
-
class
formative assessments for
SBL

with

c
ommunity college instructors
in the
domains of computer programming (Python, Ajax), engineering, environmental
studies, bioinformatics, and network security
.


Theoretical Considerations

Central to the EC
-
AR process was an understanding that technician
educators
were trying to teach and assess more than

the

recall of technical facts and
procedures. Instead, they were trying to teach and assess complex behaviors
and skills associated with competent application of technical knowledge to real
world problems
. A conceptual model of the range of skills and their
interrelationships that were to be taught and assessed appears in Exhibit 1:



21
st

Century Skills
-

5





Exhibit 1. Model for Expanding Technical Education Learning Outcomes using the EC
-
AR
Process



This model illustrates the progression we saw occurring as we engaged
technician educators in the
EC
-
AR process
. Most instructors saw their primary
responsibility as teaching and testing technical knowledge. In the
EC
-
AR
process
, the assessment team elicit
ed instructors’ knowledge of the professional
activity context, which revealed essential social processes and social
-
technical
processes and products.


A “figure
-
ground” relation
ship existed
among these different forms of knowledge
.
Instructors saw

the te
chnical knowledge as the figure
in the picture,
and the
social and social
-
technical knowledge as the background. Part of the goal of the
EC
-
AR process

is to put these background components into closer focus so they
are not neglected in either teaching or t
esting.

List of New Skills for CTE Education


In this section, we will review the kinds of skills that technician educators in
multiple domains identified as important to performing proficiently in the
workplace. The technician educators identified these s
kills by participating in
different steps in the EC
-
AR process. Each step
and its associated findings are
presented below
.


EC
-
AR Step 1:

During the
EC
-
AR process domain analysis and domain modeling
phases
, the instructors first described
to the researche
r
the complex behaviors
that are the mark of professional competence.
These behaviors are listed below.
Each complex behavior became a distinct
design pattern,
which is a document

SOCIAL
-
TECHNICAL


Translating client needs into technical specifications


Researching technical information to meet client needs


Justifying or defending technical approach to client


SOCIAL



Reaching consensus on work team


Polling work team to determine ideas


TECHNICAL


Using tools, languages, and principles of domain


Generating a product that meets specific technical criteria


Interpreting problem using principles of domain


21
st

Century Skills
-

6





that lists the different attributes of an effective assessment of the
professional
behavior

(See Table 1 for list).

Table 1.
Design Pattern Foci:
Professional Behaviors Important in Technician Education

Problem Solving Skills

Technical Disciplines of Application

1.

Research and analysis

Engineering, Environmental Studies,
Bioinformatics, Network Security

2.

Framing a problem

Programming, Engineering, Network
Security

3.

Generating a product or
solution

Programming, Engineering, Network
Security

4.

Using tools*

Bioinformatics

5.

Making inferences *

Bioinformatics

Professional Knowledge and
Skills


1.

Collaborating to solve a
problem

Programming, Engineering, Network
Security

2.

Presentation and
communication

Programming, Engineering,
Environmental Studies, Bioinformatics

*
These were developed later and potentially
are applicable to other technical fields


The list shows that instructors across multiple technical disciplines shared many
common ideas about the core professional behaviors involved in applying
technical knowledge in the workplace. Some of the skill
s they identified
resembled
phases of critical thinking or problem solving
(Facione, 1990;
Jonassen, 2000)
, including

preparatory activities, such as research and ana
lys
is
and framing a problem,
direct production activities, such as generati
ng a product
and using tools, and

interpretive activities such as making inferences. They also
emphasized social activities such as teamwork and presentation. Often these
critical thin
king and social process skills are commonly associated with liberal
arts and general education courses, but as can be seen, these technical
instructors valued such skills as essential too
.
In the design pattern, the
assessment reflection activity of defin
ing the professional behaviors set the stage
for instructors to move beyond a narrow focus on technical skills toward broader
categories of problem solving and professional skills.


The EC
-
AR Process: Step
-
by
-
Step


EC
-
AR Step 2
:
In the remaining steps of
the EC
-
AR process, technician
instructors thought about different aspects of th
e
professional behavior, recording
those details

in the design pattern. Since the
se professional behaviors are

complex, one of the first tasks in the EC
-
AR process involves identifying the
elements that are most central to
a quality
performance. The EC
-
AR process
engaged technician instructors in identifying the
focal
knowledge, skills, and
abilities (KSAs) and
ad
ditional
KSAs. Focal KSAs comprise

the specific principles

21
st

Century Skills
-

7





and procedures that
were

the desired learning outcomes of the
instructors
.
Additional KSAs represented the pre
-
requisite knowledge and skills that the
technician educators expected students to
know

before learning the focal KSAs
.


In
the
EC
-
AR process
, the
a
ctivity of generating focal KSAs and additional KSAs
became the place where instructors not only defined the relevant technical
knowledge
, but, more importantly, the specific
social
skills th
at
students needed
to apply
that
technical knowledge

effectively
.


To show how this worked, Table 2 compares the focal and additional KSAs for
the same problem solving behavior

framing a problem

in two different
domains, programming and engineering.


Table
2. Focal and Additional KSAs for Framing a Problem in Different Domains

Domain

Ajax programming

Engineering

Focal KSAs



Skill of identifying and
asking appropriate
questions to specify user
requirements



Skill of engaging in
software design
brainstorming by

generating examples of
possible user interactions
with the Web site



Skill of documenting
system requirements
using a simplified use
case format



Skill of addressing user
needs in specifying
system requirements



Skill of identifying and
asking appropriate
qu
estions to specify the
core engineering problem
and requirements



Skill of generating
possible solutions using
engineering principles



Skill of addressing
engineering problem and
requirements in
specifying possible
solutions

Additional KSAs



Knowledge of
possible
options for Web site
functions



Knowledge of user
interface design



Skills of communication
and writing



Ability to conduct
simple web searches



Ability to use graphic
tools
to create mockup of
Web site’s user interface
design



Knowledge of algebra,
tr
igonometry and
geometry



Knowledge of physics



Skills of presentation,
communication and
writing


21
st

Century Skills
-

8






These examples of KSAs show how competence in a technical field involves
interplay of
social skills and technical knowledge.
As expected, the fields of
engineering and programming
differ in fundamental
technical
principles and
tools
, but
in both cases, technicians need to interview clients to understand the
constraints and expectations that inform the technical solution. The Add
itional
KSAs illustrate the importance of specific technical knowledge and tools for
supporting clear communication and shared
reasoning in a variet
y
of
social
workplace contexts. These tools and knowledge help technicians interact
effectively

with other t
echnicians
and make

presentations to non
-
technical
audiences.



In practice, this step is challenging for instructors. Some will want to claim they
are teaching everything under the sun; some will want to claim they are teaching
only an attitude or vague s
ense of confidence that defies measurement. The goal
of this activity is to focus attention on defining the fundamental skills of applying
technical knowledge to real world problems. The hardest part will be
disentangling the social elements that are criti
cal to this work and envisioning
activities in which students can hone those skills and show evidence that they
are learning them.


EC
-
AR Step 3
:
When judging a high
-
quality performance, people often say, “I
know it when I see it.” Capturing that intuitive sense of what it looks like to do
something well is another key step in the
EC
-
AR process
. We asked instructors
to describe what it looked like w
hen a student could do the KSAs they wanted
them to learn. Their descriptions were recorded under the design pattern attribute
“observable behaviors.” To show how this works, examples of how two different
instructors described what it looks like when stude
nts know how to “frame a
problem” in two different domains are presented in Table 3 below.


Table 3. Comparison of Observable Behaviors for Framing a Problem in Different Domains

Domain

Ajax programming

Engineering

Observable
behaviors of
the problem
solving
design
pattern
“framing a
problem”



The student will:



Define the user’s primary
needs for the Web site



Define the user’s range of
possible uses for the Web
site



Determine Web
functionalities that meet
each of the user’s
needs/uses

The student
will:



Focus on a problem by using
the principles of engineering



Define a comprehensive
range of possible
engineering problems and
requirements in a given case



Comprehensively and
thoughtfully rationalize and
justify problem framing

21
st

Century Skills
-

9






decisions for a
technical
audience


As can be seen from Table 3, there are common elements of observable
behaviors
in
the two domains:
(1) identifying user/client needs, (2) defining the
user needs/problem in applicable technical terms, and (3) making a
recommendation fo
r a technical approach.


As in the KSAs, we see here some mention of the technical elements unique to
each domain. In Ajax programming, the students translate user needs into “Web
functionalities;” in engineering, students translate user needs into “engin
eering
principles.”


We also see, at least in these specific
SBL

tasks, that the programming instructor
underscores the importance of “alignment with user needs” as the primary index
for judging the quality of students’ professional performance, while the
engineering instructor sees the “comprehensiveness” of the possi
ble applications
of engineering principles as the primary performance quality indicator. These
slight variations

one toward the social dimension of the client and one toward
the technical dimension of the engineering principles

demonstrate
how the
EC
-
AR
process

flexibly incorporates individualized instructor priorities.


In practice, this step of the reflection process should not be too hard. It is
sometimes easier to describe what it looks like when someone is applying
technical knowledge well to a real

world problem than to imagine what it takes to
learn that skill. Technical instructors frequently have many examples of
observations from their own workplaces or from years observing students in their
classrooms.


EC
-
AR Step 4
:
After
imagining what a skil
lful performance looks like
, the next
step is to imagine

what

work products
” students can

produce to show evidence
of learning. Table 4 presents the types of work products that instructors used
across the domains. The work products are organized by whethe
r they were
used primarily in class for formative purposes or
on
the final test for summative
purposes.

Table 4
.
Work Products

by Percentage in Scenario
-
Based Learning

In
-
Class Item Types

Description of Work Product

Percentage of
Total
Assessments*

Tool

Performance

Technical step
-
by
-
step procedural
performance that demonstrates
proficiency with tool, language, or
system

36%

Check in

Brief questions or “on demand”
29%


21
st

Century Skills
-

10





performances requiring students to
demonstrate important skills and
knowledge

Tracking Progress

Progress checklist to mark student
performance against a pre
-
established
schedule or set of benchmarks

20%

Presentation/Inference

Presentation of pr
oject, summary of
findings, or
formal argument based on
data

17%

Final Item Types



Short Answer
Conceptual



Short
-
answer written or oral
description of the principles or
concepts relevant to a problem

40%

Short Answer
Professional

Short
-
answer written or oral
description of the strategies or tactic
relevant to a social or presentation

organization problem

27%

Mapping




Concept map to illustrate
understanding of an overall process or
the full range of principles relevant to a
problem

5%

Computation





Performance of mathematical or
statistical procedure in the course of
solving a
problem

5%

Procedural with or
without Tools



Demonstration of steps in a technical
procedure; may involve actual tool or
representation of tool


25%*

*
Some overlap on items



As the table illustrates, most of the work products
involve

familiar assessment
modes, such as short
-
answers, and familiar topics, such as concepts and
principles.
Yet s
ome of the assessment modes are unfamiliar (e.g., mapping)
and
so are

some of the topics (professional skills).
Most

of the work products require
rubric scoring. They monitor student learning in highly technical areas (tool use
procedures, computations, concepts), more social areas (professional skills), and
combined social
-
technical areas that involve higher
-
order cognitive skills
(presentation/inf
erence, mapping, tracking progress).


In practice, this step works best by asking instructors to imagine different ways
that students can show what they know. It helps to have instructors work first
alone, imagining how any of the different work products
above would provide
evidence that students have learned the key skills of applying technical
knowledge to a real world problem. Have them pick three different assessment
types from the list above, perhaps one formative and two summative, and then
have them

envision what those three different assessments of the same skill

21
st

Century Skills
-

11





would look like. Then have them discuss their ideas with another instructor. This
exercise often reveals to instructors fresh ideas for tracking student progress.


EC
-
AR Step 5
: Even the
youngest elementary children can recognize the genre
of a test question. Likewise, teachers develop a repertoire of different forms of
test questions too. In the design pattern, these preferred test question genres are
recorded under the attribute “charact
eristic task features.” This is a list of the
prompt and stimulus elements that every assessment task must include to elicit
the desired student performance. In addition, there are hard and eas
y test
questions. In the design
pattern, the elements that can
be varied to make test
questions harder or easier are documented under the attribute “variable task
features.”


In the
EC
-
AR process,

we gathered or created prompt and stimulus elements.
For the prompts, the instructors and the assessment team collaborated
, often
iterating two or more times to arrive at an approach that elicited the desired
KSAs. In these discussions, we engaged in detailed consideration of the prompts
with the instructors’ input, sometimes weighing each word to ensure it elicited
higher
-
le
vel

skills
rather than simple recall. S
ometimes
we brainstormed

how to
focus student attention on social negotiation skills instead of technical

matters
.


To generate ideas for test item scenarios

(stimuli)
, the assessment team asked
instructors to generate some technical knowledge representations. These
included a log file, in which network security students would find anomalies and a
hacker attack, and a snippet of computer programming code, which students

would analyze and complete. In some cases, the assessment team generated
relevant knowledge representations, such as
the results of a
bioinformatics
database
search or a newspaper article
about,
or transcript of
,

a planning
commission meeting

on an enviro
nmental issue
.


Ultimately, we settled upon a starting set of stimuli and prompts that elicited the
professional behaviors and their component KSAs. An overview of these appears
in Table 5.


Table 5. Characteristic Features of Assessment Tasks for Core T
echnician Education
Professional Behaviors


Characteristic Task Features

Professional
Behavior
Assessed

Stimuli & Skill
Requirements

Task Features

Research and
analysis

Technical Context
Stimulus



Brackets a phenomenon for investigation


21
st

Century Skills
-

12





Cognitive
Operation
Requirements



Cues the use of procedures for searching,
testing, or inquiring into the phenomenon
of interest



Cues analysis and interpretation

of the
results of research or testing

Framing a problem

Technical Context
Stimulus



Contains
constraints that students must
recognize

Social or Cognitive
Operation
Requirements



Requires demonstration of social process
for identifying client requirements



Cues summarization of problem goals

Generating a product

Technical Context
Stimulus



Engages
students in using real world tools,
procedures, and knowledge
representations

Cognitive Operation
Requirements



Requires students to make
recommendations or summarizations of
their intended (or enacted) design or
solution process

Using tools

Technical
Context
Stimulus



Engages students in using real world tools,
procedures, and knowledge
representations

Cognitive Operation
Requirements



Contains different opportunities to
demonstrate proficiency with different
procedural elements of the tool



Scaffolds
conceptual and interpretive
knowledge as it is not the focus of this
assessment

Making inferences

Technical Context
Stimulus



Presents student with multiple knowledge
representations relevant to real world
practice

Cognitive Operation
Requirements



Cues
interpretation, weighing evidence
toward reaching a conclusion

Teamwork

Technical Context
Stimulus



Engages or presents student with a conflict
or disagreement in a team

Social Operation
Requirements



Cues summary of the relevant social
negotiation
processes to reach a solution

Presentation and
communication

Technical Context
Stimulus



Engages student in, or presents student
with a challenge around information
organization for a specific technical or non
-
technical audience

Cognitive Operation
Requirements



Cues student to summarize an approach
for organizing information to meet the
communication needs of that audience


These characteristic task features represent the essential ingredients of technical
educator assessments. As can be seen from
this table, each assessment links
some knowledge representation stimulus from the technical context to a specific
prompt to the student to perform a cognitive or social operation.



21
st

Century Skills
-

13





In addition to describing the characteristic task features, the design patt
erns also
included specific descriptions of how to increase or decrease the difficulty of
assessments in the “variable task features” attribute. A brief list provides an
indication of the common features of variable task features:




Vary the number of feat
ures expected in a computer program (Python,
framing a problem)



Vary the number of GUI library or data storage systems for consideration
(Python, generating a product)



Programming performed by individuals or teams (Python, generating a
product)



Few or many

user
-
defined Web site functions expected of the program (Ajax,
framing a problem)



Vary the Web development tools or functions needed by the fictional
user/client (Ajax, generating a product)



Programming from a worked example or “from scratch” (Ajax, gener
ating a
product)



Use familiar or unfamiliar structural collapse scenario (Engineering, research
and analysis)



Vary the number of people on the team (Engineering, collaborating to solve a
problem)



Vary the degrees of feedback provided to the students during

design testing
(Engineering, generating a product)



Presentations may be individual or team assignments (Engineering,
presentation)

As can be seen from this list, variable task features increase or decrease
problem complexity by adding or subtracting tasks
,
adjusting the emphasis on
technical considerations,
providing more or less
feedback, or
permitting
team
collaboration or individual performance
.



In practice, this was a step that seemed to require
the most
coaching. Instructors
generally were too busy
to sit down and pore over the details of
how to word
a
single prompt or to debate the merits

or limitations
of a particular stimulus.
This
step really goes to the core of crafting a good assessment, and not everyone
wants to engage in this craft. Nonethele
ss, i
n our work, we found this was the
one step that was often quite r
ich and helpful to instructors; it just required a little
kick starting to get moving. To get this step moving, think of what it takes to
engage students in the most esoteric or technica
l part of a class you have taught.
In our experience, it helped to show a concrete example of how a poorly worded
prompt malfunctioned in pilot testing, whereas another precisely worded prompt
worked perfectly. Using contrasting examples of poor prompts an
d good prompts
gets instructors’ attention, particularly if it’s their own prompt that’s not working

21
st

Century Skills
-

14





well. At that point, most community college instructors are usually highly
motivated to fix the prompt and the process begins to flow.


EC
-
AR Step 6
:
In technical education, the details matter. For this reason, when
we put together our assessment templates, we did not simply set forth rubrics for
scoring, we incorporated details from successful student responses to support
scoring for each in
-
class and
final test item. This kind of transparency is
particularly important for
complex tasks of technical knowledge application to real
world problems
. In pilot testing, we found subjective differences in how different
scorers rated student work.
We found that
p
roviding
detailed examples of
proficient student responses
fostered

consistency i
n scoring
.


We engaged in two pilot
-
testing phases. In the first phase, we worked with a
starter rubric based on our initial expectations. Based on the results from that
pilo
t study, we used the student responses to anchor the rubrics for the second
pilot test. To anchor the revised rubrics, first we reviewed all student responses
for all given items. These responses were grouped as high, medium, and low
performances. To revis
e the rubrics, we used student answers from each
performance level to adjust the content and skill elements we included in each
corresponding level of the rubric. In some cases, even the highest performing
students failed to achieve the initial goals set f
orth by the instructors. We
discussed those findings with instructors who made the final judgment about how
to
define

the high performance expectations.


We established a three
-
level rubric that runs from “above proficient” to “proficient”
to “below profi
cient.” In general, more elaborated and technically correct
responses for both problem solving and professional tasks attained an “above
proficient” score. Responses that are basically correct but lacked some
elaboration ranked as “proficient.” Incorrect r
esponses ranked as “below
proficient.”
An example of a scoring rubric can be seen in the assessment
template in Appen
dix
A
.


In practice, we found that instructors frequently resist looking at what other
colleagues have suggested as rubrics. They just like

to use their own. The
problem with this approach is that it lacks consistency and transparency, and that
closes the door to continuous improvement of instruction. Many times, we felt
that this
was

“sacred ground” that few instructors felt comfortable open
ing up to
outsiders. Nonetheless, seeing actual student work and the ways other
instructors have scored that work is tremendously revealing. For our research
team, this was the moment when we could really see what students were being
asked to do

and what t
hey were not being asked to do.
We strongly support
broad sharing of scored student work among faculty members teaching the same
content, and we strongly support sharing past examples of student work and
rubrics with students
before

they do their own assignments. We found that some
community college CTE instructors expressed strong views that they were
helping their students by not assessing their work, particularly the “non
-
technical

21
st

Century Skills
-

15





parts.” We disagree. In our view, the best thing
any instructor can do for a
student is to clarify the performance and learning goals in any classroom task. It
takes the guesswork and games out of the social contract between an instructor
and student. Sharing examples of past student work, presenting sco
ring rubrics
before

students do assignments, and even engaging students in constructing and
applying scoring rubrics are all essential and powerful tools for helping students
internalize common standards for improving their own performance.


Selected
Validity Study Findings

After the technician educators participated in the EC
-
AR process,
both instructors
and assessment designers created test items
. This section of the chapter

will
describe the content and construct validity of these new
final test ite
ms
. This
section provides evidence of
the val
ue of teaching and testing social
-
technical
skills in technician education classes.

Content Validity


Content validity analysis seeks to answer the question: Does what we’re teaching
and what we’re testing refl
ect valued learning outcomes in the
technical
field? It
answers this question by collecting content experts’ judgments of the
assessments on different dimensions and their

qualitative comments about each
of the assessment
items. We engaged at least one edu
cator and one industry
professional to review each
SBL

task and its associated assessments.


The experts reviewed our assessments according to their relevance to the
workplace and technician education, their alignment with the professional
behaviors and K
SAs set forth in the design patterns, and the accuracy and
usability of the scoring rubrics. We also asked only the education experts to
review the cognitive complexity of the assessments and their utility with students
of diverse abilities and backgrounds
. Finally, we asked all experts to review
samples of student work to determine if they provided evidence of the target
professional behaviors and KSAs.


In total, 22 expert reviewers (10 industry reviewers, 12 education reviewers)
provided feedback on 9 ta
sks and 50 assessments (18 in
-
class assessments, 32
final test assessment
s). We had them review the post
test items only, not both
pretest and posttest
, to minimize the time required to complete the review
.
Both
pretest and posttest prompts were structurall
y similar; they differed only in the
specific contextual details of the stimulus.
The expert feedback was solicited
through an online rating system that presented experts with links to: (1) online
scenario
-
based tasks, (2) downloadable representative
samples of the
assessment templates, (3) 2 samples of student in
-
class work (high performer,
low performer), and, (4) links to two online surveys. One survey posed questions
to the experts about the scenario
-
based tasks and the other,

about the
assessments
.
They were paid with $25 gift cards for their services.

The primary

21
st

Century Skills
-

16





findings of interest for the purposes of this chapter focus on the educator experts’
judgments of the cognitive complexity of the SBL assessments.


Experts generally endorsed the quality
of the tasks, assessments, and the design
patterns’ professional behaviors and KSAs as relevant to real world work and
technician education. Both sets of experts (i.e., industry and educator) had
numerous suggestions for improvement

too
. Overall, the indus
try reviewers
wanted
greater complexity

in both the
SBL

tasks and the assessments to reflect
the difficulty of the workplace better. The educational reviewers called for
clearer
specifications

of the learning goals for each task and
more succinct presentat
ions
of problems

in both the scenario
-
based tasks and the assessments.


We
also
asked education experts additional questions to judge to what degree
the final test items were sufficiently cognitively complex to give students the
opportunity to develop hig
her
-
level thinking skills and content knowledge. These
ratings may be broadly interpreted as feedback on how difficult educators
perceived the test items to be. Overall, educators saw most of the problem
solving and professional skills items as adeq
uately
challenging for students, and
they
gave higher ratings to
those items that measured
more social
-
technical and
social skills rather than the purely technical.


The educators gave high ratings of c
ognitive complexity to all
problem
-
solving

items in the envir
onmental studies, bioinformatics, and all but one of the
engineering
items
, with scores ranging from greater than 2.5 to 3.0 on the three
-
point scale. These
problem
-
solving

items addressed the professional behaviors
of research and analysis, framing a prob
lem, making inferences, and using tools.


The educators gave slightly lower ratings (2.2 to 2.5) on cognitive complexity to
the
more traditional assessment items that focused on “generating a product” in
programming

and engineering.
These tasks were the most
consistent with what
technician educators typically used in their final examinations
. They required
students to produce some operational programming code or to complete some
fundamental computations.


The educators also gave high
to moderately high ratings on the cognitive
complexity of the professional skills items

that were more focused on social skills
of communication and teamwork
. There were fewer final test items that tested
professional skills as some instructors declined to

test these skills in a summative
fashion. The experts gave high ratings (2.5 to 3.0) to engineering professional
skills items focused on teamwork and presentation, and gave ratings of 2.5 to
environmental studies professional skills items focused on prese
ntation.


The content validity results indicate the external experts, both from education and
industry,
gave high ratings to

the assessments that required application of
technical k
nowledge to real world problems. They also rated
the items measuring

21
st

Century Skills
-

17





socia
l
-
technical and social skills as more cognitively complex than those
measuring basic technical knowledge.



In practice, community college instructional leaders may find these results useful
for conveyin
g to technician educators the value of teaching “soci
al” and “social
-
technical” skills

not only technical. These
particular validity study
results
indicate that conveying
purely technical knowledge represents
only a starting
point

for CTE.

According to these results,

both technician educators and industry
p
rofessionals see the task of applying technical knowledge to real world
problems as much more cognitively complex than simply showing proficiency in
executing a technical procedure.

Construct validity


In a construct validity study, assessment designers
try to find out if item stimuli
and prompts are eliciting the target skills and knowledge. Th
ink
-
aloud interviews
provide one method for checking to see if the items are working as intended.


The construct validity analysis was conducted with

26 final test items
, engaging
an average of

four students
per item
, for a total of 104 think
-
alouds. This
methodology is used to reveal reasoning processes that frequently are
unavailable for analysis
(Clark, Feldon, van Merrienboer, Yates, & Early, 2008)
.
For each task, we sought to engage two
high
-
performing

students

and two low
-
performing students as nominated by their instructors based on classroom
performance. In a think
-
aloud, a

researcher briefly trains each student to
verbalize thoughts while completing a task (such as tying a shoe or solving a
simple math problem). Then, the researcher records the student solving
problems on the test and closes with
asking the student
a brief
set of
retrospective questions about task difficulty and quality. The think
-
aloud sessions
are transcribed and then analyzed.


For the purposes of this pilot study, we had a minimum of two raters review each
transcript using a coding system to categorize
students’ statements in the think
-
aloud as either providing

or failing to provide

evidence of the desired
knowledge and skills. In the code development phase of the work, researchers
and domain experts reviewed transcripts and created codes based on the KS
A
s
,
rubrics, and possible responses in the assessment templates. In the code
application phase, researchers coded the transcripts and then categorized
examples of student statements that confirmed and/or disconfirmed the use of
the target knowledge or reas
oning.

I
n cases where the KSAs were well defined in
the design patterns, raters could engage in a traditional inter
-
rater agreement
approach, where the goal is to reach 0.75% agreement. In cases where the
original KSAs were less clearly delineated, raters
jointly reviewed the transcripts
and reached consensus on how to categorize statements

based on alignment
with emerging features of student answers that reflected key understandings or
skills
.



21
st

Century Skills
-

18





This

analysis clarified the focal KSAs
of the SBL tasks

and
i
nformed revision of
the assessments
.
In this summary, we will report
on
the
descriptive findings of
the construct validity analysis, documenting the
nature of the skills measured
.
Overall, the construct validity study indicated that
the prompts functioned
fairly
well for
h
igher
-
performing students
, who

applied
social
-
technical
skills
in a more
coordinated and systematic

way than lower
-
performing students, who often used
a “trial and error” approach

to technical problem solving

(Hmelo, 1998)

and who
did not smoothly coordinate
the social and technical elements.


In practice, community college instructional leaders may use this information to
frame the overall

EC
-
AR approach for

improving technician education.
This

framing is
easiest to
explain

if we step back and consider the fu
ll scope of the
EC
-
AR process:
First, t
he EC
-
AR process
reveals the instructors’ ideas

about
how to apply technical knowl
edge to real world problems. Then it
defines the

technical, social, and
social
-
technical learning goals

that lead to improved skill in
solving real world problems
. Then it
develops assessments

for tracking student
development in meeting those learning goals. When this system is in place,
instructors are likely to see variations in student performance along
both the
technical and social dim
ensions similar to those reflected in

the construct validity
results below.


Specifically, they will see that some students have challenges coordinating the
social and technical dimensions. In particular, the lower
-
performi
ng students may
have two types of problems: (1) understanding basic technical procedures and
tools; and, (2) understanding how to use information from the social context (e.g.,
client needs, budget, client problems, team member expertise) to guide the
appl
ication of technical knowledge. The instructional challenge becomes not only
to
teach
technical principles and procedures,
but also

to
teach students how to
attend to

the different dimensions of client needs
that can constrain the solution

approach

to a

te
chni
cal problem
(e.g., budget, time frame),
and to teach students
how to identify gaps in their own knowledge that need to be addressed by posing
questions and seeking out the expertise of team members.



Problem Solving Skills

Framing a problem:

Python Programming
,
Ajax Programming
, Network Security

Framing a problem
emerged as a soci
al
-
technical skill that requires

technicians
to gather information about client needs and then translate those needs into
technical specifications.

For example, in
programming, a
client interview leads to
specifying use cases and user requirements
, which
in turn

help the programmer

to

code

or
select

Web widgets

to implement desired functions
. In a similar vein, a

client

in a Network Security

interview
describes

budgetary constraints
and a

list
of
mechanical
symptoms that require
the technician to diagnose the problem
and
recommend
a course of repair or improved

network

protection.



21
st

Century Skills
-

19





T
he
think
-
aloud analysis showed that
items designed to measure the skill of

fram
ing a problem required

the
students
to construct a
partial
mental
representation of the client’s needs, and then to identify the gaps in
their own
knowledge
that inform further questions to the client
.


The
assessment
item
s that

stimulate
d

this line of so
cial
-
techni
cal reasoning
typically involve
d

a scenario
-
based stimulus drawn from the real world, and then
a prompt designed to focus students on applying the specific skill

of asking
questions
of a client
. Here is an example
from
Network Security
:


[Real World Stimulus:]
“A health care agency that sends patient services
information to over 100 insurance companies recently had its data corrupted
by a hacker attack. This SQL injection attack allowed the hacker to create a
database administrator account

and corrupt the data, leading to inaccurate
billing and statements. Your system’s technical specifications include a
cluster of HP blade servers running in a virtualized environment on VMWare
Infrastructure 3


with
a
combination of Windows and Linux mach
ines. This
consolidation took them from 48 physical servers down to 8 physical hosts.
The web servers all run Apache and are designed for high availability. The
company does not run an Intrusion Prevention system, but has recently
purchased log analysis so
ftware to help with security management and
forensic investigations. The IT department has a
budget for materials and
supplies

of $10,000 a year and already plans hardware purchases of
$20,000.

a)

[Prompts:]
List 3 additional questions you would need to have

answered to understand the level of risk to the enterprise.

b)

Describe, given your remaining IT annual budget, the scope of security
solution the enterprise can currently afford.”


Evidence of
student
proficiency
in the skill was

reflected both in the numbe
r and
technical relevance of the questions posed by the student.
In Network Security,
for example,
o
ne high
-
performing student
generated several relevant questions
to determine if the intrusion was

still ong
oing, whether the data had already

been
restored,

and

what
the value of the lost data

was
. The student also noted

the
small size of the budget available
, a key constraint to the choice about the
technical solution
.
By contrast, l
ower
performing student
s tend
ed

to ask
questions that either focus
ed

on info
rmation
already
given in
the initial stimulus
or that provide
d

little, if any, relevant information to determining the technical
choices that
would need to be

made.


Generating a product:

Python Programming,
Ajax
Programming
, Network
Security

While some of the items designed to measure the students’ proficiency in
“generating a product” were more narrowly technical in nature, some involved the
interplay of the social and technical dimensions. One such
social
-
technical
item

21
st

Century Skills
-

20





type
that was
used in multiple domains
asked students to link an initial client
request to
the specification of a
technical
solution
.


In Network Security,

for example,

“generating a solution” was operationalized as
“the skill of generating a solution that applies just

the amount of resources
needed, within the constraints of the configuration and budget, and selecting and
assembling a configuration of components to adequately address a security
problem.”


[Stimulus:]
“Management at the medical agency described above h
as
decided to provide additional funding of up to $15,000 to the IT budget to
ensure the long
-
term security of the system.
[Prompt:]
Given the higher
budget, answer the following question: Briefly describe how you would
implement the solution with the sele
cted components.”


A

list of possible components

followed
:

a)

Which of the following network security components and measures
would you recommend?

o

Switch to a Linux
architecture

o

Run MySQL as a database

o

Use Perl for the scripting language

o

Run an Apache web
server

o

Purchase a second database and server

o

Mirror data to another machine

o

Purchase an Intrusion Prevention System

o

Outsource your data management and server management to a
third party

o

Switching to Oracle instead of SQL Server

o

Fund a part
-
time security st
aff position and hire someone
trained in use of open
-
source network security scanning tools
and IT audit tools to assess and monitor the security of the
system.


In this and other similar questions,
higher performing students were able to
discern technical

functions or requirements that met user requirements relatively
quickly whereas lower performing students failed to make these links.
For the
Network Security item, the
high
-
perform
ing students showed evidence of

checking off
technical
components that
“belong
ed

together,” such as data
mirr
oring and identifying LINUX, and the students

also selected components that
were within the
client’s specified budget. One of these students

provided a
detailed implementation description.
In this performance, we see a

smooth
coordination of social and technical components.
By contrast, one low
-
performing
student instead guessed at the various components that “belonged together,”
and did not systematically defend the security budget or
provide
an
implementation descript
ion
.



21
st

Century Skills
-

21





Research and Analysis:
Bioinformatics


The
assessment items designed to measure the skill of

“research and analysis”
required students to apply
“strategic knowledge of how to plan a research
strategy.


This
included

applying
knowledge of the relevant

databases

and
biological concepts
to guide research steps and search
queries
. It also included


knowledge of appropriate bioinformatics representations that aid in analyzing
viral characteristics and evolutionary patterns.”


The stimulus and prompt were
:


[Stimulus:]

“The human enzyme thiopurine methyl transferase (TPMT)
catalyzes the methylation of thiopurine compounds.

Thiopurine
compounds have been used successfully in the treatment of various
conditions like leukemia or autoimmune disorders. Unfortuna
tely,
thiopurine compounds have also been found to have adverse effects for
up to 25% of patients and extremely adverse effects for a small
percentage of patients.


You have been asked to locate information about the human TPMT gene
and the protein encode
d by this gene, and evaluate whether it would be
reasonable to test for specific forms of the TPMT gene when prescribing
drugs for conditions or diseases like leukemia or autoimmune disorders.




[Prompt:]
Describe a step
-
by
-
step plan for using the Nationa
l Center for
Biotechnology Information (NCBI) databases to determine whether or not
knowing a patient’s TPMT genotype would be useful to doctors when they
are considering treatments involving thiopurine compounds. Cite and
justify specific search methods,
databases, and ways of organizing data
that you would use.”


One

higher
-
performing student

mentioned

the search queries of TPM
T or
thiopurine compounds and
described a plan

to look

at
interactions between drugs
and conditions in the

OMIM browser
and resear
ch
in PubMed. The
other three
s
tudents mainly repeated the given information in the prompt, saying they would
search for TPMT, but they did
not
outline an overall strategy or line of inquiry to
the same extent
.


T
he real test
in this item
was to see how f
ar th
e students would move from the

initial pointers

(NCB
I, TPMT) embedded in the item

stimulus. Three out of the
four students simply repeated the given information. Only one moved beyond it.
These
think
-
aloud
responses indicate that
, at least for the
higher
-
performing
student,

the item elicited higher
-
order, meta
-
cognitive skills of planning a
research strategy us
ing particular databases (
OMIM, PubMed) and identifying
the correct search queries
, but that most students need more support and
gui
dance to
generate a plan for a database
search
.


21
st

Century Skills
-

22





Professional Skills


Collaborating to Solve a Problem:
Network Security

The target professional behavior was operationalized as “the skills of identifying
and discussing alternative solutions, building consensus with

the team, tapping
and sharing team expertise, and distributing and managing team work and work
products in a timely way.” We used the following
stimulus and
prompt:


[Stimulus:]
“The health care agency has noted an increase in the use of
electronic medi
cal records and so the company has decided to make
some upgrades in its services and functions. You have been asked to
work with several technical and medical billing specialists to prioritize what
changes will be made in the data sent to the insurance com
panies. While
you are most concerned with security, other team members are most
concerned with speed of the data feed, as well as the accuracy of the data
feed. You have one week as a team to set your priorities, though your first
two meetings have not y
et yielded any agreement.
[Prompt:]
Describe 3
-
5
steps would you take as the team leader to arrive at a list of priorities
agreed upon by the team.”


Only o
ne

higher
-
performing student demonstrated these skills by asking team
members to list advantages and

disadvantages of each priority and creating a
priori
ty list that was “achievable.” The other three

student
s

instead focused on
the technical details only.
These results show that students had problems
switching out of the technical mode to the more purely

social mode.


The findings from the construct validity tests show, overall, that the assessments
developed elicited coordinated
responses interweaving
social and technical
elements of solving a real world problem.
The results also show that the ability to

coordinate the social and technical was not evident in the students consistently,
which suggests that students need more opportunity to hone this skill.


As mentioned earlier, these results indicate the kinds of instructional challenges
that technician e
ducators will encounter when they shift from a narrow technical
focus in instruction to one that teaches both social and social
-
technical skills for
applying technical knowledge to real world problems. As can be seen, instructors
may find they will need to

support students in a range of skills. The example
about bioinformatics research and analysis skills indicates that instructors may
need to help students persist in tasks beyond what they may typically expect
from school assignments. The examples indicate

that students may need help
knowing when to switch from a technical focus and a social focus, and how to
use social input to guide the technical solution to a problem.



21
st

Century Skills
-

23





Challenges to the EC
-
AR Process

Engaging technician educators in this process was
not easy. In this section, we
will briefly summarize the range of challenges that emerged and describe how
we addressed them.


Although the KSAs

would seem to be the easiest to define, they were actually the
hardest

for instructors
.
For example, one instru
ctor claimed her

scenario
-
based
task was intended to teach students a long laundry list of
content knowledge,
skills
, and dispositions, and she repeatedly resisted efforts to narrow down the
list to a more realistic and measurable set of learning goals
. At

the other end of
the spectrum,
one instructor

claimed the scenario
-
based task was primarily a
confidence building exercise that had
little to do with content knowledge
.
She too
resisted efforts to measure student learning.
In between
were several

instruct
ors
who could list the
content knowledge and social
skills of importance, but found it
difficult to determine
how to assess them in a coordinated way
.


The
two
instructors who insist
ed

that
they

were

teaching either “everything” or

nothing
” were
essentially resisting the process of assessment

design
.

Upon
further discussion, it was revealed that b
oth were interested not so much in
measuring the impact of workforce training as in maintaining student
engagement in school. They saw assessment as an e
lement that threatened
student engagement. In our view, while poor assessment design can threaten
student engagement, quality assessment should have the opposite effect. These
concerns about student engagement are legitimate but addressing them required
m
ore specialized time and focus than our study permitted.

In the future,
instructors with these concerns might be encouraged to find ways to engage
students in developing the learning goals and rubrics by which they are judged.
Establishing student buy
-
in c
an make the assessment process more engaging.


Most
instructors welcome
d

EC
-
AR as an

opportunity to “unpack” what they

were

teaching
and to
understand better what they need
ed

to do to help their students
perform competently in the workplace.
Still it is ch
allenging to extract and
organize all this information. This is because t
echnicians
master

their fields
not

just
by
gaining discrete technical knowledge, but by playing progressively more
central roles in the core social, cognitive, and technical aspects o
f the work
(Lave
& Wenger, 1991)
. Such background does little to prepare technician educators
for the analytic processes of curriculum and instructional design
, nor does it
prepare them to develop what Shulman
(1987)

has called pedagogical

content
knowledge

not
just the technical knowledge, but the knowledge of what
students find difficult to learn, why,

and how to intervene to support their learning.
Defining KSAs is hard precisely because it re
presents the first step toward
breaking down the different components of technician educators’ deeply
integrated knowledge base
.


To improve collaboration with these instructors, o
ur team developed a KSA
discussion process over time, taking cues from instr
uctors about what supports

21
st

Century Skills
-

24





they needed. Past cognitive research indicates that experts find it difficult to
describe all the steps in any procedure partly because their expertise permits
them to “chunk” discrete steps into automatic “super procedures.”
To
avoid the
instructors’ “blind spots,”

we added some additional questions
to the EC
-
AR
interview to
elicit
from the
instructors

more details about what students were
learning how to do. For example, we asked
them to describe each step of a
technical
process

in detail so we could probe deeper into areas where they took
“shortcuts,”

and sometimes we acted

as naïve students who did not understand
how the instructors got from the proverbial point A to point B.


Another challenge we encountered emerged when

some
of the instructors
said
they
preferred to
avoid doing assessments of
teamwork and communication
skills.
A frequent comment was: “I don’t feel qualified to grade something soft like
communication skills. I’m a technical instructor.”
In our view, technician
educators
need to embrace more social and social
-
technical assessmen
ts of all forms
. It
helps to use rubrics to clarify the elements of a good social performance in
consistent ways. While t
he
formative approach to teamwork and presentation
works

well as an inform
al instructional technique,
it also can become a signal to
students that the social aspects of the technical work are not central to the
course
,

but “extra.”
In a high
-
stakes education context, where students are
making decisions about wh
ere to invest their effort, too often what doesn’t get
assessed doesn’t get learned.


A
nother difficulty
surfaced
in examining student work
. Too often students
provided

relatively brief responses. It appeared that the assessments elicited the
desired
skill
s

but not in a sufficiently elaborated manner. We might blame these
sparse responses on pressures of the test
-
taking context

except that a similar
lack of elaboration was also observed in some of the student in
-
class
assignments. It may be that students ne
ed explicit prompting in both in
-
class and
final test assessments to provide more detailed descriptions of their reasoning.
We recommend more exploration of alternatives to writing, such as
spoken and
annotated
hands
-
on performances.


In our work one
-
on
-
on
e and in professional development groups, we have found
the EC
-
AR process to be one that gets faculty talking
in thoughtful ways
about
how they teach. In our future work, we plan to engage faculty in action research
applying the ideas of EC
-
AR in their own

practice. We also plan to create an
online tool to create a DIY version of EC
-
AR.


The Promise and Challenge of EC
-
AR in Technician Education

The EC
-
AR draws

deeply on the instructors’ technical knowledge and experience

to inform new learning goals.
Using EC
-
AR,

technician educators
successfully
characterized the important learning goals
for 21
st

century workforce programs as
comprising a range of skills:
technical, social,

and social
-
technical
.
In reviewing
the
results
of our work
, we can see
the out
lines of

Vygotsky’s
theory
of

activity



21
st

Century Skills
-

25





(Wertsch, 1985)
.
In this theory, cognition is not in the head, but distributed across
the
tools, language, social systems, and knowledge representations

o
f the
everyday world
.
The EC
-
AR process helped elicit, document, and assess
what
technician educators see as the primary forms of evidence that their students can
coordinate
the
technical tools, discourses,
social norms,
and
representational
systems of eac
h technical field.


The validity studies confirmed the value of the
EC
-
AR process

as a basis for
developing aligned and sensitive measurements of student learning in a complex
technician education

setting.


-

The content validity study demonstrated that domain experts from education and
industry endorsed the quality and difficulty of the items overall. Industry experts
recommended greater difficulty and complexity, while education experts sought
clearer definit
ions of learning outcomes and strong usability of the scoring
rubrics.


-

T
he construct validity study confirmed that the characteristic task prompts and
stimuli effectively elicited meta
-
cognitive skills of planning and constraint
monitoring
. H
owever, the
study also indicated that additional

direct guidance or
prompting are

required to support students in imagining the dynamic processes
of social and real problem solving contexts. Only the highest
-
performing students
spontaneously engaged in such imaginativ
e reasoning without additional
prompting.

To keep pace with change in the technician workforce, t
he EC
-
AR process
shows that technician instructors see

the importance of going beyond
teaching
and testing the purely technical skills and incorporating social

and social
-
technical learning goals.

Our research also indicates the

EC
-
AR

process
can
serve as a first step to

broadening two
-
year technical ed
u
cators’ assessment
repertoire and a

fundamental part of transforming technician education to meet
the demands
of the 21
st

century workplace.




Acknowledgements

The author
s

acknowledge the invaluable contributions of the following
researchers in all phases of this project: SRI colleagues Geneva Haertel, Ruchi
Bhanot, Bowyee Gong, Ann House, Ken Rafanan, Thomas Gaffney, Britte
Cheng, Tina Stanford, Jasmine Lopez, Nathan Dwyer
, Yesica Lopez, and
Luisana Sahagun. The author also acknowledges the cooperation of the
community college instructors, four
-
year college instructors, and subject matter
experts who assisted in this work: Elaine Haight, Sandra Porter, Richard Swart,
Ralph
Lantz, Tessa Durham Brooks, James “Jake” Jacob, Kristin Sullivan, and

21
st

Century Skills
-

26





Lianne Wong. In addition, the author thanks the many instructors who
participated in our workshops both online and at conferences. Finally,

the author
extends appreciation

to the
industr
y and education experts who assisted in this
work: Jas Sandhu, Claudia De Silva, Susan Hiestand, Debra F. McLaughlin, Joe
Leizerowicz, Jill Mesonas, Warren Hioki, Mike Ogle, Jorge Rodriguez, Parthav
Jailwala, Bruce Van Dyke, Elizabeth Murray, Konstantinos
Belonis, Robert G.
Pergolizzi, Mulchand S. Rathod, and Erin Sanders
-
Lorenz. This material is based
on work supported by the National Science Foundation under DUE grant number
0603297. Any opinions, findings and conclusions or recommendations expressed
in t
his material are those of the author and do not necessarily reflect the views of
NSF.

References

Alverno College (1977).
Faculty Handbook on Learning and Assessment
.
Milwaukee, WI: Alverno College.

Clark, R. E., Feldon, D., van Merrienboe
r, J. J. G., Yates, K., & Early, S. (2008).
Cognitive task analysis. In J. M. Spector, M. D. Merrill, J. J. G. v.
Merrienboer & M. P. Driscoll (Eds.),
H
andbook of research on education
al
communications and technology

(Vol. 3rd). Mahwah, N.J.: Lawrence
Erlb
aum Associates.

Cohen, A. M., & Brawer, F. B. (2002).
The American community college

(4th ed.).
San Francisco, CA: Jossey
-
Bass.

Facione, P. A. (1990). Critical Thinking: A Statement of Expert Consensus for
Purposes of Educational Assessment and Instruction
. Research Findings
and Recommendations.

Hmelo, C. E. (1998). Problem
-
based learning: Effects on the early acquisition of
cognitive skill in medicine.
Journal of the Learning Sciences, 7
(2), 173
-
208.

Jonassen, D. (2000). Toward a design theory of problem s
olving.
Educational
Technology Research and Development, 48
(4), 63
-
85.

Lave, J., & Wenger, E. (1991).
Situated Learning: Legitimate Peripheral
Participation
. Cambridge: Cambridge University Press.

Messick, S. (1994). The interplay of evidence and consequences in the validation
of performance assessments.
Educational Researcher, 23
(2), 13
-
23.

Mislevy, R. (2007). Validity by design.
Educational Researcher, 36
(8), 463
-
469.

Mislevy, R., & Riconscente, M
. M. (2006). Evidence
-
centered assessment
design: Layers, structures, and terminology. In S. Downing & T. Haladyna
(Eds.),
Handbook of Test Development

(pp. 61
-
90). Mahwah, NJ:
Erlbaum.

Pea, R. D. (1993). Practices of distributed intelligence and designs f
or education.
In G. Salomon (Ed.),
Distributed cognitions

(pp. 47
-
87). New York:
Cambridge University.

Schank, R. (1997).
Virtual Learning. A Revolutionary Approach to Building a
Highly Skilled Workforce.

New York: McGraw
-
Hill.


21
st

Century Skills
-

27





Shulman, L. S. (1987). Knowl
edge and teaching.
Harvard Educational Review,
57
(1), 1
-
22.

Wertsch, J. V. (1985).
Vygotsky and the social formation of mind
. Cambridge,
MA: Harvard University Press.

Woods, D. R. (2000).
Helping your students gain the most from PBL
. Paper
presented at the

2nd

Asia
-
Pacific Conference on PBL, Singapore.


Yarnall, L., Toyama, Y., Gong, B., Ayers, C., & Ostrander, J. (2007). Adapting
scenario
-
based curriculum materials to community college technical
courses.
Community College Journal of Research and Practice,
31
(7),
583
-
601.




21
st

Century Skills
-

28





Appendix A.


Ajax Programming Task 1:Bicycle Club

Final Test Items


Question #1: TD 2

A small group of outdoors enthusiasts, the Everest Hiking and Rock Climbing Club,
wants you to help design an interactive Web site for its members.
The users want to
have access to a variety of functions, including the ability to follow local and
international hiking and rock climbing expeditions, find local hiking routes, look up
wilderness first aid information, and find links to other local outdoor

clubs.


Generate 3 questions for the Everest Hiking and Rock Climbing Club client so you
can understand how the Web site will be used and what the end user needs are.
Your answer will be scored for how well you anticipate the end user’s needs, and
桯w
w敬氠y潵⁡牴楣畬慴 y潵rⁱ略獴u潮猠瑯⁡潮
-
瑥捨湩捡氠c敲獯測敥pi湧⁩渠浩nd
瑨慴⁴桥⁣汩 湴⁩猠 潴o慮 敮g楮敥r.

Possible
Responses



Who are the different users of this system and what do you want
them to accomplish with it?



What kinds of inputs do you
want users to enter into the system?



What criteria will users want to search on?



What outputs do you want to be visible to the users?



How sophisticated are your users?



Is there an existing website that is something like what you want?



How are these users
getting this information now?

Scoring Key

Above proficient
(Score: 2)
= formed at least 3 questions whose answers
would identify and articulate what end users or client want
in the product, show attention to primary technical concerns
as they affect the
end user, and clear articulation.

Proficient
(Score: 1)
= formed questions at least 1 good question whose
answers would address an end user or client need, show
some attention to primary technical concerns as they affect
the end user or client, and somewha
t clear articulation.

Below proficient

(Score: 0)
= end user or client needs are not included in
any questions or addressed in the questions, attention only
to pure technical concerns, and/or unclear.

Knowledge and Skills Assessed:
Skill of identifying an
d asking appropriate
questions to specify user requirements. This skill is part of
“Framing a problem
and identifying design constraints.”

Suggested Form and Procedure:
Paper and pencil test administered by instructor