Chapter 7: Analyzing the Learners

aspiringtokAI and Robotics

Oct 15, 2013 (3 years and 5 months ago)

51 views

Ferdon, EDTECH503 (SU09)


1

Chapter 7: Analyzing the Learners


Analysis of Learners



Understanding target audience is essential for instruction to be used and usable.
Learner
-
centered environments

-

careful attention to
learner
skills, knowledg
e, beliefs and attitudes
.



Diagnostic teaching

-

T
horough understand
ing of learner’s conceptual/
cultural knowledge.



Some aspects
easily quantifiable;
others are harder to define.

H
uman needs:

determine
content, delivery,
how receptive
.

Captive audience or will
ing volunteer?



Determine

learner
motivation

(intrinsic/extrinsic);
help
s

make instruction
more appealing.


Popular Approaches to Analyzing Learners



Mager



W
orking document, no need to for
mat/organi
ze
, describe both dif
ferences and
similarities. A
nalyze
: age, gender,
physical characteristics
, education, hobbies, interests,
reading ability, organizational
membership, prerequisite skills
,

reason taking course
,

attitude
about course, biases and

beliefs,

need
-
gr
atifiers, terms

and

topics to avoid
.



Heinich, Molenda, Russell, and Smaldino



1) General characteristics
-

help make informed
decis
ions about content and delivery
,
2)
E
ntry competencies

-

prior
knowledge necessary for
success
, and

3)

L
earning styl
es

-

attractiveness
&

effectiveness of instruction
.



Dick, Carey and Carey



entry behaviors, prior knowledge,
attitudes toward content/delivery
,
academic motivation, educational and ability levels, learning preferences,
attitude towards
group providing
instruction
, group characteristics.
Tremendous impact

on instruction.



Smith and Ragan



L
earner

characteristics
:

1)
stable similarities



same/unchanging
, 2)
stable differences



different
/unchanging

(p
hysical, personality traits)
, 3)
changing
similarities



human development, processes same for all people, 4)
changing differences


change with time, i.e.
knowledge, valu
es, skills, beliefs, motivation
.



Morrison, Ross and Kemp



P
ersonal and social characteristics: age/maturity level,
motivation and attitude,

expectations and aspirations, work experience, talents, dexterity,
working conditions (noise, weather, etc.).


Learner Analysis Procedure



Universal design of

education


Accommodate all,

embed

in instruction,
diverse needs.



Typically two types of
documents: 1) Chart of
learner characteristics

data (challenged,
average, gifted), handy reference but clinical. 2) Fictitious profile of typical learner


info
less accessible but more

human perspective. C
hoose one,
or do both then compare.


Evaluation of

the Success of a Learner Analysis



Summative


C
ompare learner analysis with what learners say about instruction.



Formative


M
ember check
, compare with someone who designed instruc
tion for a similar
population a
nd/or, check with member of target audience.


Synthesis:

A thorough understanding of

learner characteristics and needs allows the designer to create
instruction that will be effective, efficient, and appealing.
When it comes right down to it, there
is no one “best” way
to conduct a learner analysis
and
,

in some regards
,

it’s just your best guess.

The instructional designer must determine potential problems with various approaches to learner
Ferdon, EDTECH503 (SU09)


2

analysis and select the one that will work best in that particular situation
.
Ferdon, EDTECH503 (SU09)


3

Chapter 8: Developing Instruction
al Goals and Objectives


Instructional Goals and Objectives



Goal
-

Broad or abstract
;

not specific or observa
ble features;

intent of instruction
. C
an be
topic for subordinate instructional objectives.




Instructional objective



S
pecific;

how instruction will affect learner.

Focus on what learner
will do (instructional
systems

design
). Type
depends

on
goal (performance, conceptual).


Popular Approaches to Setting Goals and Objectives



Mager



most common approach,
performance objectives
: ac
tion, conditions, criterion.



Dick, Carey and Carey



determine goals/objectives
either by

SME or
a
performance
technology approach

(data gathered during needs analysis).

Conduct a
s
ubordinate skills
analysis

to determin
e performance objectives.



Heinrch

et al
-

“ABCD” approach
: audience (describe learners), behavior (expectations after
instruction), conditions (setting and circumstances), degree (acceptable standard).

Objectives
in four domains:
cognitive, affective and psychomotor domains

(more traditio
nal) and
interpersonal (people centered).



Morrison, Ross and Kemp



Terminal objective
: major objective, overall outcome.
Enabling
objective
s



observable behaviors, indicate achievement of terminal objective.


Goal Setting/Translation of Goals into
Objectives



Goal setting is critically important step. Fairly easy with no past practice.

Goals

for ID
project often influenced

b
y tradition, politics, decision
maker

s perspective.



Functional analysis system technique (FAST)



One way of working backwards
from
existing interventions to arrive at overarching goal. Generate increasingly abstract verb
-
noun
pairs, (brush teeth, maintain hygiene, maintain health); helps determine why various
activities are important.



Clear objectives, which are observable or mea
sureable, make it easier for the design team.



Wording of levels (know, comprehend, apply, analyze, synthesize, evaluate) in
Bloom’s
taxonomy

is a good place to start when

developing

instructional objectives.



Gagne’s hierarchy of intellectual skills

also wo
rks well as foundation for writing objectives
.
Five types of l
earning

outcomes
are
divided into
three
categories:
declarative knowledge

(verbal

information
),
procedural knowledge

(motor

skills
, intellectual

skill
, cognitive

strategy
), and
affective
knowledge

(attitudes).



Evaluation of the Success of Instructional Goal Setting and Objective Specification



Take time to carefully consider how well learner and task analysis has driven creation of
goals and objectives. Continue to compare goals and objec
tives against each other to make
sure they support
one

other, are realistic
,

and are clearly articulated.


Synthesis:

Learner and task analysis should drive creation of goals and
objectives
. Well
-
stated goals
(outcome of instruction) and objectives (outcom
e of instructional activity) provide focus and
drive instruction.
Goals
are
always necessary, but objectives may or may not be, depending on
how learners will be evaluated.
Levels by
Bloom and Gagne
provide common language

for

writing of measureable, observable objectives.

Ferdon, EDTECH503 (SU09)


4

Chapter 9: Organizing Instruction


Scope and Sequence of Instruction



Scope and sequence

-

Scope = amount of information; places restriction on
topic
. S
equence
= order of pres
entation;

isolates knowledge, relates to

whole.




K
-
12:

Curriculum


entire scope of what will be learned, measured in years.
Units



large
sets of activities, measured in weeks/months.
Lesson plans



specific day
-
to
-
day activities.



University:
Program of study


courses leading to

degree.
Syllabus



scope and sequence for
single course.

“S
it down” class or
distance learning
.

Business:
competencies or certificates.


Events of Instruction
/Continuum of Learning Experience



Posner



Macro
:

education levels (i.e. elementary, secondary). Micro
: relationship among
parts of
lesson;
events of instruction



activities within lesson (intro, body, concl., assess
).



Robert Gagne

nine events of instruction: gain attention, inform of objective, recall
prior
learning, present stimulus, provide guidance, learner performance, teacher feedback, assess
performance,
retention and transfer. Create at least one instructional activity for each event.



Smith and Ragan



T
wo aspects: 1)
supplantive

(supplied by ins
tructor
) 2)
generative

(generated by student). Instructor present
s

activities/info
, learner must actively
engage
.



Continuum


O
ne end concret
e real
-
world activities;
other
end
contrived abstract activities.



Dales
’ cone of experience

categorizes
learning
continuum
. Base


direct/real
-
world
, middle


video, pics and photos, top


visual/verbal symbols.



Bruner



Learning experiences

are 1)

e
nactive



things we know but
don’t have
words/
images for
; synthesize/
apply
;

2)
Iconic



explanation
via

sym
bols or

represen
tations
(middle of Dale);

3)
Symbolic



sounds/
signs not directly associated with event (top of Dale).



Instructional Delivery



Classroom teaching


T
raditional
;

gr
oup students by experience o
r
ab
ility; r
oots in writings
of English educator
Joseph
Lancaster

(1778
-
1838)
. Teacher
-
directed, teacher feedback.



Programmed instruction



Students work

independently, feedback from instructional media.
Instruction is the focus, not the delivery method (technology, print, etc.)
.



Distance education


synchronous or asynchronous, may be programmed instruction. Often
uses
a
learning management system

(LMS)
, computer
-
based home for the class/learners.



Immediate feedback



1)
Student and instructor; revise

based on observa
tion/
communication
,
and 2)
S
tude
nt only


programmed instruction;
less personalized;

large groups of learners
.


Instruction Activities in Noneducational Situations/Effective Instruction



P
rovide supports and
job aides

(
i.e.,
instructions on fax machine),

not

learning or training.



Effective instruction will i
nclude: 1) C
ontent for e
xtra support and challenge, 2) Learning
styles/types, 3) A
ppropriate activities for instructional event
s, 4) Job aides
-

focus on
concepts, and 5) D
elivery method to suit individual’s or organization’
s ne
ed
s.


Synthesis:

Needs, task and learner analysis
will
determine goal
s and objectives, which

are the basis for
determining
scope and sequence

of learning activities.

Choice of

delivery method is
important
decision and affects instructional events

(variety,

concrete to abstract) as well as job aides.
Ferdon, EDTECH503 (SU09)


5

Consideration of content, learning style and needs of individual and organization are key.
Ferdon, EDTECH503 (SU09)


6

Ch.

10:

Creating Learning Environments,

Producing Instructional Activities


Development of Instruction



Recommend s
pecific activities,
prescriptions
, based on needs, task and learner analysis as
well as goals and objectives.

This is the only part of the process that causes learning to occur.


Learning Environment



Learning environment


where instruction occurs.
C
entere
d around learner
, knowledge,
assessment (feedback/revision), community (learn from each
other); n
ot mutually exclusive.



Hannafin, Land and Oliver



two learning environments: 1)
Directed
:

specific objectives,
structured activities
, develop
s

same/similar knowledge, skills,
attitudes
;
passive
; 2)
Open
-
ended
:

problem
-
based,
no

specific objectives, divergent thinking,
constructivist
.



Cruickshank

et al


Direct teaching
: instructor at center;

directed learning environment.
Indirect teaching
:

instructor
on periphery, support and guide;

common in
open
-
ended LEs
.



PBL


open
-
ended, refine understanding in meaningful way.
Hannafin et al

say include
enabling contexts (p
roject p
ers
pective), resources, tools (engage, manipulate),

scaffolding
.



Simulat
ions


evolving case studies, long history, play out situation without risk (astronauts).



Instructional games

are a
form of simulation.

Practice/refine

skills/
knowledge, identif
y gaps
,
review/
summarize,

new concepts. A
ctivate
s

learner,
greater interest
,

re
laxed atmosphere
.


Research and Support for Instructional Practices



Joyce



M
eta
-
an
alysis of

effective instruction, four models/systems of instruction: personal,
information processing, behavioral, social.



Ellis and Fouts



Look
ed

at resea
rch on three
levels:

I


ba
sic, pure research/
lab.

II


comparative, determine effectiveness of instruct
ion/
school

setting.
III


evaluative, lar
ge
-
scale implementation.
12 educational innovations
,

mixed research support
,
but

clear winners
.


Activities Based on Proven
Effective Strategies



Marzano, Pickering, and Pollock



Classroom Instruction That Works
:
M
eta
-
analysis
resulted in nine effective
, research
-
based

learning str
ategies
.

1)
Similarities/
differences


identify, classify,
graphic organizers
,
scattergrams
,
analogy.

2
)
Summarizing

&
note taking


10&
2, pair share, compare
/revise
, reflective writing
. 3
)
Recognize/
r
eward effort
. 4
)
Homework
&

practice


clear

purpose,

feedback
.

5
)
Non
-
linguistic representations


i
nfo

storage

linguistic

&

imagery.
Illustrations
, animations,
graphic organizers
, sound,
kinesthetic.

6
)
Cooperative learning


Johnson and Johnson
’s

five elements, well
-
structured,
clear goals.
7
) Set objectives,
provide feedback
. 8)
Generating/
testing hypotheses


explain
thinking
.

9)
Questions, Cues and
Advance Organizers



retrieve
knowledge,
analyze
.



Advance
org.
: skim content, tell s
tory, agenda/
outline, graphic organizer, writing prompt.



Just
-
in
-
time instruction



direct instruction based on immediate needs, is adaptive, instructor
as facilitator providing instruction as needed.


Synthesis:

This is the heart of the instructional designer’s job. Effective instruction
should

be research
-
based and include a variety
of learning environments and activities to meet needs of target
audience.
The designer must think beyond his own experience
s

and preferred learning style

to

meet the needs of a
diverse body of learners.

Ferdon, EDTECH503 (SU09)


7

Chapter 11: Evaluating Learning Achievement


Evaluation, Assessment,
&

Measurement
/Purpose

of Evaluation
/Goal of Learner Eval.



Three major types of evaluations: learner, formative, summative.



Assessment



procedures/
techniques for gathering data.
Measurement



data collected
.
Instruments



devises us
ed to collect data.
Evaluation



process for determining success.



Evaluation
s provide

data analyzed to determine overall success
of the instructional design
and determine
any
changes to be made.

Learner evaluation



Based directly on goals and
objectives,
measures

impact of instruction on achievement.


Development of Learner Evaluations



Validity


valid if it helps determine if learners met objectives. Face validity


experts judge
design of evaluation. Content validity


SMEs

look at test items to judge appropriateness.



Reliability


Evaluation is reliable if it will provide similar results whenever administered.



Criterion referenced



learne
r
judged on competency related to specific criteria; provides
useful information.

Norm referenced


learners compared; info does not

improve instruction.



Determining if change is

knowledge, skill or attitude

(action/verb in objective) drives test
development.

Millman and Green

and
Hopkins

suggest
questions to guide tes
t development
.



Objective Tests
-

True
-
false (facts, minimal use/
guessing), multiple
-
choice (considered most
useful), matching (appropriate for homogeneous lists), short
-
answer (
no

guessing).



Constructed

Response Tests


E
ssay (higher levels of cognition,
but
unfair eval
uation for
ELL and
those with poor writing skills).



Performance assessment


evaluate

skill change. D
irect testing (evaluate outcome)
,
performance ratings (
actions not product; checklist, rating scale, rubric), observations and
anecdotal records (time cons
uming, subjectivity issue),
portfolios (products/progress
).



Change in attitude
is the
m
ost di
fficult to evaluate
; highly subjective. O
bservation/anecdotal
recor
ds (same problems), surveys/
questionnaires

(more feasible, open
-
ended, fixed response,
rating sc
ales)
, self
-
reporting inventories

(yes/no
statements
)
, interviews

(widely used, wide
range
of topics, clarify/expand
)
.

Problems:
social desirability
, self
-
deception, semantics.


Implementation of Learner Evaluations



Pre
-
assessment


enables

adjust
ment of

instruction for greater effectiveness.

During
instruction
-

useful if instruction over multiple sessions


quick write, random call.
Summative
-

overall
learner
success.


Determination of Success of Learner Evaluations/Instructional Designer’s Role



Succes
sful evaluation
s provide

enough data to make decisions about possible remediation or
revision of instructional materials.

Instructional designer
responsible for plan of action and
execution: 1) analyze goals/
objectives, determine outcomes, 2)
type of
evalu
ation
s

needed, 3)
choice of
asse
ssment techniques
, 4) assist with data analysis and steps to take as
a
result.


Synthesis:

F
ormative
and summative
evaluation of
the
design’s success is necessary to ensure that
instruction meets both learner and client needs.
Valid and reliable learner evaluation is necessary
to gauge effectiveness of instructional events.
Choice of assessment technique must be based
Ferdon, EDTECH503 (SU09)


8

upon the

expected
learner outcome. Use of

evaluation data to improve learning is key.
Ferdon, EDTECH503 (SU09)


9

Ch.

12: Determining Success of the Instructional Design Product and Process


Formative
Evaluation



Formative evaluation



used throughout ID process, g
ather data for feedback. I
mprove
s

instruction and shows the client design progress. Three approaches to formative evaluation:



Morrison, Ross and Kemp



based on Gooler
:
1)
Planning


purpose
,
audience
,
issues

(questions to be answered),

resources
,
evidence
. 2)

Conducting
-

data
-
gathering
techniques

and
analysis
. 3)

R
eporting
-

includes

conclusions/r
ecommendations
.



Dick, Carey and Carey


Phase 1/
one
-
on
-
one:
address omissions,
learner performance and

reactions.

Phase 2
/small group: apply changes,

representative sample
. Phase 3
/
fie
ld trial:
a
pply changes,

larger audience, administered by instruct
or. SME
address
es

accuracy.



Rapid Prototyping (
pro
totypes evaluated by experts/
users)
and
usability t
esting

(observe
product in action
,
note
problems inherent in design
) are also forms of formative
evaluation.


Summative Evaluation



Results in a formal report that gauges success of instructional intervention. Goal


determine
if instruction resulted in desired change and if client’s goals were met.



Program evaluators



experts
who
conduct
summative e
valuations
; complicated process.



Kirkpatrick

(most cited)
:

1) R
eactions

-

attitude survey
,
2) L
earning

-

pre
-
/
post
-
tests
,
3)
T
ransfer

-

difficult to assess;

can be

any point
,
4) R
esults

-

measureable
:
increas
ed sales,
fewer accidents, etc
.
*
Begin with
reactions,

continue

as time/$ permit
; best to use all four.



Smith and Ragan


1)

Determine goals of evaluation
, 2) I
ndicators of

success
, 3) O
rientation
of evaluation


objective (quantitat
ive, reliable) or subjective (
description, can be biased),

4)
Design

evaluation (data collected how, when, conditions), 5)
Design evaluation measures
(learning transfer, outcomes, attitudes, cost),
6) Collect data, 7) Analyze
, 8) Report results.



Morrison, Ross and Kemp



Three major areas: program effectiveness (major issu
e),

efficiency and costs. S
even steps similar to Smith and Ragan. 1) Specify objectives, 2)
Evaluation design
ed

for each objective, 3) Data collection instruments/procedures, 4) Carry
out evaluation, 5) Analyze results, 6) Interpret results, 7) Disseminate

results/conclusions.



Dick, Carey and Carey



Very different
; two phases: expert judgment (potential to meet
needs) and field trial (document effectiveness with target group).


Group Processing: Evaluating the Instructional Design Team



Team member

roles


SME, programmer, graphic artist, content editor, Web
-
deve
loper,
media developer. Instruc
t
i
onal designer

may also be project manager.
Group processing


formative evaluation, group members’ reactions on their work and interactions.



Johnson, Johnson

and Holubec



Group processing helps 1) improve approach to task, 2)
i
ncreas
e accountability, 3) members learn from each other, 4) e
liminate problems.



S
hould include: feedback, reflection, goal improvement, and celebration.


Synthesis:

Regardless of the scope of the project, formative evaluation
should be built into the process
.
Instruction will be most ef
fective when improvements are

made throughout the design process.

S
ummative evaluation is more
complex

and requires

skill and experien
ce generally beyond
the
Ferdon, EDTECH503 (SU09)


10

scope of instructional designers

(
also,
conflict of interest)
.

Effective client communication
ensures that the project and evaluation will meet instructional goals and client needs.