Student modeling for learning assessment using Bayesian networks ...

tripastroturfAI and Robotics

Nov 7, 2013 (3 years and 9 months ago)

118 views


1

Student modeling for performance assessment using
B
ayesian network on web portfolios

C
hen
-
Chung Liu
,

Gwo
-
Dong Chen,
Chin
-
Yeh Wang,
Ching
-
Fang Lu

Institute of Computer Science and Information Engineering,

National Central University, Chung
-
Li, 32054, Taiwa
n

Email:{chen, christia, Chinyea}@db.csie.ncu.edu.tw

Tel: 886
-
3
-
4227151 ext 4504

Fax: 886
-
3
-
4222681

Abstract

Web based curriculum development allows students to develop their learning portfolio and interact
with peers on a web learning system
. Th
is system

c
ontains web portfolios that

record in detail students’
learning activities, peer interaction, and knowledge progress.
However, t
eachers can not easily
diagnose students’ learning processes
and regulate effective strategies according to
student
activity
pe
rformance

without
the model of activity performance on web learning systems.

This investigation
proposes a
novel
methodology
that employs

Bayesian network

software

to
assist

teachers in efficiently

deriv
ing
and utiliz
ing the

student model
of activity perfo
rmance

from web portfolios in an online
manner
.

Teachers can assess and
diagnose

performances with the model of learning
activity

on web
learning systems
. The model of activity performance also
allows

teachers

to

manage various activity
performances in web

learning systems
so that

desired
strategies can be achieved to promote learning
effectiveness.


.
Introduction

Portfolios are not only used to assess student

s learning performance but also promote learning and
the
effect
iveness of
instruction.
Borko et al.
[
5
]

utilized portfolios to encourage learners to reflect on their
l
earning experiences.
Duffy et al.

[
15
]

also used portfolios to encourage critical thinking and decision

2

making. Conventionally, teachers monitor students’
learning
processes by paper records as well as
closely
monitoring

students’

behaviors in discussion, homework, and projects. Web based curriculum
environment allows students to develop their learning portfolio and interact with peers on a web
learning system. Such web learning systems can record learning
activities

in web logs. T
he learning
activity

records of student portfolios are essential for teachers to assess learning performance and
improve instruction

[
29
]
[
6
]

[
10
]
. However,
teachers can not reli
ably assess and predict student
learning progress without modeling the learning
activity performances

o
n
web learning systems,
accounting for why teachers can not easily diagnose students’ learning processes
and regulate effective
strategies
to encourage e
ffective learning

according to
student
activity performance
.

Web based courses are numerous and allow students to perform various learning activities in a virtual
classroom
[
20
]
, such as reading, messaging, conferencing, accessing

documents, and participating in
interactive activities.

Regardless of the functions that a web learning system provides, teachers require
the model of activity performance to regulate effective strategies and diagnose students’ learning
processes.
The stu
dent model of activity performance describes
how

various activity performances

are
related

such as discussion performance and
ability to grasp

a learning concept.
For instance, teachers
must estimate
participation in discussion

from
other
observable activi
ties
to fulfill peer support

[
2
]

in
the web learning system. Teachers must also know how various activity performances are related to
detect the problematic learning processes from observable activity performances. However, teache
rs in
a web environment can not interact with students face to face. Under this circumstance, teachers must
closely monitor students’ learning
activities

and utilize an observed phenomenon to modify strategies
and intervene in students’ learning processes.

Teachers normally assess and diagnose students’ learning activities based on the student model
developed from teachers’ experiences. The student model includes all aspects of the student’s behavior
and knowledge that impact their performance and learning
[
35
]
. For instance,
Burton
et al.
[
7
]

constructed
a

student model of learning procedural skill to diagnose students’ misperception in the

3

learning skill
s
.
Fletcher

et al.
[
18
]

al
so developed a student model in computer
-
based instruction to
understand students’ learning status in computer aid instruction. Similarly,
teachers can understand the
model of activity performance on web learning systems by analyzing web portfolios because

these
portfolios record in detail students’ learning activities, peer interaction, and knowledge progress.

Related studies
[
7
]
[
12
]
[
13
]
[
33
]

have
expl
ored
how student
s

think and behave

in order to

diagnos
e

and maximiz
e the

learning effect in intelligent tutoring systems. Teachers can know how individual
students learn concepts or skills via the developed student models. The student models of intelligent

tutoring systems focus on the state
s

of knowledge and skill of individual students. However,
a

web
learning system allows
many

students to learn and interact with peers on the Internet. Teachers may
enforce group learning strategies such as collaborative
learning

[
22
]

and peer support

[
2
]

to promote
learning effect. In doing so,
a
teacher can intervene
in the
student learning process
by closely

observ
ing
their

behavior in peer interaction. Therefore, in a
ddition to domain knowledge of the learning concepts
addressed in
individual

student models,
teachers must know from

the student model
how students
behave and
perform

in portfolio development processes and how they interact with peers on web
learning syste
ms

to effectively interact with their students

on the
W
eb
.

Teachers construct
a

student model by closely monitoring students’ learning processes. For instance,
Mitrovic et al.
[
25
]

proposed a methodology to develop a student model

of problem solving
by
engaging students in explicit dialogues

about problem
-
solving decisions. However, teachers can not
develop

the students’
activity

model via interactive observation

in a web learning system
.
Although the
web learning system can log ma
ny behavioral records
in web portfolios, inducing the relationship
structure among various
activity

records and how these records affect each other is extremely difficult.
Such difficulty is owing to that
these activity
records contain a large amount of va
rious types

of
complex data

and
interweave

with each other
. Consequently,
teachers can not easily derive the model
of
learning activity
on web learning systems that impact learning performance.


4

1.1 Student modeling from

web portfolios

Deriving
the
student
model of activity performance on web learning systems

for diagnosis and strategy
regulation
involves collecting portfolios,
discovering causality structure

among activities, as well as
inferring students


performance from the observable activities.
While u
sing
web
-
based learning systems
,
teachers have difficulty in
obtain
ing

and
using the
student model for strategy regulation and diagnosis
purposes
. These difficulties are summarized in the following three problems.

The first problem teachers encounter when
attempting to obtain and using the student model in web
learning systems stems from that they normally assess and diagnose
learning
processes by paper
records as well as interactive observation.
T
eachers can not

obtain

the

records of the learning
process i
n
which
students

develop their
learning
products and how students interact with peers. This is despite the
fact that

portfolios
must
contain learning
product
s

as well as

the
p
rocess
es to develop these products

[
6
]
.

Recent web soft
ware such as Microsoft Office allows
students
to

develop their
own
portfolios and
publish the
m

on the
W
eb.
Although capable of obtaining students’ learning products such as homework
and project reports
, teachers must
continuously

monitor learning activiti
es

answer questions such as


How many times
do

student
s

answer others


question
s
?

,


When
do

student
s

normally

submit
t
he
i
r
homework?

, and


How
often do

student
s

read discussion opinions?

.
Therefore, t
eachers
can

not
eas
ily

collect
activity history

an
d utilize these behavioral records to model students


learning
performance in web learning systems
. This is commonly referred to as the
portfolio collection

problem
.

The second problem is the lack of support for analyzing the causality relationship among v
arious
learning activities and performances. Teachers can
predict student learning
performance and adapt
proper strategies at an early stage by analyzing

available learning factors

[
36
]
. To derive a student
model that impacts stud
ent learning performance, teachers must answer questions such as “What
behavioral factors affect student learning outcomes with respect to various types of activities in a Web
learning system?”, “What is the relationship among various learning concepts? Do
es any concept
strongly affect the learning outcome of another concept?”, and “How do various types of learning

5

activities affect each other? Does a student who often answers others’ questions tends to submit
homework on time?”. Our previous research
[
9
]

analyzed student behaviors
in

web learning systems
with a decision tree software. However, inducing the student model involves determining the causality
structure among available records and performance in web portfolios. Decision
tree analysis can not
effectively induce student model without a clear known causality structure.
Therefore,
a teacher
requires

convenient support for inducing the causality relationship and impact factors that can explain
how various student learning beha
viors and performances are related. This is commonly referred to as
the
causality discovery

problem

The third problem teachers have
is

answering probability questions such as “If a student often answers
others' questions, what is the probability of
t
hat s
tudent achieving a good learning outcome for a
particular learning concept?” and “If students achieve a poor learning outcome for a particular concept,
how will they behave in the discussion activity?” To answer these questions, the teacher must compare
th
e learning records of web portfolios for current students with similar web portfolios of previous
students. Teachers can also apply the student model developed from previous

web portfolios to
accurately predict the behavior and outcomes of current students
. However,
as the available records in
web portfolios have a sophisticated causality structure, teachers have difficulty in inferring student
performance, commonly referred to as the
performance inference
problem, because estimating a
student’s behavior or

performance involves complicated computation of probability.

1.2 Purpose of study

This investigation

presents a novel methodology to assist teachers in deriving and using a student
model in a web base learning system. The methodology utilizes
data miming
[
17
]

and machine learning
[
24
]

techniques
on web portfolios to solve teachers’ problems in assessing and diagnosing student s’
learning activities in the web learning system. Recent web servers such as Mi
crosoft Internet
Information Server provide a log mechanism to record student learning behavior in web learning
systems. The web learning system does not need to be specially designed while considering the

6

recording of learning behaviors since the web log
is a

standard feature of most commercial web servers.
Therefore, the web log can solve the
portfolio collection

problem because teachers can easily collect
activity history from it.

Bayesian network software

[
8
]
[
14
]
[
30
]
,
a

data miming and machine learning technique,

is employed
herein to
facilitate teachers not only in
reasoning the
causality relationship

among various activity
performances in web portfolios, but also representin
g the relationship in probabilities for estimating
students’ performance. Such probabilistic reasoning can
analyz
e student performance because
it

provide
s

a mathematically sound means of handling an uncertain
situation

in assess
ment
[
32
]
.
Many
Bayesian network
facilities such as Microsoft Bayesian Network
[
26
]
, Netica
[
27
]
,

Noetic

[
28
]
, and
Bayesian Knowledge Discoverer

[
3
]

are
commercially available to help teacher
solve the
causality
discovery

and
performance inference

problem.

With this methodology, teachers can go online and
efficiently
use web portfolios to

derive and apply the model of students’ learning behavior
in

web
lea
rning systems.

Teachers can assess and predict student learning performances with the model of
students’ learning behavior. Doing so allows them to diagnose learning processes and closely monitor
strategies to enhance learning.

The rest of this paper is or
ganized as follows.
S
ection


presents our web
-
based learning
environment and the method employed herein

to derive a student model from web portfolios. Section


describes how Bayesian network software can help teachers analyze causality among students’
b
ehavior and performance in the learning environment. Section


illustrates the architectures and
procedure deemed necessary to obtain and use the student model for
strategy regulation

in a web
-
based
learning system. Conclusions are finally made in Section

.



7


. An illustrative web portfolio system of using Bayesian
Network

This work presents a web
-
based portfolio system to elucidate teachers’ role in inducing and using
student model and the method used to assess students’ learning performance. The web por
tfolio system
was
designed to assist
fifty

four

computer science students to learn an undergraduate course entitled
Introduction to Computer Science
at National Central University (Chungli, Taiwan). After instruction
in a conventional classroom, students m
ust develop portfolios. The web portfolio systems allow
students to record different types of reports including, learning notes, experiences, homework,
self
-
reflection, and
their discussion processes in different frames of the web learning system
. Table 1
summarizes the frames and the

portfolio development behaviors that students can perform in the web
portfolio system.

The teacher utilizes the web portfolios to assess student learning performance and
learning status.

No

Frame

Content description

1

Write d
aily reports

Student daily findings such as classroom events enable
teachers to comprehend daily learning progression and
status.

2

Read daily reports


3

Write notes

Learning notes from textbooks and lectures.

4

Read notes


5

Write experience reports

L
earning experience or supplemental materials to share
with other students.

6

Read experience reports


7

Propose questions

Learning questions to ask the teacher or other students.

8

Read questions


9

Write homework

Details the processes employed to solv
e assigned
problems including the problem analysis, semi
-
final
programs, final programs, program analysis, and the
program demonstration.

10

Read homework



8

11

Write reading self
-
reflections

Self
-
evaluation of reading performance of each learning
concept.

12

Read reading self
-
reflections


13

Write homework self
-
reflections

Self
-
evaluation of submitted homework for each learning
concept.

14

Read homework self
-
reflections


15

Write personal learning agendas

Self
-
defining schedule
for developing portfolio
s.

16

Read personal learning plans


17

Read Teachers’ journal

Teachers’ report about the lecture information including
p慲瑩捩t慴楯n 楮 瑨攠 汥捴ur攬 homework 慳s楧nmen琬t
sugges瑩tnsI 慮d 汥捴ur攠progr敳e楯n.



o敡d d敶敬epm敮琠tu楤慮捥

pys瑥m prov楤敤
gu楤慮捥 瑯 d敶敬ep por瑦o汩ls 慣捯rd楮g
to teachers’ criteria and student portfolio development
s瑡瑵s.



䑩a捵ss楯n

mos琠

mropos攠愠new 楳su攠or qu敳瑩on.

o数ly

o数ly 瑯 慮 ex楳瑩tg 楳su攠or 愠qu敳瑩on.

Table 1.
Frames and functions of the web port
folio system

In the learning programs, teachers identified ten essential concepts in C programming language
:
input/output, data types,
arithmetic

operations, switch control, iteration, if
-
than
-
else alternations,
functions, message passing, array, and recur
sions. Students must develop portfolios for each concept
and discuss
it
with other

s
tudents

on the web learning system.
M
ean
while
,
teachers
utilize
the web
portfolio
s

to
diagnose students


learning
activities and fulfill the peer support
[
2
]

strategy.

The peer
support strategy entails recommend experts
to
students proposing learning questions. The expert must
be willing to answer other students' questions. Analysis of
students


portfolios

can help a teacher
diagnose

student activit
y performance and enforce the peer support strategy.
Analysis
involves
the
following portfolios:



Portfolio development
activitie
s
,

including the behavior
associated with

submi
tting

homework,
notes, experiences, and self
-
reflection.


9



Discussion activity an
d learning behavior in the web portfolio system
,

including
frequency

of
using the learning system, reading behavior, posting and replying in the discussion activity.



Portfolio content assessments
,

including teachers


assessment of students


learning outcom
es of
the ten learning concepts by examining the products developed by students and quiz
zing them

to
assert
their

self
-
reflection.

T
o efficiently
diagnose students


learning
activities and fulfill peer support, teacher
s

must know how

portfolio development
behavior,
discussion behaviors
, and learning outcomes
are related
in various
learning concepts.
For instance, teacher
s

must
diagnose student
s’

learning processes
and recommend
proper students to answer other students’ questions by asking the following ques
tions:




Which student is willing to answer others’ questions and excel in a certain concept? What are the
behavioral characteristics of
the

student in observable learning activities such as developing
homework, experience reports, and self
-
reflection?



If
student
s

overestimate
t
he
i
r learning outcomes
during

self
-
reflection,
can

the
y effectively grasp
learning concepts?



What are the behavioral symptoms when students fail
to grasp

a specific learning concept?



How are
the ten learning concepts

related?

T
hat is
,
does any concept heavily influence the
learning outcome of other concept
s
?

To answer the above questions, teachers require technical support

for

managing student learning
processes and
analyz
ing
the
student learning
performance
. First, teachers must
coll
ect students


behavioral portfolios to
accurately

represent learning processes in the learning system. Although the
web portfolio system can record students


learning products such as homework and experience reports,
teachers must
continuously

monitor stud
ents


portfolio development
activities

to obtain how and when
students develop these products as well as how they interact with peers. Furthermore,
deriving a

10

student model
of activity performances to answer the above questions

involves many

students’
port
folios, portfolio development processes, portfolio assessment result,

and
discussion performance
history.
Hence,

the teacher requires
analytical
facilities to induce the relationship
among

discussion
behavior, portfolio development processes, and learning
performance as well as to infer the
performance in discussion activity and various learning concepts.


Bayesian Network
Reasoning
Software
Assessment,
Estimation,
Diagnosis,
...
Experienced
Rules
Teacher
Portfolios
Belief network of
student performance and
behaviors
Portfolios,
Web log: Portfolio develoment process,
Discussion behavioral records,
Assessment results
Student
Teacher's
Knowledge

Figure 1.
Student modeling

of learning performance
with
Bayesian network
software

Figure 1 illustrates how teachers analyze
and
use

student model

by
using Bayesian network software
from
web
portfolio
s.

Students initially develop learning portfolios and discuss with peers in the web
learning system. Meanwhile, the web learning system automatically record
s

students


portfolios
de
velopment processes and discussion behavior in the web log. After student portfolios, learning
process and discussion records, a
s well as

assessment results of students


portfolios are
accumulated

in
a database, Bayesian network software
enables

teachers t
o induce the student model of activity
performance in portfolios in a
Bayesian
belief network. The belief network
clarifies

the relationship
among various activity performance
s

in probabilistic rules. Hence
,

teachers can go online for utilizing
the experie
nced rules to diagnose students


learning processes

and regulate strategies
.


11


.
Reasons for
using Bayesian network

technologies

Teachers
require

assessment knowledge

to

realize how students
grasp learning concepts
and assist
student
s

in

learn
ing

through
de
veloping portfolios when assessing students’ learning activities and
enforcing specific pedagogical strategies.
For instance,
VanLehn and Martin

[
33
]

utilized students


learning actions to form
a
student model of physics knowledge
. Teachers can know how individual
students learn
when using

various physics concepts via the student model. However,
teachers of a web
based learning system
require more than the knowledge states of individual students. T
eachers must
know from

the student

model
how students behave and
perform

in portfolio development processes and
how they interact with peers on web learning systems

to effectively interact with their students

on the
W
eb
.

Using
Bayesian network

software

on web portfolios
allows teachers
to

attain and use
the
student
model of
learning

performance in web learning systems as described in
the following.

3.1
Web log for portfolio collection

The teacher must
collect

the records of s
tudents’ learning behavior at varying time periods to evaluate
th
eir performance precisely. For instance, the teacher may want to identify
when students submit their
homework and self
-
reflection
s as well as how many times student
s

reply to others


questions
. Whereas

teachers generally record students’ learning behaviors

by paper records in conventional classrooms,
advances in web servers
have enabled
student portfolio development process

to be automatically
logged

in a database without any special
custom

design of the web learning system
. Original web log
record
s

contain

the following information: IP address, the user login id, the date and time, the method
(GET or POST), the file name requested, the result of the request (success, failure, or error), and the
size
of
the
file
transferred. Table 2 displays a snapshot of th
e web logs.

IP

Student ID

Time

Method

File requested

Result

Size

140.115.50.201

NKChen

11/Dec/1999/11:02:41

GET

/cgi/vc/Login
.exe

200

236

140.115.50.201

NKChen

11/Dec/1999/11:03:05

GET

/cgi/vc/
HomeWor
k.exe

200

556


12

140.115.50.201

NKChen

11/ Dec /1999/11:
03:05

GET

/cgi/vc/
chap.jpg

200

349

140.115.50.91

Jcc

11/ Dec /1999/11:03:27

GET

/cgi/vc/
Reflection.
exe

200

339

140.115.50.91

Jcc

11/ Dec /1999/11:05:12

GET

/cgi/vc/
Homework
.exe

200

556

140.115.50.91

Jcc

11/ Dec /1999/11:05:12

GET

/cgi/vc/
chap.jpg

200

34
9

140.115.50.91

Jcc

11/ Dec /1999/11:06:54

GET

/cgi/vc/
Note.exe

200

428

Table 2. A snapshot of web logs

The web logs are maintained to effectively manage the web server. The web logs contain files that
students
have

read as well as the requested applicat
ion programs. The teacher can collect student
portfolio development by accessing the files requested. For instance, the
last

entry of the web logs in
Table 2 confirms that student
Jcc

wrote learning notes
at 11/ Dec /1999/11:06:54; since students can
only
write learning notes

by requesting the application program file
Note
.exe
. The web portfolio system
design must allow students to perform
each learning

behavior in a certain location if it is to be
pedagogically meaningful. Restated
,
the portfolio developme
nt behaviors that the teacher must assess
e.g. writing self
-
reflection, checking teachers’ teaching journal, or submit homework, must be
implemented as distinguishable application program files so that student behavior can be correctly
recorded in the web
log.


1

2

3

4

5

6

7

7

9

10

11

12

13

14

15

16

17

18

TOTAL

Oct.

3

11







14





1

1

10

10


50

Nov.

4

32

3

11

14




43

21

19

19

17

7

2

39

26


273

Dec.

10

11

8

1

3

16

1

7

5

1






5

10


70

Jan.

6


16


4

8


4

30

18

13


5



5


32

133

Total

23

54

27

12

21

24

1

11

92

40

32

19

22

8

3

46

46

32

526

Table 3. The frequency of each type of portfolio development behavior from October to January


13

Figure 2. The sequence of portfolio development behavior in November

The portfolio frames allow web servers to record st
udent portfolio development behaviors according to
the behavior type. The recorded web log allows the teacher to examine students’ learning performance
for
diverse

behavior types during distinct time periods
. Table 3
summarizes

a student

s performance for
eighteen learning behaviors in distinct months
.
The teacher can
display the behavior sequence of
a

student’s portfolio development process

to inspect the student

s learning performance in a specific
month, as illustrated in
Fig. 2.

The web portfolio syste
m can automatically record learning behavior
s

in the web logs via a clear
portfolio frame. Although capable of
collect
ing

student portfolio development
performance

from the
web logs, teachers must
still
tediously
analyze the relationship among various perf
ormances such as
discussion performance, homework performance and learning achievement in the learned concepts.
Therefore,
Bayesian network software is used
herein
to
assist

teachers
in

analyz
ing

the causality and
the impact factor among various learning p
erformances.

3.2 Bayesian network analysis of
activity

performance

This section describes the difficulty that a teacher has in analyzing
student portfolios
.
T
eacher
s

can use
0
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
11/01/99
11/02/99
11/03/99
11/04/99
11/05/99
11/06/99
11/07/99
11/08/99
11/09/99
11/10/99
11/11/99
11/12/99
11/13/99
11/14/99
11/15/99
11/16/99
11/17/99
11/18/99
11/19/99
11/20/99
11/21/99
11/22/99
11/23/99
11/24/99
11/25/99
11/26/99
11/27/99
11/28/99
11/29/99
11/30/99

14

the
Bayesian network software

to reason how various activity performances are rela
ted.
The reasoning
process
of Bayesian network
is based
mainly
on conditional probability theory. Bayesian network
software discover
s

the

significant belief network, i.e. causality relationship, among various learning
performances that explains the conditi
ons leading to a substantial performance
phenomenon

with
occurrence

probabilities.

Recently developed
Bayesian network software
such as
Bayesian Knowledge Discoverer

[
3
]
,

GeNIE

[
19
],
and, Microsoft Bayes
ian Network
[
26
]
, Netica
[
27
]

assists decision makers in identifying the
knowledge and corresponding
occurrence

probability

from historical records.
Many business decision
support system
s

such as
Kalamat
ion
[
23
]

and

Noetic Systems
[
28
]
applied Bayesian network
technology to
perform tasks such as
health care
, credit card authorization check,
and
diagnose

faults

since Bayesian network can
cope with

uncerta
in
ties

in these applications.
VanLehn and Martin

[
33
]

also applied

Bayesian network

to discover the causality among 230 physics rules to diagnose students


learning status.



15

Figure 3.
Belief network of student learning performanc
es

Bayesian Knowledge Discoverer (BKD) is used
herein

to discover the causality relationship among
performance of portfolio development

activity
,
discussion activity
, and learning achievement

in various
learning concepts
. B
KD

analyze
s

the students


portfol
ios and produce
s

a belief network to explain the
causality relationship among
various activity

performances
, as illustrated in Fig
.

3.
The Bayesian
network

analysis includes the following activity performances:



Reply
: the performance of answering others’ q
uestions which is classified into Y (ever answered
others’ questions) and N (never answered others questions).



Post
: the performance of proposing new issues or questions which is classified into Y (ever posted
issues or questions) and N (never posted issue
s or questions).



Read frequency
: The frequency that students read articles in discussion activity which is
classified into classes H (high), M (middle), and L (low).



LA of previous concept
: the learning achievement in previously learned concepts which is
classified into classes A, B, C, D, and E.



LA change of previous concept
: The change
of
learning achievement in previously learned
concepts which is

classified into classes porg (progressive), same(the same), and regr
(regressive).



LA
:
The learning achi
evement in the learning concept which is classified into classes A, B, C, D,
and E.



Self
-
reflection
: The accuracy of students’ self
-
evaluation of learning achievement on the learning
concept which is classified into classes OVER (over estimate), FIT (fit e
stimate), UNDER (under
estimate), and N (absent).


16



Experience report
: the time require for students to submit their experience reports which is
classified into classes F (on time), L (
delay
), and N (absent).



Notes
: the time require for students to submit th
eir notes which is classified into classes F (
on
time
), L (
delay
), and N (absent).



Homework
: the time required for students to submit their homework which is classified into
classes F (early), M (on time), and L (
delay
).



Homework Self
-
reflection
: the time
required for students to submit their homework
self
-
reflection which is classified into classes F (early), M (
on time
), and L (
delay
).



Reading Self
-
reflection
: the time required for students to submit their reading self
-
reflection
which is classified into
classes F (early), M (on time), and L (
delay
).



Login
: The frequency students login the learning system to develop portfolios which is classified
into classes H (high), M (middle), and L (low).

The belief network illustrates
how
various activity performanc
es in portfolios

with respect to cause and
effect
. For instance, the learning achievement (
LA
) in previous
ly

learned
concept (A, B, C, D, or E)
and the learning achievement change in previous
ly

learned
concept (Progressive, Regressive, or Keep
in the same)

dominate the learning achievement (
LA
) (A, B, C, D, or E) in the current learning concept.
In addition, the learning achievement of
the

learning concept also dominate
s the

students


replay
behavior (Ever or Never) in discussion activity, student self
-
refl
ection accuracy (Over estimate, Under
estimate, Fit estimate, or
Absent
), and the submission
time
of experience reports (on time, delay, or
absent
). The bar chart beside
s

each node represents the probability
of student performance
without any
prior conditi
on.

For instance, without any available observable phenomenon, a student has a probability
0.24730 to attain learning achievement (
LA
)
A
, 0.19235 learning achievement
B

etc. A student has a
probability 0.87348 not to reply others’ questions or issues and 0
.12652 to reply others’ questions or

17

issues.


Figure 4.
Belief network of learning concepts

BKD

is also employed to understand how performances of the identified learning concepts are related.
Figure 4 depicts the belief network that explains how students

performances

for

various concepts are
related. For instance, four concepts, i.e. arithmetic operation
s
, if
-
than
-
else alternations, functions, and
switch control, dominate students’ learning achievement in concept message passing. Therefore,
teachers may
remind students to review these concepts before learning message passing.

LA of previous
Concept
LA Change of
previous Concept
-
=
+
-
=
+
-
=
+
-
=
+
-
=
+
A
0.04
0.84
0.2
0.04
0.84
0.583
0.04
0.04
0.44
0.2
0.04
0.04
0.2
0.04
0.04
B
0.368
0.04
0.2
0.229
0.04
0.297
0.04
0.22
0.44
0.2
0.04
0.219
0.2
0.04
0.44
C
0.368
0.04
0.2
0.458
0.04
0.04
0.528
0.657
0.04
0.2
0.44
0.656
0.2
0.04
0.04
D
0.04
0.04
0.2
0.044
0.04
0.04
0.352
0.043
0.04
0.2
0.44
0.045
0.2
0.84
0.44
E
0.184
0.04
0.2
0.229
0.04
0.04
0.04
0.04
0.04
0.2
0.04
0.04
0.2
0.04
0.04
E
LA
A
B
C
D

-

: Regressive, =: Keep in the Same, +: Progressive


18

Table 4: Impact factors on learning achievement by conditional probability

LA
A
B
C
D
E
Y
0.717
0.333
0.167
0.104
0.1
N
0.283
0.667
0.833
0.896
0.9
REPLY

T
able 5: Impact factors on reply performance by conditional probability

Bayesian network
software

discovers not only the causality relationship, but also the impact factors
among various learning performances. The impact factors are represented as a conditi
onal probability
notation. Restated, under an observable activity performance, a student has a certain probability to
achieve a specific activity performance. For instance, a student has a probability of 0.37 to attain

learning achievement

B

in a learning
concept, but has only a probability of 0.04 to attain

learning
achievement
A
, if the student has learning achievement
A

and has been regressive in a previous concept.
Table 4 lists the impact factors of learning achievement of a previously learned concept
on the learning
achievement of a learning concept. Table 5 summarizes the impact factors on reply performance by
conditional probability in the web learning system. A student that attains learning achievement (
LA
)
A

has a probability of 0.717 to reply to o
thers’ questions or issues. Whereas a student that attains

learning
achievement (
LA
)
B

has only a probability of 0.333 to reply to others’ questions or issues. Thus,
teachers or the learning system can then utilize the rule to recommend proper students to
answer
another student’s questions. The Appendix lists the details of impact factors of activity performances of
Fig. 3 and Fig. 4.

Using the discovered causality relationship and impact factors, Bayesian network software also
facilitates teachers to
estim
ate

students’ learning performance from observable conditions. For instance,
Figure 6 estimates students’ learning achievement (
LA
) from learning achievement in a previously
learned concept. A student has a probability of 0.5 to attain learning achievement

A
or

B
if the student
with learning achievement
B

has been progressive in a previously learned concept. Figure 7 estimates

19

learning achievement from the performance of discussion activity. A student has a probability of
0.65087 to attain learning achievem
ent
A
and a probability of 0.34913 to attain

learning achievement
C
,
if a student replied to others’ questions. Therefore, use of Bayesian network
software
allows teachers to
accurately predict students’ activity performance and, hence, to diagnose learnin
g processes or fulfill
strategies from observable phenomenon.



20

Figure 5. Estimation of students’ learning achievement (LA) from learning achievement in a previously learned concept

with
a
discovered belief network


Figure 6. Estimation of students’ learn
ing achievement (LA) the performance of discussion activity

with a
discovered belief
network

Requirements

Information Technology

Reason

Portfolio collection

Web log

Attaining behavioral portfolios

Discovery of causality

Bayesian network
software

Inducin
g
belief network

and impact
factors among activity performance
s

Estimation of
performance

Bayesian network
software

Estimation of activity performance
by discovered belief network

Table
6
. The rationale of utilizing Bayesian network technology

In sum,
th
e web log and Bayesian network
technology

satisfy a teacher’s requirement of effectively
diagnosing student learning processes and managing teaching strategies because they overcome the
difficulties in obtaining and utilizing the student model of learning
performance in the web learning
system online. First, the web log
mechanism of web servers automatically records student
s’

learning

21

activitie
s to help teachers
obt
ain behavioral portfolios. Second,
Bayesian network software

facilitates
teachers

in inducing

the
belief network

and impact factors among various learning performances
. Third,
teachers can utilize the belief network
obtained

from historical data to estimate
various activity
performance
s
.
Table 6

summarizes the rationale of using
Bayesian network t
echnology

for
assessment
and diagnosis purposes
.



System architecture and implementation

This section describes the detail
ed

architecture of analyzing

portfolios for assessing student
performance and building student models that
are
related to the employment of
pedagogical

strategies
in the web learning

system. The analysis involves two primary tasks. The first task
constructs

the
database of learning performance in various concepts related to the application of web learning systems.
The database records the student performance of different level
s

of abi
lities. The performance database
can assist teachers
in

enforc
ing

pedagogical

strategies according to
a
student
's

ability. For instance,
teachers may utilize the performance database to achieve
heterogeneous

grouping
[
34
]

in group

learning according to their
learning
performance
s
. The other task of analyzing student portfolios
enta
i
l
s
inducing the student model of activity performances from the
accumulated

portfolios. Teachers may
utilize the student model to predict or estimate ac
tivity performances from observable learning
behaviors.

4
-
1
Construction of performance database

Bloom et al.
[
4
]

classified educational objectives of assessment into knowledge, comprehension,
application, analysis, synthesis, an
d evaluation levels. Teachers desire to comprehend student
s’

learning
performances at application, analysis, evaluation levels for enforcing peer support strategy in the web
learning system since teachers world
prefer

to
recommend

students
only
with applic
ation, analysis, and
evaluation abilities to help
others with analyzing and providing a solution to a problem
. Therefore, the
teachers and the learning system require the performance database that records
the
students


ability in

22

each learning concept at t
he application, analysis, evaluation assessment levels.
Constructing a
performance database entails observing students’ states by analyzing the quality of portfolios,
inferring

the
ability

of
application, analysis, evaluation

level

from observed learning s
tates
, and updating student
performance in the database. Figure 7 schematically depicts the architecture of constructing a
performance database.

Known
Analyzed
Used
Application
Evaluation
Analysis
Portfolio database
Performance
Database
State Check List
Abilities
Bayesian
Inference

Figure 7.

Architecture of constructing
a
performance database

Quality analysis of
portfolio database entails extracting the concept of students’ portfolio and
identifying the learning state that students have achieved in the portfolio. The teacher can directly
examine the content of portfolios or utilize
information retrieval
[
37
]

facilities such as IBM text miner
to extract the concept of portfolios. For instance, if the keyword “call by reference” frequently occurs
in a student’s experience report, the experience may be related to the concept
message passing
.
For a
portfolio of a certain concept, teachers assess the quality of the portfolio to assert the learning state that
a student has reached after developing the portfolio. The teachers assert the following states from
portfolios:



Known
:
The student know
s

t
he definition and basic constructs of a learning concept.



Used
:
The student
has correctly
utilized the learned constructs to solve problems.


23



Analyzed
:
The student
has
correctly analyze
d

the effectiveness and
performance
of a solution
to

a
problem
.

Teachers

assert the students’ states by examining students’ achievement in various types of portfolio
product. If students have correctly utilized learned concepts to solve the assigned homework, they have
reached the
Used

state. If students analyze the solution t
o a problem in the program analysis, they have
reached the
Analyzed

state. To assert whether a student has knows the definition and basic construct of
a concept, the web learning system prompts an online quiz when students use the web learning system.
Teac
hers can also examine the students’ experience reports, notes, homework and discussion to assert
that students have known the content of a concept.

Observing learning states is an attempt to assess application, analysis, and evaluation abilities. The
learn
ing states affect the performance in application, analysis, and evaluation abilities. Analysis of
learning states and abilities indicates that students have a higher probability of having application
ability when students have reached
Known

and
Used
states
. Students have a higher probability

of
obtaining analysis and evaluation abilities when students have reached
Known

and
Analyzed

states.
Therefore, B
KD

is applied to analyze and infer students’
abilities

in application, analysis, and
evaluation abilities
from observed states.


Figure 8 illustrates the Bayesian network for inferring students


abilities from observed states. Table 7
lists

the probabilities of inferring the application ability from
Known

and
Used

states in our experiment.
For instance, if stu
dent
s

know the definition of a concept and
did
not utilized the concept to solve an
assign problem, the
y have a

probability
of
0.6 to have application ability. Teachers can utilize the
probabilities
to

estimate students


application ability. Each time a st
udent develops a new portfolio
product in the learning system, teachers evaluate the product to detect the state change of the student.
Bayesian network software computes the score of application, analysis, and evaluation abilities for the
student. Thereaf
ter,
the
database system can update the ability change for this student in the

24

performance database.

Known
Analyzed
Used
Application
Evaluation
Analysis

Figure 8.

Architecture of inducing
a
student model of activity performance

Application

KNOWN

Yes

No

USED

Yes

No

Yes

No

Yes

0.9

0.6

0.3

0.1

No

0.1

0.4

0.7

0.9

Table 7. Inferring application ability from Known and Used states



25

4
-
2
Induction

of Bayesian Belief Network

Web Log
Portfolio
Product
Discussion
Activity
Portfolio database
Portfolio
organization
Qualify
discrete
meaning of
data
BKD:
Learning BBN
Structure
Refinement
by expert or
teacher
Causality
Relationship
BKD:
Learning BBN
Structure
Refinement
by expert or
teacher
BBN of activity
performance

Figure 9.

Architecture of inducing student model of activity performance

Figure 9
desc
r
ibes
the architecture of inducing
the
student model of activity performance by

using

Bayesian network software.
Ind
ucing student model of activity performance
initially involves
or
ganizing the developed portfolios
,

i.e., p
ortfolio products, portfolio
development activity in the web
log, and discussion activity history
,

into the database system.
Five types of learning
information

are
recorded in the portfolio database
: (1) students’ information such as name, gender, and age
,

(2)
discussion
activities i.
e. post, reply and read behaviors
, (3)

portfolio products such as the content of
homework and self
-
reflection report
,
(4)
concepts that the students’
portfolios

are related to
, and (5)
assessment results of students


performance in each learning concept fr
om the portfolio products
.
Figure
10

illustrates the relationship among
various
learning information by using an
Entity
-
Relationship diagram

[
11
]
.
Entity
-
Relationship diagram

is commonly used to describe
information and data struc
ture
.
The diagram depicts the records (Rectangle) that should be stored and
the relationships (lozenge) among these records. Notably, the portfolio development processes are
represented and recorded as the relationship between students and portfolio produc
ts. The relationship

26

indicates the time of development of each portfolio product.

Perform
Student
Discussion
Activity
Portfolio product
Develop
Assessed-as
Assessment result
Related-to
Concept

Figure 10.

Architecture of inducing student model of activity performance

The second step of inducing student model of activity performance invo
lves qualifying
the
discrete
meaning of continuous data since most data mining and machine learning methodology
function

best
on discrete data set. Although B
ayesian
n
etwork

software can
effectively handle
continuous data, most
operational B
ayesian
n
etwork

tools
only

analyze discrete data. For instance, the portfolio database
records the frequency of reading behavior in discussion activity
that

is represented as a number.
B
ayesian
n
etwork

software can not
effectively handle

such continuous data. Therefore,
this continuous
data must be
transferred

into the meaning of teachers


recognition such as
a
high, middle, or low
frequency. Available guideline
s

of qualifying discrete meaning of continuous data include expert
recognition, equal division, S
-
plus

[
31
]
,

OneR discr
e
tizer
[
16
]
, and Entropy maximization heuristic
[
21
]
. In the experiment, teachers qualify
the
discrete meaning of continuous data according to teachers


recognition and eq
ual division guideline. For instance, student achievement of learning concepts is
categorized
into five equal groups (A, B, C, D, E) while the time of homework submission is divided

27

into three group (early, on time, and delay)
based on

the
teachers


recogn
ition.

After student portfolios are recorded in the database, BKD is then applied to discover the causality
relationship and conditional probability among various activity performances. The causality
relationship can either be depicted by teachers or disco
vered by analyzing records in portfolios

by
BKD.
Since the web portfolio contain
s many

records, the architecture
applies

BKD to discover

the
initial
causality relationship. Teachers can then refine the causality relationship
based on

their experience.
Howe
ver, analyzing
the
causality relationship from records is an extremely complex task. BKD
requires
that
teachers provide the partial order among learning records which denotes
that

certain
records directly or indirectly dominate other records in the portfol
ios.

Association rule
[
1
]

data mining software such as IBM Intelligent Miner is applied to assist teachers
in
identify
ing

the partial order among records. Association rule data mining primarily analyzes
how

two
records
are relate
d
without consider
ing

complicated conditional probability. The mining tool can be
used to efficiently discover the knowledge of the form
A


B

which denote
s

if
A

occur then
B

will
occur also. Therefore, teachers may utilize association rule

software to ide
ntify the partial order among
records. BKD then discovers causality relationship and the conditional probability from the portfolio
database.



Conclusion

As students learn and develop learning portfolios on
the
World Wide Web, web portfolios completely
re
cord student learning processes, learning products, and interaction
history
. Although such web
portfolios are collected in an electronic format,

teacher
s

lack computation
al

support
in
deriving

and
utilizing student model of
learning

performance to
utilize

web portfolios
.
T
his investigation has
described how teacher
s

should ef
f
ectively
assess
activity

performance
and fulfill pedagogical

strategies
by using

portfolios
.

A
web log

methodology is proposed herein to

accumulate the students’ behavior
al
portfolios
.

This investigation also
illustrates the experience of using Bayesian network

technology to

28

help teachers
deriv
e
and utiliz
e a

student model
of activity performance.

Bayesian network

technology
can help teachers discover a
belief network that
not only impa
cts
student

performance
from web
portfolios,
but also allows

teachers to
utilize the belief network to
diagnose or intervene in students’
learning processes
at an early stage.
E
lectronic web portfolios enable teachers
to
assess
activity
performance
and
ana
lyze pedagogical strategies
in
an online and efficient manner.
Therefore,
teachers
of a web
-
based learning system can
also
extend the use of
web portfolios

and
Bayesian network

technology for
other
pedagogical purposes with relative ease.

Acknowledgment

T
he authors would like to thank the National Science Council of the Republic of China for financially
supporting this research under Contract No. NSC 89
-
2520
-
S
-
008
-
019
-

and Ministry of Education
Contract No. 89
-
H
-
FA07
-
1
-
4.

References

[
1
]

R
.

Agrawal and R
.

Srikant, Fast Algorithms for Mining Association Rules in Large Databases,
Proceedings of the 20
th

international Conference on Very Large Data Bases
,

pp.
487
-
499
,
1994.

[
2
]

J. Baker and G. Dillon, Peer Support on The
Web,
Innovations in Education and Training
International
, 36
:
1, pp. 65
-
79, 1999.

[
3
] BKD(Bayesian Knowledge Discoverer) develop by Knowledge Media Institute of The Open
University, URL is
http://kmi.open.ac.uk/projects/bkd
.

[
4
] B. S. Bloom, M. D. Englehart, E. J. Furst, W. H. Hill, and D. R. Krathwohl, T
axonomy of
educational objectives: The classification of educational goals. Hand Book 1: Cognitive domain
, New
York
: MCKay, 1956.

[
5
]

H. Borko, P. Michalee, M. Timmons, and J. Siddle, Student Teaching Portfolios: A Tool for
Promoting Reflective Practice,
Journal of Teacher Education
, 48:5, pp 345
-
356, 1997.

[
6
]

C. B. Burch, Insi
de The Portfolio Experience: The Student Perspective,
English Education
, 32
:
1, pp.
34
-
49, 1999.

[
7
]

R. B. Burton, Diagnosing Bugs in A Simple Procedural Skill, In D.Sleeman & J. S. Brown(Eds.),
Intelligent Tutoring Systems
, London: Acad
emic Press, 1982.

[
8
]

E
.

Charniak
,
Bayesian Networks without Tears,
AI Magazine,

12:4, pp. 50
-
63, 1991.

[
9
]

G. D. Chen, C. C. Liu, K. L. Ou, and B. J. Liu, Discovering Decision Knowledge from Web Log
Portfolio for Ma
naging Classroom Processes by Applying Decision Tree and Data Cube Technology,
Journal of
Educational Computing R
esearch
, 23:3, pp. 305
-
332, 2000.

[
10
]
G. D. Chen, C. C. Liu, K. L. Ou,
and
M
.

S
.

Lin
, Web learning portfolios: a tool for
supporting
performance awareness,
Innovations in Education and Training International
,

38
:
1
, 2000
,

in press.

[
11
]
P. Chen, The Entity
-
Relationship Model
-

Toward a Unified View of Data,
ACM Transactions on

29

Database Systems
, pp. 9
-
36, 1
976.

[
12
]

M. T. H. Chi, R. Glaser and E. Rees, Expertise in problem solving,
In R.J. Sternberg(Ed.)
Advances in The Psychology of Human Intelligence,

Vol. 1, 1982.

[
13
]

C. Conati, A. Gertner, K. VanLehn, and M. Druzd
zel,
On
-
Line Student Modeling for Coached
Problem Solving Using Bayesian Networks,
In proceedings of the 6
th

International Conference on User
Modeling
,
pp. 231
-
242
, 1997.

[
14
]

G
.

F. Cooper
,

Current Research Directions in The Development

of Expert Systems Based on Belief
Networks,
Applied Stochastic Models and Data Analysis
, 5:1, pp. 39
-
52, 1989.

[
15
] M. L. Duffy, J. Jones, and S. W. Thomas, Using Portfolios to Foster Independent Thinking,
Intervention in School and Cl
inic, 35:1, pp. 34
-
37, 1999.

[
16
]

U. M. Fayyad and K.B. Irani, Multiinterval Discretization of Continuous Valued Attributes for
Classification Learning,
In proceedings of the 13
th

International Joint Conference on Artificial
Intelligenc
e
, pp.1022
-
1027, 1993.

[
17
]
U. M. Fayyad, G
.

Piatetsky
-
Shapiro,

P. Smyth
, and
, R. Uthurusamy, From Data Mining to
Knowledge Discovery: An Overview,
in
Advances in Knowledge Discovery and Data Mining

pp.

1
-
34
,
MIT Press
, 1996.

[
18
] J. D. Fletcher, Modeling of Learner in Computer
-
Based Instruction,
Journal of Computer
-
Based
Instruction
, vol. 1, pp. 118
-
126, 1975.

[
19
] GeNIE develop by
Decision Systems La
boratory
,
University of Pittsburgh
., URL is

http://www2.sis.pitt.edu/~genie/
.

[
20
]

S. R.

Hiltz,
The virtual classroom: learning without limits via
computer network
, Alex Publishing
Corporation, Norwood NJ
,
1994.

[
21
]

R. C. Holte, Very Simple Classification Rules Perform Well on Most Commonly Used Dataset.
Machine Learning,

Vol. 11, pp. 63
-
91, 1993.

[
22
]

D. John
son and R. Johnson,
Learning Together And Alone

(33rd ed.),
Egnlewood Cliffs, NJ:
Prentice Hall, 1991.

[
23
]
Bayesian product development
develop by
Kalamation

International, URL is

http://kalamation.
com/
.

[
24
] T. M. Mitchell,
Decision tree learning in Machine Learning,

McGraw
-
Hill Company, 1997.

[
25
]

A.

Mitrovic
,

S.

Djordjevic
-
Kajan, L.

Stoimenov,

INSTRUCT: Modeling Students by Asking
Questions,
User Modeling
and User
-
Adapted Interaction
, 6,
:
4, pp. 273
-
302, 1996.

[
26
]

MSBN(Microsoft Belief Network Tools
) develop by Microsoft
, URL is

http://www.research.microsoft.com/researc
h/dtg/msbn/
.

[
27
] Netica develop by
Norsys Software Corp, URL is

http://www.norsys.com/

[
28
]

Noetic Systems

developed by
Noetic Systems Incorporated
, URL is
http://noeticsystems.com/
.

[
29
] F. L. Paulson
,

P. R. Paulson

and C. A. Meyer, What Makes a Portfolio a Portfolio?,
Educational
Leadership
, 48:5, pp. 60
-
63, 1991.

[
30
] J. Pearl,
Probabilistic R
easoning in Intelligent System: Networks of Plausible Inference
,
Morgan
-
Kaufmann

Publishers
,

San Mateo,

1988
.

[
31
]

P. Spector,

An Introduction to S and S
-
plus
, Duxbury Press, 1994.

[
32
] K. VanLehn and J. Martin, Stud
ent Assessment using Bayesian Nets,
International Journal of
Human
-
Computer Studies
, vol. 42, pp.
575
-
591
, 1995.

[
33
] K. VanLehn and J. Martin, Evaluation of an assessment system based on Bayesian student model,
International Journal of

AI and Education
, vol. 8, pp.
179
-
221
, 1997.

[
34
]

N. M. Webb
,

Group Composition, Group Interaction and Achievement in Cooperative Small
Groups,
Journal of Educational Psychology
,
vol.
74,
pp.
475
-
484
, 1982.

[
35
]

E.
Wenger, J. S. Brown, and J. Greeno,
Artificial Intelligence and Tutoring Systems
, Morgan
Kaufmann Publishers, 1987.


30

[
36
]

K. H.

Wilhelm, Use of an expert system to predict language learning success,

System
:

An
International Journal of E
ducational Technology and Applied Linguistics
,

25
:
3
,

pp. 317
-
334
,
1997.

[
37
]

R. Baeza
-
Yates and B
.

Ribeiro
-
Neto
,
Modern Information Retrieval (Acm Press Series)
,

Addison
-
Wesley Pub Co,
1999.

Appendix

The conditional probability of Bel
ief network of student learning performances in Fig. 3 is listed as
follows:

A
0.182
B
0.16
C
0.295
D
0.182
E
0.181
LA of previous concept

+
0.295
=
0.386
-
0.318
LA Change of previous
concept

-
: Regressive, =: Keep the Same, +: Progressive

LA of previous
Concept
LA Change of
previous Concept
-
=
+
-
=
+
-
=
+
-
=
+
-
=
+
A
0.04
0.84
0.2
0.04
0.84
0.583
0.04
0.04
0.44
0.2
0.04
0.04
0.2
0.04
0.04
B
0.368
0.04
0.2
0.229
0.04
0.297
0.04
0.22
0.44
0.2
0.04
0.219
0.2
0.04
0.44
C
0.368
0.04
0.2
0.458
0.04
0.04
0.528
0.657
0.04
0.2
0.44
0.656
0.2
0.04
0.04
D
0.04
0.04
0.2
0.044
0.04
0.04
0.352
0.043
0.04
0.2
0.44
0.045
0.2
0.84
0.44
E
0.184
0.04
0.2
0.229
0.04
0.04
0.04
0.04
0.04
0.2
0.04
0.04
0.2
0.04
0.04
E
LA
A
B
C
D
-
:
Regressive, =: Keep in the Same, +: Prog
ressive

LA of
previous
concept
A
B
C
D
E
On time
0.27
0.333
0.4
0.396
0.25
Delay
0.07
0.446
0.466
0.208
0.125
Absent
0.66
0.221
0.134
0.396
0.625
Experience report

LA
A
B
C
D
E
Y
0.717
0.333
0.167
0.104
0.1
N
0.283
0.667
0.833
0.896
0.9
REPLY



31

LA of previuos
concept
A
B
C
D
E
Over Estimate
0.269
0.224
0.254
0.57
0.25
Fit Estimate
0.407
0.554
0.253
0.19
0.25
Under Estimate
0.274
0.111
0.443
0.054
0.25
Absent
0.05
0.111
0.05
0.186
0.25
Self-Reflection

Y
0.205
N
0.795
POST


Experience
Report
On time
Delay
Absent
On time
0.399
0.076
0.067
Delay
0.335
0.846
0.068
Absent
0.266
0.078
0.865
Notes

REPLY
Y
N
Early
0.522
0.228
On time
0.411
0.658
Delay
0.067
0.114
Homework


POST
REPLY
Y
N
Y
N
High
0.75
0.079
0.704
0.096
Middle
0.183
0.691
0.229
0.452
Low
0.067
0.23
0.067
0.452
Read Frequency
Y
N

POST
Y
N
High
0.417
0.2
Middle
0.516
0.428
Low
0.067
0.372
Login

Homework
Early
On time
Delay
Early
0.383
0.296
0.067
On time
0.385
0.556
0.067
Delay
0.232
0.148
0.866
Homework Self-Reflection

Homework Self-
Reflection
Early
On
time
Delay
Early
0.718
0.2
0.181
On time
0.215
0.65
0.548
Delay
0.067
0.15
0.271
Reading Self-Reflection


The conditional probability of Belief network of
students’
performances

in various concepts

(Fig. 4) is
listed as follows:


32

A
0.167
B
0.204
C
0.259
D
0.204
E
0.167
Input/Output

Input/Output
A
B
C
D
E
A
0.587
0.084
0.132
0.040
0.040
B
0.293
0.669
0.040
0.040
0.040
C
0.040
0.167
0.657
0.167
0.040
D
0.040
0.040
0.132
0.669
0.098
E
0.040
0.040
0.040
0.084
0.782
Data Types


Data Types
A
B
C
D
E
A
0.427
0.418
0.040
0.040
0.040
B
0.213
0.418
0.263
0.040
0.040
C
0.107
0.085
0.525
0.320
0.040
D
0.213
0.040
0.132
0.560
0.040
E
0.040
0.040
0.040
0.040
0.840
Arithmetic Operations

Input/Output
A
B
C
D
E
A
0.614
0.167
0.069
0.040
0.040
B
0.102
0.167
0.343
0.262
0.040
C
0.204
0.585
0.205
0.175
0.040
D
0.040
0.040
0.343
0.436
0.098
E
0.040
0.040
0.040
0.087
0.782
Switch Control

Input/Output
A
B
C
D
E
A
0.587
0.251
0.040
0.040
0.040
B
0.293
0.501
0.132
0.040
0.040
C
0.040
0.167
0.525
0.335
0.040
D
0.040
0.040
0.263
0.418
0.195
E
0.040
0.040
0.040
0.167
0.685
Iterations

Arithmetic
Operations
A
B
C
D
E
A
0.6846
0.1674
0.04
0.04
0.04
B
0.1954
0.5014
0.1674
0.04
0.04
C
0.04
0.2512
0.5851
0.1674
0.04
D
0.04
0.04
0.1674
0.5851
0.1954
E
0.04
0.04
0.04
0.1674
0.6846
I
f-than-else Alternations


33

Message
Passing
A
B
C
D
E
A
0.404
0.263
0.144
0.044
0.044
B
0.303
0.42
0.073
0.182
0.044
C
0.205
0.18
0.496
0.231
0.044
D
0.044
0.044
0.073
0.093
0.044
E
0.044
0.093
0.214
0.45
0.824
Array

Message
Passing
A
B
C
D
E
A
0.587
0.25116
0.04
0.04
0.04
B
0.293
0.5014
0.1316
0.04
0.04
C
0.04
0.16744
0.5253
0.3349
0.04
D
0.04
0.04
0.2631
0.4177
0.1954
E
0.04
0.04
0.04
0.1674
0.6846
Recursions