docx - histproject.no

convertingtownSoftware and s/w Development

Nov 4, 2013 (3 years and 11 months ago)

322 views

This project has been funded with support from the European Commission. This publication
reflects the views
only of the authors, and the Commission cannot be held responsible for any
use, which may be made of the information contained therein.


LPP KA3
-
ICT Project 2011
-
13


511485
-
LLP
-
1
-
2010
-
1
-
NO
-
KA3
-
KA3MP


Done
-
IT



Develop of open systems services for smartphones that facilitates new evaluation methods, and enhances
use of immediate feedback on evaluation results obtained in tests as a
creative learning tool.



WP 8
:
Developing evaluation system for 4
Operative System platforms for
Smartphones

D 8.2: Technical Implementation Report



Author and editor:

R.P. Pein

Co
-
Authors:

T. M. Thorseth, M. Uran, S. Keseling, B.
Voaidas, R. Støckert, B. Zorc, M. Jovanović, A.
Koveš

Version:

Final

Date:

31.03.2013

Start month:

1

End month:

27

Package leader:

Institut za Varilstvo (IzV), Ljubljana, Slovenia

Language of the report:

English









Done
-
IT

LLP
-
Project Nr.
511485
-
LLP
-
1
-
2010
-
1
-
NO
-
KA3
-
KA3MP



2


This document may not be copied, reproduced, or modified in whole or in
part for any
purpose without written permission from the Done
-
IT consortium. In addition to such written
permission to copy, reproduce, or modify this document in whole or part, an
acknowledgement of the authors of the document and all applicable portions
of the copyright
notice must be clearly referenced. All rights reserved.

This document may change without notice, but consortium members should be informed, and
number of version, stage and date should be given.


Project consortium



Sør
-
Trøndelag University

College, Faculty of Technology, Trondheim, Norway



Centrum for Flexible Learning, Söderhamn, Sweden



Petru Maior University of Targu
-
Mures, Tirgu Mures, Romania



Magyar Hegesztéstechnikai és Anyagvizsgálati Egyesülés (MHtE), Budapest,Hungaria



Institut za Var
ilstvo (IzV), Ljubljana, Slovenia



HiST Contract Research, Trondheim, Norway




Done
-
IT

LLP
-
Project Nr.
511485
-
LLP
-
1
-
2010
-
1
-
NO
-
KA3
-
KA3MP



3


T
able of Contents

Done
-
IT

................................
................................
................................
................................
.........

1

Introduction

................................
................................
................................
................................

13

Basic data model

................................
................................
................................
....................

16

Data formats used in communication

................................
................................
....................

17

Offline operation
-

Mobile PeLe Service Unit

................................
................................
.........

17

Typical scenario

................................
................................
................................
......................

18

Preparation

................................
................................
................................
.........................

18

Assessment time

................................
................................
................................
................

19

Post
-
assessment time

................................
................................
................................
.........

19

Post class processing

................................
................................
................................
..........

20

Testin
g and evaluation

................................
................................
................................
............

20

Design

................................
................................
................................
................................
.........

21

Use cases

................................
................................
................................
................................

21

Register to system

................................
................................
................................
..............

22

Login as teacher
................................
................................
................................
..................

22

Create new assessment

................................
................................
................................
......

22

Edit assessment

................................
................................
................................
..................

23

(UCT4) Start assessment

................................
................................
................................
....

24

(UCT5) Monitor assessment

................................
................................
...............................

24

(UCT6) Close assessment
................................
................................
................................
....

25

UCT7 Post assessment activity

................................
................................
...........................

26

(UCS1) Register to the system

................................
................................
............................

26

(UCS2) Login to system

................................
................................
................................
.......

27

(UCS3) Start assessment

................................
................................
................................
.....

28

(UCS4) Respond to assessment

................................
................................
..........................

28


Done
-
IT

LLP
-
Project Nr.
511485
-
LLP
-
1
-
2010
-
1
-
NO
-
KA3
-
KA3MP



4


(UCS5) Retrieve assessment feedback

................................
................................
...............

29

UCS6 Participate in Post Assessment activity

................................
................................
.....

30

Messages


communication workflow

................................
................................
...................

30

General Requests
................................
................................
................................
................

31

Before Lecture

................................
................................
................................
....................

32

Assessment Phase

................................
................................
................................
..............

32

Interactive (SRS) Phase

................................
................................
................................
.......

34

Afte
r Lecture

................................
................................
................................
.......................

35

Scoring Model

................................
................................
................................
........................

35

Item Level

................................
................................
................................
...........................

35

Section Level

................................
................................
................................
.......................

36

Assessment Level

................................
................................
................................
...............

36

Normalized S
cores

................................
................................
................................
..............

36

PeLe Service Documentation
................................
................................
................................
......

38

Design

................................
................................
................................
................................
.....

38

RESTful Interface

................................
................................
................................
................

38

Authentication

................................
................................
................................
....................

39

Access Rights

................................
................................
................................
......................

41

HTTP Status Co
des

................................
................................
................................
..............

44

HTTP Headers

................................
................................
................................
.....................

46

REST Resources

................................
................................
................................
...................

47

REST Resource Details

................................
................................
................................
........

51

Implementation

................................
................................
................................
......................

64

Package rs.sol
.Hist.TestDefinition

................................
................................
.......................

64

Package rs.sol.Hist.Testing

................................
................................
................................
..

72

Package rs.sol.Hist.UserManagement

................................
................................
................

80

Teacher client documentation

................................
................................
................................
....

86

Design

................................
................................
................................
................................
.....

86

Use cases

................................
................................
................................
............................

86

UI and user experi
ence

................................
................................
................................
.......

8
7

Code architecture
-

packages

................................
................................
.............................

88

The context

................................
................................
................................
.........................

92


Done
-
IT

LLP
-
Project Nr.
511485
-
LLP
-
1
-
2010
-
1
-
NO
-
KA3
-
KA3MP



5


Communication infrastructure

................................
................................
...........................

93

Implementation

................................
................................
................................
......................

94

The context

................................
................................
................................
.........................

94

hist.net
-

Communication with the service

................................
................................
........

96

hist.model
-

Data model

................................
................................
................................
...

102

The main application DoneiTC.mxml

................................
................................
................

107

hist.doneit.datamodels

................................
................................
................................
....

108

hist.doneit.events

................................
................................
................................
.............

108

SessionControlEvent

................................
................................
................................
.........

108

hist.doneit.gui

................................
................................
................................
..................

108

hist.doneit.gui.about

................................
................................
................................
........

109

hist.doneit.gui.admin

................................
................................
................................
.......

109

hist.doneit.gui.create

................................
................................
................................
.......

111

hist
.doneit.gui.create.simulate

................................
................................
........................

116

hist.doneit.gui.end

................................
................................
................................
...........

120

hist.doneit.gui.general

................................
................................
................................
.....

123

hist.doneit.gui.itemRenderers

................................
................................
..........................

123

hist.doneit.gui.itemRenderes.adg

................................
................................
....................

124

hist.doneit.gui.menu

................................
................................
................................
........

126

hist.d
oneit.gui.monitor

................................
................................
................................
....

128

hist.doneit.gui.postasess

................................
................................
................................
..

130

hist.doneit.gui.start

................................
................................
................................
..........

132

Student app documentation

................................
................................
................................
....

134

Use cases

................................
................................
................................
..............................

134

UI and user expe
rience

................................
................................
................................
.........

135

Code architecture

................................
................................
................................
.................

135

Project file structure

................................
................................
................................
.........

135

Implementation

................................
................................
................................
................

139

Data processing

................................
................................
................................
................

140

Mobile PeLe Service Unit Documentation

................................
................................
...............

143

Why

................................
................................
................................
................................
......

143

What

................................
................................
................................
................................
.....

144


Done
-
IT

LLP
-
Project Nr.
511485
-
LLP
-
1
-
2010
-
1
-
NO
-
KA3
-
KA3MP



6


How

................................
................................
................................
................................
......

146





Done
-
IT

LLP
-
Project Nr.
511485
-
LLP
-
1
-
2010
-
1
-
NO
-
KA3
-
KA3MP



7


Index of Tables

Table 1: definitions for identifiers and names

................................
................................

40

Table 2: User roles

................................
................................
................................
..........

42

Table 3: Access rights matrix

................................
................................
..........................

44

Table 4: HTTP status codes

................................
................................
.............................

46

Table 5: Mime
-
types

................................
................................
................................
.......

48

Table 6: Rights for ASI resources based on the role

................................
.......................

49

Table 7: Custom metadata fields for ASI resources

................................
........................

49

Table 8: Custom metadata fields

................................
................................
....................

50

Table 9: QTI
-
XML resources

................................
................................
............................

51

Table 10: Service folders

................................
................................
................................
.

52

T
able 11: REST resource details

................................
................................
......................

52

Table 12: Assessment parameters

................................
................................
..................

58

Table 13: Completion state values

................................
................................
..................

58

Table 14: Resource state values

................................
................................
......................

59

Table 15: Assessment type values

................................
................................
..................

60

Table 16: Section type values

................................
................................
.........................

60

Table 17: User account properties

................................
................................
.................

64

Table 18:

Attributes of the ASIEntity class.

................................
................................
.....

66

Table 19: Assoc. ends of the ASIEntity class.

................................
................................
..

67

Table 20: Attributes of the ArbitraryData class.

................................
.............................

67

Table 21: Attributes of the DefaultSection class.

................................
...........................

67

Table 22: Assoc. ends of the DefaultSection class.

................................
.........................

67

Table 23:

Attributes of the PossibleResponse class.

................................
.......................

68

Table 24: Assoc. ends of the PossibleResponse class.

................................
....................

68


Done
-
IT

LLP
-
Project Nr.
511485
-
LLP
-
1
-
2010
-
1
-
NO
-
KA3
-
KA3MP



8


Table 25: Attributes of the Question class.

................................
................................
....

68

Table 26: Assoc. ends of the Question class.

................................
................................
..

70

Table 27: Attributes of the Test class.

................................
................................
.............

70

Table 28: Assoc.

ends of the Test class.

................................
................................
..........

71

Table 29: Enumeration literals of the AsessmentType enumeration.

............................

71

Table 30: Enumeration literals of the ContentType enumeration.

................................
.

72

Table 31: Enumeration literals of the ResponseMultiplicity enumeration.

...................

72

Table 32: Enumerati
on literals of the ResponseType enumeration.

..............................

72

Table 33: Attributes of the DefaultSectionCompletionState class.

................................

74

Table 34: Assoc. ends of the DefaultSectionCompletionState class.
..............................

75

Table 35: Attributes of the DSLifeCycleState class.

................................
........................

75

Table 36: Assoc.

ends of the DSLifeCycleState class.

................................
......................

75

Table 37: Attributes of the QuestionState class.

................................
............................

75

Table 38: Assoc. ends of the QuestionState class.

................................
.........................

76

Table 39: Attributes of the SRSSection class.

................................
................................
.

76

Table 40: Assoc. ends of the SRSSection class.

................................
...............................

77

Table 41: At
tributes of the StudentResponse class.

................................
.......................

77

Table 42: Attributes of the StudentTestRecord class.

................................
.....................

78

Table 43: Assoc. ends of the StudentTestRecord class.

................................
..................

78

Table 44: Attributes of the Testing class.

................................
................................
........

79

Table 45: Assoc. ends of the Testing class.

................................
................................
.....

79

Table
46: Attributes of the TestingListener class.

................................
...........................

80

Table 47: Assoc. ends of the TestingListener class.

................................
........................

80

Table 48: Enumeration literals of the ASILifeCycle enumeration.

................................
..

80

Table 49: Enumeration literals of the CompletionState enumeration.

..........................

81


Done
-
IT

LLP
-
Project Nr.
511485
-
LLP
-
1
-
2010
-
1
-
NO
-
KA3
-
KA3MP



9


Table 50: Attrib
utes of the User class.

................................
................................
............

83

Table 51: Assoc. ends of the User class.

................................
................................
.........

83

Table 52: Enumeration literals of the Role enumeration.

................................
..............

83

Table 53: Attributes and pins of the CmdCreateUser command.

................................
..

84

Table 54: Association ends and pins of the CmdCreateUser comman
d.

.......................

84

Table 55: Attributes and pins of the CmdCreateUsersBatch command.

........................

85

Table 56: Association ends and pins of the CmdDeleteUser command.

.......................

85

Table 57: Attributes and pins of the CmdEditProfile command.

................................
....

86

Table 58: Association ends and pins of the
CmdEditProfile command.

.........................

86

Table 59: Association ends and pins of the CmdFetchCurrentUser command

..............

86

Table 60: Classes in the hist.doneit.gui.create package.

................................
..............

113

Table 61: Classess in package hist.doneit.gui.create.simulate

................................
.....

118



Done
-
IT

LLP
-
Project Nr.
511485
-
LLP
-
1
-
2010
-
1
-
NO
-
KA3
-
KA3MP



10


Figure Index

Figure 1: Post

assessment dilemma

................................
................................
...............

13

Figure 2: PeLe System overview

................................
................................
.....................

15

Figure 3: System overview when using Mobile PeLe Service Unit

................................
.

18

Figure 4: Main use cases of the system

................................
................................
..........

22

Figure 5: Main communication workflow between system components

......................

32

Figure 6: ASI Basic

................................
................................
................................
...........

73

Fi
gure 7: ASI Definition

................................
................................
................................
...

74

Figure 8: Testing

................................
................................
................................
..............

82

Figure 9: Listener

................................
................................
................................
............

82

Figure 10: User Management

................................
................................
.........................

87

Figure 11: The current context minimizes the need for complicated dependencies
between the components

................................
................................
..............................

94

Figure 12: Communication infrastructure

................................
................................
......

95

Figure 13
: Context and communication related classes

................................
.................

98

Figure 14: Request flows

................................
................................
..............................

100

Figure 15: Server information

................................
................................
......................

101

Figure 16: Login and zero configuration related classes

................................
..............

104

Figure 17: Resource definition data model

................................
................................
..

105

Figure 18: Results data model

................................
................................
......................

108

Figure

19: Resource state related classes

................................
................................
.....

109

Figure 20: About box

................................
................................
................................
....

111

Figure 21: adminCanvas class and related classes

................................
.......................

112

Figure 22: adminCanvas user interface (design view from IDE)
-

shows the
adminCanvas with the export function and the datagrid displayed.

...........................

113


Done
-
IT

LLP
-
Project Nr.
511485
-
LLP
-
1
-
2010
-
1
-
NO
-
KA3
-
KA3MP



11


Figure 23: CreateAssessmentWizard classes and their relations

................................
.

114

Figure 24: shows the editAssessmentComponent as seen in design mode in the
development environment.

................................
................................
..........................

116

Figure 25: shows the EditAssessmentComponent that is connected to the yellow
model

objects and the EditQuestionItemRenderer and AlternativesItemRenderer and
how they are wrapped inside Air components (blue).

................................
.................

117

Figure 26: shows the QuestionItemRenderer displayed in design mode in the
developer environment

................................
................................
................................

117

Figure 27: shows the View
AlternativeComponent and how an hist.model.Alternative is
rendered.

................................
................................
................................
......................

118

Figure 28: shows how the ControlComponent, MonkeycageClass,
monkeyResponseClass and the HistogramForMonkeycage are related. Communication
and control is done through events up to the
main application.

................................
.

119

Figure 29: shows the simulation control dialog.

................................
..........................

121

Figure 30: shows the end interface in “sectionClosed” state where results are displayed
from the assessment.

................................
................................
................................
..

123

Figure 31: shows the class diagram of the ResultsOverview and how th
at use
hist.doneit.gui.monitore.SummaryViewComponent to display results from the model.
123

Figure 32: shows the SessionInformationPanel, that is used to display more detailed
information about sessions from a list of sessions.

................................
....................

125

Figure 33: shows the m
enu with the menu elements and the selector triangle to
amplify the current position in the “user flow” of the program

................................
..

128

Figure 34: shows the MonitorComponent, the two related views,
SummaryViewComponent and MatrixViewComponent, related model (yellow) an
d
related item renderers.
................................
................................
................................
.

129

Figure 35: shows the MonitorComponent when it displays the default
MatrixViewComponent.

................................
................................
...............................

130

Figure 36: shows the MatrixViewComponent that is inside MonitorComponent.

.....

130

Figure 37 shows the SummaryViewComponent with two secti
ons. The default section
is the main assessment, and the second chance section is not really used in the

Done
-
IT

LLP
-
Project Nr.
511485
-
LLP
-
1
-
2010
-
1
-
NO
-
KA3
-
KA3MP



12


application as it is today. To the right, a histogram displays the selected question in
more detail.

................................
................................
................................
.................

131

Figure 38: shows the ItemSummaryResponseItemR
enderer2 displaying results as
percentage bars.

................................
................................
................................
..........

131

Figure 39: shows the component in NoDocument state.

................................
.............

133

Figure 40: shows the component in DocumentNotReady state.

...............................

133

Figure 41: shows the assessment in DocumentReady state. The Start button is made
a
vailable

................................
................................
................................
........................

134

Figure 42: shows the component in DocumentUploaded. Normally the user will be
redirected to monitoring when the assessment has been started and the user has to
deliberately enter the start stated to get this state.

................................
....................

134

Figure 43: relation betw
een the views

................................
................................
.........

141

Figure 44: System overview
-

Use of MPSU with the internal AP enabled

..................

145

Figure 45: MPSU architecture overview

................................
................................
.......

146

Figure 46: Use of MPSU with one AP/Wireless Router added

................................
....

150

Figure 47: Use of MPSU with multiple AP/Wireless Routers add
ed for a combined
wireless network capacity

................................
................................
............................

150



Done
-
IT

LLP
-
Project Nr.
511485
-
LLP
-
1
-
2010
-
1
-
NO
-
KA3
-
KA3MP



13



Introduction

The PeLe (Peer Learning Assessment Services) system developed within the frame of the Done
-
IT project aims to
provide the teachers and the students with a set of services that allow information

exchange and information
aggregation in order to support evaluation activities that promote easy and fast verification and elaboration based
learning processes immediately after a test has been done. The main targets are higher education and VET
consideri
ng both contexts of inside and outside of the classroom.

The PeLe system (PeLe) allows on one hand the students to use their own smart
-
phones, tablets or portable
computers to provide the responses for assessments and post assessment activities and on the
other hand allows the
teacher to monitor the activity and give verification or elaborative feedback to individual students/groups of
students immediately after a test or activity.

This is a key factor helping students to improve their skills by the use of
active collaborative supported learning.
Students will, when they still remember the questions in the test, learn why the correct answer is correct and why
the other ones are incorrect. Thus, mobile technology provides new evaluation and testing criteria w
ithin education
and training.

The traditional focus when introducing electronic assessment is to save time for correction answers and give
immediate feedback on learning. The aspect of this project is turned another wa
y around and focus on the learning
perspective. So you have performed an assessment, the students has been working on a problem for a time and
given you as a teacher a digitally delivery of the results, you can see the results immediately,
what to do next
?

Figure 1: Post assessment dilemma


Done
-
IT

LLP
-
Project Nr.
511485
-
LLP
-
1
-
2010
-
1
-
NO
-
KA3
-
KA3MP



14


If the feedback to you as a teacher is good, dependent upon the results, you can do things in many different ways.
With the Done
-
IT assessment solution our target is to provide the teacher with instant feedback on the status of
the assessment. What questi
ons did the students solve correctly and what questions caused more problems. Now
the teacher can do the following for each question:

a)

Continue as usual and only give the results.

b)

Give the students verifying feedback and explain what has been misunderstood.

c)

Give the students a hint of what might be the problem for this question, but not the actual solution.

d)

Give the students the results, “this is what you voted” and allow the students the possibility to discuss the
problem.

e)

Allow the teacher to pick one
question, as set up in the assessment, and send it out prepared for an SRS
session on the question.

f)

Allow the teacher to collect several of the questions after a pedagogical treatment, and allow the students
to resubmit their answer to parts of the assessm
ent.

In case of c) and d) should the students be allowed to take part of the test again, or should the students after a
group / peer discussion be able to renegotiate their response? There are several possibilities that we can allow the
teacher to do.

The
main advantage of entering the test subject just after the test is first of all, the problem is fresh in mind since
the student has just been working with it. The student might have spent time on parts of the assessment but have
just not found the right an
swer. Dependent upon the nature of the subject being thought, there might be just a hint
from a peer student or a teacher that might solve the problem.

The system has been developed using an iterative prototyping process. In the design process have been in
volved the
main system stakeholders: teachers and students. The very context of use dictates that a special emphasis must be
placed on the human computer interaction aspects. The educational activity must be supported in a transparent
and usable way. The s
ystem has been evaluated through a number of usability tests that included both expert
based evaluation and user testing.


Done
-
IT

LLP
-
Project Nr.
511485
-
LLP
-
1
-
2010
-
1
-
NO
-
KA3
-
KA3MP



15


The system is based on a client
-
server architecture and the communication between the components is

over the
HTTP protocol. The main components of the system (figure 1) are: a server application, a student client
application and a teacher control application. The responsibilities of each component are defined as follows:



The server component includes th
e PeLe webservice, the back end database (that in our case is collocated
with the webservice, but by no means restricted to be collocated) and portal pages. The server also hosts
the students' web app. The web service manages the central data model and kee
ps the entire system in a
consistent state at all times. It provides an interface to remotely modify and read the productive resources
following the "Representational State Transfer" (REST) principles. The web service is the main
communication node and its

data represents the "truth", if synchronization issues occur.



The teacher software is responsible for modelling the workflow for a given use
-
case. This is achieved by
modifying the data inside the web service according to the required work
-
flow. The teach
er role is granted
write access to various parameters which allow for shaping a custom work
-
flow.



The student software usually reacts to the actions of the teacher. Again, the user interface has to
implement the workflow for the respective use
-
case. A
central aspect of the student client is the ability of
being remotely controlled. According to the state of the data maintained in the web service, the student
client may entirely block input or provide restricted read/write access to a set of resources.

A
part from the main basic components the system includes few other components that aim at improving of the
user experience and facilitating the system maintenance:

Figure 2: PeLe System overview


Done
-
IT

LLP
-
Project Nr.
511485
-
LLP
-
1
-
2010
-
1
-
NO
-
KA3
-
KA3MP



16




The authentication service allows integration with other authentication methods and also allo
ws the
scenario when one user is accessing several servers.



The autoupdate service allows the teacher client to update over the internet. When a new build is
available it is published on the autoupdate location. The teacher clients will check that location

periodically and will notify the user to update.



The zero configuration service allows the teacher client to automatically retrieve the required information
to connect to a certain server.

The PeLe tool is a web service in combination with use
-
case specif
ic clients. During lectures, the system is controlled
by the teacher and the student clients react according to the situation, thus generating an interactive environment
with immediate feedback between teacher and students.

The ability of reacting to the t
eacher's actions poses certain challenges in the implementation, as the HTTP
protocol does not specify a reliable back
-
channel for browsers yet. Possible approaches to address this issue are
polling, long
-
polling or web sockets where each of which undeniab
ly has specific drawbacks.

In this interactive environment, both the teacher’s as well as the student’s tools are designed to be streamlined. The
amount of available options is kept small and the important functionality for the targeted workflow can be
rea
ched with a minimum amount of interactions. A quick interaction with the service is necessary because of the
time pressure created in the interactive environment. Neither teacher nor students should get distracted or
overwhelmed by special functionality. A
ny delay in usage of the tool directly causes the other users to wait for the
intended change.

Outside the lecture, other clients (e.g. a dynamic web page) are used to work on the resources that has been collected
inside the classroom. These tools can offe
r a rather complex user interface as the time pressure of a live system does
not apply here. Typically, the teacher may create new or modify the existing content. Further, both teachers and
students have read access to the collected data. Naturally, the te
acher is allowed to see all data related to his/her
lectures. Students usually only have access to unlocked data, further filtered for personal content. The data of other
students remains entirely invisible.

Basic data model

The system data model includes
data structures mainly for assessment definition, assessment results and users. In
the data model referring to the assessments we considered three levels: Assessment, Section, Item (ASI). An
assessment is considered to be a single working unit that is crea
ted by an instructor and used for a specified task,

Done
-
IT

LLP
-
Project Nr.
511485
-
LLP
-
1
-
2010
-
1
-
NO
-
KA3
-
KA3MP



17


e.g. one single lesson during a course. Each assessment can have one or more sections which in their turn can
contain items. An assessment item corresponds to a single question in a test.

The ASI resource
s have assigned to them a set of metadata properties. The most important metadata properties
refer to the resource state and the current resource inside the “container resources” (i.e. the currentSection for the
assessment and the currentItem for the secti
on).

The access to the resources is accomplished through the REST commands. The operations available for each
resource at a given moment in time are dependent on the resource state and the role of the user requesting the
operation. For example, the teacher

has the possibility to create new instances on each individual level and can also
assign various states to every single resource. These states are controlling the accessibility, i.e. allowing or disallowing
certain levels of interaction (reading/writing)
of the students of a resource.

Data formats used in communication

The messages exchanged between the system components use a number of formats. The system supports one
standard format, i.e. "IMS Question & Test Interoperability Specification" (QTI), and al
so proprietary formats
in XML and CSV. Unfortunately, version 2 of the QTI specification has never been finished except for a draft
for the items. Therefore the older, yet complete version 1.2 has been selected as the main data model. From the
QTI format o
nly the relevant elements have been included in the implementation and also several additions have
been applied mainly in the form of specific metadata. The QTI format proved to be inefficient for scenarios when
the clients need often updates so the propri
etary formats have been introduced. The proprietary XML supports
two levels of detail: complete results and summarized results. To further improve the response time and reduce the
network traffic demands we have also added a CSV format. The CSV format redu
ces considerably the amount
of data that needs to be transferred for updating a specific client (e.g. the teacher client).

Offline operation
-

Mobile PeLe Service Unit

In certain scenarios accessing the internet might not be possible and thus the PeLe serv
ice is not available if on a
remote server. For this reason we have developed a standalone offline solution that does not rely on an Internet
connection
-

the Mobile PeLe Service Unit (MPSU). MPSU includes everything that is needed to run the PeLe
service
without any external dependency. The server components are installed onto a portable unit that provides
both the service and the wireless network necessary for communication with the student devices.


Done
-
IT

LLP
-
Project Nr.
511485
-
LLP
-
1
-
2010
-
1
-
NO
-
KA3
-
KA3MP



18



Typical scenario

The typical scenario refers to the usage of the PeLe system within a classroom and tries to capture the most
important elements from the typical user tasks and activities supported by the system.

Preparation

The instructor, usin
g the teacher client, prepares in advance a short test that she uploads to the service. She can
include some text directly into the items but more often than not it is much more convenient to provide the body of
the questions on a sheet of paper or even to

project them during the lecture. The most important part of the
assessment creation is to define how many alternatives each question has, which ones form the correct answer and
how many choices the student is allowed to make.

The teacher can also change t
he scoring rules and eventually try to simulate the assessment in order to see which is
the probability to get a certain score by randomly answering the test.

After the prepared assessment is done the teacher can choose to immediately upload it on the serv
er or just save it
on the disk and upload it later. When the assessment is uploaded on the server it gets assigned a randomly
generated session code (typically 5 random characters).

Figure 3: System overview when
using Mobile PeLe Ser
vice Unit


Done
-
IT

LLP
-
Project Nr.
511485
-
LLP
-
1
-
2010
-
1
-
NO
-
KA3
-
KA3MP



19


Assessment time

The teacher distributes the test questions and provides th
e students with the session code that they can use to connect
to that particular assessment by entering it into their client. This allows for easily joining any ad
-
hoc session
without picking an assessment from a list from the service.

Students use their d
evices and post the answers deemed correct to the server. When they are done with the
assessment they can submit it (digital action similar to handing the test paper to the teacher in the physical world).

The teacher can monitor on his application the
students that joined the session using their client. After all students
are ready the test can be started and the teacher can get live data at predefined intervals.

The teacher can choose to monitor using a summary view (that gives aggregated information o
n how many students
replied or not, correctly or not or marked a question) or to see a more detailed view that shows information for each
student in a “monitoring matrix”.

The access to all these details allows the teacher to already prepare the discussion

that will follow the in the post
assessment activities. The teacher can spot items that might have been misunderstood or well understood as well as
items that require discussion and further elaboration.

When the allocated time for the assessment passes t
he teacher can stop the test by pushing a button. After this the
students are prevented from changing their replies disregarding the fact that they submitted or not.

Post
-
assessment time

Now the teacher can select the items one by one and discuss the resul
ts if necessary. During the discussion she
might consider that a certain item should be answered again after the discussion or she wants to give the students a
second chance after offering some hints. The discussion might generate some ad
-
hoc dilemma or qu
estion that can be
easily sent to the students to respond. The results can be shown using a projector or even better a smart
-
board. The
teacher can interact with the results charts and highlight alternatives as necessary for the discussion.

If the teacher
considers necessary she can release the results immediately after finishing the discussions or she can
wait to release them later after the class.


Done
-
IT

LLP
-
Project Nr.
511485
-
LLP
-
1
-
2010
-
1
-
NO
-
KA3
-
KA3MP



20


Post class processing

The teacher can consult a very detailed view

that includes for each student the answers given to every question from
the main assessment and for the post assessment activities. The teacher can export that data for further processing
in her spreadsheet software of choice directly from the teacher app
lication. Alternatively she can go and log in to
the portal pages and access additional formats for the results.

Implementation details

The main server components are implemented using Java and deployed as a Tomcat web application and they use
a MySQL back
-
end database. The teacher client interface is implemented in Flex as an Adobe Air desktop
application. The student client is implemented as a HTML5 web app.

Testing and evaluation

The system components have been tested and benchmarked both as individual a
nd as a whole.The first level of
testing included functional and unit testing. In the second level, were the evaluations using inspection methods as
well as small scale user tests in controlled environments. The server side performance has been benchmarked

through a number of simulations. Finally some larger scale user testing have been conducted.




Done
-
IT

LLP
-
Project Nr.
511485
-
LLP
-
1
-
2010
-
1
-
NO
-
KA3
-
KA3MP



21


Design


Use cases

The use cases described here are mainly focusing on the functionality regarding a standard assessment sequence.

Some steps in the usecase desc
riptions are in parenthesis. These are steps that we don’t need to take into account in
the early stage of the project, but might be an issue later in the project.



Figure 4: Main use cases of the system


Done
-
IT

LLP
-
Project Nr.
511485
-
LLP
-
1
-
2010
-
1
-
NO
-
KA3
-
KA3MP



22


Register to system

Users register themselves to

the system. The default role is assigned as student. Elevated rights are assigned by an
administrator.

Login as teacher

Scope: Controller interface.

Arena: Outside classroom, Inside Classroom

Preconditions: User

has been registered on the system and has a username and password.

Main Success Scenario

1. Controller interface provides a login interface.

2. Teacher choose server assessment server (from a list of available servers).

3. Teacher provides username.

4. Te
acher provides password.

5. Server confirms that it is open for business.


Create new assessment

Scope: Controller interface

Arena: Outside classroom.

Preconditions: Controller has registered on the system and has a username, password and defined role. The

teacher
has the assessment written on a paper with multiple choice alternatives given.

Main Success Scenario

1.

Teacher chose “create new assessment”.

2.

The system starts the “create new assessment wizard”.

a)

Teacher chose type of assessment; assessment (, assig
nment or self assessment).

b)

(Teacher set the date and time of start.)


Done
-
IT

LLP
-
Project Nr.
511485
-
LLP
-
1
-
2010
-
1
-
NO
-
KA3
-
KA3MP



23


c)

(Teacher set the date and closure time.)

d)

Teacher set the number of questions according to the written assessment.

e)

Teacher set the default number of alternatives on each default question.

f)

Teacher defines the score rules.*

g)

Teacher sets the default evaluation model.

3.

System generates an access code (which the teacher can write on the written assessment).

4.

The system now fills in and presents an edit assessment page where all data related to the

assessment can
be modified and changed.

5.

Teacher takes question 1 and modifies number of alternatives, if it is not the default one.

6.

The teacher set the correct alternative(s) on question number 1.

7.

Teacher modifies the default evaluation model on question
number 1 if it is not the default one.

Teacher goes through 5
-

7 until all questions are set according to the written assessment.


Edit assessment

Scope: Controller interface

Arena: Outside classroom.

Preconditions: Controller has registered on the system

and has a username, password and defined role. (Teacher
has entered the edit assessment interface)

Main Success Scenario

1.

System presents an interface where the entire assessment setup is presented in an overview

2.

The teacher now

1.

a. deletes/add one question

2.

b. add/delete alternatives in a question.

3.

c. set/change the correct alternative of the question

4.

d. set/change the score for each correct answer

3.

System deletes one question and presents an updated version of the assessment.



Done
-
IT

LLP
-
Project Nr.
511485
-
LLP
-
1
-
2010
-
1
-
NO
-
KA3
-
KA3MP



24


(UCT4) Start assessment

Scope:
Controller interface

Arena: In classroom.

Preconditions: Controller has registered on the system and has a username, password and defined role. The
assessment is a written paper with multiple choice alternatives written on the paper and ready to hand out t
o the
students. The assessment is submitted to the system.

Main Success Scenario:

1. Teacher login to the system.

2. Teacher selects assessment.

3. System goes “In classroom mode”.

4. System shows the schedule of upcoming event and flags the closest event
for selection

5. Teacher

6. The system shows the assessment code for students to use in order to participate on the assessment.

7. System gives the monitor interface showing the students that are logged in.

8. The teacher press Play, when the assessment i
s handed out and conditions are right.

9. System confirms to the teacher and students that the assessment is open.

Purpose: See that all students are ready and logged in, ready to go. Allow the students to submit responses.


(UCT5) Monitor assessment

Scope
: Controller interface

Arena: In classroom.

Preconditions: Controller has registered on the system and has a username, password and defined role. The
controller has started the assessment (UCT4) and the students are working with the assessment.

Main Succes
s Scenario

1. System provides a status window showing how many students are present and who they are


Done
-
IT

LLP
-
Project Nr.
511485
-
LLP
-
1
-
2010
-
1
-
NO
-
KA3
-
KA3MP



25


2. System provides status of the responses given for all students.

3. System shows status of the submitted assessments.

Alternative Scenario

Purpose: Get

information about how many students has finished the work and how many has submitted. When
they all have submitted, this might change the time plan of the event in the classroom. Based on the responses the
teacher must have to change the following activit
ies.


(UCT6) Close assessment

Scope: Controller interface

Arena: In classroom.

Preconditions: Controller has registered on the system and has a username, password and defined role. The
assessment is a written paper with multiple choice alternatives written

on the paper.

Main Success Scenario

1. Teacher press Stop.

2. System responds with a “small status report” and ask teacher to confirm.

3. Teacher confirms closure dependent on status report.

4. System gives teacher a choice for student feedback.

a. No fe
edback.

b. Number of correct answers of total

c. Number of correct answers of total and what was correctly answered and wrong answered.

d. Number of correct answers of total and what was correctly answered and wrong answered with solution for
each questio
n.

5. Teacher selects the desired alternative for feedback.

Alternative Scenario

3.a Teacher decides to leave the assessment open until status is ok

3.b Teacher open the monitor window and looks at the detailed status


Done
-
IT

LLP
-
Project Nr.
511485
-
LLP
-
1
-
2010
-
1
-
NO
-
KA3
-
KA3MP



26


Purpose: Close the assessment for the

students and decide upon feedback level. Flexibility gives the teacher a choice
of how he/she would like to continue with the post assessment work or just leave it.

UCT7 Post assessment activity

Scope: Controller interface

Arena: In Classrom

Preconditions
: An assessment procedure up to UTC6 has been performed and the teacher wants to summarize

Main Success Scenario.

1. System gives an assessment report.

2. Teacher considers the responses from an assessment report tool.

3. Teacher identifies questions that
caused problems.

4. Teacher does his/her pedagogical consideration and choice of treatment

5. Teacher use SRS system to send a specific question back to the students.

6. System creates an SRS setup that is shown on the students interface.

7. Students give
their response

8. System gives thre results back to the students.


Alternative Scenario

3a Teacher leaves the assessment. (No Post assessment activity)

Purpose: Now the teacher can take the results and create a new arena for learning. Students has the pro
blems
fresh in mind and the problems are fresh in mind. From considering the responses, are there questions where the
students has many mistakes (70% has the wrong answer) or that there are some signal of an obvious problem
with the question. Here we want
to have a tool that can help the teacher to identify questions where the students
have problems.


(UCS1) Register to the system

Scope: Student interface


Done
-
IT

LLP
-
Project Nr.
511485
-
LLP
-
1
-
2010
-
1
-
NO
-
KA3
-
KA3MP



27


Arena: Any

Preconditions: None

Main Success Scenario

4. (Student download application to use for wirele
ss assessment.)

5. Student start the application / enters the student interface.

6. Student selects “register”.

7. Student provide information.

a. First name

b. Family name

c. Email

d. Username

e. Password

8. System check username against others and makes
a small password check.

If username taken or password security level is low return to 4.

9. System confirms that the student has registered.

Purpose: Create username and password with real name and email address for identification.


(UCS2) Login to system

Scope: Student interface

Arena: In classroom / Any

Preconditions: Student has registered (UCS1) and has got a username and password.

1. Student start the interface (App. /Web interface)

2. System shows login page asking for username and password.

3.
Student provides username and password.

4. System confirms login.


Done
-
IT

LLP
-
Project Nr.
511485
-
LLP
-
1
-
2010
-
1
-
NO
-
KA3
-
KA3MP



28


Purpose: General login procedure to identify user on system using data from UCS1.

(UCS3) Start assessment

Scope: Student interface

Arena: In classroom

Preconditions: Student has registered o
n the system (UCS1) and has a username, password and has logged in(
USC2). The student has a device to

Main Success Scenario (collected submission)

1. System asks for an assessment code.

2. Student provides assessment code given from the teacher in the roo
m

3. System confirms login and ready for assessment “Please wait”.

4. Student waits until assessment is opened.

Purpose: Identify the student on the system as participating on the assessment. This identification should later be
used to inform the teacher w
ho’s ready and present before the assessment is opened.

(UCS4) Respond to assessment

Scope: Student interface

Arena: In classroom

Preconditions: Student has gone through UCS1 to UCS4. The assessment is a written paper handed out to the
student.

Main Succes
s Scenario (collected submission)

1. System displays the ready to retrieve response interface.

2. Student solves questions on the paper and decides on the alternatives for each.

3. Student registers the answer on question 1 to system.

4. System retrieves t
he response and stores the result

2 and 3 repeated for every question until all answers have been given.


Done
-
IT

LLP
-
Project Nr.
511485
-
LLP
-
1
-
2010
-
1
-
NO
-
KA3
-
KA3MP



29


5. Student submit assessment

Success Scenario (sequential submission)

1. Student solve question 1 and find the right alternative.

2. Student responds t
o question 1.

3. System retrieves the response and stores the result

Student repeats 1
-

3 until all questions are answered

4. Student submits the answer.

Comment: Both scenarios are possible but will vary from student to student. A combination of
both is

possible, where after a while the questions that are solved is sent and the student
continues with the difficult ones and sends the answers as they are solved.

Alternative Scenario:

2.a System has been waiting long time (5 minutes), the responding device
goes to sleep and is offline

3.a Student loose network connection or battery is dead on response device?

(UCS5) Retrieve assessment feedback

Scope: Student interface

Arena: In classroom

Preconditions: Student has gone through UCS2
-
UCS4 and submitted.

Main
Success Scenario:

1. System display results of the assessment

a. How many correct how many wrong.

b. a+which question was wrong and correct.

c. a+b+what was the correct answer.

Purpose: Give the student feedback on the assessment.


Done
-
IT

LLP
-
Project Nr.
511485
-
LLP
-
1
-
2010
-
1
-
NO
-
KA3
-
KA3MP



30


UCS6 Participate in Post
Assessment activity

Scope: Student interface

Arena: In classroom

Preconditions: Student has gone through UCS1 to UCS5. The assessment is a written paper handed out to the
student. Teacher wants to perform a Post Assessment activity with RSRS

Main Success S
cenario

2. Student follow teacher instructions

3. Vote when instructed using an SRS system.

4. System shows alternatives

5. Student vote

Repeat 3
-
5 until teacher is finished.

Purpose: Give the student a second chance and learn something from the assessmen
t.


Messages


communication workflow

This section describes the communication flow between the clients and the service.

1.

T Upload assessment

2.

T Open assessment

3.

S login

4.

T start

5.

S respond to assessment

6.

T monitor

7.

S submit

8.

T proceed to SRS

9.

T run SRS

10.


S respond to SRS

11.


T finalize lecture and release results


Done
-
IT

LLP
-
Project Nr.
511485
-
LLP
-
1
-
2010
-
1
-
NO
-
KA3
-
KA3MP



31


12.


S view personal results


General Requests

GET root



read information about the service

Figure 5: Main communication workflow between system components


Done
-
IT

LLP
-
Project Nr.
511485
-
LLP
-
1
-
2010
-
1
-
NO
-
KA3
-
KA3MP



32


GET whoami



read own user account
information

Before Lecture

(T) POST assessment



Upload prepared assessment with sections



returns unique identifier and code

(T) GET assessment setup



Download the complete assessment setup after selecting an existing session

On selecting an existing assessme
nt the teacher client will move on to download the entire assessment setup.

Assessment Phase

(T) GET assessmentlist



read list of currently prepared assessments



pick single assessment from the list

(S) GET state (polling)



Polling assessment state by code



optionally: Push functionality through long polling or Web Sockets

(T) PUT open "default"



Open assessment



Open section "default"



Set currentSection="default"



Other sections should still be unavailable to students


Done
-
IT

LLP
-
Project Nr.
511485
-
LLP
-
1
-
2010
-
1
-
NO
-
KA3
-
KA3MP



33


(T) GET responses (polling)



Polling assessm
ent section responses by identifier



optionally: Push functionality through long polling or Web Sockets

(T) GET state_Teacher (polling)



Polling assessment state by name

The teacher client is polling for the state as well. Now it is only using that to displa
y / get the current section. This
polling can be used in multi
-
client scenarios for a teacher.

(S) GET "default"



Read contents of the currentSection



sectionType is "assessment"



Display assessment view

(S) PUT response



Submit a response to a single item



Retrieve Header "X
-
(PeLe
-
)Accepted
-
Answer" as confirmation

(S) GET myresponses (sync)



Read all responses given by logged in user



Used to sync the client with the server



Overview for the student before submission

(S) PUT submit "default"



Set the "default"
section to submitted state


Done
-
IT

LLP
-
Project Nr.
511485
-
LLP
-
1
-
2010
-
1
-
NO
-
KA3
-
KA3MP



34


Interactive (SRS) Phase

(T) PUT open "srs"



Set section "default" to "finished"



Open section "srs"



Set currentSection="srs"



Other sections should still be unavailable to students

(T) POST SRS item



Upload new item or create copy wi
th link to existing one



Switch current item to new one

(S) GET SRS item



Read the contents of the current item



Display input form

(S) PUT response



Send response to the item (form parameters)



Retrieve Header "X
-
(PeLe
-
)Accepted
-
Answer" as confirmation

(T) PUT

item close



Close the current item (e.g. by timer)

(S) GET state (polling)



sectionType is "srs"



Display SRS view

(T) GET item responses



Read the results for the current item


Done
-
IT

LLP
-
Project Nr.
511485
-
LLP
-
1
-
2010
-
1
-
NO
-
KA3
-
KA3MP



35


After Lecture

(T) PUT complete assessment



Set assessment to "complete"



Results
visible on student devices


Scoring Model

Item Level

The responses (single, multiple) sum up the score of each selected alternative and have a constrained min and max
value.



score map:
M

= {(a
i
, s
i
), ...}



selected alternatives:
S

= {a
i
, ...}



upper bound
ary:
u



lower boundary:
l



score for no choice:
nc

If there is a selection then:

itemscore = max(l, min(u, ∑[{s
i

| a
i


S

(a
i
, s
i
)

M }] ))

else:

itemscore = max(l, min(u, nc ))

Individual scoring states

For a single candidate, the itemscore could be
used to determine one out of 4 different states:



not answered



=
l

-
> incorrect


Done
-
IT

LLP
-
Project Nr.
511485
-
LLP
-
1
-
2010
-
1
-
NO
-
KA3
-
KA3MP



36




=
u

-
> correct



else
-
> partly correct

Section Level

sectionscore = ∑[itemscore
i

]

Assessment Level



scoreweight:
w

(to be implemented as metadata field)

Typically:



w
default

= 1



w
srs

= 0

assessmentscore = ∑[sectionscore
i

* w
i
]

Normalized Scores

In addition to the absolute scores, a normalized score is required. The basic rules are:



0p = 0.0 (0%)



max score = 1.0 (100%)



Negative scores use the same scaling factor as used for the

positive range and are mapped to negative
values

For the normalization, one point
p

is equivalent to the fraction:

max*p = 1.0
-
> p=1/max





Done
-
IT

LLP
-
Project Nr.
511485
-
LLP
-
1
-
2010
-
1
-
NO
-
KA3
-
KA3MP



37







Done
-
IT

LLP
-
Project Nr.
511485
-
LLP
-
1
-
2010
-
1
-
NO
-
KA3
-
KA3MP



38


PeLe Service Documentation


Design

The "Assessment Service"

is a RESTful web service. It manages the resources within the Done
-
IT and keeps
them synchronized over the network. Rather than providing a sophisticated user interface, it only offers some basic
control elements for maintenance and testing purposes. Vari
ous UI components can be attached to the service,
offering use
-
case specific work flows, such as the Teacher Control Interface and the Student Response Interface.

RESTful Interface

The service provides resources that are accessible through a URI. The commu
nication protocol is HTTP and the
preferred data format is XML. The most important resources are following the IMS Question & Test
Interoperability Specification with some project specific additions. Some resources have no immediate connection to
the IMS s
pecification, e.g. the live monitoring information.

The root URL of the service is given using the protocol, the server name (+port) and the service name. For
practical reasons, this is the location of the portal page and the dynamic REST services are loca
ted a level deeper
inside the folder "/rest" e.g.:

http://example.org/pele

http://example.org/pele/rest

The Root Resource represents the entry point for the whole service. It contains a set of links to further resources
within the system. Each client shou
ld use this resource to determine information such as the version number and
the URLs for the required resources.

Identifiers and Names

In the QTI specifications, each resource has an attribute called "identifier" or "ident". This attribute always
contains

the "Name". It is up to the server side implementation, if this is also equivalent to the "Global Id". The
exact definition of "Global Id", Name" and "Title" are as follows:

Table 1: definitions for identifiers and names

Name

Defined
by

Description


Done
-
IT

LLP
-
Project Nr.
511485
-
LLP
-
1
-
2010
-
1
-
NO
-
KA3
-
KA3MP



39


Global
Id

service

The
Global Id

is a server controlled identifier that is globally unique (within the
entire system). In the PeLe service, it is equivalent to the "soloist id" which is
managed by the Soloist framework. It can be used to directly access any

resource/object in the data model, mainly each Assessment, Section or Item in
the Library. If no
name

is specified when creating a new resource, the global id
is automatically also used as
name
.

Name

service/c
lient

The
name

is locally unique (within a si
ngle assessment) and can be specified by
the creating instance. It lies within the responsibility of the creator to use
unique names. Defining a name is not possible on the assessment level (it is
always a copy of the global id). The name can only be set f
or sections and
items. A global uniqueness is not required, as the names are always used within
the scope of the parent resource where the top level
-

the assessment
-

always
has a unique name from the service. It is merely a convenient way for the client
to keep track of the created data on the server without having to check an
assigned id each time.

Title

client

The contents of the
title

is not meant to be parsed by a machine. The contents
are purely human readable and in the full responsibility of the c
lient/person
setting it.

Routing

Each access to the service is routed according to the URL used in the request. Some resources (e.g. the root
resource) provide read
-
only access via GET and do not require authentication.

Authentication

By default, the PeLe

system uses BASIC HTTP Authentication which can be made reasonably secure through
an HTTPS connection. In addition, it supports the external Feide authentication scheme. Independent from the
initial authentication scheme, the system is using a stateless "
login token" mechanism. Instead of storing username
and password on the client side, the only required information for accessing a resource is an encrypted token. This
token contains information of the currently acting user as well as a time stamp of its c
reation time. This login
token makes the system transparent for using any external authentication scheme as the user has to authenticate
only once for a given period of time. Other schemes, such as LDAP or OpenID may be added in a similar
fashion as Feide.


Done
-
IT

LLP
-
Project Nr.
511485
-
LLP
-
1
-
2010
-
1
-
NO
-
KA3
-
KA3MP



40


HTTP Authentication

The HTTP Basic Authentication scheme requires user name and password inside a request header. It is possible
for a client to send the credentials on every single request, but it is discouraged for security reasons. Instead, the
login t
oken provided within a successful request should be used instead.

External authentication scheme (Feide)

Instead of maintaining the username and password locally in the service, a third party authentication service
(Feide) can be used. This service
authenticates the user against the service and does also specify the role (student or
teacher). The PeLe service itself should not require to store any local account information under normal
circumstances. Whenever someone logs in through the third party s
ervice, the PeLe service receives all required user
information and automatically stores a local copy of that user for future reference. This process is similar to a
manual registration at the service but it dos not require to know any password.

Login Toke
n

After having a valid authentication, the PeLe service generates an encrypted token that is sent to the user and is
not stored locally on the service side. The user should then add the token to each subsequent request instead of
using the previous authent
ication mechanism again. A token that can be decrypted by the server to a valid string
which confirms the authentication. If a user wants to log out from a specific device, the token can be deleted locally
on the client, making it necessary to authenticate

again for any subsequent request. For Feide, the client should also
trigger the SSO logout mechanism.

In order to keep the service stateless, the token must contain all the essential information to determine the
authorization of a request. The authenticat
ion string is built as follows in CSV format:


<serviceId>,<userId>,<role>,<generatedTime>|<hash>


e.g. hist,student01,student,2012
-
07
-
12T16:24:44|564657643



serviceId: Arbitrary string from the configuration file, used for identifying the particular servic
e



userId: User login name



role: Main user role



generatedTime: Creation time of the token



hash: Hashed csv string to validate the token integrity

This string is generated and symmetrically encrypted by the PeLe service, using a secret key known only to the
service itself. The encrypted string is then added as "X
-
PeLe
-
Token" header. The client should store this token

Done
-
IT

LLP
-
Project Nr.
511485
-
LLP
-
1
-
2010
-
1
-
NO
-
KA3
-
KA3MP



41


locally for the duration of the user session and send it inside the "X
-
PeLe
-
Token" header on each subsequent