User Experience with Information

closebunkieΤεχνίτη Νοημοσύνη και Ρομποτική

15 Νοε 2013 (πριν από 3 χρόνια και 8 μήνες)

69 εμφανίσεις

1

Understanding and Evaluating the
User Experience with Information

Spaces


Andrew Dillon

HCI Lab

Indiana University

adillon@indiana.edu


2

Why does user experience matter?


The improvements in performance gained
through usable interface design are
3 or 4
times larger

than those gained through
designing better search algorithms”





Sue Dumais, Microsoft
-

invited presentation to IU’s Computer
Science Horizon Day, March 2000.

3

Why do we need to test users?


Bailey (1993) asked 81 designers to assess
4 interfaces for users like themselves



Interface


Rating


Performance


A



4



1


B



3



2


C



1



3


D



2



4



NB: 95% of designers selected an interface other than the one
they performed best on.


4

So what to test?

Interaction Basics

User

Task

Tool

Context

5

Basic user tendencies:


Users don’t estimate own performance
well


Users change over time


Are impatient


See things in their own way


Seek to minimize cognitive effort

6

Traditional approach:





usability engineering


Usability defined:


Semantically


Featurally


Operationally

7

So what is usability?


Semantic definitions


‘user
-
friendliness’?


‘ease
-
of
-
use’?


‘ease
-
of
-
learning’?


‘transparency’



These tend to circularity, and provide little
value to design practice


However, the term captures something that
people recognize as important



8

Usability as a collection of features


Interface is usable if:



Links, search engine, nav bar, back

button?



Graphical user interfaces (GUI)



Based on style guide recommendations?



Meets Nielsen’s or Shneiderman’s

principles of design?

9

Attribution Fallacy


The attribution fallacy suggests usability is a
quality of an interface that is determined by
the presence or absence of specific
interface features.



This attribution leads to an over
-
reliance on
guidelines and prescriptive rules for design

10

Experience requires more than
features


Users’ experience is contextually
determined by their
needs
, their
tasks
,
their
history

and their
location
.


Understanding this and knowing how
to evaluate experience, is the primary
purpose of this talk


11

Operational definition

Usability (of an application) refers to the
effectiveness
,
efficiency
, and
satisfaction

with which specified users
can achieve specified goals in
particular environments


ISO Ergonomics requirements, ISO 9241 part 11: Guidance on usability
specification and measures.


Useful but overlooked, and still not the full story….

12

Effectiveness

The extent to which users can achieve
their task goals.

Effectiveness measures the degree of
accuracy and/or completion

e.g.,if desired task goal is to locate information on
a web site then:


Effectiveness= success of user in locating the
correct data

13

Effectiveness can be a
scale

or an
absolute

value


If the outcome is ALL or NOTHING then
effectiveness is an absolute value

-
User either locates info or does not...


If outcome can be graded, (user can be
partially right) then effectiveness should be
measured as a scale

-
As a %, or a score from 1 (poor) to 5 (complete)


Scale should be determined by evaluator in
conjunction with developers and users

14

Quality?


Some tasks do not have a definitive
correct answer:


creative production (writing, design)


information retrieval


data analysis


management


Making a purchase…..


Effectiveness alone misses
something...

15

Efficiency


Measures resources used to perform
task


i.e., time, effort, cost,



In case of Web site use, efficiency might
equal time taken to complete a task or the
navigation path followed etc.


16

Efficiency of using a redesigned
web site


Time taken to complete task


Compared across tasks, across users or
against a benchmark score


Number of steps taken


Number of deviations from ideal path


Such variables are frequently highly positively
correlated
-

but they needn’t be.

17

Efficiency in path analysis

Ideal path: 3 steps

Conference
Journal
Papers
Intro to HCI
Graduate
Stats
HCI
Java
Undergrad
Classes
Office hours
Home page
18

Efficiency in path analysis

Conference
Journal
Papers
Intro to HCI
Graduate
Stats
HCI
Java
Undergrad
Classes
Office hours
Home page
Actual to ideal user navigation
-

7:3 steps

19

But is it efficiency that users want?


The push to efficiency is symptomatic
of an engineering
-
oriented approach


Who determines efficiency?


Are path deviations always inefficient?


Is time equally weighted by user, designer
or owner?


Suggests a need for negotiation
beyond typical usability tests

20

Satisfaction


Measures the affective reaction (likes,
dislikes, attitudinal response) of users
to the application


Assumed to be influenced but not the
same as effectiveness or efficiency
e.g.,


2 applications with equal effectiveness, and
efficiency, may not be equally satisfying to use


or What users like might not be what they need!

21

Basis for satisfaction?


Positively influenced by effectiveness
and efficiency


Also


Personal experience with other technologies?


Working style?


Manner of introduction?


Personality of user?


Aesthetics of product?

22

Satisfaction is important


Good usability studies recognize this



But satisfaction is not enough….


People often like what they don’t use
well


What about empowerment, challenge
etc?

23

Beyond usability: P
-
O
-
A


User experience can be thought of at
three levels:


Process


Outcome


Affect


Full evaluation needs to cover these
bases


24

Experiencing IT at 3 levels:





What user does



What user attains



How user feels

Process
Outcome
Affect
User experience
25

Process: what the user does


Navigation paths taken


Use of back button or links


Use of menus, help, etc.


Focus of attention


The emphasis is on tracking the user’s moves
and attention through the information space



26

Outcome: what the user attains


What constitutes the end of the
interaction?


Purchase made?


Details submitted?


Information located?


The emphasis is on observing what it means
for a user to feel accomplishment or closure


27

Affect: how the user feels


Beyond satisfaction, we need to know
if user feels:


Empowered?


Annoyed, frustrated?


Enriched?


Unsure or wary?


Confident?


Willing to come back?


The emphasis is on identifying what the interaction
means for the user


28

User experience =



behavior +result +emotion



Behavior

Result

Emotion

29

Interesting ‘new’ measures of UE


Aesthetics,


Perceived usability


Cognitive effort,


Perception of information shapes


Acceptance level


Self
-
efficacy


UE proposes a range of measures not normally associated with
usability testing

30

Aesthetics and user performance
-
Dillon and Black (2000)


Took 7 interface designs with known
user performance data


Asked 15 similar users to rate
“aesthetics” and “likely usability” of
each alternative design


Compared ratings with performance
data

31

Rankings of 7 interfaces

In
te
r
fac
e
Pe
r
fo
r
man
c
e
Pr
e
f
e
re
n
c
e
Ra
t
i
n
g of
A
es
th
e
t
i
c
s
Pe
r
c
e
iv
e
d
us
ab
il
i
t
y
A
2
4

1
3
B
7
5

1
1
C
6
6

4
2
D
1
1

3
4
E
4
2

6
5
F
3
3

5
3
G
5
7

7
7
R=.85

R=.83

Correlation between aesthetics and performance = 0

32

Follow up study:


30 users


Rated the aesthetics, likely usability
and then used 4 web search interfaces


Rated aesthetics and usability again
again


No correlation with performance!


33

So what?


Users respond to interface beauty


Users do not predict their own
performance (process and outcome)
accurately


Designers cannot usefully predict user
response through introspection, theory
or asking their colleagues!


34

Time matters...Error Scores for
Regular Users of Software

0
20
40
60
80
100
120
1
2
3
4
5
6
7
8
9
10
"Friendly"
"Unfriendly"
Trial days

So design stops being important?

35

NO…it remains important….

0
20
40
60
80
100
120
1
2
3
4
5
6
7
8
9
10
11
12
Friendly
Unfriendly
Stress
trial
Transfer
trial
36

So what?


User experience is dynamic
-



Most evaluations miss this


User data is the best indicator of interaction
quality….REPEAT THIS TO SELF DAILY!!!!!


To be valid and reliable, the user data must
reflect all aspects of the user experience:


P
-
O
-
A


The targets are moving….user experience is
growing daily in web environments



37

Genres in information space


Users have expectations of information
spaces


Documents have genres


E
-
business is “shopping”


A website is a website is website….


Expectations activate mental models
which drive what users see and
interpret

38

What does a home page look like?
Dillon and Gushrowski (2000)


We analyzed a sample of 100 home
pages for features


Then tested 8 candidate pages
manipulating the most common or
uncommon features of existing pages


New users were asked to rate the
pages they thought were ‘best’


Significant positive correlation resulted


39

El ement
Total
(of
57)
Total
by
%
%
found
on
i nitial

sampli ng
Title
55
96
71
E-mail
address
(M*)
49
86
82
Updat e
date
48
84
39
Table
of
cont ent s

(L**)
42
74
11
Create
date
41
72
20
External
links

(M)
39
68
72
Welcome
message


(M)
38
67
51
1-4
G
raphics

(M)
34
60
52
Photographs
32
56
42
Brief
bio


(M)
32
56
49
Text-only
option

(L)
26
46
2
5-9
Graphics
22
39
31
Sit e
map
14
25
4
Guestbook

(L)
11
19
16
List s
9
1
33
Animat ion
8
14
37
Tables
7
12
37
Frames

(L)
7
12
11
Sound
7
12
5
Image
map
5
9
4
Count er
2
4
39
Advertisements
0
0
33
10
or
more
Graphics


(L)
0
0
17
Back
to
top
button
Thumbnails
of
images
1
1
*
/**
Denot es
inclusion
in
t he
most-used
(M)

/least-used
element
(L)
exercise
40

Page

M

1st

2nd

3rd

4th

5th

6th

7th

8th

1

2.1

30

7

13

2

2

1

2


2

3.6

7

14

7

7

10

10

2


3

3.5

7

10

11

13

9

7



4

4.0

3

9

11

15

8

8


3

5

5.0

4

8

4

7

6

13

5

10

6

4.5

5

8

8

6

8

9

9

4

7

6.6

1


3

1

8

5

21

18

8

6.7


1


6

6

4

18

22


Correlation between features and user ranking

r=0.95, d.f.=6, p<.01

Page #

Reflects

features

Number reflects

user ranking

41

Implications



Expectations for digital information
spaces are forming quickly


Violation of expectancy impacts initial
user ratings


Full report online at:


http://memex.lib.indiana.edu/adillon/genre.html


42

Maximizing your evaluations:


Measure the 3 aspects of UE


Process, Outcome and Response


Design user tests that capture multiple
sources of data


Protocols, screen capture, attitude, speed, free
-
form answers


Don’t rely on gurus or guidelines!


A little data goes a long way!

43

Example web site protocol

1.32:


What do I choose here?
....
looks like there is no direct
link
....and
I don’t like the colors here, too bright...er....






(SELECTS TEACHING).
.


1.39:


teaching and courses’ sounds right








(SCREEN CHANGES)..


1.41
: “oh this is all about missions and stuff...hang on....






(HITS BACK BUTTON)

1.48
:

“well.....
that looks the best of these
, you know.”

(
User guesses
)

(
Negative comments
)

(
Navigation

strategy
)

44

Biggest user complaints in our lab


Poor content


Slow loading


Poor aesthetics


Unclear menu options


Menus with example sub
-
items much preferred and lead to
more efficient use


Too much clicking and “forced” navigation


No site map


Poor search facilities