C.3.1 Necessity of Supporting Situation Awareness for Preventing Over-trust in Automation

basicgratisΜηχανική

5 Νοε 2013 (πριν από 3 χρόνια και 11 μήνες)

88 εμφανίσεις

C.3.1

Necessity of
S
upporting
S
ituation
A
wareness for
P
reventing
O
ver
-
trust in
A
utomation

Makoto Itoh
1

1
Department of Risk Engineering, University of Tsukuba
, Tsu
kuba
, Japan

(Tel: +81
-
29
-
853
-
5502
, E
-
mail:
itoh@risk.tsukuba.ac.jp
)


Keywords:
Automation, Tru
st, Situation Awareness
, Human
-
Machine Interaction
.


Abstract

Preventing over
-
trust in automation is an important issue in human
-
machine systems. It is
necessary to understand how operators become reliant on automation too much. Most
previous studies relat
ed to over
-
trust have focused on 'complacency' (e.g., see,
[2, 3]
).
However, several aviation accidents suggest that human operators rely on an automated
system inappropriately when they misunderstand the limit of the capability of the automation.
Such k
ind of over
-
reliance may occur even when an operator is highly motivated or vigilant.

In this study, we investigate how a human operator comes to expect that an automated system
can perform a task successfully even beyond the limit of automation. We have
developed a
model of trust in automation by which we are able to discuss how operator's trust in
automation become
s overtrust
[1]
. On the basis of the model of trust, we conducted a
cognitive experiment using a microworld of an automated mixed juice proce
ssing system to
examine whether the range of user’s expectation exceeds the limit of the capability of
automation. Subjects are informed that the limit of capability of the automation before
performing experimental tasks. For each trial, a subject receive
d information on the
difficulty of the working condition for the automation. He or she has to decide whether or
not they use the automation after receiving the information on the working condition. The
automation can be used only if the condition is with
in the limit of capability of the
automation.

The result of
this experiment showed that
trust of three

out of 33 subjects
exceeded the
limit of the automation

by
experienc
ing

‘unintended use of automation,’ in which a subject
intends to intervene into co
ntrol manually because the working condition seems to be beyond
the actual limit, but he or she pushes ‘use automation’ button mistakenly. The automation
worked successfully in those situations because the working condition was actually within the
limit of

capability of automation. After experiencing the unintended use of automation and
its success, three subjects among the nine subjects became completely reliant on the
automation. In other words, the three subjects used the automation even when the worki
ng
condition was beyond the actual limit. Based on this observation, we can claim that
over
-
trust in automation is not merely due to personal characteristics or lack of vigilance. It is
necessary to support correct situation awareness in order to prevent
over
-
trust in automation.


R
eferences

[1]
Itoh, M., and Tanaka, K. (2000). Mathematical Modeling of Trust in Automation: Trust,
Distrust, and Mistrust, Proc. IEA2000/HFES2000, 1, 9
-
12.

[2]
Moray, N. (2003). Monitoring, Complacency, Scepticism and Eutacti
c Behavior, Int. J.
Industrial Ergonomics, 31, 175
-
178.

[3]
Parasuraman, R., Molloy, R., Singh, I. L. (1993). Performance Consequences of
Automation
-
Induced "Complacency," Int. J. Aviation Psychology, 3, 1
-
23.