The crucial role of haptics in perceiving presence

embarrassedlopsidedAI and Robotics

Nov 14, 2013 (3 years and 9 months ago)

223 views


1

The crucial role of haptics in perceiving presence

Miriam Reiner


1.
Introduction


presence and the sense of haptics



Any attempt to study what determines the
sense of presence

is epistemologically based
on understanding the interaction between human bei
ngs and the physical environment,
especially what kind of sensory information is perceived by the human and how
responses are generated. Thus the central question in designing an immersive
environment is: what would one need to sense in order to be
present
? Most will agree that
a competent tennis
-
player, watching the approaching ball, preparing for the next move,
feels that she is
present

in the tennis court. The player is emotionally, cognitively and
physically immersed in the
present

environment, performi
ng at her best: swinging the
arm while moving the racket in a complex path designed to meet the ball at a particular
time and point in space, at a particular angle, that would exactly fit the ball’s trajectorory
and impart the ‘correct’ intended trajectory

towards a particular point in the opponent’s
court, for a potentially winning hit. The force exerted on the bat is not too strong or too
weak, the velocity of the hand
-
racket system and timing are precisely right, as if carefully
calculated…


To further u
nfold what such interaction may require, lets assume that the player is
replaced by an ‘intelligent’ robotic system. The system may have to estimate the distance
of the ball, its velocity and position, then calculate timing and the needed acceleration of
t
he racket, the required rotation
-
angle of the arm
-
racket, angular momentum of the
system so that the resulting impact fits the desired trajectory

.

According to the traditional centralized processing view, the intelligent robot will have to
use a vast am
ount of complex physics modeling. The human player, obviously, does not.
(Nor does a chimpanzee swinging between two trees...) Then


how do humans (or a
chimpanzee …) make decisions about such actions? It seems that through a complex
mechanism, knowledge
embodied in the bodily manipulation is triggered, used and
controlled through sensory ongoing interpretation. Bodily acts such as in athletics, in
driving, or in surgery are efficient, fast and optimal. The mechanism of bodily
optimization is not due to th
e traditional mathematical optimization processes. A different
process is going on


the body organizes itself. It is bodily knowledge, non
-
propositional
in any sense of the word yet efficient, learnable. It is a form of problem solving, often but
not alw
ays, in an automated manner,. The quality of problem solving heavily depends on
the sensory input. Omitting haptics, will impoverish the sensory information, hence
damage the bodily tactics


it is almost impossible to manipulate a hitting a ball without
f
eeling the collision. Such embodied tactics, dynamic
-

developed over time, and based
on sensory information, continuously refined. The refinement process is a kind of
embodied learning. Thus such kind of input is crucial for any application that targets

training, developing expertise, or collaborative performance of tasks.




2

The sense of position in space of the hand/racket, the feel of the weight of the racket and
tension of the grip, feel of touching the handle, are all included in an umbrella term
kn
own as
haptics
. Examples form everyday interactions with the environment are
infinite. We test pasta for an el
-
dente state by sensing its resistance while biting, a
physician checks the glands of a sick child by touching and palpating, a surgeon palpates
t
he operated organs to test abnormalities in size…


Haptic
-
research deals with the study of sensing and manipulating objects through touch.
It is an interdisciplinary domain and draws on a wide range of domains such as
engineering, cognitive science, human

physiology, neurological science and psychology.

Generally the goal of research in haptics is to:

-

deeper understand human haptics,

-

develop haptic machines that enables users to feel immersed, present in a
particular mediated world,

-

enhance the interac
tion between humans and machines.


The field of
haptics

is a combination of research and development in
engineering

and
human haptics
. The engineering perspective deals with developing devices that allow
users to "touch and feel" objects that are present i
n virtual world. Such devices allow the
user to feel surface textures, shapes, temperature, weight and firmness. It requires
developing engineering techniques such as collision display and detection, force
feedback, hand and arm tracking, and haptic render
ing.


Human haptics

deals with the general principles that humans use to explore, represent,
and interact with objects. It includes issues such as basic psychophysical research,
perception of touch, resolution, cognitive operations, haptic
-
based problem so
lving
strategies, perception and neurological processes through haptic operations.


Haptic research

is still a young domain. Most research and development of devices
occurred within the last 15 years of research activity. A deeper understanding of human
ha
ptics was developed (see section 1) and simple haptic devices were developed and
integrated in popular commercial applications such as games. The commercial
community made such devices widely available. More sophisticated devices were also
developed and b
ecame available for research (see section 2 on haptic devices). Several
applications were developed such as surgical, flight and educational applications and
research around these applications emphasizes the central role of haptics in performance,
learning

and collaborative problem solving over the web. A theoretical framework of
design of haptic interfaces based on human cognition just started to emerge, which makes
it into an exciting time for haptics researchers.


But haptics is not an engineering proble
m only, or a low end, manual, ‘non
-
intelligent’
act. It has a cognitive
-
conceptual counterpart as well, in that haptic sensations improve
our reasoning capabilities: we memorize haptic patterns (Schacter 1996), for instance we
remember and can recognize th
e feel of squeezing a sponge, of riding a bike or catching a
ball, of a flat tire while driving, often absent
-
mindedly. There is no need to invest high
-
level efforts in calculating the movements in playing basketball, no matter how complex

3

the situations a
re. Strategies for solving such problems are triggered by the haptic sensory
input. Haptic
-
memorized
-
patterns are considered to have a meaning similar to linguistic
patterns (Lakoff 1987, Johnson 1987) and are the bodily roots of universal metaphors,
indep
endent of a particular language.
Kilpatrick et al (1976) further show the relations
between reasoning and haptics. They used kinesthetic feedback as an aid to 2D and 3D
force field understanding respectively and showed that kinesthetic feedback improved
us
er perception and manipulation in a simple 3D virtual world, even more so than did
three
-
dimensional stereo viewing. Similarly, Brooks et al. (1990) found that
understanding of the binding energy of a drug molecule was much clearer through forces,
than vis
ual display in a simple 6D docking task. Haptic display was found to improve
both perception and understanding of world models, as well as problem solving (Brooks
et. al 1990).


Haptic feedback is central in performance of collaborative manual tasks such
as when
two distant users must cooperate to perform a joint task. Basgadan et al (2000) found that
haptic feedback significantly improves the quality of task performance and contributes to
the ‘sense of togetherness’ in social virtual environments. They fu
rther showed that
visual plus haptic display was found to elicit a better performance when compared to that
of visual feedback only.


Haptics engineering and human haptics draw on each other. Understanding of human
haptic is improved through haptic technol
ogy


virtual environments and interfaces. And
interfaces are designed according to properties of human haptics. The next section
provides a theoretical framework of human haptics followed by a description machine
haptics, emphasizing the symbiotic interre
lations between the two domains.




2. Human haptics


Sensory information from muscles joints and skin, is essential for regulating movement
(Kandel Shwartz and Jessell 2000 p.711) When the tennis player hits the ball, sensory
information is transferred

to the brain such as the sense of forces applied on the hand
during the collision, and the position and posture of the arm. The combination of the
sense of forces and position is considered
kinesthetic

information (similar to term
proprioceptice used some
times in the literature). The feel of touching the racket, touching
a ball to feel its texture, are
tactile

information.



The kinesthetic input is an outward
-
orientated sense (Gibson 1979), and provides input
about our body's position in space. The kines
thetic sense interacts in real time with the
vestibular sense (
Kandel Shwartz and Jessell 2000,) an inward
-
oriented sense, (Gibson
1979)
responsible for bodily balance, but though related to perception of bodily position,
is not considered part of haptics
.


In addition to the sensory subsystem, the haptic system includes two more components: a
motor system and a haptic

cognitive subsystem. The motor system regulates movements,

4

the haptic
-
cognitive subsystem relates actions with sensations, interpretations

and
intentions.


To summarize, the haptic system includes three main subsystems, each consisting of
several components:

i.

Sensory subsystem, responsible for the input
-
sensory
-
signals from the
environment. This includes two types of sensations:

-
Kinesthet
ic sense conveys information about forces exerted,
posture and motion of the hand and various limbs.

-
Tactile sense, conveys information about changes in spatial force
distribution on the skin, and temperature. The tactile sense
provides information about

surface properties of objects through
touch.

ii. Motor subsystem responsible for motion and object manipulation.

iii. Cognitive
-
haptic subsystem that relates the following three functions:

-

Haptic sensations

-

Perception.

-

Goal oriented interaction with th
e environment


The first two were summarized above. The third, a cognitive
-
haptic subsystem, looks at
how sensory patterns are interpreted in the brain, and how human actions are built in
response to the sensory information, perceptions and intentions of t
he performer (Reiner
2000). Thus, in the process of sensory interaction with the environment, an interpretative
occurs


haptic sensory patterns are interpreted. For instance, a surgeon palpating a lump,
feels a particular pattern of distribution of forces

which helps him decide the
structure(e.g. granular), shape(e.g. circular) and substance of the lump (fat, liquid or
other).



Traditional haptic literature looks mainly at the kinesthetic and tactile components, with
a lesser focus on the cognitive compon
ent. Yet recent research (Biggs and Srinivasan
2001,
Brooks, Ouh
-
Young, Batter & Kilpatrick, 1990,
Reiner 1999) shows the centrality
of the cognitive component in relating haptic input to intentional acts. It is responsible for
assigning meaning to the sen
sed patterns, and for executing intentional acts. Assigning
meaning to patterns requires that haptic patterns are stored in the memory and recognized
in new situations. Indeed recent research on neurology of haptics shows that haptic
patterns are memorized

and recognized even if the haptic experience itself is not
remembered (Schacter 1966, Glynn 1999, Gazzaniga, 1998). This result changes the
thinking about design of VE for training. VE are efficient only if they provide the learner
with an opportunity to
experience and memorize the ‘correct’ haptic patterns. Otherwise,
some VEs may just convey wrong haptic patterns and hence teach patterns that
eventually will have to be unlearnt.


Haptic sensory input and motor control are not independent. For instance,
Kelso et al
(2001) showed that
haptic information stabilizes and destabilizes coordination dynamics,
a kind control of motor acts.

The information flow in haptic acts is bi
-
directional, so as
we touch and manipulate objects, we simultaneously change their
state and receive

5

information about them. Thus haptics is not only a channel to receive information but
also a channel for expressiveness through actions ( Salisbury 1999). Controlling acts
through haptics is a cyclic continuous process. Sensory informatio
n is interpreted to
generate changes in movements and force applied, which change the sensory patterns felt.
The new sensory patterns are used to further change the movements according to the
user’s intentions. This process acts as a corrective ongoing pro
cess, that convergences to
optimize the fit between actions, sensory information and the user’s intentions. This fit is
sometimes based on anchored responses:

motion and sensory patterns are for instance
related through the well known ‘anchoring effect’
(B
yblow et al 1994) that arises because
specific aspects of the movement trajectory become coupled to specific sensory stimuli.
(Fink et al 2000). Looking at the tennis player may make this more explicit. While hitting
the ball, the tennis player felt sensat
ions of distribution of forces along the hand, position
of her hand in space (kinesthetic) and the texture of the racket held in her hand (tactile).
These are part of the sensory component of the haptic system. She interpreted the haptic
information (cogni
tive system) and was able to manipulate the racket to the correct
position (motor control) according to her intentions (cognitive). Her actions generated a
new situation, thus new sensory input is received followed by new interpretations,
intentions, actio
ns and so on. This process is cyclic, active, intentional and explorative.
The haptic cycle of active exploration is described in figure no. 1:













Figure 1. Active exploration
-

a cyclic process


In various situations motor acts originate

from sensory input without the cognitive system
being involved. Such is the case of motor acts at the level of reflexes. Reflexes are an
automated response through the autonomus muscolur system unrelated to intentions, or
interpretations. Thus goal
-
direct
ed, coordinated movements in humans emerge from a
variety of constraints that range from 'high
-
level' cognitive strategies, to 'low
-
level'
neuromuscular
-
skeletal factors, within all ranges of automaticity of response.



Dichotomizing these sources of cons
traint as well as each of the components of the
active exploration cycle, has been proven fruitful
-

it helped to define and identify the
different factors that serve to stabilize coordination under conditions which may
otherwise become unstable (see for i
nstance Turvey 1992, Kelso et al 2001). Literature
seem to be more focused on the tendency to study these factors in isolation rather than
modeling and understanding the mutual interplay
-

the emerging synergism between
Haptic sensat
ions
(Sensory system)


Actions


(Motor
-
system):

Intentions

(Cognitive system)

Interpretations

(Cognitive system)


6

cognitive, physiological and exter
nal constraints (Kelso, Fink; DeLaplain & Richard G.
Carson, 2001). The cycle of active exploration provides a conceptual frame of thought,
that integrates a wide range of constraints into action
-
sensory
-
cognitive processes.


2.1 Kinestheric information


Research on kinesthetic sensations looks at perceptual sensory thresholds, resolution of
position in space, and resolution of net forces applied. Position can be measured by
describing limb position. Searching for kinsesthetic capabilities is then related
to fine
discriminations of limb positions.


Perception of limb position and motion is widely studied (e.g. Jones and Hunter, 1992,
Clark and Horch, 1986). The threashold of detecting joint rotation is a fraction of a
degree in about one second. Brooks s
hows that the bandwidth of kinesthetic sensing is
20
-
30Hz (Brooks 1990) . More than one joint may move in a particular act, yet the
human sensitivity to proximal joints is higher than that of more distal joints. The just
noticeable difference


JND
-

defi
nes the sensitivity threashold. The JND for finger
joints is about 2.5 deg, 2 deg for the wrist and elbow, and about 0.8 for the shoulder
(Tan, Srinivasan, Eberman and Cheng, 1994).


Assesment of length by the finger span method (Durlach et al., 1989; Tan

Pang &
Durlach, 1992), shows that the JND is about 1mm for a length of 10mm. The JND
increases 2
-
4 mm for a reference length of 80 mm. Thus the JND is not constant, nor is
the change in JND linear relative to the change in the reference length, hence viol
ating
Weber’s law.



In the kinesthetic space, Fasse Kay and Hogan (1990) found several kinds of anistropies
in perception of length, distance, orientation and apparent curvature of straight lines (see
also Loomis and Lederman 1986). A known anistropy conc
erning orientation is the
oblique effect (Appelle, 1972). The vertical and horizontal orientations are percieved
more accurately than the oblique orientations. The oblique effect has been demonstarted
both in the visual and haptic system. For a review of
this effect in the visual field see
Appellee 1972; Essock 1980, Howard 1982.


The oblique effect was studied in haptics in an exploration
-
reproduction task only.
Blindfolded subjects were asked to explore a rod with one hand and to reproduce the
orentation

with the same and other hand (Appelle & Countryman, 1986; Gentaz &
Hatwell, 1999; Lechelt, Eliuk, & Tanne 1976, Lechelt &Verenka, 1980). Results were
consistent with the visual oblique effect. It suggests that haptic sensory input related to
orientation,

length and distances is coded according to the vertical and horizontal frame
of reference. However this effect disappeard when the whole body was tilted, which
suggests that the gravitational sense (or egocentric frame of reference) are not the main
caus
e of the effect (Luyat et al, 2001).




7

2.1.1 Control of limb motion


Controlling limb motion is often studied by tracking performance of coordinated
movements with systems of masses and springs (Brooks, 1990; Poulton, 1974; Jones&
Hunter, 1992; Sheridan, 1
992; Turvey 1992). Thresholds are about 8% (Jones and
Hunter, 1992). The bandwidth for limb motion (Brooks et al 1990) is a function of the
mode of operation:1
-
2 Hz for unexpected signals; 2
-
5 Hz for periodic signals, up to 5Hz
for learned or internally g
enerated trajectories and about 10Hz for reflex action.


2.1.2 Sensitivity to net contact forces

When lifting a glass of water two types of contact forces are felt: the weight is felt
through the muscles and tendons, in addition to the forces exerted on
the skin of the
fingers. Biggs and Srinivasan (2001) suggest that overall, contact force is probably the
single most important variable that determines both the neural signals in the sensory
system as well as the control of motor action. (p
-
8). The JND for

contact forces is 5
-
15%
of the reference force value depending on variation in force magnitude, muscle system
and kind of task (Jones, 1989; Pang, Tan, & Durlach 1991; Tan et al., 1992)

In distinguishing among weight, a higher JND was found: of about 10%
(Clark, & Horch,
1986; Jones, 1986)


Interaction between different types of sensory information leads to changes in perception:
cold objects feel heavier than warm objects of equal masses (Sherrick & Cholewiak,
1986).

Controlling motor acts such as liftin
g a glass of water depends on forces felt. In an
experiment of grasping and lifting with two fingers (Johansson &Westling, 1984)
individuals are capable of coordinating the grasping forces and forces needed for lifting.
However if the force is different
f
rom the expected, the motion is not immediately
adjusted (these are termed trajectory movements, so called since the movement cannot be
corrected). Think for instance of lifting a big glass seemingly full of water


the hand will
move in an accelerated moti
on due to the bigger force applied. Such movement are called
trajectile motion (Calvin 1996). When tactile information was blocked with anesthetic
material, so that the skin contact forces were not felt, the ability deteriorated. This
suggests that tactil
e ongoing input is part of the cycle of motor control (see figure 1)


loss of the tactile feel, blocked information about slipping which would call for increase
in the normal force applied.


Vibrations are another kind of information related to haptics,
especially important for
blind and/or deaf people. The human intensity threshold for a single probe is 28dB for a
frequency of 0.4 to 3Hz. (for an extensive review see Kaezmarek and Bach
-
y
-
Rita 1993).
The threshold decreases by a rate of 5dB up to 30Hz, an
d decreases at a rate of 12dB
between the range of 30and 250 Hz. Beyond this frequency the threshold increases again
(Biggs and Srinivasan 2001).


2.1.3 Perceiving roughness

Surface roughness is one of the most prominent perceptual attributes of surface

texture.
Earlier studies used abrasive surfaces and more precisely shaped rigid surfaces consisting
of controlled 2XD bar gratings. For engraved linear metal gratings with rectangular

8

waveform, the groove width between the ridges exerted by far the strong
est effect on
perceived textures (Minsky and Lederman 1996). Thus the perception of surface
roughness relies on the tactile sense alone, and is dependent on groove width, contact
force, and temperature, but not scanning velocity. (Loomis and Lederman 1986
).
Humans can detect 2

-
m high single dot on a smooth glass, a 0.075

m high grating on
the plate. Even though the feel of roughness is uniformly reported, the tactile sense of the
magnitude of roughness is subjective (
Verrillo et al 1999)

.


However when

a rigid probe was used to scan a rough surface, sensitivity decreased and
increasing velocity generates a smoother feeling of the surface (Lederman, Klatzky,
Hamilton & Ramsay 2001).



Perceiving roughness from distance


teletactation
-

has been designe
d so that it fits the
human psychophysical characteristics. Moy et al (2000) developed a model for a
teletaction system based on predicted subsurface strain. They found that 10% amplitude
resolution is sufficient for a teletaction system with a 2~mm elas
tic layer and 2~mm
tactor spacing.


2.1.4 Perceiving shape


Perception of shape normally entails active exploration over time through touch


a
cyclic, intertwined process, of action and sensory input. When sliding a finger along the
circular edge of a gl
ass, two types of information are perceived. The first concerns the
change in the position of the finger in space. The shape of the object is perceived by
mentally ‘tracing’ the geometry of the position of the finger. (Schacter 1996). The
second type of i
nformation relates to the force felt on the contact skin. This information is
essential for knowing that the finger is indeed touching the top of the glass, rather then
touching nothing at all, but is not essential for perception of the shape.


Generally,

most researchers assume that perception of shape relies on the first kind of
information, determined by the geometry of the object alone, independent of the force
cues. According to this view, force cues are used for a dychotomic test of the state of the
finger


touching or not touching the object. Recent research however shows that force
cues have a role beyond mere testing whether the finger is touching the glass.
For
example, when sliding a finger across a surface with a rigid bump on it, the finger mo
ves
over the bump while being opposed by a force whose direction and magnitude are related
to the slope of the bump. The steeper the bump, the stronger the resistance. Though
geometrical and force cues of the bump are correlated, the common general assumpt
ion is
that in these categories of cases, just as the previous one, shape perception relies on
object geometry alone. Roble
-
De
-
La Torre & Hayward (2001) show that regardless of
surface geometry, subjects identified and located shape features on the basis o
f force cues
or their correlates. In an ellegent experiment, they used paradoxical stimuli, for example
combining the force cues of a bump with the geometry of a hole, they found that subjects
perceived a bump. Conversely, when combining the force cues of
a hole with the
geometry of a bump, subjects tended to perceive a hole. The researchers do not report
whether the shape of the bump/hole (e.g round, square) was percieved too through force
cues. Thus force cues may overcome geometrical cues in particular c
ategories of sensory

9

information. Similar results were found by Reiner (2000). The presence of a bump was
perceived, based on force cues. Yet the shape of the bump was percieved through a
combination of geometrical cues and force cues. An interetsing case
is when forces felt
are mediated through a trackball tactile interface in a virtual environment (Reiner 1999).
A virtual field force was generated on a screen. Through active exploration of motion of a
particle on the screen, subjects were asked to draw th
e lines along which the forces act,
and the lines along which the force felt constant. The pattern was that of a particular field
of force of a negative charge. Results show that subjects were capable of building a very
accurate description of such a force

field. Similarly to Robles
-
De
-
La
-
Torre et al. (2001)
results show that the perception of the combined central attraction force and accelerated
motion was interpreted as a ‘hill’, while the decellerated motion of the particle from the
center away, was inte
rpreted as a hole. Perceiving shape is based on translating haptic
information into visual information (Reiner 1999). For instance, a surgeon, palpating the
abdominal area prior to a removal of a gall bladder, will touch the various organs, and
report to
the OR team, the size of each of the organs, virtually describing a map of the
abdominal area. Yet there is no research on how this translation takes place.


For laparoscopic interface research, it is important to find out whether shape can be
percieved t
hrough a probe. There is a debate among researchers whether touching with a
probe indeed allows perceiving shape of the organ.
Burton et. all (1990) studied
perception of shape through dynamic touch, with a probe. They found a high success of
subjects in
identification tasks. The types of errors, involve the characteristic moment of
inertia profiles of each shape, and a ratio of the object's resistances to rotation around
orthogonal axes. These are found to be a strong predictors of performance in the sha
pe
recognition experiments.


2.1.5 Perception of edges

Perception of edges is crucial for shape recognition. Perceiving the shape of an object
through active exploration, means identifying cues that separate an object from the
environment. This requires h
aptic detection of the boundaries of an object. Identification
of boundaries of an object is easily done through vision: based on spatial cues, texture,
colors etc, boundaries are often obvious. Not so with haptics.



Information about boundaries is commo
nly considered to be related the slowly adapting
type I afferents (SAIs) that are highly sensitive to edges (Vierck, 1979; Phillips and
Johnson, 1981a; Johansson et al., 1982). Apparently, The reason for this sensitivity is
that SAIs respond to strain en
ergy density (Phillips and Johnson, 1981b; Srinivasan and
Dandekar, 1996) which makes them sensitive to high rate of changes in forces applied
such as rectilinear corners (Blake et al., 1997b) and to small diameters curves that
generate a circular stimuli
(Mountcastle et al., 1966). It is not obvious that long straight
edges are sensed with the same mechanism as curved edges.
A number of studies of
tactile drawings (Kennedy et al., 1991; Millar, 1991; Lakatos and Marks, 1998) suggest
that humans are able to

perceive changes in curvature in the plane of the skin, but they
provide no information about the nature or resolution of this capacity. Spatial resolution
on the fingerpad may be related to sensations of curvature. For one point stimulus the

10

resolution i
s about 0.15mm. For two point Limen the JND is about 1mm (Briggs and
Srinivasan 2001)


3. Machine Haptics


Machine haptics refers to creating sensory haptic experience through artificial means. It
includes design of interfaces that generate haptic sensa
tions in virtual environments,
robots that have a sense of haptics, and transmission of haptics in tele
-
operation systems.

Haptic interfaces are manipulators used to provide force or tactile feedback to humans
interacting with virtual or remote enviro
nments. In interacting with a virtual object or a
distant object, two kinds of action take place. In the first action the user conveys the
intended motions by physically manipulating the interface. In the second, the interface
conveys tactual sensory infor
mation to the user by stimulating the appropriate tactile and
kinesthetic sensory system. These two acts constitute a cycle of active
-
haptic
-
exploration
of a mediated environment, and follow the active exploration cycle of non
-
mediated
environments describ
ed in section 2.1.


A haptic machine is a two way system
-

it measures the change over time of the position
and force exerted by the hand on the interface, and the interface continuously responds by
exerting forces and changing position of the human hand.

The source for the first
information is based on collecting data through sensors of the motion. The source for the
second is computer mediated. The display of haptic attributes of surface and material
properties of virtual objects is termed haptic renderi
ng, conveying an analogous process
to visual rendering.


Generally haptic devices can be separated according to ground
-
fixed devices and bodily
-
attached devices. Examples of ground
-
fixed devices are force feedback joysticks. Such
joysticks can trace a sin
gle point and respond by exerting the appropriate force to the
user. The force display is considered ‘appropriate’ if it is synchronized with the visual
event on the screen, and fits the human haptic system.


Some bodily devices are attached to the hand/a
rm, or to the body (exoskeleton
interfaces). Devices that are attached to the hand track the hand position and provide
force feedback to the fingers accordingly. The interaction capabilities of ground
-
based
devices are considered to be better: their resolu
tion is higher, and bandwidth of force
display is larger than bodily devices.


3.1 Ground
-
based devices


The most frequently found ground based devices are joysticks, mice, or steering wheels,
all integrated mainly in gaming devices and widely available c
ommercially. Different
types, qualities and prices can be found now in the joystick market. These differ by
resolution, force feedback bandwidth and frequency. Some can be re
-
programmed such
as the Immersion force feedback impulse engine 3000. Most are wit
h two actuated
degrees of freedom (dof). A good review can be found on the haptic


community web
page and in the immersion website www.immersion.com. More advanced joysticks with

11

3
-
6 dof are also available at a higher cost. Mice with force reflecting capa
bilities
have
been developed a while ago (Akamatsu and Sato, 1992). More recent versions
create a
sensation of sliding on ice or riding on a rocky road, and are also available now
commercially both for games and for educational programs, especially for bli
nd students.
Such devices enable blind students to ‘see’ with their hands objects never seen before,
making the imagined into explicit, subject for direct learning. The Sandpaper project

was
one f the first that seeing through feeling: it was based on a jo
ystick that enables sensing
surface texture and bulk properties of objects (Minsky et al, 1990 ).



Force feedback interfaces are widely used as probes in laporascopic surgical simulators.
The leading interface is the Immerssion Laparoscopic Impulse Engine
, with three degrees
of freedom. They currently develop a pair of coordinated impulse engines. Intuitive
surgical, a new version of the SRI telesurgery system (Hill & Jensen 1998) has multiple
hand masters with a variety of 4 to 7dof for each handle. More

sophisticated joysticks
that allow dynamic programming have been developed at northwestern (Colgate,
Peshkin, & Wannasuphoprasit 1996).


However, many of these devices are always in contact with the fingers or hands of the
operators. Since the finger ski
n is ‘busy’ touching the interface, discrimination between
skin contact and non
-
contact with a virtual object is hard or impossible. A new haptic
device that was developed by Yoshikawa and Nagura (2001) provides both touch and
force sensations to the user.

The device separates between two states: not
-
in
-
contact and
in
-
contact with a virtual object. In the first case the device tracks the operator's finger. In
the second case the device touches and displays force to the finger.


Project GROPE integrated for
ce feedback interfaces in a simulation of molecule
docking. Researchers can feel through a force feedback probe the field forces created by
the rotating/vibrating charged particles of a molecule. The researchers reported that not
only that research proble
ms became more intuitively solvable, but learning and mental
models generated through the feel of forces, was enhanced even more so
than through
visual display
(
Grope (Brooks, Ouh
-
Young, Batter & Kilpatrick, 1990).



A similar environment was developed by
Mizushima et al (2001). The central function
of their system is enabling the user to "touch" and feel the electrostatic potential field of a
protein or a drug molecule. By using a charged globular probe the user can scan the
surface of a protein. The elec
trostatic force between the protein and the probe is
calculated in real time, and immediately fed back into the force feedback device to
change the force felt. The user can easily search for positions where the probe is strongly
attracted to the force fiel
d, by that identifying possible sites where the tested chemical
groups can bind to the target protein.


Another exciting project conveys the feel of forces detected by a Scanning Tunneling
Microscope, providing the user with the ability to navigate in a n
ano
-
world.


Probably the most successful and widely used ground attached force feedback device is
the Personal Haptic Interface Mechanism
-
PHANToM (Massie & Salisbury, 1994). The

12

Phantom, a desktop device, is used by inserting the index finger into a thim
ble. The
Phantom measures a user’s finger position and exerts a precisely controlled force on the
finger tip, generating a sense of interacting with visual objects. The Phantom generated a
whole wave of new research questions and methodologies, conferences

and collaborative
haptic groups of interest.


3.2 Body
-
grounded devices


The obvious advantage of body grounded devices is that they are attached to the
arm/hand/body, move with the body, thus have a similar, if not identical, active space as
the body.
In this sense body grounded devices allow more comfortable, natural,
unrestricted motion. Better devices would be small, and would fit the size of the bodily
part attached to, unconstraining the motion space. Some are heavy and huge, deploying
the device o
f its natural advantages. Well
-
designed, bodily devices are essential for
applications that inherently require free motion in a VE.


These devices can be categorized according to the part of the body they are attached to


hand, arm, torso, foot, etc. Sin
ce haptics feel is mainly concerned with the hand, gloves
are one of the widest developed devices.
The principle of a force feedback glove is
simple. It consists of opposing the movement of the hand in the same way that an object
squeezed between the finge
rs resists the movement of the hand. The glove has to be
designed to re
-
generate the intensity and direction of the feedback forces applied by the
object on the human hand, in response to the forces applied by the hand on the object. It
does so,
by generat
ing forces through a complex structure such as levers and springs
mechanisms attached to the back of the hand.

Most provide a 5
-
23 dof with a resolution
of spatial motion of about 1 degree (Biggs and Srinivasan 2001)
Several gloves were
developed: The iRea
lity 5
th
, the cybergrasp, a recent development of the cyberglove, the
VTI Cyberglove and Fake space Pinch glove, which detects interaction between the
fingers (mainly contact), are now available.

A ‘reverse’ glove, that exerts forces on the palm rather tha
n on the back of the hand, was
developed by Burdea (1996). Several body suits are being developed exerting forces on
different part of the body to generate sensation of collisions and bodily touch.


The combination of advantages of both exoskeleton and gr
ound based devices was
developed by NOSC ( at the University of Utah) and Sarcos, Inc. The device provides a
wide range of force feedback for a high bandwidth. The dynamic space is similar to that
of the arm, even
-
though the arm is attached to the ground a
t two points: forearm and
upper arm, making movements hard to perform (reported in Biggs and Srinivasan 2001).


3.3 Tactile devices


Conveying information about texture and fine details of an object are necessary for
interacting with VE. Yet, these turned
out to be technically more difficult to develop
than kinesthetic devices. Some convey the sense of vibration: Cybertouch conveys
vibrations to the back of the hand and fingers; the Aurora interactor conveys vibrations to
the body through a seat cushion,

or through a torso
-
wearable
-
belt (Engineering Acoustics
Inc.).


13

A different tactile device is a trackball that conveys forces to the hand, developed by
Haakma and Engel
(
Engel, Goosens and Haakma, 1994). It is designed for navigation in
multimedia environm
ents (Keyson 1994) and is still in a developmental stage. The
trackball is a 5 cm ball that rests on a ring supported by four wheels. Two of the wheels
are attached to two orthogonal DC which are designed to generate a torque on the
trackball felt by the h
and navigating in the virtual reality.



A mouse targeting mainly the blindly impaired uses a 2
-
4mm vertical pin arrays, located
on top of a mouse or a pad, that stimulate the fingers skin by moving up and down at
different frequencies, amplitudes, and ac
celeration. This device provides a vertical array
of pins, generating the sense of vibrations and tactile stimulations that convey the feel of
texture of an object or planar haptic drawings. Such devices are also used for conveying
Braille information for
the blind, or information about the screen by attaching different
levels of roughness to different task
-
spaces (see for instance the Virtouch mouse at
Virtouch.com,).


Howe et al (1995), and Moy et al (2000) developed a matrix based sensory device (in the

shape of a balloon array) that conveys the feel of palpation with a probe.

Other devices include actuators based on compressed air (Moy Wagner & Fearing 2000)
and piezoelectric elements that transmit vibrations (Chanter & Summers 2000)

Moy et al designe
d a teletactation system based on the human tactile perception. They
looked at several perceptual capabilities of the human tactile system needed for
teletaction. Based on predicted subsurface strain they develop a model of human
teletaction system Their r
esults show a 10% amplitude resolution is sufficient for a
teletaction system with a 2~mm elastic layer and 2~mm tactor spacing.



3.4 Haptic rendering


Integration of haptics in virtual environments made it necessary to find techniques for
generating the

feel of touch and manipulation of virtual objects. Just as visual rendering
is concerned with strategies for displaying the visual properties of a virtual object, haptic
rendering is concerned with strategies for displaying properties that are felt throug
h the
haptic sense, such as texture, friction, compliance and flexibility. Ho et al (Ho, Basgodan
& Srinivasan 1999) propose an efficient haptic rendering method for displaying the feel
of 3
-
D polyhedral objects in virtual environments. They use a algorith
m called the
"neighborhood watch" algorithm that uses precomputed connectivity information to
detect collisions between the an (effector) (the study reports of a remote effector) and the
virtual object displayed. When added to their model of improved searc
h techniques in
structured databases, computational time is reduced independently of the number of the
polygons that represent the object. They propose strategies of displaying both surface
properties (such as friction and texture) superimposed onto convex

or concave surfaces,
and dynamic of rigid and deformable objects.


Beyond the specific technology of the haptic device, coupling between visual and haptic
rendering is crucial. Yokokohji et al (Yokokohji; Hollis; Kanade 1999) coined the

14

expression WYSIWYF
-

what you see is what you feel for spatially and temporally
consistency of haptic and visual input. They propose a method that can realize

correct visual/haptic registration. By using a visual tracking device, and combining an
encountered
-
type of haptic d
evice with a motion command
-
type haptic rendering
algorithm, they manage to deal with extreme cases of both free motion and rigid
constraints.



An interesting idea is to automatically re
-
construct a remote environment both visually
and haptically by coll
ecting sensory information from distance. This requires a system
that automatically collects analyzes and generates visual and haptic properties of the
remote environment. Dupont et al explore issues related to automated (Dupontet al
1999). As a case stud
y, a simple block
-
stacking task, performed with a teleoperated two
-
fingered planar hand, is considered. An algorithm is automatically segments the data
collected during the task, categorizing only a general description of the sequence of task
events. Using

the segmented data, the algorithm then successfully estimates the weight,
width, height, and coefficient of friction of the two blocks handled during the task. This
data is used to calibrate a virtual model incorporating visual and haptic feedback.


4. Fu
ture research

It seems that rate of accumulation of knowledge on haptics is beyond the ‘basics’, at a
point where sufficient is known to allow interesting new questions and innovations.
Several domains require more research. On the engineering side, most
techniques are
still in the cradle phase, providing many opportunities for exploration and innovation.
Questions about techniques for force control and haptic rendering, about capturing,
sampling and analyzing haptic data, design of transparent haptic inte
rfaces, bodily
interfaces with natural motion
-
spaces, programming tools and algorithms for appropriate
feedback, sensors for capturing the motion and force exerted by the hand, algorithms for
synchronized visual and haptic mediated response, technology for

communicating haptics
sensations are interesting and require extensive research.


Haptics, being sensory based, non propositional in its nature is by definition personal. Yet
it is extremely important to be able to communicate with the expert interpretati
on and
knowledge related to varying haptic patterns. Performance and problem solving is also
improved through haptic collaboration. Yet such technologies just started to emerge. A
preliminary system is being developed an applied to a two directional virtu
al ‘hand
shake’ over the WEB using the Immersion force feedback manipulators (see for instance
SUMMIT website at the medical school at Stanford University). Thus technology for
collaborative haptics is just emerging, an essential next step.


New applicati
on areas for haptics emerge: surgical simulation not just for training but
also to extend and empower the surgeon’s capability through robotic arms allow higher
fidelity and lower tremor than the natural capabilities of the hand. Microsurgery is scaled
to
allow performance of procedures within the micro
-
range, as if these were in the mm
range, thus extending the human natural performance skills to new ranges.



15

Haptics also provide the visually impaired with tools to ‘see’ with their hands. natural
touch d
oes not always convey information about the content: for instance

when a blind
person touches a screen of a computer no information about the content of the screen is
available to the person. Haptic interfaces can translate the visual information into hapt
ic,
making visible both symbolic information such as verbal, pictorial, mathematical,
graphical and contextual (such as when one moves to a new window).

Haptic also provides new environments for the arts. Virtual environment for sculpturing,
and collaborat
ive design in architecture or in the arts are another

premise for the future.

Still another domain is neurological research of haptic processing. Some studies look at
the position of the processing of haptic data (see for instance Servos et al 2001) T
hey
used functional magnetic resonance imaging (fMRI) to investigate the neural substrates
involved in haptic. They identified a common region located within relatively posterior
portions of the PCG.



Processing of haptic data in the brain, the interacti
on between haptic processing and
visual, olfactory and audio are of major importance. How sensory information is
processed for precise timing and motor acts is still a mystery. The amazing fast
processing in motor problem solving, is highly displayed in hu
mans and animals but not
well understood. The capability to ‘know’ and develop mental models of the world based
on haptics is especially puzzling. It seems that haptic patterns are organized in schematas,
used not just for extremely efficient bodily action
s, but also for higher cognitive acts such
as metaphorical reasoning, non
-
verbal communication, language, and problem solving.


Another interesting domain is artificial hands. The human hand is an extremely
sophisticated device, still not fully understoo
d. The wealth of known data on the human
hand's sensory capacities is not matched by an equivalent database on motor
performance.. Robot hands are evaluated relative to human hands. Thus a review of the
sensory and motor capacities (tactile, thermal, and
proprioceptive ) of the human hand is
essential. Jones 1997 provides such a review as a reference frame for evaluation of
prosthetic and dextrous robot hands. Most prosthetic hands in use at present are simple
grasping devices, and imparting a "natural" se
nse of touch remains a challenge (For an
extensive review on this topic see Jones 1977). Several dextrous robot hands exist as
research tools and even though some of these systems can outperform their human
counterparts in the motor domain, they are still
very limited as sensory processing
systems. Some recent suggestions look at gloves made of ‘smart materials’ capable of
self
-
programming, that can exert pressure on the hand. This new domain of natural
haptics, is still unexplored and potentially of great
contribution to both artificial hands
and design of smart interfaces.







16




References


Akamutsu, M., & Sato, S. (1992). A mouse
-
type interface device with force display.
Proceedings of the Second International Conference on Artifi
cial Reality and Tele
-
existence (ICAT 92),

July 1992, Tokyo, p.178
-
182


Appelle, S. (1972). Perception and discrimination as a function of stimulus orientation:
The “oblique effect” in man and animals.
Psychological Bulletin
, 78, 266
-
278


Appelle, S., & C
ountryman, M. (1986). Eliminating the haptic oblique effect: Influence
of scanning incogruity and prior knowledge of the standarts.
Perception
, 15, p.365
-
369


Appelle, S. (1991). Haptic perception of form: activity and stimulus attributes. In: The

psycholo
gy of touch (Heller MA, Schiff W, eds), p 169
-
188. Hillsdale: Erlbaum.


Biggs
-
J, Srinivasan
-
MA (2001). Haptic Interfaces. In
Handbook of Virtual Environment
Technology

(Kay M. Stanney, Ed.). Lawrence Erlbaum Associates, Inc.


Blake, D.T., Hsiao S. S., Joh
nson K. O. (1997a). Neural coding mechanisms in tactile
pattern recognition: the relative contributions of slowly and rapidly adapting
mechanoreceptors to perceived roughness.
J Neurosci

17, p.7480
-
7489


Basgadan, C., Ho, C. H., Srinivasan, M. A., Slater,
M. (2000). An experimental study on
the role of touch in shared virtual environments.
Computer
-
Human Interactions
, Vol. 7,.
4, December. p. 443
-
460


Biggs, J., Srinivasan, M. A., (2001). Haptic Interfaces. In
Handbook of Virtual
Environment Technology
. Kay

M. Stanney, Ed. Lawrence Erlbaum Associates, Inc.


Brooks, F., P., Ouh
-
Young, M., Batter J. J., Kilpatrick, P. J. (1990). Project GROPE
-

Haptic Displays for Scientific Visualization.

ACM Computer Graphics 24
(4), 177
-
185


Brooks, T. L., (1990). Telerobo
tic response requirements. In
Proceedings of the IEEE
International Conference on Systems, Man and Cybernetics

p.113
-
120, Los Angeles
USA:IEEE


Burdea, G. C. (1996).
Force and Touch Feedback for Virtual Reality.

New York: John
Wiley & Sons Inc.


Burton,
G., Turvey, M. T, Solomon, H. Y. (1990). Can shape be perceived by dynamic
touch?
Percept Psychophys

Nov;48(5):477
-
87



17

Byblow, W. D., Carson, R. G. & Goodman, D. (1994). Expressions of asymmetries and
anchoring in bimanual coordination.
Hum. Move. Sci.

13,

3
-
28.



Calvin, W. (1996)
How the brain thinks: evolving intelligence, then and now
. Basic
books

.

Chanter, C., & Summers, I. ( 2000) The Exeter fingertip stimulator array for virtual
touch. WWWnewton.ex.ac.uk.medphys/pages/array1.html (Oct 2001)


Clarc
k, F. J., and Horch, K. W., (1986) Kinesthesia. In K. R. Boff, L. Kaufman, & J. P.
Thomas (Eds.) Handbook of perception and Human performance (Vol. 1 p. 1
-
62). New
York: John Wiley &Sons.


Colgate, J. E., Peshkin M. A., & Wannasuphoprasit, W. (1996) Nonh
olonomic haptic
display. In proceedings of the 1996 IEEE International Conference on Robotics and
Automation p. 539
-
544. Minneapolis, USA:IEEE


Dupont, P., E., Schulteis, M., T., Millman, P. A.; Howe R. D. (1999) Automatic
Identification of Environment H
aptic Properties.
Presence: Teleoperators & Virtual
Environments

V. (
8),
4, Page: 394


411)



Durlach, N. I., Delhorne, L. A., Wong, A., Ko, W. Y., Rabinovitz, W. M., &Hollerbach,
J. (1989) Manual identification and discrimination of lengthby the finger
-
span method.
Perception and Psychophysics 46 (1) 29
-
38


Engel F. L., Goossens P. & Haakma R. (1994) Improved efficiency through I and E
feedback: a trackball with contextual force feedback. Manuscript 1013. Institute for
Perception Research, Eindhoven Un
iversity of Technology, the Netherlands.


Essock, E.A. (1980) The oblique effect of stimulus identification considered with respect
to two classes of oblique effects.
Perception
, 9, 37
-
46


Fink, P. W., Kelso, J. A. S., Foo, P. & Jirsa , V. K. (2000). Local

and global stabilization
of coordination by sensory information.
Exp. Bran. Res.

134, 9
-
20.


Fasse, E. D., Kay, B. A., & Hogan, N. (1990). Human haptic illusions in virtual object
manipulation. In
Proceedings of the 14
th

Annual Conference of IEEE Engine
ering in
Medicine and Biology Society

(p. 1917
-
1918) Paris, France: IEEE


Gazzaniga, M. S. (1998).
The Mind's Past.

Berkeley and Los Angeles: University of
California Press.


Gentaz, E., & Hatwell, Y. (1999). Role of memorisation conditions in haptics pro
cessing
of orientation and the “oblique effect”.
British Journal of Psychology
, 90, 373
-
388



18

Gibson, J. J. (1962). Observations on active touch.
Psychological Review

69 , pp. 477
-
490


Gibson, J. J. (1979).
The ecological approach to visual perception.
Lon
don, Houghton
Mifflin.



Hill, J. W. & Jensen J. F. (1998). Telepresence technology in medicine: principles and
applications.
In proceedings of the 1998 IEEE International Conference on Robotics and
Automation

p. 539
-
544. USA:IEEE


Glynn, I. (1999).
An
Anatomy of Thought.
Oxford University Press


Howe, R. D., Peine, W. J., Kontarinis, D. A., & Son, J. S., (1995). Remote palpation
technology for surgical applications. IEEE Engineering in Medicine and Biology
Medicine, 14(3), 318
-
323.


Ho, C. H., Basdog
an, C., Srinivasan, A. M. (1999). Efficient Point
-
Based Rendering
Techniques for Haptic Display of Virtual Objects.
Presence:

Teleoperators & Virtual
Environments

Volume: 8 Number: 5 Page: 477


491


Howard, I. P. (1982).
Human visual orientation
. New Yo
rk: Wiley.


Johansson, R. S., & Westling, G. (1984). Roles of glabrous skin receptors and
sensorimotor memory in automatic control of precision grip when lifting rougher or more
slippery objects.
Experimental Brain Research

56, p. 550
-
564.


Johansson, R. S
., Landstrom, U., Lundstrom, R. (1982). Sensitivity to edges of
mechanoreceptive afferent units innervating the glabrous skin of the human hand.
Brain
Res
244:27
-
32

Johnson, M. (1987).
The body in the mind: The bodily basis of Meaning, Imagination and
Reas
on
. Chicago, The University of Chicago Press.


Jones, L. (1997). Dextrous hands: human, prosthetic, and robotic.
Presence Teleopeatorr
and Virtual Environments

Feb;6(1):29
-
56


Jones, L. A., & Hunter, I. W., (1992). Human operator perception of mechanical
variables and their effects on tracking performance. In:
Proceedings of the 1992
Advances in Robotics
, ASME Winter Annual Meeting DSC Vol 42 p. 49
-
53. Anaheim,
CA: ASME



Kaezmarek , K. A., & Bach
-
y
-
Rita P. (1993). Tactile displays. In W. Barfield & T.
Fur
ness, III (Eds.)
Virtual Environments and advanced Interface design
. Oxford
University Press.



19

Kandel, E, Schwartz, J. Jessel, T. (2000).
Principles of Neural Science

4rh edition,
McGraw
-
Hill


Kelso, S. J. A., Fink, W. P., DeLaplain, C. R, Carson, R., G.

(2001). Haptic information
stabilizes and destabilizes coordination dynamics.
Proceedings of the Royal Society,
London, B. Biological Sciences,
Volume: 268 Number: 1472 Page: 1207


1213



Kennedy, J. M., Gabias, P., Nichollas, A. (1991). Tactile pictures
. In:
The psychology of
touch

(Heller MA, Schiff W, eds), pp 263
-
299. Hillsdale: Erlbaum


Keyson, D.V. (1994). Tactile Directional Cues in User Interface Navigation, Manuscript
no. 978/III, IPO, Institute for Perception Research, P.O. Box 513, 5600 MB Eind
hoven,.
Holland


Kilpatrick, P. J. (1976),

The use of Kinesthetic Supplement in an Interactive system.
Ph.D
dissertation, Computer Science Department, University of North Carolina at Chapel Hill.


Lakoff, G. (1987),
Women, Fire and Dangerous Things: What
Categories Reveal about
the Mind,

Chicago: University of Chicago Press.



Lechelt, E. C., Eliuk, J., & Tanne, G. (1976) Perceptual orientationasymmetries: A
comparison of visual and haptic space.
Perception and Psychophysics
, 20, 582
-
589



Lechelt, E., &
Verenka, A. (1980). Spatial anisotropy in intramodal and cross modal
judgements of stimulus orientations: The satbility of the oblique effect.
Perception

9,
581
-
589


Lederman S.J., Klatzky, R. L., Hamilton, C. L., Ramsay, G. I., (1999). Perceiving surface
Roughness via a rigid probe: Effects of exploration speed and mode of touch.
Haptics
-
e
Journal Vol. 1, No. 1, October 7 http://www.haptics
-
e.org


Loomis, J. M., & Lederman , S. M.(1986). Tactual perception. In

K. R. Boff, L.
Kaufman, & J. P. Thomas (Eds.)
Handbook of Human Perception and Human
Performance

(Vol. 1, p12
-
1


12
-
57) New York: John Wiley and Sons.



Luyat, M., Gebtaz, E., & Corte, T. R. (2001). Reference of frame and haptic perception
orientation: Body and head tilt effects on the oblique effect
.
Perception and
Psychophysics

63, 3, 541
-
554


Massie, T. H. & Salisbury, K. (1994). The PHANToM Haptic Interface: A Device for
Probing Virtual Objects.
Proceedings of the ASME Winter Annual Meeting, Symposium
on Haptic Interfaces for Virtual Environment
and Teleoperator Systems
, Chicago, IL,
Nov 1994
.


20


Merleau
-
Ponty, M. (1964). The Primacy of Perception and Other Essays on
Phenomenological Psychology, In. J. M. Edie r. C Dallery, (Eds.)

The Philosophy of Art,
History and Politics
, T Evanston Il, Northwes
tern University Press


Merleau
-
Ponty, M. (1968).
The Visible and the Invisible.

Evanston, Northwestern
University Press.


Millar, S. (1991). A reversed lag in the recognition and production of tactual drawings:
Theoretical implications for haptic coding. I
n: M. A. Heller, Schiff W., (Eds.)

The
psychology of touch
), pp 301
-
325. Hillsdale: Erlbaum.


Minsky, M., Ouh
-
Young, M. Steele, O. Frederick, P., Brooks, F. P., & Behensky, M.
(1990). Feeling and seeing: Issues in Force Display.
ACM Computer Graphics, Sp
ecial
Issue on the 1990 Symposium of 3D Graphics,

24, (2) 235
-
243


Minsky, M. & Lederman, S. J. (1996). Simulated haptic textures: Roughness. In
Proceedings of the ASME International Mechanical Engineering Congress: Dynamic
Systems and Control Division,

Vol. 2 (Haptic Interfaces for Virtual Environments and
Teleoperator Systems), DSC
-
Vol. 58, 421
-
426.


Mountcastle, V. B., Talbot, W. H., Kornhuber, H. H. (1966). The neural transformation
of mechanical stimuli delivered to the monkey's hand. In: A.V.S. De R
euck, J. Knight
(Eds.)
Touch, heat and pain
(Ciba Foundation), pp 325
-
351. London: Churchill.


Moy, G. U., Singh, E., Tan, R. S., Fearing, T. (2000). Human Psychophysics for
Teletaction System Design.
Presence: Teleoperators and Virtual Environments
Vol. 1
,
No. 3, February, 18, 2000



Minsky, M. & Lederman, S. (1996). Simulated haptic textures: Roughness.
Proceedings
of ASME Dynamic Systems and Control Devision
, Vol2, DSC
-
Vol 58, 421
-
426


Mizushima, H., Nagata, H., Tanaka, E., Hatsuta, M., Tanaka, H.

(2001
). Virtual Reality
System using Force Feedback Device for Molecular Modeling. MEDINFO2001


Reiner M.(1999). Conceptual construction of fields with a tactile interface.
Interactive
Learning Environments. 6,
(X)

1
-
25


Reiner M. (2000). The validity and consi
stency of force feedback interfaces in
telesurgery.
Journal of Computer aided surgery

2, 6.


Robles
-
De
-
La
-
Torre, G., & Hayward, V. (2001). Force can overcome object geometry in
the perception of shape through active touch.
Nature

July 26;412(6845):389
-
91



21

Phillips, J. R, Johnson, K. O. (1981a). Tactile spatial resolution. II. Neural representation
of bars, edges, and gratings in monkey primary afferents.
J Neurophysiol

46:1192
-
1203


Phillips, J. R., Johnson, K. O. (1981b). Tactile spatial resolution. III.
A continuum
mechanics model of skin predicting mechanoreceptor responses to bars, edges, and
gratings.
J Neurophysiol

46:1204
-
1225


Poulton, E. C., (1974).
Tracking skill and manual control
. New York, Academic press.


Salisbury,
J
. K. (1995). Haptic
s: The Technology of Touch. At

http://www.sensable.com/haptics/haptwhpp.html


Salisbury, J. K., (1999). Making Graphics Physically Tangible.
Communications of the
ACM
, Vol. 42, No. 8, August 1
999, pp. 75
-
81.


Servos, P., Lederman, S., Wilson, D., Gati, J. (2001). fMRI
-
derived cortical maps for
haptic shape, texture, and hardness
. Brain Res Cogn Brain Res

Oct;12(2):307
-
13



Sherrick, C. E., & Cholewiak, R. W. (1986). Cutaneous Sensitivity. In
K. R. Boff, L.
Kaufman, & J. P. Thomas (Eds.),
Handbook of Human Perception and Human
Performance

(Vol. 1, p12
-
1


12
-
57) New York: John Wiley and Sons.


Schacter, D. L.
Searching for memory.
Basic Boobs


Sheridan, T. B. (1992).
Telerobotics, Automation
and Supervisory Control
. Cambridge
Mass: MIT Press.


Srinivasan, M., Dandekar, K. (1996). Investigation of the mechanics of tactile sense using
two
-
dimensional models of the primate fingertip.
J Biomech Eng

118:48
-
55


Tan, H. Z., Pang, X. D., & Durlach, N.

I. (1992). Manual resolution of length, force and
compliance. In
Proceedings of 1992 ASME Winter Annual Meeting
, DSC
-
Vol. 42. p13
-
18. Anaheim, CA:ASME


Tan, H. Z., Srinivasan, M., Eberman, B., & Cheng, B. (1994). Human factors for the
design of force ref
lecting haptic interfaces. American Society of Mechanical Engineers.
DSC
-
Vol. 55
-
1


Turvey, M. T. (1992). Ecological foundations of cognition: Invariants of perception and
action. in Pick, H. L., Van Den Broek, P. Knill, D. C. (Eds.):
Cognition: Conceptual

and
Methodological Issues,

Washington DC: American Psychological Association


Verrillo, R. T., Bolanowski, S., J., McGlone, F., P. (1999). Subjective magnitude of
tactile roughness.
Somatosensory & Motor Research

Volume 16 No. 4




22


Vierck, C. J. (1979)
. Comparison of punctate, edge and surface stimulation of peripheral,
slowly adapting, cutaneous afferent units of cat.
Brain Res
.175:155
-
159


Yokokohji, Y., Hollis, R. L., Kanade, T. (1999). WYSIWYF Display: A Visual/Haptic
Interface to Virtual Environmen
t.
Presence: Teleoperators & Virtual Environments

Volume: 8 Number: 4 Page: 412
--
434



Yoshikawa, T., Nagura, A.. (2001). A Touch/Force Display System for Haptic Interface
(2001)
Presence: Teleoperators & Virtual Environments

Volume: 10 Number: 2 P
age:
225


235