Virtual Reality and Haptics in Nano - College of Engineering

slipperhangingAI and Robotics

Nov 14, 2013 (5 years and 4 months ago)


Virtual Reality and Haptics in
Nano- and Bionanotechnology
Gaurav Sharma
Department of Mechanical and Aerospace Engineering,Rutgers,
The State University of New Jersey,Piscataway,New Jersey,USA
Constantinos Mavroidis
Department of Mechanical and Industrial Engineering,
Northeastern University,Boston,Massachusetts,USA
Antoine Ferreira
Laboratoire Vision et Robotique,ENSI de Bourges,
Université d’Orléans,Bourges,France
2.1.Objectives and Related Problems.................2
2.2.Virtual Reality–Based Perception Solutions at
the Nanoscale................................4
2.3.Experimental Virtual Reality Systems for
3.1.Objectives and Related Problems................19
3.2.Virtual Reality–Based Perception Solutions at
the Bionanoscale.............................20
3.3.Experimental Virtual Reality Systems for
Copyright © 2005 by American Scientific Publishers
All rights of reproduction in any form reserved.
Handbook of Theoretical and Computational Nanotechnology
Edited by Michael Rieth and Wolfram Schommers
Volume X:Pages (1–33)
Virtual Reality and Haptics in Nano- and Bionanotechnology
Virtual Reality (VR) is a powerful technology for solving today’s real-world problems.It has
been conceived of as a tool to liberate consciousness—a digital mandala for the cyberian age.
VR refers to computer-generated,interactive,three-dimensional environments into which
people are immersed.It provides a way for people to visualize,manipulate,and interact
with simulated environments through the use of computers and extremely complex data.
The scientific community has been working in the field of VR for some years now,having
recognized it as a very powerful human–computer interface.They are able to exploit VR for
visualizing scientific data and for modeling and animating complex engineering systems.
The traditional applications of VRhave been in the areas such as medicine,education/arts/
entertainment and the military.Are there any new and emerging domains in which VR could
prove beneficial now,or in the near future?
Emerging VR applications such as manipulation of molecules for development of nano-
technology devices and chemical systems,scientific and technical visualization,and so forth
have less maturity and even fewer validation data at this time,compared to the traditional
applications mentioned above;therefore,it is important to review such applications here,
for the benefit of students and researchers alike.This article describes some of the emerging
applications of VR recently completed or currently underway in the fields of nanotechnology
and bionanotechnology.
Nanotechnology has emerged as a new frontier in science and technology.The essence of
nanotechnology is the ability to work at the molecular level,atom by atom,to create large
structures or devices with fundamentally new molecular organization.Bionanotechnology
is a branch of nanotechnology that uses biological starting materials or biological design
principles or has biological (life science) applications.
VR techniques are currently being explored in nanoscience and biotechnology research
as a way to enhance the operator’s perception (vision and haptics) by approaching more or
less a state of “full immersion” or “telepresence.” The development of nanoscale devices
or machine components presents difficult fabrication and control challenges.Such devices
will operate in microenvironments whose physical properties differ from those encoun-
tered by conventional parts.Particularly interesting microenvironments are those involved
in nanomedicine and space applications.In nano-medicinal applications,nanorobots could
operate inside the body to provide significant new capabilities for diagnosis and treatment of
diseases.In space-related applications,the environment could be the vacuum of outer space,
where the protein-based nanomachines could be assembled and form multi–degree of free-
dom nanodevices that could apply forces and manipulate objects in the nanoworld,transfer
information from the nano- to the microworld,and travel in the nanoenvironment.Because
these nanoscale devices have not yet been fabricated,evaluating possible designs and con-
trol algorithms requires using theoretical estimates and virtual interfaces/environments.Such
interfaces/simulations can operate at various levels of detail to trade-off physical accuracy,
computational cost,number of components,and the time over which the simulation follows
the nano-object behaviors.They can enable nanoscientists to extend their eyes and hands
into the nanoworld and also enable new types of exploration and whole new classes of exper-
iments in the biological and physical sciences.VR simulations can also be used to develop
virtual assemblies of nano- and bionanocomponents into mobile linkages and predict their
2.1.Objectives and Related Problems
Nanotechnology can best be defined as a description of activities at the level of atoms and
molecules that have applications in the real world.A nanometer is a billionth of a meter;that
is,about 1/80,000 of the diameter of a human hair,or 10 times the diameter of a hydrogen
atom.The size-related challenge is the ability to measure,manipulate,and assemble matter
with features on the scale of 1–100 nm.Figure 1 shows two nanogears in mesh.
Virtual Reality and Haptics in Nano- and Bionanotechnology
Figure 1.Nanogears no more than a nanometer wide could be used to construct a matter compiler,which could
be fed raw material to arrange atoms and build a macroscale structure.Courtesy of NASA Ames Research Center.
To achieve cost effectiveness in nanotechnology,it will be necessary to automate molec-
ular manufacturing.The engineering of molecular products needs to be carried out by
robots,which have been termed nanomanipulators (NM).On one side,some researchers
are trying to understand more about the nanoworld physics and chemistry,and on the other
side,robotics researchers are attempting to construct new tools,new control and sensing
technologies,and human–machine interfaces specific to the nanoworld.Ideal performance
requirements are such that a human operator manipulates microparts in the normal-size
world and performs tasks (such as cutting,grasping,transportation,assembly,scratching,
digging,and stretching) that have a direct similar mapping at the nanoworld [1,2].So far,
scanning tunnelling microscopy (STM) [3],scanning electron microscopy (SEM),atomic
force microscopy (AFM) [4],and scanned-probe microscope (SPM) [5] seem to be the com-
mon tools for scanning and manipulating at the nanoscale.Current work is mainly focused
on using atomic force microscope nanoprobes for teleoperated physical interactions and
manipulation at the nanoscale.The disadvantage is that it is generally less accurate than the
STM,and as the AFMcan directly contact the sample,sharpness and wear of the tool plays
an important role.As STM is a noncontact method (under vacuum conditions),it has little
tool wear,but the manipulation is limited to conducting and semiconductor materials.Laser
beams can also be used to trap and manipulate small particles.A laser apparatus,called OT
(optical tweezers) [6],provides the user with a noncontact method for manipulating objects
that can be applied viruses,bacteria,living cells,or synthetic micro- and nanoscale particles.
AFMand STMare generally limited to two dimensions with a very limited third dimension,
whereas OT can work in three dimensions.For these nano-handling tools,it appears quickly
that only visual feedback is not enough for precise nanomanipulation because of the scaling
VR technology comes to our aid by providing the experience of perception and interaction
with the nanoworld through the use of sensors,effectors,and interfaces in a simulated
environment.These interfaces transform the signals occurring at nanoscale processes into
signals at macrolevel and viceversa.The requirement is that the communication with the
nanoworld must be at a high level and in real time,preferably in a natural,possibly intuitive
“language”.Considering the nanospecific problems related to task application,tools,and
the interconnection technologies,leads to many flexible nanomanipulation concepts,which
range from pure master/slave teleoperation (through three-dimensional (3D) visual/VR or
haptic force feedback),over shared autonomy control (where,e.g.,some degrees of freedom
are teleoperated and others are operating autonomously),to fully autonomous operation.
Although many of the described technologies have been developed into more or less
mature products for robots acting in the macroworld,the nanosize of the objects poses
extreme challenges and requires a complete rethinking of the visual and haptic perception
of the nanoworld.
Virtual Reality and Haptics in Nano- and Bionanotechnology
2.2.Virtual Reality–Based Perception Solutions at the Nanoscale
2.2.1.Restricted Visual Information
The working environment must be perceivable by the operator,and information in the
processing scene must be transmitted accurately to the operator.As far as nanotasks are
concerned;tools must be arranged in the observing area (colocality),bilateral magnification
must be stable and fully transparent,and direct and natural perception is required with 3D
movements,dynamic images,sound,and aural interfaces.From the vision feedback point of
view,virtual environment (VE) interfaces provide insights and useful capabilities to scanned-
probe microscopes.Indeed,the field of view or scanning is in most cases restricted to a
small area,and the distance between nano-objects and the lens or the probe is very short.
Moreover,scanning (which is on the order of seconds or minutes in some cases) does not
allow online imaging.However,because the same single probe is used for both functions,
scanning and nanomanipulating,the latter two can not physically be achieved in parallel,
whatever the scanning speed is.Finally nano-operation is executed,in general,within the
field of view.The limitation of this method is that while manipulating the specimen,the
graphic display is static and requires additional scans to see the result of the manipulation.
All of the data about the surface that has been scanned are in fact,available to the scientist
in a grayscale image.The scientist can measure any feature of the data to exacting precision,
using standard techniques such as cross sections and statistical algorithms.The benefit of
visualization comes from getting multiple naturally controlled views (two-dimensional [2D]
pseudo-color display or 3D VR topology) of the data that lead to specific insights in data
analysis.For very flat surfaces with small nanostructures,a 2D pseudo-color view is superior
to a shaded 3D view for two reasons:the pseudo-coloring devotes the entire intensity range
to depth (and is sometimes imperceptible in a 3D display) and obscures small fluctuations
caused by noise in the image.Small features on surfaces with other height variations are
better brought out using specular highlighting of a 3D surface (and are often imperceptible
in 2D display) [7].To provide an intuitive interface that hides the details of performing
complex tasks using an AFMor SPMnanomanipulator,a 3D VR topology can be built and
displayed to the user (Fig.2b).It takes the 2D array of heights,tessellates with triangles,
and uses a graphics computer to draw it as a surface in 3D [8].A 3D virtual nanoworld
can be a global view of the real nanoworld or be restricted to the actual working area
[9,10],according to operator need.Static or intuitively manipulated multiple 2D or 3D
views are then allowed in real time [11,12].In addition,remote features may be augmented
with multiple contrasting colors to present data in a comprehensive and easily interpretable
form [13].
(a) (b)
Figure 2.(a) Pseudo-color image of a graphite surface using directional illumination of a 3D surface.Pseudo-color
displays height:Higher areas are red and lower areas are blue.From this particular viewpoint,specular reflexion
reveals a regular pattern of diagonal stripes caused by layers of graphite sheet poking out of the surface.Reprinted
with permission from [7],R.M.Taylor and M.Falvo,“pearls Found on the Way to the Ideal Interface for Scanned-
Probe Microscopes,” IEEE Visualization Conference,(1997).© 1997,IEEE.(b) 3D VR topology can be built
and displayed to the user.Reprinted with permission from S.Horiguchi et al.,“Virtual Reality User Interface
for Teleoperated Nanometer Scale Object Manipulation,Proceedings of the 7th International IEEE Workshop on
Robot and Human Communication,” ROMAN’98,Takamatsu,Japan,1998,pp.142–147.© 1998,IEEE.
Virtual Reality and Haptics in Nano- and Bionanotechnology
2.2.2.Poor Feeling of Handling Forces
The lack of direct 3D vision feedback from the nanoworld,and the fragility of the tele-
manipulated nano-objects,make real-time force feedback an absolute necessity of the macro-
to nanoworld interface.Indeed it is primordial to better understand the condition of the
gripper during operation.An excessive force applied on a nano-object may lead to a non-
negligible degree of probe or object deformation and may destroy the nano-object or
make it flip away.A real-time capable force feedback is indispensable during teleoperated
nanomanipulation. Sensing at the Nanoscale Very small gripping and contact forces in the
range of 0.1 up to 200 N and more have to be sensed with nano-Newton resolution.
Only a few results can be found in literature about the use of AFM-based force sensors in
other fields than scanning probe microscopy.Only Refs.[8] and [14] report on the appli-
cation of these sensors:Either they are integrated into a microgripper that works under a
light microscope,or they serve simultaneously as a manipulator and sensor for teleoperated
nanohandling in SEM.The cantilever serves as a force transducer.In AFM,not only the
force normal to the surface but also forces parallel to it have to be considered;therefore,
the response of the cantilever to all three components has to be analyzed.In principle,the
cantilever can be approximated by three springs—one in each direction of space.The force
acting on the tip causes deflection of these springs.To measure the forces acting on the
AFM tip,different measurement methodologies are used.The cantilever deflection can be
measured using a laser beam and a photo detector system.Depending on the cantilever,
forces ranging from a few pico-Newtons to several micro-Newtons can be measured.Forces
can be measured in the normal and the lateral direction (Fig.3 depicts the geometry for
lateral force measurements).The cantilever can be considered as an elastic spring,and the
force exerted on the sample is simply given by Hooke’s law
= k

where F
is the pseudo normal force,k
is spring constant of the cantilever (k
where E is the Young’s modulus),and 
is the deflection.Once the deflection and spring
constant are known,the normal force can be calculated immediately.Suppose 3D forces
 are applied to the tip.These forces result in 3D torques (
) relative to
point O:

= F

= −F
l −F
h (2)

= F
AFM cantilever
(a) (b)
Figure 3.Cantilever deflection measurement using laser beam and a quad-photodetector system.Definition of 3D
contact forces when the AFM probe tip is (a) at rest and (b) actuating a nanoparticle,using a pushing strategy
Courtesy of Professor Antoine Ferreira,Laboratoire Vision et Robotique of Bourges,France.
Virtual Reality and Haptics in Nano- and Bionanotechnology
causes the cantilever end to twist at an angle 
along the x-axis,
causes the
cantilever to bend 
in z-direction,and 
causes the cantilever to bend 
in y-direction.
Using the quad-photodiode detector,
can be obtained by Eq.(3),and 
is not

= K

= K
Here K
and K
are constants that need to be calibrated,and S
and S
are the normal and
lateral signal outputs of the quad-photodiode detector.Because 
is caused by F
and F
and we cannot remove the effects of F
from the signal,we define a pseudo-force F
the z direction,as shown in Fig.3,such that
= −

= F
This pseudo-force is the normal force measured by Eq.(1).It is clear that F
≈ F
h/l is usually very small.Suppose the tip lateral motion direction has an angle  with
respect to the x-axis of the cantilever frame;then the lateral force F
must be opposite to
the motion direction,and its amplitude can be found by:
where F
is measured by

is the torsional constant of the cantilever and G is the shear modulus.
The lateral force in x-direction can be calculated by
= F
tan (7)
Then the 3D forces represented in the AFM frame are

= R

where R is the rotational matrix between the workspace and the cantilever frame.By feeding
the 3D forces F

to a haptic device,the operator can feel the forces,which are
proportional to the actual forces acting on the cantilever.
Thus,piezoresistive measurement of the deflection of an AFM-like cantilever is a very
promising approach to implement in nanomanipulation systems.Piezoresistive cantilevers
are advantageous for nanohandling devices,where laser detection systems limit their
mechanical design and motion capabilities.However,the deflection resolution is almost
10 times worse than laser detection one.High-precision measurements require,for example,
soft cantilevers with small spring constants,whereas less precise measurements allow the
use of stiffer cantilevers with corresponding better-handling facilities of micro-nano-objects.
The deflection can be calculated by resistance change (R/R) of the implanted piezoresis-
tors [15].For a resistor with area A
,the piezoresistive sensitivity is given as
per Unit Load

is the longitudinal piezoresistive coefficient,
is the transverse piezoresistance
coefficient,load can be a vertical displacement,!
is the longitudinal stress,!
is the
transversal stress,and A
is the area of the resistor.The measurement of resistance change
itself is performed by an integrated Wheatstone bridge that supplies a voltage change as
an output signal.This signal is amplified and provided to the haptic display for further
processing and control.
Virtual Reality and Haptics in Nano- and Bionanotechnology
The Wheatstone bridge is integrated into the cantilever to achieve a maximally pack-
aged design (Fig.4a).Most AFM cantilevers have a rectangular shape with a uniform cross
section,having a higher sensitivity in the normal force and small sensitivity for lateral forces.
They are not really designed to sense forces along the three dimensions F

In addition,sensors that are sensible to nanoforces acting in three dimensions are currently
being investigated (Fig.4b) [16].This sensor with integrated piezoresistors offers a distinctly
better resolution for forces in micro- and nano-Newton ranges than does a sensor with
attached semiconductor strain gauges.However,as we can see,the laser-based approach
requires precise alignment of the laser optics with respect to the cantilever,and the piezore-
sistive approach requires a cantilever specially embedded with piezoresistive material and
external circuitry to process the output of the sensor.
With OT-based manipulation,typical manipulation can be done through a 2D pointing
device such as a mouse or joystick.Usually the manipulation is done in noncontact mode,
so there is no direct force that can be felt and rendered on a haptic display.Because of the
limited ability of traditional optical microscopy to resolve nanometer-scale structures,OT,
unlike AFM and STM,must use other means and must synthesize virtual sensory inputs,
such as virtual potential fields,as an abstraction of the potential energy field that is induced
by the laser beam,for rendering the haptic display to the operator. Display The goal of the force control development is to offer
the operator a straightforward and flexible handling of nano-objects with the help of force
information.The information representation could be realized by visual interfaces,by hap-
tic devices,or by acoustic interfaces.Using multimodal interfaces for a microrobot-based
nanohandling cell,it is possible to provide visual,haptic,and sound force feedback from
the nanohandling place.Depending on the sensor information and the requirements for
the nanoworld representation,haptic interfaces allow the operator to feel and control the
nanoforces during nanomanipulation (usually through an AFM cantilever beam) by hand in
case of the lack or absence of visual information.By using a haptic device,it is possible to
perform real-time manual control of the force F
with which the nano-object is handled by
the AFMprobe tip.It is also possible to control forces that occur during contact of the probe
tip or a nano-object with their environment.Figure 5 presents two multidegree-of-freedom
haptic devices [87,17] that allow the operator to feel and control the forces in the nanoworld
by hand and to change the position of the microrobot and its AFM-based manipulator by
changing the position of his hand in space.This can be the case when nanohandling has to
be performed in closed tubes,deep inside complex 3D nanostructures,and so on,or when
IR9793 10.0 kv ×300 100 µm
167 µm×10025.0 kvIR9981
Figure 4.Structure of a AFM-based piezoresistive force sensing with integration of Wheatstone bridge;(a) force
sensing in one dimension and (b) in three dimensions.Reprinted with permission from [16],S.Fahlbusch et al.,
“AFM-Based Micro Force Sensor and Haptic Interface for a Nanohandling Robot,” IEEE/RSJ International
Conference on Intelligent Robots and Systems,Lausanne,Switzerland,2002,pp.1772–1777.© 2002,IEEE.
Virtual Reality and Haptics in Nano- and Bionanotechnology
(a) (b)
Figure 5.Force-feedback systems for haptic display (a) haptic interface (PHANToM,SensAble Technologies)
with six-depth-of-field positional input and three-depth-of-field force output for force-feedback interaction,and
(b) six-depth of field positional input and three-depth of field translational inputs.Reprinted with permission
from [16],S.Fahlbusch et al.,“AFM-Based Micro Force Sensor and Haptic Interface for a Nanohandling
Robot,” IEEE/RSJ International Conference on Intelligent Robots and Systems,Lausanne,Switzerland,2002,
pp.1772–1777.© 2002,IEEE.
it has to be performed in a subnanometer range;that is,below the limits of the resolution
of SEM.
Real-time force-feedback control can be also performed through 3D vision-based force
sensing on a computer display.Figure 6 illustrates such a nano-Newton scale sensing applied
to a microscale cantilever beam,using a computer vision approach.The information repre-
sentation is visualized directly by the operator through 3D-based simulation graphics.Vision-
based nanoforce sensing provides a simple method to use an elastic part as a reliable force
sensor.One advantage of measuring nanoforces through the use of computer vision is that
it uses equipment that often already exists in a nanomanipulation or biological manipulation
station,where a CCD camera mounted with a microscope or a scanning electron micro-
scope exist.Another advantage of this method is that the same technology can be applied to
geometries that are more complex than a simple cantilever beam (i.e.,nanotube structures).
Figure 6.Vision-based force sensing using mechanical deflection of a cantilever by using a deflected beam template
matched with deflected beam and 3D graphic reconstruction for user force interaction.Reprinted with permission
from [17],M.A.Greminger and B.Nelson,“Vision-Based Force Sensing at Nanonewton Scales,” SPIE Interna-
tional Society of Optics,Microrobotics,and Microassembly III,2001,Vol.4568.© 2001,SPIE.
Virtual Reality and Haptics in Nano- and Bionanotechnology
Finally,it should be noticed that acoustic interfaces based on 3D sound can help the
operator through auditory channels to feel the magnitude of the applied nanoforces [18,19].
The use of sound information would enhance the user’s perception of the nature of the
nanoworld [20].For example,sound plays a large role in determining how we perceive events,
and it can be critical to giving the user/viewer a sense of immersion.Vibration feedback
could be used for tactile display to sense impact events such as tapping and handling in the
nanoworld.For vibration feedback in virtual environments,a model is used to determine the
forces to display [21].When the above-mentioned techniques are unified,the teleoperator
is approaching the state of full immersion,or telepresence.
2.2.3.Inaccurate Models at Nanoscale
The operational remote environment is actually a hostile environment to humans because
nanoworld component behavior is very complex to understand and to manipulate.More-
over,as a human operates based on macroworld model physics,tasks can not be easily
executed by the operator.When the objects are scaled down to the nanometer scale,there
are many significant changes in the physics and properties of materials:surface-to-volume
ratio increases (i.e.,surface forces,friction,and drag forces dominate inertial forces;surface
properties;could dominate bulk properties;and friction becomes also a function of contact
area);dynamics of the objects become faster;and heat dissipation increases.For parts with
dimensions of less than 100 m,adhesive forces such as electrostatic,van der Waals,and
surface tension become dominant with respect to inertial forces and make it difficult to grasp
and release nanoparts during operation.Understanding of nanodynamics and the effect of
various nonlinear forces,together with a reliable modeling approach,is a critical issue for
the success on manipulations tasks.Assuming that the AFM tip is spherical,nanoforces
between a sphere,a particle,and an elastic flat substrate are to be modeled for simulating
the nanomanipulation interactions in a VR environment. Effects Figure 7 illustrates quantitative information on forces between
the AFMprobe tip and a sample as a function of tip–sample distance in ambient conditions.
If the AFM tip approaches the sample surface,the cantilever is deflected from its original
position.During the approaching phase of the tip to the surface there is no tip–sample
contact.Tip–sample contact occurs at the point second point,where the tip jumps into con-
tact with the sample as a result of van der Waals and electrostatic force.The cantilever is
deflected further under an increasing force at the linear part of the force–distance curve.
Fs (Nano Force)[µN]
(Tip-Sample Distance) [µm]
–0.4 –0.3 –0.2 –0.1 0 0.1 0.2 0.3 0.4 0.5 0.6
Contact region Non-Contact region
Repulsive ForceAttractive Force
Figure 7.Example of force-distance curve during approach to and retraction froma flat silicon surface with piezore-
sistive AFM nanoprobe Courtesy of Professor Antoine Ferreira,Laboratoire Vision et Robotique of Bourges,
Virtual Reality and Haptics in Nano- and Bionanotechnology
When the movement retracts in z-direction,the force of the cantilever is decreasing.Adhe-
sive forces between the tip and the sample keep the tip in contact with the sample beyond
the previous first-contact force.This leads to negative deflection of the cantilever.The can-
tilever then breaks free from to the surface (“pull-out”) and returns to its starting deflection.
The main forces causing the noncontact attraction and contact repulsion are modeled and
explained here.
Van der Waals forces exist for every material in every environmental condition (like the
gravitational force in the macroworld),and they depend on the object geometry,material
type,and separation distance.They are caused by a momentary dipole moment between
atoms,resulting from interaction between electrons in the outermost bands rotating around
the nucleus.If we assumed that the end-tip of an AFMcantilever is ideally spherical shaped,
the interactive force between a spherical probe tip and a spherical particle can be estimated
from the interaction force resulting from van der Waals forces between two spheres [22]:
− R

− R


− R


− R


where R
is the curvature of the spherical probe tip and R
is the radius of the sphere-
shaped sample,C is the distance between two centers,and H is the Hamacker constant.
The equivalent Hamaker constant for two different materials is H
,in which H
and H
are Hamacker constants for individual materials.By letting R
go to infinity,the van
der Waals force between a spherical probe and a flat surface (Fig.8a) is approximated as
d +2R

where d is the distance from a flat surface to a spherical probe tip.
In Capillary forces,the water layer on the surfaces of the probe,particle,and substrate
results in an adhesion force.A liquid bridge occurs between the tip and the surface at close
contact,as shown in Fig.8b.Using the macroscopic theory,the adhesion phenomenon is
expressed by [23]
= 4 &
1 −
h −2e
cos  · u −h +L (12)
The parameters of this capillary force are  (the contact angle of the meniscus),u · (the
step function),h (the probe tip/substrate distance),&
(the liquid surface energy for water
= 72 mJ/m
),e (the thickness of the liquid layer),* (radii of curvature of the meniscus
as r
),and R
(the radius of the probe tip).The parameter is defined as L = 2e (during
approaching),which is the thickness of the water layer,and L = (during retracting),which
represent the breaking length of the minescus.This force is attractive for values of  less
than 90 degrees.
probe tip
substrate substrate substrate
probe tip
Liquid layer
+ F
+ F
probe tip
(a) (b) (c)
Figure 8.Parabolic nanoprobe tip and a flat surface interacting (a) via the van der Waals,electrostatic force
parameters,(b) via capillary force parameters during approach to or retraction from phases,on (c) via interacting
forces during positioning of nanoparticles by the AFMtip contact pushing.Courtesy of Professor Antoine Ferreira,
Laboratoire Vision et Robotique of Bourges,France.
Virtual Reality and Haptics in Nano- and Bionanotechnology
Via Electrostatic forces,significant amounts of charge may be generated by friction (during
pushing manipulation) and by differences in contact potentials.Grounding the conducting
substrate such as Si or Au,the electrostatic forces can be reduced.However,in the case of
nonconducting particles,there are charges trapped around the perimeter of particles,and
during pushing or contact,triboelectrification induces local charges.These forces occur dur-
ing adhesion-based manipulation strategy,when the electrostatic force between the sample
and the substrate becomes important.During pushing-based manipulation,the charge of the
sample is transferred to the tip,which can cause an electrostatic force between the sample
and the probe tip (then the sample can stick to the tip during retraction,which is observed
in some cases).
= −
u h −h

where +
is the permittivity and S is the shared area;U is the resulting voltage difference
between the probe tip and the sample.As previously,h is the probe tip/substrate distance,
and h

represents the minimal distance by which Eq.(13) is valid.
Frictional forces involves,during pushing (Fig.8c),the friction between the sample and
the substrate (indexed ss) playing an important role.The definition of the friction at the
micro-nanoscale can be given as
where N
with F
=4 R
at contact,and F
is the external preload,and 
is the particle-substrate friction coefficient.Also,there is a friction between the tip and the
sample (indexed ts) such that F
with N
= F
with F
= 4 R
Repulsive contact forces come into play as well.The contact area between the sample
and the AFM probe tip is very small,so only the deformation between the particle and the
substrate along the z-axis is considered.The contact forces are modeled by considering the
repulsive elastic deformation/indentation forces with surface adhesion.If we assume small
load and high surface forces,the JKR [24] model is commonly used to approximate the
contact area prediction. Simulation for Nanomanipulation Tasks These reality-based models
could be used to enhance the haptic display of virtual environments at the nanoscale.VR-
domain modeling would consist of using this knowledge to construct a 3D model of the
environment,which simulates in a realistic manner the nanointeractions (i.e.,attractive,
repulsive,adhesion,and frictional nanoforces [25,26]).When a proper virtual environment
is available,it will greatly help to design nano-operation activities.For example,it visualizes
the virtual movement in the nanoworld during manipulation as a way for planning auto-
matic assembly tasks,it gives virtual force reflection that aids in the development of force
control methods for nano-telemanipulation,and it facilitates testing of different manipulator
and tool structures before constructing them,reducing the development cost.For example,
Fig.9a shows the physically based simulation of an AFMprobe tip contacting a nanosphere.
The nanosphere is surrounded by a force field,which corresponds to the minimum distance
so that the objects can be attracted to each other.From a physical point of view,this force
field represents the combination of noncontact forces at the microworld,that is,electro-
static,adhesive,and van der Waals forces.By implementing the physical model of these
noncontact attractive forces,the theoretical distance can be estimated with high precision
during nano-positioning (Fig.9b).However,this does not take into account the physical and
geometrical imperfections of the objects and manipulator tip.An experimental calibration
procedure allows us to avoid such problems.
Furthermore,haptic-rendering algorithms offer us the possibility of generating an ade-
quate force field to simulate the contour of the object and the surface properties such
as friction [27] and texture [28],which enhance the virtual haptic display of nano-objects.
Image-based texture maps can be used to modulate any of the surfaces parameters—friction,
viscosity,or stiffness—to create different haptic effects.Accurate physical simulations,which
Virtual Reality and Haptics in Nano- and Bionanotechnology
2 4 6
Distance between AFM tip and
Microobject [nm]
Interactive Force [nm]
8 10
Silicon AFM tip
Contact point
(a) (b)
Figure 9.(a) Interactive adhesive nanoforces when the AFM-based robot closely approaches the nanosized object,
and (b) calibrated interactive force for planned trajectory.Courtesy of Professor Antoine Ferreira,Laboratoire
Vision et Robotique of Bourges,France.
have been studied in computational mechanics,can provide a powerful tool for automati-
cally generating elastic deformations.They have been traditionally very expensive in terms of
computation time,but they are becoming increasingly affordable with the continually grow-
ing performance of computer hardware.The basic approach for contact problems is to detect
penetration between objects and compute an appropriate response that eliminates,mini-
mizes,or reduces penetration,especially for biological micromanipulation cells.Figure 10
gives an example of force calculation and 3D display during surface deformation of a biolog-
ical cell.When the cantilever touches the surface and is pushed further into the surface of
the nano-object,a force occurs between the surface and the tip [29,30].Thus,the operator
can visualize the effect of the interactive force as well as feel the reflected force.To speed
up the graphical display,not just the entire shape varying during the object deformation is
not implemented but the curved surfaces of the deforming object are approximated,using
linearly varying planes.
As it is well known,nano-interaction is not reproducible enough to automate a nano-
procedure.Therefore,the use of physically based simulation techniques of 3D multi-
body nano-systems would enhance the operator’s skills by learning and feeling a realistic
nanoworld in an off-line user interaction mode.Then,by practicing the adequate gesture
through trial-and-error schemes,the operator would be able to reproduce the nanomanipu-
lation tasks in a real environment.
2.2.4.Complex Strategy of Nanomanipulation
Nanotechnology will involves the planning and the scheduling of assembly sequences in an
eutatic environment.At the nanoscale,tasks are defined as conventional ones such as posi-
tioning,assembling,grip,release,adjust,fix-in-place,push,pull,and so forth of individual
Figure 10.3D Nano-Graphics:force display and surface deformation.To provide the operator with force feed-
back,the magnitude of the force is displayed as a conic in the computer model.Courtesy of 3D Nano-Graphics.
© Dr.Hideki Hashimoto,University of Tokyo,Japan.
Virtual Reality and Haptics in Nano- and Bionanotechnology
13 of Nanohandling It is believed that free-space motion planning
and the geometric assembly constraints in macroworld planners will directly apply in the
microdomain.However,fine-motion planning and precise motion will differ from the
macroworld.As it has been shown previously,the assembly in the nanodomain is not
reversible—motions required to pick up a part are not reverse motions required to release
a part.Figure 11 shows the static forces on the sphere at the various stages of pick up and
release a nano-object during force-controlled adhesion for three-dimensional (3D) assembly
of nanoparticles.The forces involved in the adhesion process are the force of gravity F
force of surface attraction F
= F
(if we assume that no electrical charge on the
sphere),and the force of tool attraction F
.To pick up the part,the force of tool attraction
must be greater than the gravitational force and the force of surface attraction.To hold
the part,the force of tool attraction must be greater than the gravitational force.Finally,to
release the part,the force of tool attraction must be less than the gravitational force and
the surface attraction [31].
To link the macroworld to the nanoworld,a virtual environment of physical simulation in
which assembly and motion-planning strategies can be tested is of real interest.Thus,the
reliability and efficiency of manipulation tasks can be tested with respect to various tool
shapes,different properties of materials in contact (tools,objects,and substrate),adequate
texture-reducing adhesion forces,efficient strategies of releasing,and so on.As an example,
Fig.12 shows a virtual simulator developed at Laboratoire de Robotique de Paris 6 [32],
testing the releasing strategy by use of dynamics effects.Once the pick-up,holding,and
release goal regions are determined,normal assembly and path planning routines can be
used to determine assembly sequences and collision-free paths. Path and Trajectory Planning Strategies Originally,the
AFM mechanism is based on interatomic force interaction for holding the topology images
of a substrate.It can be applied to imaging all types of particles/samples that are fully or
semifixed on a substrate with homogeneous surface stiffness and interatomic force proper-
ties.Changing its function fromonly imaging to both imaging and manipulation,3Dpath and
trajectory planning strategies of an AFM-based force-controlled system are possible for two-
dimensional (2D) positioning of nanoparticles using pushing operation (Fig.13) [33,34],for
nanocutting and nanolithography [35] or nanomeasurement of samples such as adenoviruses,
ADN fibers,and so forth [7].
(2) (3) (4) (5)
(a) Approach
(b) Adhesion
(c) Capture
Figure 11.Capture by adhesion of a 50-m-diameter object.Reprinted with permission from [32],D.S.Haliyo
and S.Régnier,“Advanced Applications Using [mü]MAD,the Adhesion Based Dynamic Micro-Manipulator,”
IEEE/ASME International Conference on Advanced Intelligent Mechatronics,Kobe,Japan,2003,pp.880–885.
© 2003,IEEE.
Virtual Reality and Haptics in Nano- and Bionanotechnology
(a) Adhesion
(b) Capture
(c) Release
Figure 12.Physically based simulation,in virtual environment,of dynamic effects during releasing of an object.
Reprinted with permission from [32],D.S.Haliyo and S.Régnier,“Advanced Applications Using [mü]MAD,the
Adhesion Based Dynamic Micro-Manipulator,” IEEE/ASME International Conference on Advanced Intelligent
Mechatronics,” Kobe,Japan,2003,pp.880–885.© 2003,IEEE.
Hence,solutions,like an intuitive,VR-based interface hiding the details of perform-
ing complex 3D tasks in combination with 3D topography display,seem to be an attrac-
tive solution to some extent.The introduction of direct human interaction creates not
only an enhanced measurement capability but also an automated technology presaging the
nanofabrication or repair of nanostructures.The intuitive interface would include virtual
tools or virtual effective probes together with a 3D representation used as a functional inter-
mediary to map operator actions to the nanoworld.Its purpose is to assist a human operator
with the different levels of planning and decision making involved in performing remote
nanomanipulation operations.A number of features could be provided to make this possible:
• Virtual barriers to “fence off” areas or objects to protect against collisions (repulsive
force fields) or contamination with other materials
Figure 13.In this sequence of images,from left to right,a 15-nm gold ball (circled) is moved,avoiding collisions,
into a test rig.Force feedback is used to feel as the ball is pushed and when it has slipped off the tip.Reprinted with
permission from[7],R.M.Taylor and M.Falvo,“Pearls Found on the Way to the Ideal Interface for Scanned-probe
Microscopes,” IEEE Visualization Conference,1997.© 1997,IEEE.
Virtual Reality and Haptics in Nano- and Bionanotechnology
• 3D stencils to constrain teleoperated motions to within predefined boundaries,improv-
ing the reliability and safety of the probe tip
• 3D tape measure to determine straight-line distances between any two points within
the virtual 3D image
• 3D Protactor to quickly measure angles between any two intersecting 3D lines
• Color change indication of dynamic changes in object color to warn of impending over-
turn,collisions,and so forth
• Surface coloring to indicate location and quantity of nongeometric information,such
as temperature and radiation
However,building patterns with large numbers of particles interactively is tedious and
tends to be inaccurate because humans have difficulty in performing precise positioning
tasks.Automating the process of moving a large number of objects (potentially hundreds or
thousands) in near real time is necessary to make such nanorobotic tasks possible [36,37].
Up to now,automatic motion and manipulation planning (termed nanoplanner) are not
efficient at the nanoscale because of scale effects.Another extension is human/planner
cooperation using a haptic and 3D VR simulator that would solve the complex strategy of
nanoautomation.Introducing hints from human operators could provide a means of increas-
ing the efficiency of automatic methods while still relieving the operator from the tedious,
time-consuming work of producing the path manually.The human operator communicates
with the automatic planner by manipulating virtual nano-objects using a haptic interface,
capturing some paths’ configurations that seem feasible,and passing them to the planner for
2.3.Experimental Virtual Reality Systems for Nanotechnology
In this section,we give an overview of a few VR systems developed for nanotechnology-
related applications.
Researchers at the Institute of Industrial Science,University of Tokyo,have proposed
a user interface for teleoperated nanometer-scale object manipulation [2,8].A 3D VR
computer graphics display presents the topology of the nanoworld to the user,and a 1-DOF
haptic device used together with a 2-D conventional mouse is the master manipulator,which
also enables force and tactile feedback from the nanoworld.
Preliminary experiments on the user interface show that the interface can be used for
telenanorobotics applications such as 2D assembly of nanoparticles and biological object
The researchers also proposed that the force feedback haptic interface for such manipu-
lation tasks can also be realized by using a haptic device constructed in Hashimoto’s Labo-
ratory called Hand Shake Device [38].This interface is used to sense the nanometric forces
in the order of nN on the operator’s hand.
Scientists at the National Institute of Standards and Technology are developing an Open
Architecture for Virtual Reality in Nanoscale Manipulation,Measurement,and Manufactur-
ing [39].The goal of this project is to develop an open architecture for users to incorporate
powerful VR functionality in nanorelated research in a standard way.The architecture cov-
ers generalized VR related issues and,more important,covers issues that are specific to
The Collaboratory for Advanced Computing and Simulations at the University of Southern
California is using immersive and interactive virtual environments for explorative visualiza-
tion of multiscale simulation data for nanosystems [40].They have implemented this system
in an ImmersaDesk [41] virtual environment.The ImmersaDesk consists of a pivotal screen,
an Electrohome Marquee stereoscopic projector,a head-tracking system,an eyewear kit,
infrared emitters,a wand with a tracking sensor,and a wand-tracking input/output subsys-
tem.A programmable wand with three buttons and a joystick allows interactions between
the viewer and simulated objects.The rendering system is an SGI Onyx with two R10000
processors,4 Gbytes of system RAM,and an InfiniteReality2 [101] graphics pipeline;it can
render walkthroughs of multimillion-atom systems.
Virtual Reality and Haptics in Nano- and Bionanotechnology
Macro World
3-D VR
1.66MHz, 32MB, Windows)
AFM System
(Nano Robot)
Nano World
PC (Pentium,
Figure 14.Telenanorobotics system.Reprinted with permission from S.Horiguchi et al.,“Virtual Reality User
Interface for Teleoperated Nanometer Scale Object Manipulation,Proceedings of the 7th International IEEE
Workshop on Robot and Human Communication,” ROMAN’98,Takamatsu,Japan,1998,pp.142–147.© 1998,
Also,scientists of the Argonne Futures Lab at the Argonne National Laboratory are
exploring the use of spatially immersive virtual reality systems (e.g.,CAVE [42,102] and
ImmersaDesk) for interactive modeling and visualization of nanotechnology-relevant molec-
ular systems [43].The goal of their work is to characterize the role that immersive virtual
reality can play in improving the user’s effectiveness in conceiving,modeling,and under-
standing large-scale molecular nanostructures.Spatially immersive display (SID) devices sur-
round the user in real space with a 3D computer-generated visual and audio scene that is
responsive to the user’s point of view,orientation,and action.
Researchers at the University of North Carolina at Chapel Hill have developed a nanoma-
nipulator system [44].Scanning-probe microscopes (SPMs) allow the investigation and
manipulation of surfaces down to the atomic scale.The nanoManipulator system provides
an improved,natural interface to SPMs,including STM and AFM.The NM couples the
microscope to a VR interface that gives the scientist virtual telepresence on the surface,
scaled by a factor of about a million to one.It provides new ways of interacting with mate-
rials and objects at the nanometer scale,placing the scientist on the surface,and in control
while an experiment is happening.
Virtual Reality and Haptics in Nano- and Bionanotechnology
Figure 15.A scientist immersed in an atomistic model of a fractured ceramic nanocomposite.Reprinted with
permission from [40],A.Nakano et al.,IEEE Comput.Sci.Eng.3,56 (2001).© 2001,IEEE.
Nanomotion planning of complex nanotasks implies the integration of nanorobotics,3D
sensing,VR systems,and computer-aided design.Hybrid motion planning approach,which
combines vision-based global path planning and model-based local motion planning,is cur-
rently investigated at KIST of Seoul [45].
Researchers at the Laboratoire Vision et Robotique of Bourges have developed human/
NM interfaces for real-time automatic motion planning.A 3D topology of the nanoworld
is reconstructed,in real-time,from real 2D images from SEM images.The visual and 3D
reconstruction recycling frames are less than 20 ms.
Atomic Force Microscope
Force Feedback
Visual Feedback
Engine and
Host Processor
©1997 UNC-CH
Todd Gaul, Photographer
Figure 16.NMsystemusing VE interface to SPMs.Astudent uses the NMto examine carbon nanotubes.Reprinted
with permission from [7],R.M.Taylor and M.Falvo,“Pearls Found on the Way to the Ideal Interface for Scanned-
probe Microscopes,” IEEE Visualization Conference,1997.© 1997,IEEE.
Virtual Reality and Haptics in Nano- and Bionanotechnology
Repulsive forces
Tool tip
Repulsive sphere obstacle
Boundary of workspace
(a) (b)
Figure 17.Haptic and VR interfaces for real-time human/planner cooperation using physically based models for
micro- and nanomanipulation.Integration of repulsive force fields for boundary of the workspace and microsphere
obstacle for guidance of nanoplanning tasks.Courtesy of Professor Antoine Ferreira,Laboratoire Vision et Robo-
tique of Bourges,France.
Thus,a truly interactive virtual environment (VE) that incorporates efficient,dynamic,
physically based simulation (adhesive,repulsive,attractive nanoforces) and motion-planning
techniques applicable to complex nanotasks has been experienced.In this system,a haptic
(PHANToM) [99,100] device and visual interfaces (3D head-mounted display with a head-
tracking system) are used together to enable the user skills,and an automatic motion planner
to cooperatively solve a motion task query [46] physically placing the scientist at the scale of
a nanopart.The operator’s planning difficulty is evaluated with easily measurable quantities
such as the number of collisions produced by the n-forces,the NMclearance,and the contact
forces acting on the cantilever tip.
Recently,VR has been used to define an intuitive interface to a nanoscale OT-based
manipulation device [47].To understand the underlying physical nature of the device,a VE
that can simulate the physics of the laser beam and particle interactions is shown in Fig.18a.
The solid sphere at the center is the particle and is being illuminated by a dithered laser
beam shooting in the direction of the plane.The small sphere to the left of the particle is
the cursor for the operator.The operator steers the particle with a cursor,which is a small
spherical ball that is position controlled by the stylus of the haptic device.The operator
moves the cursor to the particle of interest and presses the button attached to the stylus,
and then the particle is rigidly glued to the cursor.Figure 18b illustrates the instant snapshot
when the particle is steered to the right and the displacement amount exceeds the physical
capability of the potential field.The virtual environment reacts with a force that is in the
direction counter to the steering motion.
(a) (b)
Figure 18.Intuitive interface using virtual environment to control the motion of OT:(a) initial screen representing
the solid sphere at the center as a particle and an ellipse representing the dithered laser beam shooting in the
plane;(b) the virtual environment reacts with a force when the displacement amount exceeds the physical capability
of the potential field.Courtesy Kevin Lyons,J.Res.Natl.Inst.Standards Technol.108,2003.
Virtual Reality and Haptics in Nano- and Bionanotechnology
3.1.Objectives and Related Problems
Bionanotechnology is an emerging area of scientific and technological opportunity.It is a new
and rapidly growing interdisciplinary field addressing the assembly,construction,and utiliza-
tion of biomolecular devices on the basis of nanoscale principles or dimensions.Research
and product development at the interface of physical sciences and biology as applied to this
area require multiskilled teams and often novel technical approaches for material synthe-
sis,characterization,and applications.This section gives an overview spanning fundamental
considerations of applications,with an emphasis on the development and construction of
molecular motors.One such motor,the viral protein linear (VPL) motor [48],is shown in
Almost all physical illnesses that affect the human body,such as cancer and AIDS,are
caused pathology at the atomic level.Bulk chemical reactions were earlier used to study
these molecules and their function.The problem with this approach is that the reactions
occur stochastically,and we can only measure average quantities.As nanotechnology will
allow for atomic assembly,disassembly,and rearrangement,the task of treating a disease
can be viewed as simply a task of atomic engineering.Bionanotechnology applies the tools
and processes of nano-/microfabrication to build devices for studying such biosystems.For
example,it can be used to disassemble cancer cells or invalidate cancer viral material,alter
DNA [49,50],and RNA chains,or,in the near future,repair damaged tissue or perform
restorative dental procedure on a patient’s tooth.Precise manipulation could help scien-
tists better understand the principles of molecular motors [51,52],design chips for gene
sequencing and engineering,and develop biosensors and lab-on-chip diagnostic devices.
To achieve this,it is essential for the molecular scientists to be able to visualize the atom-
to-atom interaction in real time and see the results in a fully immersive 3D environment.
Also,to facilitate user input in bionano systems it is essential to develop voice-,gesture-,and
touch-recognition features in addition to the conventional visualization and manipulation
techniques.VR technology is applied here that not only provides immersive visualization but
also gives an added functionality of navigation and interactive manipulation of molecular
graphical objects.
One of the active areas of research in bionanotechnology is the computer-aided drug
design.The goal of drug design is to find a small molecule (ligand) that docks with a recep-
tor cavity in a specific protein [53].This docking can stimulate or inhibit some biological
activity leading to the desired pharmalogical or drug effect,so the strategy of drug design is
to accurately determine the protein (receptor) structure.Once this structure is determined,
it may be possible to design new compounds (ligands)—candidates for new drugs—that can
(a) (b)
Figure 19.(a) Three titin fibers can be used as passive spring elements to join two platforms and form a single
degree of freedom parallel platform that is actuated by a viral protein linear (VPL) actuator (center).(b) The
VPL actuator has stretched out,which results in the upward linear motion of the platform.The three-titin fibers
acting as a spring are also stretched out.Their elastic behavior can be used as a passive control element or as the
restitution force that will bring the platformback to its original position.Courtesy of Bio-Nano Robotics Laboratory,
Northeastern University.
Virtual Reality and Haptics in Nano- and Bionanotechnology
specifically bind to this protein [54].Conventional tools from computer science and engi-
neering have been developed for drug-design systems,but they have their limitations.Such
tools have no support for the interactive manipulation of molecules,and their functions are
insufficient to express interactions between drug molecules and a protein [55].These prob-
lems can,however,be solved using VR techniques because VR supports haptic interaction
with force-feedback technology.
3.2.Virtual Reality–Based Perception Solutions at the Bionanoscale
3.2.1.Virtual Reality for Structural Biology
Structural information on biological macromolecules is an essential requirement for our
understanding of biological functions.From the very beginning of structural biology,visual-
ization was essential for determining and understanding structures.
The initial work in this area was the use of Visualization and Virtual Reality for Bioin-
formatics,especially for the 3D structural analysis of the biomolecular system.Surface- and
volume-based visualization provide 3Dconcepts of biomolecular structure.VR offers a chan-
nel to reach into the molecular space in an immersive and interactive environment.
Progress in the x-ray and nuclear magnetic resonance (NMR) instrumentation and com-
puter and software technology has led to an increasing rate of accumulation of new biolog-
ical structures such as proteins and DNA.The Protein Data Bank [56] has a database of
coordinate entries of 21,248 proteins.Proteins are complex biological structures made up
of several hundred atoms.The usual approach to visualizing such a molecule is to retrieve
the coordinate files from the database and then use one of the molecular graphics software
packages.These packages largely provide monorepresentations of molecular images without
any interactivity.Use of VR Modeling Language (VRML) can provide a 3D and immer-
sive environment with provision for stereo representations of the molecules [57].This not
only helps in better understanding the molecular structure but also provides insight into
the chemical and biochemical properties of the molecule,such as the number of hydrogen
3.2.2.Complex Molecular Structure Modelling for Design and Analysis
The biosystems are made up of atoms,and as atoms cannot be observed directly,we can
see them only in our imaginations.Commonly available physical models provide an intuitive
representation of structural molecular biology.When applied to large molecules,such as
proteins and the molecular assemblies found in cells,computer graphics simulation can be
used to accurately portray various molecular computational models,with their varying com-
plexities.Electrostatic field data around the molecules of interest,interatomic forces,laws of
quantum mechanics,and so on represent the variety of biomolecular nanointeractions [58].
However,these methods lack the beneficial tactile and kinaesthetic attributes of real physical
models.By coupling an accurate molecular dynamics (MD) simulation code to an immersive
VR display with interactive capabilities and manual force feedback,“immersive” visualiza-
tion of molecular atoms could be improved [59].To create multimodality enhancements of
such tangible models,VR-based technology is an interesting tool.By superimposing addi-
tional graphical information on top of fabricated models,for example,augmented reality,
by incorporating support for voice commands and by providing haptic interface for sensing
the electrostatic charges and interatomic collisions,the user would be able to interact with
these virtual enhancements haptically while manipulating the physical model.
3.2.3.Unification Problem of Enhanced Vision and Force Display for
Automatic Bionanomanipulation
To precisely control and manipulate biomolecules,we need tools that can interact with
these objects at the nanoscale in their native environments [60,61].Existing bionanoma-
nipulation techniques can be classified as noncontact manipulation,including laser trap-
ping [62,63] and electro-rotation [64],and contact manipulation referred to as mechanical
stylus-,AFM-,or STM-based nanomanipulation [65].The rapid expansion of AFM studies
Virtual Reality and Haptics in Nano- and Bionanotechnology
Figure 20.Secondary structure representation of the Bacillus circulans xylanase (PDB code,1bcx) complexed with
sulfate and cyclic xylose in the VRML viewer WebSpace (helix–tube,sheet–ribbon).Reprinted with permission
from [57],J.Sühnel,Virtual Reality Modeling for Structural Biology,“Proceedings of the German Conference
on Bioinformatics,” GCB ’96,Lecture notes in computer science (R.Hofestädt,T.Lengauer,M.Löffler,and
D.Schomburg,Eds.),Vol.1278,pp.189–198.Springer,Berlin,1997.© 1997,Springer.
in biology/biotechnology results from the fact that AFM techniques offer several unique
advantages:first,they require little sample preparation,with native biomolecules usually
being imaged directly;second,they can provide a 3D reconstruction of the sample surface
in real space at ultrahigh resolution;third,they are less destructive than other techniques
(e.g.,electron microscopy) commonly employed in biology;and fourth,they can operate in
several environments,including air,liquid,and vacuum.Rather than drying the sample,one
can image quite successfully with AFM in fluid.The operation of AFM in aqueous solu-
tion offers an unprecedented opportunity for imaging biological molecules and cells in their
physiological environments and for studying biologically important dynamic processes in real
time [66].
Figure 21.Visual tracking of tip pattern (on the left) and real-time construction of the virtual biologic environment
(on the right) with a continuously updated object model.Courtesy of Dr.Toshio Fakuda,University of Nagoya,
Virtual Reality and Haptics in Nano- and Bionanotechnology
At present,these bionanomanipulations are conducted manually;however,long training,
disappointingly low success rates from poor reproducibility in manual operations,and con-
tamination call for the elimination of direct human involvement.Furthermore,there are
many sources of spatial uncertainty in AFMmanipulation (e.g.,tip effects,thermal drift,slow
creeping motion,and hysteresis).To improve the bionanomanipulation techniques,auto-
matic manipulation must be addressed.Visual tracking of patterns from multiple views is
a promising approach,that is currently investigated in autonomous embryo pronuclei DNA
injection [66].Interactive nanomanipulation can be improved by imaging a 3D viewpoint in a
virtual environment.Construction of a VR space in an off-line operation mode for trajectory
planning combined with a real-time operation mode for vision tracking of environmental
change ensures a complete “immersed” visual display.Figure 22 shows an example of a 3D
biomicro-/nanomanipulation system with a 3D VR model of the environment including the
bio cell,and carrying out the user viewpoint change in the virtual space [67].
Since its discovery by Binning [4],AFM has been the main contact mode tool and is
used to manipulate macromolecular biological assemblies as well as individual biomolecules
[68,69].However,the contact mode can also be used where higher force is desired,as in
cutting or dissection.As application,elastic properties of DNA,and identification of the
different force-extension regimes in DNA are currently controlled with a haptic interface.
However,contact manipulation can damage specimens as a result of uncontrolled forces
at the contact interface between the tip and the surface.Introduction of physically based
models of contact nanoforces (friction,rugosity,elasticity of cell membrane),liquid mani-
pulation forces (viscosity,density),and continuum models (electrostatic,adhesive) into 3D
virtual objects will generate realistic virtual nanoenvironments for guided simulations.As an
example,aided by nanomanipulation systems and MEMS multiaxis cellular force sensors,
biomembrane mechanical characterization and modeling have been conducted [70] at the
University of Minnesota.Figure 23 shows the force measurement process on the Zona Pellu-
cida of mouse oocytes and embryos.The proposed biomembrane mechanical model enables
vision-based biomembrane force sensing,assimilating vision,and force-sensing modalities.
A real-time 3D virtual simulator with force display based on measurement of actual biocells
will enable motion planning,tactile data feeling,and controlled tip/surface interactions.
Figure 22.Haptic force measurement process on the Zona Pellucida of mouse oocytes and embryos.Reprinted
with permission from [66],Y.Sun et al.,IEEE Trans.NanoBioSci.2 (2003).© 2003,IEEE.
Virtual Reality and Haptics in Nano- and Bionanotechnology
0 60.0 µm
(a) (b)
. µm(a 06b (a) Low-magnification of the living cell using optical microscope.The AFM tip is adjusted to the
top of the cell surface.(b) Low-magnification imaging of the living cell using tapping mode AFM with scanning
range of 60 m.Reprinted with permission from [35],G.Li et al.,“3-D Nanomanipulation Using Atomic Force
Microscopy,” IEEE International Conference on Robotics and Automation,Taipei,Taiwan,2003,pp.3642–3647.
© 2003,IEEE.
3.2.4.Molecular Dynamics Simulations in Virtual Environment
MD is a computer simulation technique in which the time evolution of a set of interacting
atoms is followed by integrating their equations of motion.In MD,we follow the laws of
classical mechanics,and most notably Newton’s law:
= m
for each atom i in a system constituted of N atoms.Here,m
is the atom mass,a
its acceleration,and F
the force acting upon it,caused by the interactions with other atoms.
MD simulations are based on the calculation of the free energy that is released during the
transition fromnative to fusogenic state and compute the atomic trajectories and coordinates
by numerically solving the equations of motions using an empirical force field that mimics
the actual interatomic forces in the molecular system.MD simulations of complex molecular
systems require enormous computational power and produce large amount of data in each
step.The resulting data include the number of atoms per unit volume,atomic positions,
velocity of each atom,force applied on each atom,and the energy contents.These results
of MD simulations need to be visualized to give the user a more intuitive feel of what is
happening.Haptic interaction used in conjunction with VR visualization helps the scientist
to control/monitor the simulation progress and to get feedback from the simulation process
as well [71].Figure 24 shows the VR visualization of MD simulation.Figure 25 shows a VR
representation of MD simulations in CAVE VE.
3.2.5.Computational Steering and Visualization of Complex
Molecular Systems
Computational steering is the ability of the user to design or modify a simulation interactively
in a VE during run time,which gives the user a tremendous advantage over postsimulation
visualization and analysis of results.In run-time steering,the user does not have to wait until
the end of the simulation to see the results of his modifications;instead,he can immediately
see the result of the interactively changed parameters,giving him an opportunity to detect
and modify them to steer the simulation to a desired output.In computational steering,
the user can steer MD simulation by applying external forces into the computations.These
external forces can help a complex molecular system overcome a potential energy barrier
and can even steer the system to a new geometric conformation for further analysis.Thus,
it provides a great advantage over targeted MD [72],which also targets a MD simulation
to a desired output,but in which the user has no control over the simulations once it
has started.Another advantage of steered MD over conventional MD is the possibility of
inducing relatively large conformational changes in molecules on nanosecond timescales [73].
Figure 26 shows the force-induced unfolding of the protein titin [73].
Computational steering of MD simulation with the aid of VR visualization can thus help
molecular scientists and researchers explore new models and their structural behavior,and
Virtual Reality and Haptics in Nano- and Bionanotechnology
Molecular Dynamics Simulation
Molecular Dynamics Simulation
Virtual Environment
VR Input Devices
Figure 24.Block diagram showing virtual environment for MD simulations.Reprinted with permission from [88],
Z.Ai and T.Frohlich,Comput.Graphics Forum 17,1998.© 1998,Eurographics Association.
also to study the elastic/mechanical behavior of bionanotechnology-relevant complex molec-
ular systems.
3.3.Experimental Virtual Reality Systems for
The Virtual Molecular studio at Imperial College London [74–76] and the Technische
Hochschule Darmstadt [77] have done some pioneering research in the bionanotechnology
VR field.In addition,researchers at the Institut für Molekulare Biotechnologie have set up
an Internet-based Image Library of Biological Macromolecules [78].This was one of the
Figure 25.MD simulation visualized in CAVE.Courtesy of Dr.Zhuming Ai,VRMedLab,University of Illinois at
Chicago.Available at:
Virtual Reality and Haptics in Nano- and Bionanotechnology
50 100 150
extension (Å)
force (pN)
200 300250
(a) (b)
eatson Wgr (a) Force extension profile fromSMDsimulations of titin I27 domain with a pulling velocity v =0:5 Å/ps.
The extension domain is divided into four sections:I,preburst;II,major burst;III,postburst;IV,pulling of fully
extended chain.(b) Intermediate stages of the force-induced unfolding.All I27 domains are drawn in the cartoon
representation of the folded domain;solvating water molecules are not shown.The four figures at extensions 10,
17,150,and 285 Å correspond,respectively,to regions I–IV in (a).Reprinted with permission from [73],Sergei
Izrailev et al.,in “Computational Molecular Dynamics:Challenges,Methods,Ideas” (P.Deuflhard,J.Hermans,B.
Leimkuhler,A.E.Mark,S.Reich,and R.D.Skeel,Eds.),Lecture Notes in Computational Science and Engineering,
Vol.4,pp.39–65.Springer,Berlin,1998.© 1998,Springer-Verlag.
first VRML applications in biology.The VRML division of this library contains more than
650 VRML representations of biopolymer structures.
One of the challenges in computational structural biology is enabling the efficient use
and interoperation of a diverse set of techniques to simulate,analyze,model,and visual-
ize the complex architecture and interactions of macromolecular systems.Scientists at the
Scripps Research Institute,California,in collaboration with the Central Institute for Applied
Mathematics,are developing a 3D graphics package termed SenSitus [79] that is capable
of supporting VR devices such as stereo glasses,3D trackers,and force-feedback (haptic)
devices.This development effort will permit scientists to build models,combine atomic and
volumetric data,and perform morphing and warping (flexible docking) interactively within
a single computational environment.A force-feedback device measures a user’s hand posi-
tion and exerts a precisely controlled force on the hand.The software supports this process
by calculating forces according to the correlation coefficient of density maps and crystallo-
graphic data.The high sampling frequency required for force feedback (refresh rate >1 kHz)
(a) (b)
Figure 27.Interacting with a room-sized display of a three-dimensional protein in CAVE.Courtesy Dr.Karl,
V.Steiner,Delaware Biotechnology Institute,University of Delaware.Available at:
Virtual Reality and Haptics in Nano- and Bionanotechnology
Figure 28.Geometrical representation model for the molecule data.Courtesy of Dr.Juergen Pleiss,HIMM
project,University of Stuttgart,Germany.Available at:
is achieved by means of the vector quantization algorithm developed by the group,which
reduces the complexity of the data representation to manageable levels.
In addition to automated fitting,microscopists have a need to evaluate and manipulate
docking models interactively,“by eye”.Three-dimensional capabilities and the physics of
touch offer tangible benefits for modelers who wish to explore a variety of docking situations
in a VR environment.
In the Visualization Studio of the Delaware Biotechnology Institute at the University of
Delaware,research is focused on molecular biology,genomics,proteomics,structural and
computational biology,and biomedical imaging.The institute’s interactive,immersive Visu-
alization Studio consists of a SGI Reality Center system linked to a 15 ×7-foot FakeSpace
[80] rear-projection screen.A six-processor Onyx 3200 visualization supercomputer with two
graphics pipes drives a pair of Mirage 2000 projectors to deliver an edge-blended image with
a total resolution of approximately 2560×1024 pixels.The studio is currently being outfitted
with a haptic feedback system to further improve the immersive interface with the visualized
The Institute of Technical Biochemistry at the University of Stuttgart has an ongoing
project regarding the application of virtual reality technology in the area of molecular mod-
eling [81].The Highly Immersive Molecular Modeling (HIMM) [82] project is aimed at
the integration of computer-aided molecular modeling tools and virtual reality systems like
COVISE [83].
Recently,however,another way of visualizing atoms has become available.Researchers in
the NAS data analysis group have developed an application called Virtual Mechanosynthesis,
or VMS [84].The user of this application sees various collections of atoms floating in the
space above the NAS Visualization Laboratory’s Immersive Workbench,made by Fakespace
Inc.The VMS utility allows the user to see,move,and even “feel” simulated molecular
structures in 3D space (Fig.29).
A haptic,or force-feedback,device can also be used in VMS to interact with the simu-
lation.This device is essentially a mouse mounted on a mechanical arm.It can be pushed
around in three dimensions and,by way of a clever arrangement of small motors,can push
back.Once users “attach” to an atom using the haptic interface,they can feel the attractive
and repulsive forces as this atom pushes and pulls on its neighbors.
The Theoretical Biophysics Group at the University of Illinois at Urbana Champaign have
implemented a system termed Interactive Molecular Dynamics (IMD) [85],which permits
Virtual Reality and Haptics in Nano- and Bionanotechnology
Figure 29.In the graphite sheet being explored,gray spheres represent carbon atoms,and green spheres rep-
resent hydrogen.Using the pointer,the user has “grabbed” the pink atom for manipulation.Courtesy of
Dr.Christopher Henze,NASA Ames Research Center.Available at:
manipulation of molecules in MD simulations with real-time force feedback and graphi-
cal display.IMD consists of three primary components:a haptic device controlled by a
server that generates the force environment,a MD simulation for determining the effects
of force application,and a visualization program for display of the results.Communica-
tion is achieved through an efficient socket connection between the visualization program
Visual Molecular Dynamics (VMD) [86] and the molecular dynamics program(NAMD) [87]
running on single or multiple machines.A natural force-feedback interface for molecular
steering is provided by a haptic device.
Another kind of VR interface for MD simulation has been developed at the Fraunhofer
Institute for Computer Graphics,Germany.This system,called RealMol [88],is implemented
to run on CAVE or any other computer system with a head-mounted display or a stereo
projection screen.The MD simulation program for RealMol is NAMD,and the communi-
cation between the two for exchanging molecular data is achieved by RAPP [89],developed
at the National Center for Supercomputing Applications at the University of Illinois at
Urbana-Champaign.The IDEAL (interaction device abstraction layer) system [90] is used
to interface RealMol with the CAVE rendering system.IDEAL is independent of the choice
of rendering system,provides an easy to use interface,and handles devices for interaction in
a virtual environment.With the use of IDEAL,a cyberglove can be displayed in the virtual
environment as a mapping of the hand (Fig.30).
Modeling proteins as biological motors,as well as designing proteins not found in nature,
is a rapidly developing field.Keeping this in mind,researchers at the University of North
Carolina at Chapel Hill have built SMD [91,98],a system for interactively steering MD
calculations by adding user-specified external forces into the computation on the fly.Steer-
ing implies that the user is able to “tug” an atom or a group of atoms toward a desired
target position.SMD consists of two software components.The first one is the soft-
ware component for performing MD simulations on a molecular system and studying its
response to an externally applied steering force.The display component of SMD is VMD
An example from the University of Washington is given in Figure 32,where the super-
imposition of multimodal interaction can greatly complement and accelerate the learning
process of the operator.For initial demonstration,a superoxide dismutase (SOD),an essen-
tial enzyme for cellular functioning that exhibits a strong electrostatic funneling effect,is
modeled.In this scenario,the user holds the superoxide radical with the haptic device probe,
and as it nears the charge field of SOD,strong forces pull the superoxide free radical
toward the Cu and Zn ions at the active site of SOD.At the same time,the user sees
Virtual Reality and Haptics in Nano- and Bionanotechnology
(a) (b)
Figure 30.(a) Molecules can be grasped with a cyberglove and moved to a desired position.(b) 3D menu can be
activated and selected by a gesture of the cyberglove.Reprinted with permission from [88],Z.Ai and T.Frohlich,
Comput.Graphics Forum 17 (1998).© 1998,Eurographics Association.
the secondary structure of the SOD enzyme as an augmented reality overlay on top of the
physical model [59].
The researchers have also developed a Virtual Reality Peripheral Network (VRPN) [92].
The VRPN is a set of classes within a library and a set of servers that are designed to
implement a network-transparent interface between application programs and the set of
physical devices (tracker,etc.) used in a VR system.The idea is to have a PC or other
host at each VR station that controls the peripherals (tracker,button device,haptic device,
analog inputs,sound,etc.).VRPN provides connections between the application and all of
the devices using the appropriate class-of-service for each type of device sharing this link.
STALK [93] is a VR-based system developed at the University of Illinois at Urbana-
Champaign and Argonne National Lab for studying the docking of a ligand molecule with a
protein binding site.STALK,shown in Figure 33,uses a parallel genetic algorithm library to
search for a low-energy conformation.An interface to the CAVE virtual reality systemallows
a scientist to visualize the genetic algorithm’s progress and to interact with the algorithm
by,for example,changing the position or structure of the drug molecule and restarting the
algorithm so that the new parameters are incorporated [94].
The Virtual Biomolecular Environment (VIBE) [95] is a VR-based system proposed at
Argonne National Lab that would provide an environment for drug design.It consists of a
massively parallel computing system to simulate the physical and chemical properties of a
molecular system,CAVE for immersive display and interaction with the molecular system,
and a high-speed network interface to exchange data between the simulation and CAVE.
VIBE enables molecular scientists to have a visual,auditory,and haptic experience with
(a) (b) (c)
Figure 31.(a) Specifying a “tug”:selected atom is highlighted as the red sphere at upper right.(b) Moving the
tug:target position follows pointer.(c) System response after 300 fs of simulation.Reprinted with permission from
[91],J.Leech et al.IEEE Comput.Sci.Eng.3,38 (1996).© 1996,IEEE.
Virtual Reality and Haptics in Nano- and Bionanotechnology
(a) (b)
Display of force vector field around active site of superoxide dismutase (SOD),and (b) user interacting
with SOD models using HMD and PHANToM.Reprinted with permission from [59],G.Sankaranarayanan et al.,
“Role of Haptics in Teaching Structural Molecular Biology,” 11th Symposium on Haptic Interfaces for Virtual
Environment and Teleoperator Systems,Los Angeles,CA,2003.© 2003,IEEE.
a chemical system while simultaneously manipulating its physical properties by steering,in
real-time,a simulation executed on a supercomputer.
The Protein Interactive Theater (PIT) [96] is an integrated virtual environment for steered
MD constructed at the University of North Carolina at Chapel Hill.The PIT is a dual-
screen stereo display system for two operators seated at a table,as shown in Figure 34.
Each operator wears a pair of CrystalEyes liquid crystal shutter glasses with an attached
tracking sensor and views a stereo image projected on the screen across the table.The two
operator’s views are in registration,so that the operators agree on the apparent position
of objects in the physical workspace and can augment their discussions using hand ges-
tures.Each operator also has a tracked,six degree-of-freedom,handheld controller that
provides pointing,picking,and other scene manipulations.Dials and buttons at the table
corner shared by the two operators provide top-level control over the display parameters
Figure 33.A user in STALK Courtesy of MCS Division,Argonne National Laboratory.Available at:http://www-fp.
Virtual Reality and Haptics in Nano- and Bionanotechnology
(a) (b)
Figure 34.(a) Details of PIT environment,(b) Two users interacting in the PIT.Courtesy of Dr.Mary Whitton,
Photograph by Todd Gaul,Nanoscale Science Research Group,University of North Carolina at Chapel Hill.
© Dr.Russell M.Taylor.
and the dynamics simulation.A separate,flat-panel LCD monitor in front of each user,vis-
ible through their stereo glasses,provides access to a conventional windows,keyboard,and
mouse interface for detailed control of the visual representation of the molecule and the MD
It is hoped that this review will give students and professionals a feel of the applications
and utility of VR in computational biology,nanotechnology,and bionanotechnology,where
it is starting to be applied,in some cases,with spectacular results.VR interfaces to real-
time simulations over high-speed networks hold a great deal of potential for molecular and
nanotechnology scientists.The advantages of a VR interface are twofold.First,it allows
the scientist to gain a deeper understanding of the micro-/nanoworld by allowing immersive
visualization in three dimensions.Second,a scientist can use personal intuition to steer a
simulation toward more favorable or realistic results.
However,some problems still remain to be solved.The time delay,or the lag between
a user action in the real world and the corresponding update of the state of the object
in the virtual world,is a major concern because it leads to unrealistic visualization,and
thus to simulation sickness [97].Modeling nanoforces in realistic-based models is neces-
sary for reliable and guided nanomanipulation.Real-time VR nanodynamic simulations
(generic continuum models for van der Waals,capillary,contact deformation,and electro-
static forces) are needed to be developed for complex tip/object geometry shapes.The lack
of force-feedback input and output modality through haptic devices such as a force-feedback
CyberGrasp [103] glove and so forth is another area for future research and development.
These haptic devices are important because they can offer resistance to the movement of
the molecule during simulation,and thus can reflect the force it can exert,and they offer
resistance proportional to the local temperature in the system to reflect the dissipation of
work through friction.In practical,telenanomanipulation systems and dedicated nano–haptic
interfaces should be developed.Real-time capable force and kinesthetic-feedback interfaces
with multi–degree-of-freedom nanoforce sensing axes are indispensable and should be an
area of future research.The required “feeling” of the nanoworld calls for additional force
display information from,for example,3D acoustic interfaces,noncontact vision techniques,
temperature,and so on.To keep pace with real-time interaction,VR technology must be
supported by high-performance computers,associated software,and high-bandwidth network
capabilities.VR also requires the development of new technologies such as displays that
update in real time with head motion;advances in sensory feedback such as force,touch,
texture,temperature,and smell;and intelligent models of environments.
These limitations must be overcome before a fully functional VR system is useful in
practice.Additional development and refinement of the VR interface are needed to make
a robust tool for the study of a real nanosystem.
Virtual Reality and Haptics in Nano- and Bionanotechnology
1.A.A.G.Requicha,in “Handbook of Industrial Robotics,” 2nd Edn.,pp.199–210.Wiley,New York,1999.
2.M.Sitti,in “Survey of Nanomanipulation Systems,” Proceedings of the IEEE-Nanotechnology Conference,
3.G.Binnig,H.Rohrer,C.Gerber,and E.Weibel,Phys.Rev.Lett.56,930 (1986).
4.G.Binnig,Phys.Rev.Lett.49,57 (1982).
5.H.K.Wickramasinghe,Sci.Am.261,89 (1989).
6.Z.Ulanowski,Proc.R.Microscopy Soc.36 (2001).
7.R.M.Taylor and M.Falvo,in “Pearls Found on the Way to the Ideal Interface for Scanned-Probe Micro-
scopes,” IEEE Visualization Conference,1997.
8.M.Sitti,S.Horiguchi,and H.Hashimoto,in “Nano Tele-Manipulation using Virtual Reality Interface,”
Proceedings of the IEEE International Symposium on Industrial Electronics,South Africa,1998,pp.171–176.
9.C.Cassier,A.Ferreira,Y.Haddab,P.Rougeot,and N.Chaillet,in “Development of Macro-Micro Teleop-
erated System with Virtual Reality and Haptic Feedback,” SPIE International Symposium on Microassembly
and Microrobotics,Newton,MA,2001,Vol.4568,pp.112–123.
10.F.Arai,T.Sugiyama,P.Luangjarmekorn,A.Kawaji,T.Fukuda,K.Itoigawa,and A.Maeda,in “3D Viewpoint
Selection and Bilateral Control for Bio-Micromanipulation,” IEEE International Conference on Robotics and
Automation,San Francisco,CA,2000,pp.947–952.
11.M.Ammi and A.Ferreira,in “Vision-Based Extraction of Micro Object Geometry for Micro/Nano World
Observation,” IEEE Instrumentation and Measurement Technology International Conference,Como,Italy,
12.S.Palm,T.Mori,and T.Sato,in “Teleoperation via Bilateral Behavior Media:Control,Accumulation,and
Assistance,” IEEE International Conference on Robotics and Automation,1999.
13.H.B.Westlund and G.W.Meyer,in “Computer Graphics,Annual Conference Series.” ACM SIGGRAPH,
14.B.J.Nelson,Y.Zhou,and B.Vikramaditya,in “Integrating Force and Vision Feedback for Micro-Assembly,”
Proceedings of SPIE,Boston,MA,1998,Vol.3202,pp.30–41.
15.T.Gotsazalk,P.Grabiec,and I.Rangelow,in “Piezoresistive Sensors for Scanning Probe Microscopy,” Ultra-
microscopy 82,Elsevier,2000,pp.39–48.
16.S.Fahlbusch,A.Shirinov,and S.Fatikow,in “AFM-Based Micro Force Sensor and Haptic Interface for a
Nanohandling Robot,” IEEE/RSJ International Conference on Intelligent Robots and Systems,Lausanne,
17.M.A.Greminger and B.Nelson,in “Vision-Based Force Sensing at Nanonewton Scales,” SPIE International
Society of Optics,Microrobotics,and Microassembly III,Vol.4568,2001.
18.D.R.Bergault,“3-D Sound for Virtual Reality and Multimedia.” Academic Press,1994.
19.T.Eme,P.Hauert,K.Goldberg,W.Zesch,and R.Siegwart,in “Micro-Assembly using Auditory Display of
Force Feedback,” Proceedings of SPIE,Boston,MA,1999,Vol.3834,pp.203–210.
20.J.F.O’Brien,P.R.Cook,and G.Essl,in “Synthesizing Sounds from Physically Based Motion,” SIGGRAPGH
2001,Los Angeles,CA,2001,pp.529–536.
21.A.M.Okamura,M.R.Cutkosky,and J.T.Dennerlein,in “Reality-Based Models for Vibration Feedback in
Virtual Environments,” IEEE/ASME Transactions on Mechatronics,2001,Vol.6,No.3,pp.245–252.
22.H.C.Hamaker,Physica 10,1058 (1937).
23.J.Crassous,E.Charlaix,H.Gayvallet,and J.Loubert,Langmuir,9,1995 (1993).
24.D.Sarid,J.P.Hunt,and R.K.Workman,Appl.Phys.A 66,283 (1998).
25.M.Falvo,and R.Superfine,J.Nanoparticle Res.2,237 (2000).
26.F.Arai,D.Ando,T.Fukuda,Y.Nonoda,and T.Oota,in “Micro Manipulation Based on Micro Physics,
Strategy Based on Attractive Force Reduction and Stress Measurement,” IEEE International Conference on
Robotics and Automation,1995,pp.236–241.
27.L.Kim,A.Kyrikou,M.Desbrun,and G.Sukhatme,in “An Implicit-Based Haptic Rendering Technique,”
IEEE/RSJ International Conference on Intelligent Robots and Systems,Switzerland,2002.
28.M.Finch,V.Chi,R.Taylor,M.Falvo,S.Washburn,and R.Superfine,in Surface Modification Tools in a
Virtual Environment Interface to a Scanning Probe Microscope,” Proceedings of the Symposiumon Interactive
3D Graphics,Monterey,CA,1995,pp.13–18c.
29.Y.Zhuang,“Real-Time Simulation of Physically-Realistic Global Deformations,Ph.D.Thesis,University of
30.A.Baris,S.Hiroaki,H.Hashimoto,P.M.Kee,and S.Metin,in “Man-Machine Interface for Micro/Nano
Manipulation with an AFM Probe,” 32nd International Symposium on Robotics,2001,pp.670–675.
31.J.T.Feddema,P.Xavier,and R.Brown,J.Micromechatronics 1,139 (2001).
32.D.S.Haliyo and S.Régnier,in “Advanced Applications Using [mü]MAD,the Adhesion Based Dynamic
Micro-Manipulator,” IEEE/ASME International Conference on Advanced Intelligent Mechatronics,Kobe,
33.M.Sitti and H.Hashimoto,IEEE/ASME Trans.Mechatronics 5,199 (2000).
34.J.H.Makaliwe and A.A.G.Requicha,in “Automatic Planning of Nanoparticle Assembly Tasks,” IEEE
International Symposium on Assembly and Task Planning,Fukuoka,Japan,2001,pp.288–293.
35.G.Li,N.Xi,M.Yu,and W.K.Fung,“3-D Nanomanipulation Using Atomic Force Microscopy,” IEEE
International Conference on Robotics and Automation,Taipei,Taiwan,2003,pp.3642–3647.
Virtual Reality and Haptics in Nano- and Bionanotechnology
36.A.Czarn and C.McNish,in “From Nanotechnology to Nano-Planning,” The Ninth University of Western
Australia Computer Science Research Conference,Nederlands,Western Australia,1998,pp.73–85.
37.D.-H.Kim,T.Kim,K.Kim,and B.Kim,in “Motion Planning of an AFM-Based Nanomanipulator in a
Sensor-Based Nanorobotic Manipulation System,” Third IWMF 2002,Minneapolis,MN,2002,pp.137–140.
38.S.Monorutkul,Y.Kunii,and H.Hashimoto,in PEMC Hungary,1996.pp.337–341.
39.Kevin Lyons and Yong Wang,in “An Open Architecture for Virtual Reality in Nano-scale Manipula-
tion,Measurement and Manufacturing (m3),” Eighth Foresight Conference on Molecular Nanotechnology,
and F.Shimojo,IEEE Comput.Sci.Eng.3,56 (2001).
41.M.Czernuszenko,D.Pape,D.Sandin,T.DeFanti,G.Dawe,and M.Brown,Comput.Graphics 31,46 (1997).
42.C.Cruz-Neira,D.J.Sandin,and T.A.DeFanti,in “Surround-Screen Projection-Based Virtual Reality:The
Design and Implementation of the CAVE,” in Proceedings of Sisgraph 93.ACM Press,New York 1993,
43.R.Stevens and I.R.Judson,in “Using Immersive Virtual Reality for Visualizing and Modeling of Molecular
Nanosystems,” Fifth Foresight Conference on Molecular Nanotechnology,California,1997.
44.D.H.Sonnenwald,R.E.Bergquist,K.L.Maglaughlin,E.Kupstas-Soo,and M.C.Whitton,in “Design-
ing to Support Collaborative Scientific Research Across Distances:The Nanomanipulator Environment,”
(E.Churchill,D.Snowdon,and A.Munro Eds.),in Collaborative Virtual Environments,pp.202–224.
45.D.-H.Kim,T.Kim,and B.Kim,in “Motion Planning of an AFM-Based Nanomanipulator in a Sensor-Based
Nanorobotic Manipulation System,” in International Workshop on Micro Factories,Minneapolis,MNS,2002.
46.M.Ammi and A.Ferreira,in “Virtualized Reality Interfaces for Micro and Nanomanipulation,” in IEEE
International Conference on Robotics and Automation,New Orleans,LA,2004.(submitted).
47.Y.-G.Lee,K.W.Lyons,and T.W.Lebrun,J.Res.Natl.Inst.Standards Technol.108,275.
48.A.Dubey,C.Mavroidis,A.Thornton,K.P.Nikitczuk,and M.L.Yarmush in “Viral Protein Linear (VPL)
Nano-Actuators,” in Proceedings of the 2003 IEEE–NANO Conference,San Francisco,CA,2003.
49.T.Yamamoto,O.Kurosawa,H.Kabata,N.Shimamoto,and M.Washizu,IEEE Trans.Industry Appl.36,1010
50.S.Thalhammaer,R.Stark,S.Muller,J.Winberg,and W.M.Heckl,J.Struct.Biol.119,232 (1997).
51.T.Funatsu,Y.Harada,and H.Higuchi et al.,Biophys.Chem.68,63 (1997).
52.K.Svoboda,C.Schmidt,B.Schnapp,and S.Block,Nature 365,721 (1993).
53.S.M.LaValle,P.W.Finn,L.E.Kavraki,and J.C.Latombe,J.Comp.Chem.21:731 (2000).
54.S.Wo-Pong and Y.Rojanasakul Eds.“Biopharmaceutical Drug Design and Development.” Humana Press,
55.H.Nagata,E.Tanaka,M.Hatsuta,H.Mizushima,and H.Tanaka,in “Molecular Virtual Reality System with
Force Feedback Device.” ICAT 2000,pp.161–166.
56.H.M.Berman,J.Westbrook,Z.Feng,G.Gilliland,T.N.Bhat,H.Weissig,I.N.Shindyalov,and P.E.Bourne,
Nucleic Acids Res.28,235 (2000).
57.Jürgen Sühnel,“Virtual Reality Modeling for Structural Biology.” Institut fu
r Molekulare Biotechnologie,
58.G.H.Ouh-Yong,M.Pique,J.Hughes,N.Srinivasan,and F.P.Brooks,in “Using a Manipulator for
Force Display in Molecular Docking,” IEEE Robotics and Automation Conference,Philadelphia,PA,1988,
59.G.Sankaranarayanan,S.Weghorst,M.Sanner,A.Gillet,and A.Olson,in “Role of Haptics in Teaching Struc-
tural Molecular Biology,” 11th Symposium on Haptic Interfaces for Virtual Environment and Teleoperator
Systems,Los Angeles,CA,2003,p.365.
60.K.Castelino,“Biological Object Nanomanipulation,” Review Report,University of California,Berkeley,2002.
61.T.Fukuda,F.Arai,and L.Dong,in “Nano Robotic World—From Micro to Nano,” International Workshop
on Nano-\Micro-Robotics in Thailand,2002.
62.A.Ashkin,Phys.Rev.Lett.24,156 (1970).
63.T.N.Bruican,M.J.Smyth,H.A.Crissman,G.C.Salzman,C.C.Stewart,and J.C.Martin,Appl.Optics 26,
64.M.Nishioka,S.Katsura,K.Hirano,and A.Mizuno,IEEE Trans.Industry Applications 33,1381 (1997).
65.A.A.G.Riquicha et al.,in “Manipulation of Nanoscale Components with the AFM:Principles and Applica-
tions,” IEEE International Conference on Nanotechnology,Maui,HI,2001.
66.Y.Sun,K.T.Wan,K.P.Roberts,J.C.Bischof,and B.J.Nelson,IEEE Trans.NanoBioSci.2,2003.
67.F.Arai,and T.Fukuda,in “3DBio Micromanipulation,” International Workshop on Microfactoryies IWMF’98,
68.Y.Ishii,A.Ishijima,and T.Yanagida,Trends Biotechnol.19,6 (2001).
69.D.Fotiadis,S.Scheuring,and S.Muller et al.,Micron 33,385 (2002).
70.Y.Sun,K.-T.Wan,K.P.Roberts,J.C.Bischof,and B.J.Nelson,IEEE Trans.NanoBioSci.2,4 (2003).
71.M.Disz,M.Papka,R.Pellegrino,Stevens,and V.Taylor,in “Virtual Reality Visualization of Parallel Molecular
Dynamics Simulation,” Proceedings of the 1995 Simulation Multiconference Symposium Phoenix,AZ,Society
for Computer Simulation,1995,pp.483–487.
72.J.Schlitter,J.Mol.Graphics 12,84 (1994).
Virtual Reality and Haptics in Nano- and Bionanotechnology
73.Sergei Izrailev,Sergey Stepaniants,Barry Isralewitz,Dorina Kosztin,Hui Lu,Ferenc Molnar,Willy Wriggers,
and Klaus Schulten,in “Computational Molecular Dynamics:Challenges,Methods,Ideas” (P.Deuflhard,
J.Hermans,B.Leimkuhler,A.E.Mark,S.Reich,and R.D.Skeel,Eds.),Lecture Notes in Computational
Science and Engineering,Vol.4,pp.39–65.Springer,Berlin,1998.
74.O.Casher,C.Leach,C.S.Page,and H.S.Rzepa,J.Mol.Struct.368,49 (1996).
75.O.Casher,C.Leach,C.S.Page,and H.S.Rzepa,Chem.Br.34,26 (1998).
76.O.Casher and H.S.Rzepa,Comput.Graphics 29,52 (1995).
77.J.Brickmann and H.Vollhardt,Trends Biotechnol.14,167 (1996).
78.J.Sühnel,Comput.Appl.Biosci.12,227 (1996).
79.Willy Wriggers and Stefan Birmanns.J.Struct.Biol.133,193 (2001).
80.ImmersaDesk, Systems Inc.,2003.
81.H.Haase,J.Strassner,and F.Dai,Comput.Graphics 20,207 (1996).
82.R.C.Drees,J.Pleiss,D.Roller,and R.D.Schmid,in “Computer Science and Biology,” Proceedings of the
German Conference on Bioinformatics (GCB’96).Leipzig,Germany,1996,pp.190–192.
83.Collaborative Visualization and Simulation Environment (COVISE),
84.C.Levit,S.T.Bryson,and C.E.Henze,in “Virtual Mechanosynthesis,” Fifth Foresight Conference on Molec-
ular Nanotechnology,California,1997.
85.J.E.Stone,J.Gullingsrud,K.Schulten,and P.Grayson,in “2001 ACM Symposium on Interactive 3D
Graphics,” ACM SIGGRAPH,New York,2001,pp.191–194.
86.W.F.Humphrey,A.Dalke,and K.Schulten.J.Mol.Graphics 14,33 (1996).
87.M.Nelson,W.Humphrey,A.Gursoy,A.Dalke,L.Kale,R.D.Skeel,and K.Schulten,Int.J.Supercomput.
Appl.High Performance Comput.10,251 (1996).
88.Z.Ai and T.Frohlich,Comput.Graphics Forum 17 (1998).
89.R.Kufrin,“RAPP Software Guide,” University of Illinois,1996.
90.M.Roth and T.Frohlich,“IDEAL Interaction DEvice Interaction Layer User’s Manual,” Fraunhofer-IGD
internal report,1997.
91.J.Leech,J.F.Prins,and J.Hermans,IEEE Comput.Sci.Eng.3,38 (1996).
92.Russell M.Taylor II,Thomas C.Hudson,Adam Seeger,Hans Weber,Jeffrey Juliano,and Aron T.Helser,in
“VRPN:A Device-Independent,Network-Transparent VR Peripheral System,” ACM VRST,2001.
93.D.Levine,M.Facello,P.Hallstrom,G.Reeder,B.Walenz,and F.Stevens,IEEE Comput.Sci.Eng.(1996).
94.N.Akkiraju,H.Edelsbrunner,and J.Qian,IEEE Comput.Graphics Applications 16,58 (1996).
95.C.Cruz-Neira,R.Langley,and P.A.Bash,Comput.Chem.20,469 (1996).
96.K.Arthur,T.Preston,R.M.Taylor II,F.P.Brooks Jr.,M.C.Whitton,and W.V.Wright,in “Designing and
Building the PIT:A Head-Tracked Stereo Workspace for Two Users,” Proceedings of the 2nd International
Immersive Projection Technology Workshop,Ames,Iowa,1998.
97.G.C.Burdea and P.Coiffet,“Virtual Reality Technology,” 2nd Edn.Wiley,2003.
98.Jan F.Prins,Jan Hermans,Geoffrey Mann,Lars S.Nyland,and Martin Simons,Future Generation Comput.
Syst.15,485 (1999).
99.T.H.Massie and J.K.Salisbury,in “Dynamic Systems and Control” (C.J.Radclie,Ed.).pp.295–301.ASME
100.A.Bejczy,W.Kim,and S.Venema,in “The Phantom Robot:Predictive Display for Teleoperation with Time
Delay,” Proceedings of the IEEE Conference on Robotics and Automation,Cincinnati,OH,1999,pp.546–551.
101.SGI InfiniteReality Workstation, Graphics,Inc.,2003.
102.CAVE Systems Inc.,2003.
103.CyberGrasp Glove, (Jasandre Pty.Ltd.),