Upload
und
View
0
Download
0
Embed Size (px)
Citation preview
66th International Astronautical Congress, Jerusalem, Israel. Copyright ©2015 by the International Astronautical Federation. All rights reserved.
IAC-15-B3.9- YPVF.2 Page 1 of 11
IAC-15-B3.9- YPVF.2
ENHANCING THE HUMAN-MACHINE INTERFACE USING VISR- AN INTERACTIVE 3D
VISUALIZATION/ DESENSITIZATION TRAINING TOOL IN A VARIABLE GRAVITY MODEL
Poonampreet Kaur Josan, Department of Space Studies, University of North Dakota
Pablo de Leon, Associate Professor, University of North Dakota
Chrishma Singh-Derewa, Systems Engineering Division, NASA-JPL
Benjamin Inada, User Research & Interaction Design, NASA-JPL
Priyanka Srivastava, University of Michigan, Ann Arbor
ABSTRACT
The human body is used to a 1-G environment and behaves differently in lower gravity. Mars gravity is about
1/3rd of Earth’s, and Extra Vehicular Activities (EVA) using rovers and spacesuits will be challenging. Lower
gravity planetary environments with distinctive terrain require trained space explorers with interactive human-
machine compatibility. Current available visualization methods, such as Virtual Reality (VR) goggles, can be
adapted to display a generalized view of planetary surface featuring a visual 3-D interface, yet lacking an ‘actual’
immersive environment due to the lack of lower gravity conditions. In order to better understand human behavior in
alternate environments, it is important to study the psycho-physiological components of a subject in a real-time
immersive environment. With the potential to perfect planetary EVA procedures and address vital human factors
associated with space planetary exploration, our team in partnership with the University of North Dakota (UND)
present the concept of VISR (Visual Immersion for Simulated Robotics) system.
VISR provides an immersive 3-D planetary Virtual Reality model utilizing high performance computing, image
processing, and 3-D rendering. The software manipulates image processing speed with a variable gravity parameter
as an input. VISR is compatible with multiple planetary bodies and asteroids whereas the current system applications
are limited to the space station and Mars. The device assists in the study of psycho-physiological effects (human
limbs – eye coordination) associated with deep-space travel and low gravity environments. Mars has a vast,
geologically complex terrain with existing environments models allowing topographical visualization. VISR users
experience these features in an immersive and interactive environment enhancing human performance in real
situations, and assisting scientists and mission planners in system and mission design. VISR integrates the UND
developed planetary suit NDX-2, currently used as a simulation tool for ‘astronaut training’, human factors and
related Psycho-physiological studies.
The system design includes different phases such as development of terrain maps, and their integration with
visual device sensor, development of a model which can locate and import physical planetary parameters from
available databases, varying the speed of image output to the user by manipulating the gravity parameter, and
integrating it with already developed terrain models. This device can be further used to feed real-time physiological
performance data to a separate sensor based system, and study the related effects on human subjects. VISR will be an
effective tool which would equip the researchers, mission planners and space travelers for a future planetary manned
mission.
I. INTRODUCTION
Since the onset of human space exploration missions
in outer space and the Moon, the psycho-physiological
experiences of astronauts has directed researchers
towards the understanding of interaction of human
physiology and microgravity. As a result, there has been
ground-breaking research in the field of space life
sciences in microgravity. The findings of these studies
have not only improved astronaut health, but has also
contributed towards the betterment of life on earth.
Various experiments involving rats were flown on
space shuttle missions to study the effects of
microgravity on physiological health [1, 2, 3]. One of
the most interesting phenomenons was observed in the
functioning of human and rats vestibular system. In
66th International Astronautical Congress, Jerusalem, Israel. Copyright ©2015 by the International Astronautical Federation. All rights reserved.
IAC-15-B3.9- YPVF.2 Page 2 of 11
microgravity, both subjects have had trouble with the
sense of direction. Previously, Apollo astronauts at the
wheel of NASA's moon rovers, and during EVAs
reported that they had a hard time estimating the incline
of the lunar terrain.
Recent studies have found that humans need at least
15 percent of the level of gravity found on Earth to
orient themselves [4]. Another experiment conducted in
European Space Agency's (ESA) Short Arm Centrifuge
Facility (SAHC) in Cologne showed that gravity only
starts influencing a person's sense of up and down once
it hits about 0.15g. Lunar gravity is 17 percent of Earth's
gravity, and is just barely strong enough to provide
adequate cues for astronauts to know which way is up.
In case of asteroid missions, relatively gravity-less
environment requires the probes, vehicles, and the
astronauts to be tethered to the surface. The issues
related to vestibular system will increase manifold in
such extreme environments. Human Mars missions are
the primary focus for next decade, and it is essential to
train the explorers before sending them in such hostile
environment. Although Mars gravity is within safe
approach (30 percent of Earth’s gravity) as per the
gravity limits imposed by studies for vestibular systems,
it is recommended to train astronauts to ensure overall
mission success, as well as crew and system safety.
Figure: Apollo 16 and Apollo 17 Astronauts losing
balance and control during a lunar EVA. Credits: NASA
Virtual Reality is a system where a user is
interactively interfaced to a computer and engaged in
a three-dimensional (3D) visual task. The computer
provides a ‘virtual domain’ for supporting the 3D
models or complete environment and, given suitable
transducers, the user can interact with the system
in ‘real time’. VR devices in space is not a new concept,
and there have been various dedicated systems designed
for space related applications. Two of the most common
types of VR devices are data glove, and head mounted
display (HMD). A HMD device cuts off visual and
audio sensations from the surrounding world and
replaces them with computer generated three-
dimensional images. VISR will be a new-generation
HMD device with user-controlled planetary parameter,
e.g. gravitational value, as an input.
II. SCOPE
After millennia of evolution the human brain has
evolved optimal information processing techniques.
Almost 60% of all brain activity is devoted to the
processing of visual images. Mental state modelling
occurs via instinct faster than the cognoscente mind can
process scenarios. While the brains active processes are
dedicated to optical data collection, it is movement of
these images that truly engages the brain. Neurologists
also find that the emergence of a form towards the
observer creates powerful stimulate. VISR enables the
explorer to know before they go and pre-provides them
with these trigger mechanisms ultimately enhancing
mission communication and performance.
VISR combines the engineering capabilities of
perceiving detailed mission terrain with the
augmentation of environment manipulation. The human
brain operates based on its ability to combine resources,
perception and understanding and consequently
communication is created. Altering the image projection
speed of the simulation with the gravitational
parameters of the setting increases the fidelity of the
experience. Coupling detailed terrain imagery with the
planetary parameter knowledge of the JPL databases
will further enhances the VISR algorithms.
In the subsequent images VISR illustrates the
avatars ability to interact with its surroundings and
manipulate them. The linked SYSML (System
Modelling Language) model allows each object to be
connected with real time physical parameters in addition
to numerous other characteristics enabling rapid design
and collaboration within the workspace not possible in
real life settings. In this sense interactions in the VISR
environment can meet and even exceed the potential of
real working spaces.
Reality augmentation using avatars may avoid some of
the limitations of VR simulations to be explained later
while maintaining the immersive experience for the
astronaut. VISR utilizes a schema that enables the
mission environment to incorporate additional aspects
of systems engineering design, test and simulation
allowing developers to interact directly with one another
and visualize the explorer’s interactions with their
environments. Electrical, mechanical, thermal and
software models are alive within the VISR model and
66th International Astronautical Congress, Jerusalem, Israel. Copyright ©2015 by the International Astronautical Federation. All rights reserved.
IAC-15-B3.9- YPVF.2 Page 3 of 11
Image 2: SYSML model depicting Avatar capability of VR environment. (Credits: NASA/JPL)
their behaviours represented in the real-time
environment. Additionally, VISR is able to interact with
software packages directly and represent their outputs
within the environment. Engineers and scientists can
observe one another’s work as they would normally in
two dimensions or interact with the results three
dimensionally as rendered elements.
III. VESTIBULAR SYSTEMS IN
MICROGRAVITY
The human ear is made up of several smaller
structures that can be organized into three distinct
anatomical regions. The content of this study is
foucused on the inner ear, which is composed of the
cochlea and the vestibular system. The peripheral
vestibular apparatus in the inner ear consists of two
kinds of sensory receptors (1) the semicircular canals
signalling rotatory head movements, and (2) the otolith
organs that sense linear forces, including gravity, acting
on the head. The activity of the otolith organs is altered
in microgravity (because stimulation from gravity is
absent during space flight, interpretation of the
gravireceptor signals as tilt is meaningless; therefore,
during adaptation to weightlessness, the brain
reinterprets all gravireceptor outputs). The activity of
the semicircular canals is not affected.
In microgravity, human senses react differently. The
sense of direction is often conflicted, as there is no ‘up’
or ‘down’ in microgravity. However, during long
duration stays in microgravity, astronauts often develop
their own sense of direction, which is not essentially
universal. Crewmembers also experience a disruption in
their proprioceptive system, which tells where arms,
legs and other parts of the body are oriented relative to
each other. This disorientation is the main cause of so-
called Space Motion Syndrome (SMS). Most of the
space travellers suffer from space sickness, which
brings with it headaches, poor concentration, nausea and
vomiting. Usually, though, the problems disappear
within a few days as astronauts adapt.
Image 3: Inner Ear- The Vestibular System [5]
Exposure to microgravity rearranges the
relationships among signals from vestibular,visual, skin,
joint and muscle receptors. Until some level of
adaptation to the novel sensory conditions encountered
in the weightlessness environment of space flight is
achieved, astronauts and cosmonauts often experience
the following:
1) Illusory self and/or surround motions;
2) Space motion sickness (SMS);
3) Eye-head coordination impairment; and
4) Equilibrium control disturbance.
Many of the same types of disturbances are observed
during a re-adaptation period following return to Earth.
The magnitude and recovery-time-course of these
66th International Astronautical Congress, Jerusalem, Israel. Copyright ©2015 by the International Astronautical Federation. All rights reserved.
IAC-15-B3.9- YPVF.2 Page 4 of 11
neuro-sensory, sensory-motor and perceptual
disturbances tend to vary as a function of mission
duration.
Image 4: Schematic representation of multisensory
interactions in control of eye movements, posture,
perception of orientation and motion. These interactions
provide the basis for sensory reweighting or substitution
that underlies vestibular adaptation to altered gravity.
[6]
In weightlessness, the otolith organs can no longer
serve their usual graviceptive role of indicating static
head position to gravity [7]. As a result, visual
information on the direction of the vertical or
horizontal, which normally interacts with graviceptive
information on Earth, becomes increasingly dominant in
establishing spatial orientation in weightlessness.
Tactile information such as pressure on the soles of the
feet (when strapped to the floor) continues to promote a
sense of the vertical.
There have been several experiments conducted on
rats in microgravity to understand the effects on
vestibular system. The effects have been apparent to the
human vestibular system and the results confide in the
effects observed as SMS or SAS (Space Adaptation
Syndrome).
1V. 3-D IMMERSION TECHNOLOGY
The Oculus Rift is a HMD for virtual reality
applications and gaming developed by Oculus VR. It is
the device we present primary interest in for VISR
design and implementation. We list pros and cons for
the Oculus Rift per its usefulness for our use cases.
Pros Cons
Largest current active
development
community
Required umbilical
connection
Integrated control with
Oculus Touch
The following technologies may prove useful to
explore as alternatives to the Oculus Rift [8, 9, 10, 11,
12]. The pros and cons of these technologies are listed
below:
HTC Vive
Pros Cons
Custom controllers
trackable in space
Still in development
In development with
VR sickness in mind,
first & foremost
PlayStation VR (Sony Morpheus)
Pros Cons
High refresh rate Large display may not
fit well in NDX-2
helmet
Low latency (<18ms)
Vrvana Totem
Pros Cons
On board cameras will
eventually enable user
to see the real world
while in use (AR -
augmented reality)
Still in development
Design compensates
for glasses
V. INTERACTIVE INTERFACES FOR SPACE
EXPLORATION
The technological interfaces in the posterity are
expected to develop direct human interaction and fortify
possible simulations essential to predict survival in
outer space land. An evolution in the designs of human
interactive technologies is required in terms of
adaptability and ability to perform human subjective
evaluation. NASA as well as several private companies
have been working on VR themed after space
exploration, with some of them having direct
application such as training astronauts for EVA on ISS,
flight simulator, robotic arm simulator etc. (NASA
virtual Reality Lab).
To facilitate the subjective evaluation between the
spacecraft system and humans, the immersion-type
simulation environments are used extensively.
Immersion type experimental platforms integrate
relatively inexpensive PC-based 3D VR technology to
actively perform bio-mimetic study on human
interactive systems. Technically, these platforms can be
grouped in two categories. The first category platforms
66th International Astronautical Congress, Jerusalem, Israel. Copyright ©2015 by the International Astronautical Federation. All rights reserved.
IAC-15-B3.9- YPVF.2 Page 5 of 11
integrate PC-based immersion-type display with a 3D
dynamic simulator along with motion detectors. The
motion detectors help in capturing motion and the 3D
simulator facilitates the human subject to feel as if he
were immersed within the same environment as the
robot or system. Hence, the user participating in such
interaction is not at all affected physically. The second
category platforms are equipped with real and virtual
dual-arm robot with vision and force sensors. The force
and human motion displays, the user can feel as if he is
immersed as a robot body in the system and that
facilitates him to perform tele-manipulation based on a
unilateral control approach [13].
NASA JPL, in its quest to explore natural ways to
operate robots and spacecraft in space, has exemplified
the usage of immersion technology. The organization
used Leap Motion Controller to remotely maneuver a
Mars rover and Oculus Rift and a Virtuix Omni to take
a virtual tour of it. The plans are in place to use this
immersion platform with a combination of a HMD and
the Kinect motion sensor to control and monitor the
orientation and rotation of the robotic limb. Hence, this
technology will be deployed to enable a full stereo
vision from a human-based perspective, wherein the
visual input is properly mapped to the limbs of the user
in reality [14].
Several Virtual reality companies such as Sensics,
Worldviz, Mechdyne, Quantum 3D, Motek medical,
Atlantis Cyberspace, Vicon are rushing forward in this
domain of space exploration. SpaceVR designed a
virtual reality platform that uses 3D, 360-degrees
camera that has the capability to livestream footage
from International Space Station’s Cupola observatory
module to Earth. This footage then enables customers to
envision space travel while immersed in 3D virtual
reality.
Immersion products like WorldViz’s Vizard
supports HMD, LCD shutter glasses and display
technologies such as autostereoscopes and dome
projectors. It also features haptic displays, force-
feedback systems, built-in high quality 3D sound and
multi-user networking. Such kinds of platforms also
have high-level morph controllers that help transitioning
virtual humans into existing environments. They can
blend thousands of their designed humans, avatar
meshes and animation suits to the environment as
desired [15].
VI. SYSTEM REQUIREMENTS
VR technology has been tested in various analogue
environments such as Neutral Buoyancy Lab (NBL) and
NASA Extreme Environment Mission Operations
(NEEMO) as a tool for training astronauts. Microsoft’s
HoloLens was also a part of the cargo flown on SpaceX
Dragon ISS resupply mission in June, 2015, but was lost
in the launch failure. However, the participants often
need to use underwater facilities and be subjected to
different amount of physical loading to simulate
microgravity equivalent to Moon, Mars, or an asteroid.
These analogue missions are costly and time
consuming.
The VISR concept synthesized from the idea of
arming the NDX-2 planetary EVA spacesuit at the
Human Spaceflight Laboratory of University of North
Dakota, with Virtual Reality (VR) devices, and studying
human performance. The primary requirement of the
system is its ability to alter image projection speed with
a change in gravitational parameter value. Current VR
devices and applications themed after planetary or space
exploration process the graphical image based on 1-g
perception of our vestibular systems. As discussed
earlier, the actual sense of confusion in low gravity
environment comes from the way our vestibular system
interprets its environment. The proposed system shall
also be able to take other planetary features in account,
such as albedo, surface inclinations for certain regions,
and human and mechanical system interfaces. Data
gloves can be used to incorporate interface interaction in
the training regime.
Image 5: UND developed NDX-2 advanced lunar
spacesuit. Credits: Human Spaceflight Lab, UND.
The system needs to be integrated with the spacesuit
helmet without interfering or obstructing its field of
view. For this purpose, the device needs to be
miniaturized. Considering most of the previous HMD
VR devices, the size has been rather bulky, and they are
difficult to integrate with spacesuit.
The system shall be able to connect to JPL’s
planetary database to use a user defined environment
specific data as input. The device will operate on a
secure and hi speed wireless network, and need to have
a continuous access to the database during its operation
time. Specific applications can be designed separately to
expand VISR’s usability to other sectors.
66th International Astronautical Congress, Jerusalem, Israel. Copyright ©2015 by the International Astronautical Federation. All rights reserved.
IAC-15-B3.9- YPVF.2 Page 6 of 11
VISR will not only be used as a training tool, but
will also act as an important research testbed for psych-
physiological performance in altered environments. The
system can be integrated with body mounted biomedical
sensors, and motion tracking devices to get reaction
feedback of limbs in a specific user-defined
environment. The system shall be able to take VISR
output, and feed it to a sensor-based system for
collecting real-time data. This data can be recorded,
processed, and analysed further to study reaction of
vestibular system. The output can be plotted as a
function of time, gravity value, as well as other user
defined inputs. The findings of these studies can help
establish an understanding of working of human
vestibular system, or particular limbs in relation to
different environmental approaches.
VII. SYSTEM ARCHITECTURE
VISR system comprises of NDX-2 spacesuit
integrated with a HMD VR device as shown in image 6.
Computer 1 (C1) will be used to establish a wireless
connection with the JPL planetary, and small-body
database, which will enable a controller to select an
input parameter from the vast database of planetary
parameters in the database, and feed it to the VISR
system. VISR will also be integrated with separately
developed terrain maps, specific to a certain planetary
body (Moon, Mars, or asteroid). These terrain maps will
form the graphic user interface (GUI), or in other words,
the immersive environment of the system. The input,
e.g. gravitational parameter, will be integrated in several
steps with respect to time in a Gravitational Control
Algorithm (GCA). This algorithm will alter the image
projection rate, and the user in immersive environment
can relate to the new microgravity environment not just
on visual basis, but on sensory (vestibular) level.
The users will experience VR sickness in these
environments, which will be discussed later in this
paper. To avoid any injury, user can be strapped with
harnesses to account for loss in the sense of direction.
The reactions to the vestibular system’s perception can
be recorded by placing biomedical sensors on an
Astronaut’s body. The output of these sensors will
appear on a second computer (C2). This biomedical data
can be processed and analysed for better understanding
of vestibular behaviour in microgravity, and for
developing human eye-limb coordination profiles in
varied environments.
Image 6: VISR System Architecture for varying gravity parameter.
66th International Astronautical Congress, Jerusalem, Israel. Copyright ©2015 by the International Astronautical Federation. All rights reserved.
IAC-15-B3.9- YPVF.2 Page 7 of 11
VIII. METHODOLOGY
VIII.I Integration of NDX-2 and VR device sensor
The Human spaceflight Lab at University of North
Dakota has been involved in developing several
technology testbeds for furthering manned spaceflight
program in accordance to the objectives of NASA’s
global exploration map [16]. The NDX-2 advanced
lunar spacesuit was initially developed for the lunar
Constellation program which was later cancelled [17].
However the spacesuit has been tested regularly, and
has proved to be a good testbed for expanding,
integrating, and testing spacesuit related systems. This
led to selecting NDX-2 as a primary system to be
integrated with VR HMD for EVA training purposes.
The proposed system architecture involves
integrating the spacesuit helmet with HMD goggles, and
a head tracker. The VR display device will allow users
to see graphics superimposed on their view of the real
world. Data glove or other haptic interface devices will
be integrated in the system, to allow a user to interact
with the 3D models in the virtual world.
A human's two eyes have overlapping 140 degree
Field of View (FOV), binocular or total FOV is roughly
180 degrees in most people. A feeling of immersion
arises when the FOV is greater than roughly 60 to 90
degrees. Hence, the HMD will be positioned and
integrated in a manner that it does not obstruct user’s 3-
D field of view, and it has sufficient clearance relative
to the eye, and the visor of the spacesuit helmet. This
configuration will ensure a sharp frustum of vision.
VIII. II Importing physical planetary parameters from
the database
JPL’s Solar Systems Dynamics Lab has an extensive
database of physical parameters of different planets,
planetary satellites, asteroids, and comets. Planetary
parameters, which are of particular interest to VISR,
include geometric albedo, surface gravity, mean density,
and escape velocity to simulate EVAs, and EDL (Entry,
Descent, and Landing) forces. Due to their small size,
physical parameter of comets and certain asteroids are
not known. However, physical parameters for well-
known Asteroids are readily available, and can be found
in JPL’s small body database.
The system will enable the operator- called Author-
to establish a wireless connection to the database via a
secured portal, select an input using C1’s graphic user
interface (GUI), and feeding it to the VISR. This input
will be processed by GCA to alter the image projection
speed. Interactivity between the author and the user is
required to successfully execute the derivation and
feeding of required inputs to VISR.
VIII.III Importing Terrain Maps
Three dimensional terrain maps of relevant regions
of different planets and asteroids can be developed
using commercially available software, and the global
data collected by several fly-by missions. The computer
graphics will consist of high level of detail to realize
photo realism. Use of shading, texture, color,
interposition, or other visual characteristics will provide
depth cue to estimate the distance of an object from the
observer in the immersive environment. Teleos (TM)
tool will be used to create Silicon Graphics computer-
based real-time interactive environments with "life-like"
deformable objects. A bitmap pattern is added to an
object to increase realism through texture mapping.
Image 7: 3D terrain maps for Mars and Asteroid Vesta
visualized using NVIDIA Quadro® graphics solutions.
Image courtesy: NASA/JPL, created using NVIDIA
graphics
VIII.1V Varying image processing speed
As discussed earlier, the main objective of VISR is
to alter the image projection rate, and make it equivalent
to how human eyes will perceive in that certain
environment. A control algorithm will perform
corrective transformations to alter the rate of image
processing by VR software. For example, GCA will
take value of gravity parameter of a specific planetary
body as an input, and will correspondingly change the
output to the user of VISR. However, the system needs
to be tolerant of metallic distortions which causes noise
interference or degraded performance in
electromagnetic trackers when used near large metallic
objects. High resolution terrain maps with enhanced
refresh rate will provide desired output to the user. As
shown in Image 6, the force feed from biomedical
sensors and C2 to the control algorithm unit will
account for delayed response time, and automatically
improve latency tolerance of the system.
66th International Astronautical Congress, Jerusalem, Israel. Copyright ©2015 by the International Astronautical Federation. All rights reserved.
IAC-15-B3.9- YPVF.2 Page 8 of 11
VIII.V Integration with Biomedical Sensors
Biomedical sensors will be placed on VISR user,
and will provide information about spatial orientation of
the body during the VR simulation. The data obtained
will be fed to C2, where it can be processed and
analysed as a function of time. This output is further
used as a force feedback to GCA to improve system
latency. Force feedback transmits pressure, force or
vibrations to provide the VR participant with the sense
of resisting force, typically to weight or inertia. This
should not be confused with tactile feedback, which
simulates sensation applied to the skin.
In a VR environment, human body operates in six
Degree of Freedom (6DOF) environment, which include
three translational DOFs and three rotational DOFs. The
corresponding coordinates track spatial navigation of
the subject. The results derived from the data obtained
from these simulations will benefit kinematics and
kinesthesis studies in microgravity.
Egocenter is the sense of one's own location in a
virtual environment. VR environment may cause users
to experience a vection, i.e. a sensation of egocenter
caused by motion of the visual environment. The most
active parts of brain during a VR simulation are
occipital cortex, and parietal cortex. Occipital cortex is
the back of the brain receiving retinotopic projections of
visual displays, whereas Parietal cortex is an area of the
brain adjacent and above the occipital cortex, which
process spatial location and direction information.
These parts of the brain can be mapped during a VR
EVA simulation in microgravity to establish required
relationships.
IX. VR SICKNESS
VISR’s limitations lie primarily in the realm of VR
sickness. Past research in simulation and virtual reality
shows that many humans are prone to nausea and
instability after periods of VR simulator usage. How
might we deal with such factors in the design of VISR?
First, we attempt to define VR sickness in terms of three
main categories: subject, simulator, and task. We
explain each category and its attributes in depth below.
For the sake of listing attributes without bias, we list in
alphabetic order.
XIII. I Subject
Age
Age is a primary factor in VR and motion sickness.
In 1975, Reason and Brand found that motion sickness
susceptibility is nearly non-existent after around the age
of 50. Conversely, it is greatest between the ages of 2
and 12 years, decreasing from around 12 to 21 years
[18].
Ethnicity
Studies conducted by Stern, Hu, LeBlanc, and Kock
in 1993 show that ethnicity plays a role in the likelihood
of motion sickness. Subjects of Chinese descent
reported significantly more sickness than European-
American and African-American subjects [19].
Experience with real-world task
Kennedy et al. found that, in flight simulation, pilots
with more flight experience are more likely to
experience sickness than those with less flight
experience [20]. Other studies have attempted to derive
a reason for the finding’s figures inconsistently. One
concept is that a sensitivity to differences between
actual and simulated flight is related to a pilot’s
experience with the sensory aspects of actual flight.
However, another study presented that degree of control
is a larger factor in sickness occurring in the context of
experience. For example, student pilots tend to handle
the controls more than instructor pilots, which decreases
the likelihood of sickness because they exhibit more
control over the system. Degree of control in relation to
sickness is described in more detail in the below Task
section. Ancillary to this concept is the fact that optimal
viewing region is often placed at the student pilot’s
perspective, which decreases the likelihood of sickness.
Experience with simulator (adaptation)
Pilots who experience sickness with simulators
initially are, in most cases, able to quickly adapt over
time. Hence, incremental adaptation has a positive
correlation with incidence of sickness. Although this
sounds all well and good, this reduction of sickness over
time may be due to building a tolerance to sickness-
inducing stimuli and learning adaptive behaviours to
avoid sickness. This may negatively influence
performance in a real-world environment, where such
behaviours may be inappropriate.
Flicker fusion frequency threshold
Flicker is discussed in the next section as a property
of the simulator. However, the property of flicker is
dependent on the point at which the individual visually
perceives flicker, the flicker fusion frequency threshold.
This measure changes among individuals according to
gender, age, and intelligence but is always related to a
circadian bodily function which causes the measure to
increase by day a decrease by night.
Mental rotation ability
Mental rotation is a means for a human to recognize
an object when it is not in its usual orientation. Stimulus
rearrangement is an alteration of normal spatial
relationships among stimuli that are associated with
orientation. For example, a subject may walk forward in
66th International Astronautical Congress, Jerusalem, Israel. Copyright ©2015 by the International Astronautical Federation. All rights reserved.
IAC-15-B3.9- YPVF.2 Page 9 of 11
a virtual environment yet remain still in reality; this
incurs a stimulus rearrangement. One study involving
Soviet cosmonauts showed that mental rotation training
prior and during flight significantly improved
performance on procedures in actual microgravity
environments. A virtual environment system could be
adapted for training purposes — specifically where
users could practice mental rotations via stimulus
rearrangements.
XIII.II Simulator
Flicker
Flicker is the perception of fading between cycles
displayed on headsets and other displays. Research
shows that flicker is one key cause of simulator
sickness, overall distraction, and eye fatigue. Flicker
must be suppressed by increasing refresh rate as
luminance and field-of-view levels increase in order to
avoid the perception of flicker. [21]
Field of View
Wide field-of-view of simulators generally exhibit
higher incidences of sickness than narrow field-of-view
simulators. Wide field-of-view also increases the
likelihood that flicker will be perceived because the
peripheral visual system is more sensitive to flicker than
the fovea, which is located in the center of the eye.
Refresh Rate
Refresh rate is directly correlated with flicker.
Refresh rate must increase with field-of-view and
luminance to avoid flicker. However, the requirement of
high refresh rate implies a requirement for higher CPU
processing rates, which in turn increases cost.
Update Rate (Frame Rate)
Update rate is defined by the number of visual
frames that are displayed by the simulator per time unit.
Low update rate contributes to visual lag, which may
cause sickness. Update rate is determined by the
complexity of the scene being displayed along with the
computing power provided by the simulator. This is in
contrast to Refresh Rate, which is determined by the
hardware characteristics of the simulator.
It seems that the images projected from virtual
reality have a major impact on sickness. The refresh rate
of on-screen images is often not high enough when VR
sickness occurs. Because the refresh rate is slower than
what the brain processes, it causes a discord between the
processing rate and the refresh rate, which causes the
user to perceive “glitches” on the screen. When these
two components do match up, it can cause the user to
experience the same feelings as simulator and motion
sickness which is mentioned below.
The resolution on animation can also cause users to
experience this phenomenon. When animations are
poor, it causes another type of discord between what is
expected and what is actually happening on the screen.
When onscreen graphics do not keep the pace with the
users’ head movements, it can trigger a form of motion
sickness.
XIII.III Task
Degree of Control
Pausch et al. showed that subjects who themselves
generate input to the virtual environment are less
susceptible to motion sickness than “passengers” or “co-
pilots” who do not generate input [22]. Cue conflict
increases the anticipation and awareness of movement,
which in turn, reduces the likelihood of sickness.
Duration
Longer exposure times result in higher incidence of
sickness and require longer adaptation periods. Ataxia
may also incur with increasing intensity and duration
with increased simulator exposure as shown by a study
conducted by Fowlkes et al. in 1987 [23].
Global Visual Flow
Global Visual Flow is the rate at which objects flow
through the visual scene, directly related to velocity and
inversely related to altitude and visual range. Kennedy,
Berbaum, and Smith found that altitude is one of the
most influential contributors to sickness [24]. Sickness
will often occur with self-movement at low altitudes on
terrain at high speeds. This is because visual flow cues
are high at low altitudes, and exponentially lower at
high altitudes.
Rate of Linear or Rotational Acceleration
Without full initial adaptation, increased
manoeuvring aggressiveness may result in increased
incidence of sickness.
Type of Application
McCauley and Sharkey predict that sickness occurs
primarily in applications involving distant objects, self-
motion through environment, and vection (illusory self-
motion) [25].
Unusual Manoeuvres
Task with high rates of linear or rotation
acceleration including extraordinary situations have
been found to be unsettling. McCauley and Sharkey use
abruptly freezing the simulation and flying backwards
as two examples [25]. Situational reset, where the scene
is rapidly reset forward or backward in time may also
cause nausea.
66th International Astronautical Congress, Jerusalem, Israel. Copyright ©2015 by the International Astronautical Federation. All rights reserved.
IAC-15-B3.9- YPVF.2 Page 10 of 11
Implications
Some of the more severe problems with virtual reality
sickness has little to do with the induced sickness at all.
In 1987, Crowley showed that sickness among pilots in
flight simulators reduced the efficacy of training
through distraction and the encouragement of adaptive
behaviours that are unfavourable for performance,
ground safety, and flight safety when pilots leave the
simulated training environment [26].
XV. SUMMARY
For centuries the distant red hue of Mars has
captured the imagination of humanity. Now mankind is
ready to embrace whatever lies beneath the rough
surface. Water rich asteroids harbour resources essential
for the extension and betterment of humanity. VISR is
an innovative tool in suite of exploration techniques and
devices designed to explore the Martian environment
and beyond. To preserve and disseminate this valuable
knowledge, a virtual space is being forged which will
facilitate the development and utilization of these
unique competencies. VISR offers an ideal
manifestation for this intelligent data repository. The
system allows astronauts to collaborate with scientists
and engineers around the world in a seamless and better
than reality space. Altering the simulation to reflect
changes in gravity fields allows the explorer to interact
with their environment and mission hardware in a high
fidelity simulation. The environment is based on the
extensive SYSML terrain database at JPL and an
advanced behavioural workspace with responsive
connectivity capable of high fidelity simulation and test
capabilities. Visual, intuitive and synchronized, VISR
will become an essential tool for the future of
exploration on Mars and beyond.
XVI. BIBLIOGRAPHY
1. Demêmes D, Dechesne CJ, Venteo S,
Gaven F, Raymond J. “Development of the
rat efferent vestibular system on the ground
and in microgravity” Developmental Brain
Research, Volume 128, Issue 1, 31 May
2001, Pages 35–44
2. Tanaka K, Gotoh TM, Awazu C, Morita H.
“Roles of the vestibular system in
controlling arterial pressure in conscious
rats during a short period of microgravity”
Neuroscience Letters, Volume 397, Issues
1–2, 10–17 April 2006, Pages 40–43
3. Ross MD, “Morphological changes in rat
vestibular system” Biocomputation Center,
NASA Ames Research Center, Moffett
Field, California 94035-1000. Journal of
Vestibular Research : Equilibrium &
Orientation [1993, 3(3):241-251]
4. Harris LR, Herpers R, Hofhammer T,
Jenkin M (2014) How Much Gravity Is
Needed to Establish the Perceptual
Upright? PLoS ONE 9(9): e106207.
doi:10.1371/journal.pone.0106207
5. http://www.skybrary.aero/index.php/Vestib
ular_System_and_Illusions_
(OGHFA_BN). Last visited on 9/21/2015
6. Young R., vestibular adaptation to
weightlessness. In: proc. Symp. Vestibular
organs and altered force environ, edited by
m. Igarashi, and k. G. Nute. Houston:
National aeronautics and space
administration, 1988, p. 85–90
7. http://www.nasa.gov/audience/forstudents/
912/features/F_Human_Vestibular_System
_in_Space.html. Last visited on 9/21/2015
8. https://www.vrvana.com/ Last visited on
9/21/2015
9. https://www.playstation.com/en-
gb/explore/ps4/features/project-morpheus/
Last visited on 9/21/2015
10. http://www.samsung.com/id/gearvr/index.h
tml Last visited on 9/21/2015
11. https://www.google.com/get/cardboard/
Last visited on 9/21/2015
12. http://www.htcvr.com/ Last visited on
9/21/2015
13. Z.W. Luo et.al, “Integration of PC-based
3D Immersion Technology for bio-mimetic
study of Human Interactive Robots”
International Conference on Robotics,
Changsha, China, October 2003.
14. http://www.engadget.com/2013/12/23/nasa-
jpl-control-robotic-arm-kinect-2/ Last
visited on 9/20/2015
15. http://www.worldviz.com/industries/acade
mic Last visited on 9/20/2015
16. International Space Coordination Group.
The Global Exploration Roadmap. August
2013. Page 2-4.
17. De Leon P. & Harris G.L., “NDX-2:
Development of an Advanced Planetary
Spacesuit Demonstrator System for the
Lunar Environment.” 41st International
Conference on Environmental Systems,
July 2011.
18. Reason J.T. & Brand J. J, “Motion
Sickness”, London Academic Press, 1975.
19. Stern R.M., Hu S., LeBlanc R., Koch K.L.,
“Chinese Hyper-susceptibility to Vection-
induced Motion Sickness”, Aviation,
Space, and Environmental Medicine, 1993.
66th International Astronautical Congress, Jerusalem, Israel. Copyright ©2015 by the International Astronautical Federation. All rights reserved.
IAC-15-B3.9- YPVF.2 Page 11 of 11
20. Kennedy R.S., Lilienthal M.G., Berbaum
K.S., Baltzley D.R., McCauley M.E.,
“Simulator Sickness in U.S. Navy Flight
Simulators”, Aviation, Space, and
Environmental Medicine, 1989.
21. E.M. Kolasinski, “Simulator Sickness in
Virtual Environments (ARI 1027)” U.S.
Army Research Institute for the
Behavioural and Social Sciences. 22 July,
2014
22. Pausch R., Crea T., Conway M., “A
Literature Survey for Virtual
Environments: Military Flight Simulator
Visual Systems and Simulator Sickness”,
Presence, 1992.
23. Fowlkes J.E., Kennedy R.S., Lilienthal
M.G., “Postural Disequilibrium Following
Training Flights”, Proceedings of the 31st
Annual Meeting of the Human Factors
Society, 1987.
24. Kennedy R.S., Berbaum K.S., Smith M.G.,
“Methods for Correlating Visual Scene
Elements with Simulator Sickness
Incidence”, Proceedings of the 37th Annual
Meeting of the Human Factors and
Ergonomics Society, 1993.
25. McCauley M.E., Sharkey T.J.,
“Cybersickness: Perception of Self-motion
in Virtual Environments”, Presence, 1992.
26. Crowley J.S., “Simulator Sickness: A
Problem for Army Aviation”, Aviation,
Space, and Environmental Medicine, 1987.