Motion simulator

Simulator seating at the St. Louis Zoo

A motion simulator or motion platform is a mechanism that encapsulates occupants and creates the effect/feelings of being in a moving vehicle. A motion simulator can also be called a motion base, motion chassis or a motion seat.[1] The movement is synchronous with visual display and is designed to add a tactile element to video gaming, simulation, and virtual reality. When motion is applied and synchronized to audio and video signals, the result is a combination of sight, sound, and touch.[2] All full motion simulators move the entire occupant compartment[2] and can convey changes in orientation and the effect of false gravitational forces. These motion cues trick the mind[3] into thinking it is immersed in the simulated environment and experiencing kinematic changes in position, velocity, and acceleration. The mind's failure to accept the experience can result in motion sickness.[1] Motion platforms can provide movement on up to six degrees of freedom:[1] three rotational degrees of freedom (roll, pitch, yaw) and three translational or linear degrees of freedom (surge, heave, sway).

Types[4]

Motion simulators can be classified according to whether the occupant is controlling the vehicle, or whether the occupant is a passive rider, also referred to as a simulator ride or motion theater.

An example of a Stewart platform
Professional Stewart Hybrid Type Motion System with six degrees of freedom

Historically, motion platforms have varied widely in scale and cost. Those in the category of amusement park rides and commercial and military aircraft simulators are at the high end of this spectrum; arcade style amusement devices fall into the middle of the spectrum, while smaller and lower-costing home-based motion platforms comprise the other end.

Modern motion platforms have become complicated machines, but they have simpler roots. Many of the early motion platforms were flight simulators used to train pilots.[6] One of the first motion platforms, the Sanders Teacher, was created in 1910. The Sanders Teacher was an aircraft with control surfaces fitted to the ground by a simple universal joint. When wind was present, the pilot in training was able to use the control surfaces to move the simulator in the three rotational degrees of freedom. Around 1930, a large advance in motion platform technology was made with the creation of the Link Trainer. The Link Trainer used the control stick and external motors to control organ bellows located under the simulator. The bellows could inflate or deflate, causing the simulator to rotate with three degrees of freedom. In 1958 the Comet IV was designed using a three-degrees-of-freedom hydraulic system. After the Comet IV both the range of motion and the degrees of freedom exhibited by motion platforms was increased. The most expensive motion platforms utilize high-fidelity six-degrees-of-freedom motion, often coupled with advanced audio and visual systems. Today you will find motion platforms in many applications including: flight simulation, driving simulation, amusement rides, and even small home-based motion platforms.

Fly motion simulator with 6 rotational degrees of freedom

The high-end motion platform has been used in conjunction with military and commercial flight instruction and training applications. Today one can find high-end, multiple-occupant motion platforms in use with entertainment applications in theme parks throughout the world. The systems used in these applications are very large, weighing several tons, and are typically housed in facilities designed expressly for them. As a result of the force required to move the weight of these larger simulator systems and one or more occupants, the motion platform must be controlled by powerful and expensive hydraulic or electromagnetic cylinders. The cost of this type of motion platform exceeds US$100,000, and often goes well into the millions of dollars for the multi-occupant systems found at major theme park attractions. The complexity of these systems require extensive programming and maintenance, further extending the cost.

Low-cost home motion system with 3 rotational degrees of freedom

A typical high-end motion system is the Stewart platform, which provides full 6 degrees of freedom (3 translation and 3 rotation) and employs sophisticated algorithms to provide high-fidelity motions and accelerations. These are used in a number of applications, including flight simulators for training pilots. However, the complexity and expensive mechanisms required to incorporate all degrees of freedom has led to alternative motion simulation technology using mainly the three rotational degrees of freedom. An analysis of capabilities of these systems reveals that a simulator with three rotational degrees of freedom is capable of producing motion simulation quality and vestibular motion sensations comparable to that produced by a Stewart platform.[7] Historically these systems used hydraulics or pneumatics; however, many modern systems use electric actuators.

The middle of the spectrum includes a number of disclosures involving powered motion platforms aimed at arcade-style amusement games, rides, and other arrangements. These systems fall into a price range from $10,000 to $99,000 USD. Typically the space requirements for such a platform are modest requiring only a portion of an arcade room and a smaller range of motion is provided via similar, less expensive, control systems than the high-end platforms.

The lower-cost systems include home-based motion platforms, which have recently become a more common device used to enhance video games, simulation, and virtual reality. These systems fall into a price range from $1,000 to $9,000 USD. Within the 2000s (decade), several individuals and business entities have developed these smaller, more affordable motion systems. Most of these systems were developed mainly by flight simulation enthusiasts, were sold as do it yourself projects, and could be assembled in the home from common components for around one thousand US dollars ($1,000).[8] Recently, there has been increased market interest in motion platforms for more personal, in-home, use. The application of these motion systems extends beyond just flight training simulation into a larger market of more generalized "craft-oriented" simulation, entertainment, and virtual reality systems.[7]

Common uses

Engineering analysis

Motion platforms are commonly used in the field of engineering for analysis and verification of vehicle performance and design. The ability to link a computer-based dynamic model of a particular system to physical motion gives the user the ability to feel how the vehicle would respond to control inputs without the need to construct expensive prototypes. For example, an engineer designing an external fuel tank for an aircraft could have a pilot determine the effect on flying qualities or a mechanical engineer could feel the effects of a new brake system without building any hardware, saving time and money.

Flight simulators are also used by aircraft manufacturers to test new hardware. By connecting a simulated cockpit with visual screen to a real flight control system in a laboratory, integrating the pilot with the electrical, mechanical, and hydraulic components that exist on the real aircraft, a complete system evaluation can be conducted prior to initial flight testing. This type of testing allows the simulation of "seeded faults" (i.e. an intentional hydraulic leak, software error, or computer shutdown) which serve to validate that an aircraft's redundant design features work as intended. A test pilot can also help identify system deficiencies such as inadequate or missing warning indicators, or even unintended control stick motion. This testing is necessary to simulate extremely high risk events that cannot be conducted in flight but nonetheless must be demonstrated. While 6 degree-of-freedom motion is not necessary for this type of testing, the visual screen allows the pilot to "fly" the aircraft while the faults are simultaneously triggered.

Ride simulators

Main article: Simulator ride

Motion simulators are sometimes used in theme parks to give the park guests a themed simulation of flight or other motion.

Some examples:

Video games

Some driving and flying simulation games allow the use of specialized controllers such as steering wheels, foot pedals or joysticks. Certain game controllers designed in recent years have employed haptic technology to provide realtime, tactile feedback to the user in the form of vibration from the controller. A motion simulator takes the next step by providing the player full-body tactile feedback. Motion gaming chairs can roll to the left and right and pitch forward and backward to simulate turning corners, accelerations and decelerations. Motion platforms permit a more stimulative and potentially realistic gaming experience, and allow for even greater physical correlation to sight and sound in game play.

How human physiology processes and responds to motion[11]

The way we perceive our body and our surroundings is a function of the way our brain interprets signals from our various sensory systems, such as sight, sound, balance and touch. Special sensory pick-up units (or sensory "pads") called receptors translate stimuli into sensory signals. External receptors (exteroceptors) respond to stimuli that arise outside the body, such as the light that stimulates the eyes, sound pressure that stimulates the ear, pressure and temperature that stimulates the skin and chemical substances that stimulate the nose and mouth. Internal receptors (enteroceptors) respond to stimuli that arise from within blood vessels.

Postural stability is maintained through the vestibular reflexes acting on the neck and limbs. These reflexes, which are key to successful motion synchronization, are under the control of three classes of sensory input:

Proprioceptors[11]

Proprioceptors are receptors located in your muscles, tendons, joints and the inner ear, which send signals to the brain regarding the body's position. An example of a "popular" proprioceptor often mentioned by aircraft pilots, is the "seat of the pants". In other words, these sensors present a picture to your brain as to where you are in space as external forces act on your body. Proprioceptors respond to stimuli generated by muscle movement and muscle tension. Signals generated by exteroceptors and proprioceptors are carried by sensory neurons or nerves and are called electrochemical signals. When a neuron receives such a signal, it sends it on to an adjacent neuron through a bridge called a synapse. A synapse "sparks" the impulse between neurons through electrical and chemical means. These sensory signals are processed by the brain and spinal cord, which then respond with motor signals that travel along motor nerves. Motor neurons, with their special fibres, carry these signals to muscles, which are instructed to either contract or relax.

The downfall with our internal motion sensors is that once a constant speed or velocity is reached, these sensors stop reacting. Your brain now has to rely on visual cues until another movement takes place and the resultant force is felt. In motion simulation, when our internal motion sensors can no longer detect motion, a “washout” of the motion system may occur. A washout allows the motion platform occupant to think they are making a continuous movement when actually the motion has stopped. In other words, washout is where the simulator actually returns to a central, home, or reference position in anticipation of the next movement. This movement back to neutral must occur without the occupant actually realizing what is happening. This is an important aspect in motion simulators as the human feel sensations must be as close to real as possible.

Vestibular system[11]

The vestibular system is the balancing and equilibrium system of the body that includes the vestibular organs, ocular system, and muscular system. The vestibular system is contained in the inner ear. It consists of three semicircular canals, or tubes, arranged at right angles to one another. Each canal is lined with hairs connected to nerve endings and is partially filled with fluid. When the head experiences acceleration the fluid moves within the canals, causing the hair follicles to move from their initial vertical orientation. In turn the nerve endings fire resulting in the brain interpreting the acceleration as pitch, roll, or yaw.

There are, however, three shortcomings to this system. First, although the vestibular system is a very fast sense used to generate reflexes to maintain perceptual and postural stability, compared to the other senses of vision, touch and audition, vestibular input is perceived with delay.[12] Indeed, although engineers typically try and reduce delays between physical and visual motion, it has been shown that a motion simulator should move about 130ms before visual motion in order to maximize motion simulator fidelity.[13] Second, if the head experiences sustained accelerations on the order of 10 – 20 seconds, the hair follicles return to the “zero” or vertical position and the brain interprets this as the acceleration ceasing. Additionally, there is a lower acceleration threshold of about 2 degrees per second that the brain cannot perceive. In other words, slow and gradual enough motion below the threshold will not affect the vestibular system. As discussed in the preceding “Proprioceptors” section, this shortfall actually allows the simulator to return to a reference position in anticipation of the next movement.

Visual inputs[11]

The human eye is the most important source of information in motion simulation. The eye relays information to the brain about the craft's position, velocity, and attitude relative to the ground. As a result, it is essential for realistic simulation that the motion works in direct synchronization to what is happening on the video output screen. Time delays cause disagreement within the brain, due to error between the expected input and the actual input given by the simulator. This disagreement can lead to dizziness, fatigue and nausea in some people.

For example, if the occupant commands the vehicle to roll to the left, the visual displays must also roll by the same magnitude and at the same rate. Simultaneously, the cab tilts the occupant to imitate the motion. The occupant’s proprioceptors and vestibular system sense this motion. The motion and change in the visual inputs must align well enough such that any discrepancy is below the occupant’s threshold to detect the differences in motion.

In order to be an effective training or entertainment device, the cues the brain receives by each of the body’s sensory inputs must agree.

Putting it together - how simulators trick the body[14]

It is physically impossible to correctly simulate large scale ego-motion in the limited space of a laboratory. The standard approach to simulate motions (so called motion cueing) is to simulate the “relevant” cues as closely as possible, especially the acceleration of an observer. Visual and auditory cues enable humans to perceive their location in space on an absolute scale. On the other hand, the somatosensory cues, mainly proprioception and the signals from the vestibular system, code only relative information. But fortunately (for our purpose), humans cannot perceive accelerations and velocities perfectly and without systematic errors. And this is where the tricky business of motion simulation starts. We can use those imperfections of the human sensory and perceptual systems to cheat intelligently.

Linear movements

In principle, velocity cannot be directly perceived by relative cues alone, like those from the vestibular system. For such a system, flying in space with some constant velocity is not different from sitting in a chair. However, changing the velocity is perceived as acceleration, or force acting on the human body. For the case of constant linear acceleration, a substitute for the real situation is simple. Since the amplitude of the acceleration is not very well perceived by humans, one can tilt the subject backwards and use the gravity vector as a replacement for correct resulting force from gravity and forward acceleration. In this case, leaning backwards is therefore not perceived differently from being constantly accelerated forwards.

Linear accelerations[15]

Linear accelerations are detected by otoliths. The otolith structure is simpler than the three-axis semicircular canals that detect angular accelerations. The otoliths contain calcium carbonate particles that lag behind head movement, deflecting hair cells. These cells transmit motion information to the brain and oculomotor muscles. Studies indicate that the otoliths detect the tangential component of the applied forces. A transfer function model between the perceived force and the applied forces is given by:

Based on centrifuge experiments, threshold values of 0.0011 ft/s2 have been reported; values up to 0.4 ft/s2 have been reported based on airborne studies in the USSR. The same studies suggest that the threshold is not a linear acceleration but rather a jerk motion (third time derivative of position), and the reported threshold value is on the order of 0.1 ft/s3. These findings are supported by early studies showing that human movement kinematics is represented by characteristics of jerk profiles.[16]

Rotational movements

Unfortunately, there is no easy way of cheating for rotations. Hence, many motion simulations try to avoid the problem by avoiding quick and large rotations altogether. The only convincing way of simulating larger turns is an initial yaw rotation above threshold and a back-motion below threshold. For roll and pitch, the static (otolithic) cues cannot be modified easily due to the ambiguity of linear accelerations and changes in gravitational direction. In real life, the ambiguity is resolved by using the dynamical properties of the vestibular and other sensory signals (most importantly, vision).

Angular accelerations[15]

Angular accelerations are detected by semicircular canals while linear accelerations are detected by another structure in the inner ear called the otolith.

The three semicircular canals are mutually orthogonal (similar to three-axis accelerometer) and are filled with a fluid called the endolymph. In each canal, there is a section where the diameter is larger than the rest of the canal. This section is called the ampulla and is sealed by a flap called the cupula. Angular accelerations are detected as follows: an angular acceleration causes the fluid in the canals to move, deflecting the cupula. The nerves in the cupula report the motion to both the brain and oculomotor muscles, stabilizing eye movements. A transfer function model between the perceived angular displacement and the actual angular displacement is:

A second-order model of the angle of the cupula is given by

where is the damping ratio, is the natural frequency of the cupula, and is the input angular acceleration. Values of have been reported to be between 3.6 and 6.7 while values of have been reported to be between 0.75 and 1.9. Thus, the system is overdamped with distinct, real roots. The shorter time constant is 0.1 seconds, while the longer time constant depends on the axis about which the test subject is accelerating (roll, pitch, or yaw). These time constants are one to two orders of magnitude greater than the shorter time constant.

Experiments have shown that angular accelerations below a certain level cannot be detected by a human test subject. Values of have been reported for pitch and roll accelerations in a flight simulator.

Implications

The above studies indicate that the pilot's vestibular system detects accelerations before the aircraft instruments displays them. This can be considered an inner control loop in which the pilots responds to accelerations that occur in full-motion simulators and aircraft, but not in fixed simulators. This effect shows that there is a potential negative training transfer when transitioning from a fixed-based simulator to an aircraft and indicates the need for motion systems for pilot training.

It is physically impossible to precisely simulate large scale egomotion in the limited space of a laboratory. There is simply no way around the physics. However, by exploiting some of the imperfections of the body’s sensory and perceptual systems, it is possible to create an environment in which the body perceives motion without actually moving the subject more than a few feet in any one direction. This is where the tricky business of motion simulation begins.

The standard approach to simulating motion (so called motion cueing) is to simulate the “relevant” cues as closely as possible which trigger motion perception. These cues can be visual, auditory, or somatosensory in nature. Visual and auditory cues enable humans to perceive their location in space on an absolute scale, whereas somatosensory cues (mainly proprioception and other signals from the vestibular system) provide only relative feedback. Fortunately for us, humans cannot perceive velocity and acceleration directly without some form of error or uncertainty.

For example, consider riding in a car traveling at some arbitrary constant speed. In this situation, our sense of sight and sound provide the only cues (excluding engine vibration) that the car is moving; no other forces act on the passengers of the car except for gravity. Next, consider the same example of a car moving at constant speed except this time, all passengers of the car are blindfolded. If the driver were to step on the gas, the car would accelerate forward thus pressing each passenger back into their seat. In this situation, each passenger would perceive the increase in speed by sensing the additional pressure from the seat cushion. However, if the car were traveling in reverse and the driver stepped on the brake pedal instead of the gas, the deceleration of the vehicle would create the same feeling of increased pressure from the seat cushion as in the case of acceleration that the passengers would be unable to distinguish which direction the vehicle is actually moving.

Summary of most commonly used “tricks”

Implementation using washout filters

Washout filters are an important aspect of the implementation of motion platforms as they allow motion systems, with their limited range of motion, to simulate the range of vehicle dynamics being simulated. Since the human vestibular system automatically re-centers itself during steady motions, washout filters are used to suppress unnecessary low-frequency signals while returning the simulator back to a neutral position at accelerations below the threshold of human perception. For example, a pilot in a motion simulator may execute a steady, level turn for an extended period of time which would require the system stay at the associated bank angle, but a washout filter allows the system to slowly move back to an equilibrium position at a rate below the threshold which the pilot can detect. This allows the higher level dynamics of the computed vehicle to provide realistic cues for human perception, while remaining within the limitations of the simulator.[17][18]

Three common types of washout filters include classical, adaptive and optimal washout filters. The classical washout filter comprises linear low-pass and high-pass filters. The signal into the filter is split into translation and rotational signals. High-pass filters are used for simulating transient translational and rotational accelerations, while the low-pass filters are used to simulate sustaining accelerations.[19] The adaptive washout filter uses the classical washout filter scheme, but utilizes a self-tuning mechanism that is not featured with the classical washout filter. Finally, the optimal washout filter takes into account models for vestibular system.[18]

Classical Control Representation

The classical washout filter is simply a combination of high-pass and low-pass filters; thus, the implementation of the filter is compatibly easy. However, the parameters of these filters have to be empirically determined. The inputs to the classical washout filter are vehicle-specific forces and angular rate. Both of the inputs are expressed in the vehicle-body-fixed frame. Since low-frequency force is dominant in driving the motion base, force is high-pass filtered, and yields the simulator translations. Much the same operation is done for angular rate.

To identify the tilt of the motion platform, the tilt mechanism first supplies the low-frequency component of force for rotation calculation. Then, the high-frequency component 'f' is used to orient the gravity vector 'g' of the simulator platform:

Typically, to find position, the low-pass filter (in a continuous-time setting) is represented in the s-domain with the following transfer function:

The inputs to the high-pass filter are then calculated according to the following equation:

where are the force inputs. The high-pass filter may then be represented according to (for example) the following series:

The two integrators in this series represent the integration of acceleration into velocity, and velocity into position, respectively. , and represent the filter parameters. It is evident that the output of the filter will vanish in steady state, preserving the location of the open-loop equilibrium points. This means that while transient inputs will be "passed", steady-state inputs will not, thus fulfilling the requirements of the filter.[20]

The present practice for empirically determining the parameters within the washout filter is a trial and error subjective tuning process whereby a skilled evaluation pilot flies predetermined maneuvers. After each flight the pilot's impression of the motion is communicated to a washout filter expert who then adjusts the washout filter coefficients in an attempt to satisfy the pilot. Researchers have also proposed using a tuning paradigm and the capturing of such using an expert system.[21]

Nonlinear Washout Filter

This washout filter can be regarded as the result of a combination of an Adaptive and an Optimal washout filter. A nonlinear approach is desired to further maximize the available motion cues within the hardware limitations of the motion system, therefore resulting in a more realistic experience. For example, the algorithm described by Daniel and Augusto computes a gain, α, as a function of the system states; thus, the washout is time varying. The 'α' gain will increase as the platform states increase their magnitude, making room for a faster control action to quickly washout the platform to its original position. The opposite outcome occurs when the magnitude of the platform states is small or decreasing, prolonging the motion cues which will be sustained for longer durations.[22]

Likewise, the work of Telban and Cardullo added an integrated perception model that includes both visual and vestibular sensation to optimize the human's perception of motion. This model as shown to improve pilot's responses to motion cues.[23]

Adaptive Washout Filter

This adaptive approach was developed at NASA Langley. It is made up of a combination of empirically determined filters in which several of the coefficients are varied in a prescribed manner in order to minimize a set objective (cost) function. In a study conducted at the University of Toronto the coordinated adaptive filter provided the “most favorable pilot ratings” as compared with the other two types of washout filters. The benefits of this style of washout filter can be summarized with two major points. First, the adaptive characteristics give more realistic motion cues when the simulator is near its neutral position, and the motion is only reduced at the limits of the motions systems capabilities, allowing for better use of the motion system’s capabilities. Second, the cost function or the objective function (by which the washout filter is optimized) is very flexible and various terms may be added in order to incorporate higher fidelity models. This allows for an expandable system that is capable of changing over time, resulting in a system that responds in the most accurate way throughout the simulated flight. The disadvantages are that the behavior is difficult to adjust, primarily due to the cross fed channels. Finally execution time is relatively high due to the large number of derivative function calls required. In addition as more complex cost functions are introduced the corresponding computing time required will increase.[24]

Limitations

Although washout filters do provide great utility for allowing the simulation of a wider range of conditions than the physical capabilities of a motion platform, there are limitations to their performance and practicality in simulation applications. Washout filters take advantage of the limitations of human sensing to the appearance of a larger simulation environment than actually exists. For example, a pilot in a motion simulator may execute a steady, level turn for an extended period of time which would require the system stay at the associated bank angle. In this situation, a washout filter allows the system to slowly move back to an equilibrium position at a rate below the threshold which the pilot can detect. The benefit of this is that the motion system now has a greater range of motion available for when the pilot executes his next maneuver.

Such behavior is easily applied in the context of aircraft simulation with very predictable and gradual maneuvers (such as commercial aircraft or larger transports). However, these slow, smooth dynamics do not exist in all practical simulation environments and diminish the returns of washout filters and a motion system. Take training of fighter pilots, for example: while the steady, cruise regime of a fighter aircraft may be able to be well simulated within these limitations, in aerial combat situations flight maneuvers are executed in a very rapid manner to physical extremes. In these scenarios, there is not time for a washout filter to react to bring the motion system back to its range equilibrium resulting in the motion system quickly hitting its range of movement limitations and effectively ceasing to accurately simulate the dynamics. It is for this reason that motion and washout filter based systems are often reserved for those that experience a limited range of flight conditions.

The filters themselves may also introduce false cues, defined as: 1) a motion cue in the simulator that is in the opposite direction to that in the aircraft, 2) a motion cue in the simulator when none was expected in the aircraft, and 3) a relatively high-frequency distortion of a sustained cue in the simulator for an expected sustained cue in the aircraft. The previous definition groups together all of the cueing errors that lead to very large decreases in perceived motion fidelity.[21] Six potential sources of false cues are:

Impact

Impact of motion in simulation and gaming[4][11]

The use of physical motion applied in flight simulators has been a debated and researched topic. The Engineering department at the University of Victoria conducted a series of tests in the 1980s, to quantify the perceptions of airline pilots in flight simulation and the impact of motion on the simulation environment. In the end, it was found that there was a definite positive effect on how the pilots perceived the simulation environment when motion was present and there was almost unanimous dislike for the simulation environment that lacked motion.[25] A conclusion that could be drawn on the findings of the Response of Airline Pilots study is that the realism of the simulation is in direct relationship to the accuracy of the simulation on the pilot. When applied to video gaming and evaluated within our own gaming experiences, realism can be directly related to the enjoyment of a game by the game player. In other words, – motion enabled gaming is more realistic, thus more iterative and more stimulating. However, there are adverse effects to the use of motion in simulation that can take away from the primary purpose of using the simulator in the first place such as Motion Sickness. For instance, there have been reports of military pilots throwing off their vestibular system because of moving their heads around in the simulator similar to how they would in an actual aircraft to maintain their sensitivity to accelerations. However, due to the limits on simulator acceleration, this effect becomes detrimental when transitioning back to a real aircraft.

Adverse effects (simulator sickness)

Motion or simulator sickness: Simulators work by “tricking” the mind into believing that the inputs it is receiving from visual, vestibular and proprioceptive inputs are a specific type of desired motion. When any of the cues received by the brain do not correlate with the others, motion sickness can occur. In principle, simulator sickness is simply a form of motion sickness that can result from discrepancies between the cues from the three physical source inputs. For example, riding on a ship with no windows sends a cue that the body is accelerating and rotating in various directions from the vestibular system, but the visual system sees no motion since the room is moving in the same manner as the occupant. In this situation, many would feel motion sickness.

Along with simulator sickness, additional symptoms have been observed after exposure to motion simulation. These symptoms include feelings of warmth, pallor and sweating, depression and apathy, headache and fullness of head, drowsiness and fatigue, difficulty focusing eyes, eye strain, blurred vision, burping, difficulty concentrating, and visual flashbacks. Lingering effects of these symptoms were observed to sometimes last up to a day or two after exposure to the motion simulator.

Contributing factors to simulator sickness

Several factors contribute to simulation sickness, which can be categorized into human variables, simulator usage, and equipment. Common human variable factors include susceptibility, flight hours, fitness, and medication/drugs. An individual’s variance in susceptibility to motion sickness is a dominant contributing factor to simulator sickness. Increasing flight hours is also an issue for pilots as they become more accustomed to the actual motion in a vehicle. Contributing factors due to simulator usage are adaptation, distorted or complicated scene content, longer simulation length, and freeze/reset. Freeze/reset refers to the starting or ending points of a simulation, which should be as close to steady and level conditions as possible. Clearly, if a simulation is ended in the middle of an extreme maneuver then the test subjects IMU system is likely to be distorted. Simulator equipment factors that contribute to motion sickness are quality of motion system, quality of visual system, off-axis viewing, poorly aligned optics, flicker, and delay/mismatch between visual and motion systems. The delay/mismatch issue has historically been a concern in simulator technology, where time lag between pilot input and the visual and motion systems can cause confusion and generally decrease simulator performance.

Debate over performance enhancement from motion simulators

In theory, the concept of motion simulators seem self-explanatory: if the perception of events can be mimicked exactly, they will provide the user an identical experience. However, this ideal performance is next to impossible to achieve. Although the motion of vehicles can be simulated in 6 degrees of freedom (all that should be required to mimic motion), the impacts of simulated motion on pilots, and operators in many other fields, often leave trainees with a multitude of adverse side effects not seen in un-simulated motion. Further, there are many scenarios which may be difficult to simulate in training simulators exposing a concern that replacing real world exposure with motion simulations may be inadequate.

Due to the exorbitant cost of adding motion to simulators, military programs have established research units to investigate the impact of “skill acquisition” with the use of motion simulators. These units have provided results as recent as 2006 despite the use motion simulators over the last century. From an Army study, it was determined that “motion-based simulators are recommended for training when individuals must continue to perform skill-based tasks…while the ground vehicle negotiates rough terrain.”[26] However, if individuals are not required to negotiate rough terrain, or motion sickness does not detract from performance in the field, then “motion is not recommended.”[26]

The existence of adverse side effects of virtual environments has spawned a plethora of studies from predicting and measuring the impact of the side effects to identifying their specific causes.[27]

Advantages and disadvantages of simulation in training

Advantages

Disadvantages

See also

Wikimedia Commons has media related to Motion simulator.

References

[20]

  1. 1 2 3 "Motion Platforms or Motion Seats?" (PDF). Phillip Denne, Transforce Developments Ltd. 2004-09-01.
  2. 1 2 "Motion Systems and Visual Displays" (PDF). Phillip Denne. 1994-01-12.
  3. Scanlon, Charles H. (December 1987). "Effect of Motion Cues During Complex Curved Approach and Landing Tasks" (PDF). NASA: 6–9. Retrieved 2009-07-19.
  4. 1 2 "SimCraft :: Military Grade Full Motion Simulators for SimRacing and FlightSim". SimCraft Corporation. 2006-06-12.
  5. Rollings, Andrew; Ernest Adams (2003). Andrew Rollings and Ernest Adams on Game Design. New Riders Publishing. pp. 395–415. ISBN 1-59273-001-9.
  6. Page, Ray L. "Brief History of Flight Simulation." In SimTechT 2000 Proceedings. Sydney: The SimtechT 2000 Organizing and Technical Committee, 2000
  7. 1 2 Nicolas A. Pouliot; Clément M. Gosselin; Meyer A. Nahon (January 1998). "Motion Simulation Capabilities of Three-Degree-of-Freedom Flight Simulators". Journal of Aircraft. 35 (1): 9–17. doi:10.2514/2.2283.
  8. "XSimulator DIY Motion Simulator Community". xsimulator.net. 2013-09-24.
  9. http://www.nasm.si.edu/visit/concessions/simulators.cfm
  10. http://pulseworks.com/I-360.html
  11. 1 2 3 4 5 "Motion Platforms". Moorabbin Flying Services. 2006-06-12.
  12. Barnett-Cowan, M.; Harris, L. R. (2009). "Perceived timing of vestibular stimulation relative to touch, light and sound". Experimental Brain Research. 198: 221–231. doi:10.1007/s00221-009-1779-4.
  13. Grant, P; Lee, PTS (2007). "Motion–visual phase-error detection in a flight simulator". J Aircr. 44: 927–935. doi:10.2514/1.25807.
  14. Markus von der Heyde & Bernhard E. Riecke (2001-12). "how to cheat in motion simulation – comparing the engineering and fun ride approach to motion cueing". CiteSeerX 10.1.1.8.9350Freely accessible. Check date values in: |date= (help);
  15. 1 2 "Allerton, D. (2009). Principles of Flight Simulation. John Wiley & Sons, Ltd.
  16. Flash, Tamar; Hogan, Neville (1985). "The coordination of arm movements: an experimentally confirmed mathematical model". The Journal of Neuroscience. 5: 1688–1703.
  17. Chen, S.H.; Fu, L.D. (2010). "An optimal washout filter design for a motion platform with senseless and angular scaling maneuvers". Proceedings of the American Control conference: 4295–4300.
  18. 1 2 Grant, P.R.; Reid, L.D. (1997). "Motion washout filter tuning: Rules and requirements.". Journal of Aircraft. 34 (2): 145–151. doi:10.2514/2.2158.
  19. Springer, K.; Gattringer, H.; Bremer, H. (2011). "Towards Washout Filter Concepts for Motion Simulators on the Base of a Stewart Platform". PAMM. 11 (1): 955–956. doi:10.1002/pamm.201110448.
  20. 1 2 R. Graf and R. Dillmann, "Active acceleration compensation using a Stewart platform on a mobile robot," in Proc. 2nd Euromicro Workshop Advanced Mobile Robots, Brescia, Italy, 1997, pp. 59-64.
  21. 1 2 Grant, P.R.; Reid, L.D. (1997). "PROTEST: An Expert System for Tuning Simulator Washout Filters". Journal of Aircraft. 34 (2): 145–151.
  22. Daniel, B. "Motion Cueing in the Chalmers Driving Simulator: An Optimization-Based Control Approach" (PDF). Chalmers University. Retrieved 14 April 2014.
  23. Telban, R.J. (May 2005). Motion Cueing Algorithm Development: Human-Centered Linear and Nonlinear Approaches (PDF). NASA Contractor Report CR-2005-213747.
  24. Nahon, M.A.; Reid, L.D. "Simulator motion-drive algorithms-A designer's perspective". Journal of Guidance, Control and Dynamics. 13 (2): 356–362. doi:10.2514/3.20557.
  25. Lloyd D Reid; Meyer A. Nahon (July 1988). "Response of airline pilots to variations in flight simulator motion algorithms". Journal of Aircraft. 25 (7): 639–646. doi:10.2514/3.45635.
  26. 1 2 "Effects of Motion on Skill Acquisition in Future Simulators" (PDF). DTIC.
  27. Michael K. McGee. "Assessing Negative Side Effects in Virtual Environments".
  28. 1 2 U.S. Army Research Institute for the Behavioral and Social Sciences (April 2005). "Introduction to and Review of Simulator Sickness Research" (PDF).
This article is issued from Wikipedia - version of the 11/1/2016. The text is available under the Creative Commons Attribution/Share Alike but additional terms may apply for the media files.