Body-Based Technology Alleviates VR Motion Sickness

Hopefully more effective than Antivert (motion-sickness) pills, Alka Seltzer, and Pepto Bismol, MONKEYmedia offers its patented body-based navigation solution (BodyNav) for hands-free virtual reality interactions. BodyNav leverages the existing on-board sensors of smartphones and advanced 3D headsets in novel and unanticipated ways to engage the body’s innate center of gravity. This human-centered interaction approach reduces motion sickness artifacts and enhances navigation abilities in virtual and augmented realities (VR/AR), as well as first-person view (FPV) drone aviation contexts.

 

Motion Sickness

SENSORS EXPO & CONFERENCE

Sensors 2018 Hits Silicon Valley June 26-28!

Join thousands of engineers this June in San Jose at the sensor industry’s biggest event! With 65+ Technical Sessions, 100+ Leading Speakers, and 300+ Interactive Exhibits, there’s more opportunity than ever to connect with this booming industry and the technologies driving it. See thousands of the newest technologies in action, learn about the latest applications, including AI, Autonomous Vehicles, IoT, and Medical, and develop invaluable partnerships at the only event dedicated to sensors, connectivity, and systems.

 

Motion sickness has long been a complaint amongst virtual reality gamers and drone pilots. Traditional stereoscopic headset interfaces use multiple sensor axes (e.g. rotate left/right, pivot up/down, tip left/right) to establish viewer orientation, while requiring handheld controllers (e.g., joysticks, gamepads, keyboards, etc.) for locomotion. Visually “moving” through space while in a sedentary posture creates sensory imbalances that can cause dizziness and nausea in the viewer. Oculus’ former Chief Scientist goes so far as to call hand controllers "sickness generators." Addressing this problem, MONKEYmedia’s patented, hands-free BodyNav technology creates more intuitive virtual interactions by remapping control axes to accomplish both orientation and locomotion with natural body movement. This provides the organic equilibrium needed to circumvent sensory imbalances.

 

How BodyNav Works

 

Without any custom hardware, BodyNav uses distinct sensor axes for independent functions to maintain equilibrium in the body’s proprioceptive system. Viewers simply lean, using either their head or torso, to move themselves through virtual spaces, or to move their drones through remote physical spaces. This allows the sensory receptors, which receive stimuli internally and relate to the body’s position and movement, to properly engage with virtual or remote content, synchronizing visual and vestibular senses and reducing motion sickness-inducing factors.

 

Use Cases

 

BodyNav can be readily adapted to modernize a user experience with just a few lines of code. For developers like Garriott, it enhances first-person gaming by freeing the hands from managing avatar movement to focus on other tasks. It will also amplify multi-camera performances and sporting events, control of remote vehicles, video conferencing and telepresence applications, street-view maps, augmented reality, architectural simulations, and 3D user interfaces for browsing data models, documents and images – all while keeping users comfortable, engaged and entertained.

 

For more information, visit MONKEYmedia and to learn how to incorporate BodyNav technology into a VR or drone piloting user experience, contact [email protected].