Augmented Reality in Space

Augmented reality (AR) is an emerging technology that enhances, or augments, a user’s perception of the real world. The enhancement can be delivered through sight, sound, or touch and provides additional information about a person’s environment.

Beyond consumer and entertainment applications, AR is highly useful for improving efficiency, safety, and productivity in the workplace by providing a user with important information (i.e. sensor data, inventory information, heat mapping) that is not naturally perceivable from the environment. Many industries including construction, medical, manufacturing, and defense sectors have begun to invest and develop AR technologies that enhance a user’s ability to complete a task. Recent partnerships between HTC Vive and AECOM, developments by the US military, and projects supported by Medtronic illustrate how AR heads-up displays offer a powerful method for conveying information needed to plan a construction site, learn a medical procedure, or successfully execute a military training exercise.

In addition to these on-Earth settings, AR has similar applications for improving operational tasks during space exploration. Astronauts stationed on the International Space Station (ISS) have already experimented using Microsoft’s Hololens to complete tasks.
Feedback from astronauts that tested the Hololens and NASA’s roadmap plans indicate that AR could be used to (1) superimpose instructions or illustrations that guide an astronaut through maintenance repairs, and (2) enable remote visibility between an astronaut and ground operator to solve an issue together in real-time. The Hololens is also being used to plan the next Mars mission. Additional projects such as AMARIS and open calls for AR solutions highlight NASA’s interest in developing AR technologies and AR’s growing ability to support unique conditions in aeronautic and space missions.

AR technologies, however, are not all visually-based or delivered through a special headset. We responded to one of NASA’s challenges and offered our own AR solution: communication through the sense of touch. Not all situations enable clear communication through sight and sound, and thus require a method of information delivery beyond these two channels. Our technology conveys information felt on the surface of the skin as tactile, or haptic, feedback and allows a person to receive important data even when their eyes and ears are unavailable. More than an ordinary tap or buzz, users feel unique patterns corresponding to information important to them.

Our recent participation at the 2018 NASA iTech Cycle I Forum highlighted a few NASA use-cases of our technology. Though haptic feedback has been incorporated in joysticks used in training simulations, our technology expands the number of applications that haptic feedback provides operational enhancement. Addressing the A in NASA (Aeronautics), our technology could allow pilots to receive unique tactile alerts corresponding to sensor information needed for a safe flight without requiring their visual attention. In space, our technology could convey similar information to astronauts whose space suits limit use of most current AR form factors, including headsets.

As AR continues to evolve, it introduces a growing list of potential applications to both improve operational workflow and enhance user experience. We’ll see how the technology continues to evolve but perhaps at the next Space X launch, AR developments will allow viewers who aren’t present at Cape Canaveral to experience the feeling of a rocket shaking the Earth as it accelerates into orbit.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.