NASA’S ISS hybrid reality experience: pushing the boundary of physical and virtual reality

“Using the latest imaging technologies, we are literally able to put our crew in space while they’re still on earth.”
— Matthew Noyes, Aerospace Technologist, Robotics and Automation for NASA

NASA’s Hybrid Reality Lab was created to improve the quality and reduce the cost and implementation time of NASA’s astronaut training, engineering design, scientific visualisation, field analog and surface operations, and studies related to human performance by developing systems combining elements of physical and virtual reality.

What’s the philosophy behind it?

Matthew Noyes explains. “Physical and virtual realities both have their strengths and weaknesses. Physical reality allows for a high degree of accuracy in tactile feedback and aids muscle memory development when learning tools and system interfaces for the first time. But the cost and time impact of making things look “photorealistic” in real life – in terms of spacecraft prototype fidelity – is very high. And physical prototypes are not very mobile, which is a challenge for a team that spans cities and nations around the world.“

By contrast, VR makes it easier to have reconfigurable, photorealistic graphics at a lower cost, supporting remote multi-user interaction. This improves collaboration across NASA centers nationwide and with foreign partners, and also allows for a mobile public outreach demonstration. However, artificial haptic feedback technology is only an approximation and cannot replicate the feeling of holding a real object in your hands.

How does the experience work?

The International Space Station (ISS) Hybrid Reality Experience was created to demonstrate this synergy. The experience lets the user roam around virtual US, European, and Japanese ISS module interiors, and seamlessly pass through the airlock to observe the station as a whole in a 1:1 scale representation.

 

The experience includes a 3D printed replica of the Pistol Grip Tool, which feels real in the real world but looks real – made of metal – in the virtual world. It features tactile rumble when in use, and it can be used on a panel repair procedure in the virtual environment. Other features include a realistic earth model with rotational motion provided by Opaque Space, makers of the VR game Earthlight and NVIDIA Flex integration for realistic fluid motion in microgravity.

Matthew Noyes explains how NASA uses Unreal Engine to train astronauts.

What are the project objectives?

In the short term, the experience is designed to demonstrate ways in which physical and virtual reality can work together to achieve greater immersion than they could separately. It also shows how NASA can work with the electronic entertainment industry to leverage advances in VR hardware, GPU technology and rendering engines to reduce cost and schedule impact, while increasing simulation quality. In the UK, NASA has discussed the use of its technology with companies that include McLaren in the automotive sector and Epic Games in immersive entertainment.

What technology was used?

NVIDIA Geforce GPU Technology was used to accelerate rendering and physics calculations and NVIDIA Flex created soft body simulation. The experience was built in Unreal Engine 4. Additive manufacturing (3D printing) generated tracked objects and objects are currently tracked either using standalone Lighthouse HTC tracking pucks, or by integrating tracking photodiodes directly into the tracked objects. Blender/Maya was used for 3D modelling and animation, Substance Painter for texturing and Artec scanners for photogrammetry.

“At a fundamental level, training astronauts to explore space is kind of like creating a game. We immerse the user in a fabricated 3D environment and have them complete objectives under various constraints.”
— Matthew Noyes

What was the timeframe for completion?

The initial technology demonstration of an interactive International Space Station environment was completed in four months. It continues to be expanded on as a testing ground for new consumer VR and AR technology which may be deployed in other NASA systems.

What challenges were encountered?

CAD models have a high polygon density and no textures, which means they are not very realistic looking. To overcome this problem, custom scripts were written to de-feature the models in some cases, while hand retopology was used in others. Custom textures were painted on and, where possible, the models were created by 3D scanning real hardware.

Another challenge arose from NASA’s position as a government agency, working with a wide variety of contractors. “As a government agency we work with a variety of contractors who maintain control over models used by the federal government,” says Matthew. “To address this we created a modding framework similar in concept to how many games work. This allows third-party content to be added as “mods” which can be seamlessly removed later without breaking the codebase when sending the “base” project to new collaborators.” He adds that this was considerably aided by the fact that Unreal Engine includes easy-to-use plugin architecture.

How has the experience been received within NASA?

“The ISS Hybrid Reality experience has been an overwhelming success,” Matthew confirms. “Many astronauts who have tried the system have been extremely complimentary on the realism the system provides related to photorealistic visuals, the way objects move in the microgravity environment, tracking fluidity, lack of simulator sickness and the intuitiveness of using 3D printed replicas in digital space. The response has been particularly well-received among those who flew on the ISS during its construction by providing a unique opportunity to explore the final configuration, the culmination of their collective achievement, as if they were back aboard.”

In fact, its users have commented that the VR prototypes look more realistic than many of the physical prototypes, due to concessions made when assembling physical mock-ups to accelerate build-up and stay within cost constraints that did not have to be made for the VR experience.

 

What is the current status of AR/VR/MR technology within this sector?

AR/VR/MR technology has come a long way in the last 20 years, says Matthew. It’s finally at the point where the benefits have advanced so greatly, and the former negatives – cost and motion sickness – have diminished so much, that it’s viable for mainstream applications.

“There are of course some things that can still be improved for enterprise use, like screen resolution, field of view, tracking volume, and so on, but those advancements are coming so quickly, it’s worth investing in content for it now. Microsoft HoloLens is actively being used for engineering model evaluation and driving the Mars rovers at the Jet Propulsion Laboratory. HTC Vive is already being used to train astronauts at JSC, for maintenance training at Kennedy Space Center, and with LIDAR visualisations at Goddard Space Flight Center, so it has already proven itself for engineering and scientific applications.”

What’s happening in this area right now?

Matthew explains that this project has directly led to a number of advancements in other projects with real-world applications in spaceflight.

“Hybrid reality is being applied in evaluating architectural design for next generation space habitats, which will create a habitat capable of deployment around or on the Moon/Mars. We visualise 3D CAD models in virtual reality with photorealistic textures, and gather feedback from evaluators as they are immersed in the experience.”

Geologists are also using hybrid reality to create hybrid field analogs for astronaut training. Field analogs are remote locations on earth which simulate the environment and/or harsh living conditions of deep space, such as in the middle of the desert or under the ocean. Geologists collect Light Imaging, Detection and Ranging (LIDAR) data from a Hawaiian lava flow, which will be integrated together into one of the most physically accurate “open world environments” ever created for VR. This will allow astronauts to practice collecting and processing rock samples virtually without having to physically transport to the remote field location.

Human performance researchers are also using an optical animation trick in a VR headset to simulate the disorientation astronauts experience when they transition between gravitation environments. “By translating a checkerboard pattern across the user’s field of view and coupling its motion with the rotation of the user’s head, we can induce a perturbation of balance similar in nature to what astronauts feel when they try to walk again after returning from space,” explains Matthew.

The goal was to see if it would be possible to create a VR system that could train astronauts how to walk properly before they land, to increase safety when exiting the vehicle. Surprisingly, just 10 minutes of exposure produced a 209% increase in walking performance. The researchers now want to expand this VR experiment into a full hybrid reality obstacle course for performance evaluation.

What does the future look like for VR in space?

In the future, VR/MR/AR will be a standard progression – in that order  – across a wide variety of projects in spaceflight. “VR will be used to refine early engineering prototypes, prototype early AR interfaces, familiarise astronauts with procedures, visualise scientific data in new ways, provide new methods of public outreach, and so on,” says Matthew.

Meanwhile, MR/HR will be used for more realistic and in-depth engineering prototype analysis, high-fidelity mission training, virtual field analogs, and human performance studies where proprioception matters.

AR will be deployed in actual human spaceflight missions, helping astronauts conduct procedures with virtual task lists, providing mission control with astronauts’ first-person view, identifying important elements in the environment like soil samples, telerobotics and piloting spacecraft.

“Our ISS Hybrid Reality Project represents the beginning of an effort to create simulations so realistic they are virtually indistinguishable from physical reality. By closely matching the training environment to the operational environment, we can leverage context-dependent memory recall to improve the reaction time of astronauts.

Highly realistic training will also more accurately elicit emotional reactions to stress, providing deeper insight into the psychological impact of long-duration spaceflight and the viability of mission architectures. By measuring this insight with objective tools like brain wave scans, heart rate monitoring and galvanic skin response, training can be tailored in real-time by advanced AI to suitably stress-test astronauts where they need it most to ensure mission success.”

Words by Bernadette Fallon

Back to case studies

tagged
Engineering, NASA, NVIDIA, Training, Virtual Reality, VR