Astronauts aboard the International Space Station (ISS) aren’t space tourists. They’re workers, scientists and engineers. They are doing critical science missions in an intense operating environment where safety is paramount. On spacewalks they repair equipment, install new instruments and upgrade the largest spacecraft ever flown. Just like workers here on Earth, their gloves can show wear and tear even rips and cuts presenting potential safety concerns.
To prevent problems from arising, astronauts working for the National Aeronautics and Space Administration (NASA) must take photos of their spacesuit gloves during and after every spacewalk and transmit them down to Earth for inspection. From there, NASA analysts examine photos of the gloves, looking for any damage that could pose a hazard, and then send the results back to the astronauts on the ISS.
This process gets the job done with the ISS’s low orbit distance of about 250 miles from Earth, but things will be different when NASA once again sends people to the moon, and then to Mars 140 million miles away from Earth.
From Mars, it will take up to 20 minutes to say “hello” to someone on Earth, and another 20 minutes for someone on Earth to say “hello” back. That means it could take a total of at least 40 minutes to determine if an astronaut’s glove checks out which is simply too long to wait.
To solve this, a Microsoft team working with NASA scientists and Hewlett Packard Enterprise (HPE) engineers are developing a system that uses artificial intelligence (AI) and HPE’s Spaceborne Computer-2 to scan and analyse glove images directly on the ISS potentially giving autonomy to astronauts onboard with limited support from Earth.
Detecting flaws in a critical safety component
Astronaut gloves have five layers. The outer layer consists of a rubberised coating that provides grip and acts as the first layer of defense. Next comes a layer of a cut-resistant material called Vectran. The additional three layers maintain the suit’s pressure and protects against the temperature extremes in space which can range anywhere from 180 degrees Fahrenheit to 235. degrees Fahrenheit.
The outer layer is meant to stand up to a good amount of abuse, but problems can start when wear reaches the Vectran layer. After that comes the pressure bladder for the suit which is essentially the safety layer for the astronaut.
Gloves are most vulnerable between the thumb and index finger, given how often those two digits are used to grip objects. Moreover, some areas on the ISS itself have been exposed to hazards such as micrometeorites for more than two decades. The impacts from these tiny particles have created numerous sharp edges on handrails and other structural components.
Further hazards will be encountered on the moon and Mars, where the lack of natural erosion from wind or water means rock particles are more like broken bits of glass than pebbles or sand granules here on Earth.
To create the onboard glove monitor, NASA’s team began with collections of new, undamaged gloves, and gloves that had seen wear and tear both during spacewalks and terrestrial training. They then photographed and went through the damaged gloves to tag specific types of wear areas where the outer rubberised silicone layer had begun to flake off, or places where the vital Vectran® layer was compromised. This was done through Azure Cognitive Services’ Custom Vision NASA engineers opened the pictures of gloves in a web browser, and clicked on examples of damage.
This data was then used to train a Microsoft Azure cloud-based AI system, and the results compared with actual damage reports and images from NASA. Leveraging the power of AI cloud compute capabilities, the tool then generated a probability score for the likelihood of damage to a particular place on the glove.
After a spacewalk, crew members take pictures of astronauts’ gloves while they remove their spacesuits in the airlock. These pictures are then immediately sent to HPE’s Spaceborne Computer-2 onboard the ISS, where the Glove Analyser model rapidly looks for signs of damage live in space. If any damage is detected, a message is immediately sent to Earth, highlighting areas for further review by NASA engineers.
“What we demonstrated is that we can perform AI and edge processing on the ISS and analyse gloves in real time,” says Ryan Campbell, senior software engineer at Microsoft Azure Space. “Because we’re literally next to the astronaut when we’re processing, we can run our tests faster than the images can be sent to the ground.”
HPE contributes space-ready computing hardware and software
Through Microsoft’s partnership with HPE, together we’re able to offer NASA the opportunity to test this AI technology directly on the ISS by operating on the HPE Spaceborne Computer-2 which is currently on a multi-year mission aboard the station.
The HPE Spaceborne Computer-2, an edge computing and AI-enabled system designed with rugged solutions capable of withstanding the rough conditions of space, can perform more than 2 trillion calculations or 2 teraflops per second.
Currently, the damage-assessment tool developed by NASA, Microsoft, and HPE is in a trial stage meaning it runs analyses on the gloves but is not used to make crucial safety decisions. Still, the technology shows great promise. The goal now is to continue this testing to demonstrate its reliability over time.
Glove program could extend to other capabilities
Although the glove program is new to the ISS, NASA sees ways to extend the technology to other areas where it could look for possible damage to other critical components such as docking hatches. Further, it’s possible that Microsoft HoloLens 2 or a successor could help astronauts rapidly visually scan for glove damage, or even eventually facilitate assisted repairs on complicated machinery.
Space is a powerful laboratory for innovation. By pushing humans and equipment to their limits, space drives engineers everywhere to expand the limits of their ingenuity and skills. For the Microsoft team, this opportunity to apply the power of AI to help keep NASA’s astronaut’s gloves safer serves as a first step.
“One of NASA’s missions is to explore, discover and expand knowledge for the benefit of humanity. This project hits upon all of that, and it’s just a starting point,” says Jennifer Ott, data and AI specialist at Microsoft. “Bringing cloud computing power to the ultimate edge through projects like this allows us to think about and prepare for what we can safely do next as we expect longer-range human spaceflights in the future and as we collectively begin pushing that edge further out.”
For more click here: ‘Hands-on’ AI-based test project
Follow us and Comment on Twitter @TheEE_io