Kent State Interdisciplinary Team Selected as a Top 10 Team for NASA SUITS Challenge
There is a very good chance that technology, incubated at Kent State University, could play an integral role in improving NASA astronauts’ performance on the next space missions to the moon and Mars.
For months, an interdisciplinary team of 12 students and four faculty advisors have been working collaboratively to research and develop new assistive features for the space suits. Featuring a helmet-based augmented reality (AR) display system, their design could help astronauts better complete extravehicular activities (EVA), such as repairing spacecraft during NASA’s Artemis Program scheduled to launch in 2024. Their project also includes the design of a “telesuit,” a suit capable of collecting motion data and biometric data (such as heart rate and blood pressure) from the astronaut and showing a 3-D visual representation of this information on the AR helmet display.
“For exploration, it is essential that crewmembers on spacewalks are equipped with the appropriate human-autonomy enabling technologies necessary for the elevated demands of lunar surface exploration and extreme terrestrial access,” NASA officials explained on the agency’s website.
Selected as a top 10 team nationwide for the NASA Spacesuit User Interface Technologies for Students (SUITS) Challenge, Kent State’s “ATR_FLUX” team, was invited to present their work at NASA’s Johnson Space Center in Houston, Texas, on April 20-24. Due to the COVID-19 pandemic, the event will be reorganized as a virtual meeting on June 11.
Read about how the team reacted to the COVID-19 crisis by fundraising and contacting international manufacturers in order to purchase medical masks that they personally delivered to Summa Health and Cleveland Clinic hospitals on April 1.
The ATR_FLUX Team
Led by Irvin Cardenas, a computer science Ph.D. student and member of the Advanced Telerobotics Research (ATR) Lab in the College of Arts and Sciences, the team will present its project titled “Coactive and Collaborative Interaction Platform for xEMU”.
Their team includes undergraduate and graduate students from the Department of Computer Science, the School of Fashion, and the Mechatronics Engineering, Political Science and History departments. Their advisors are from the Department of Computer Science and the School of Fashion. Computer Science Professor Jong-Hoon Kim, Ph.D., is the director of the ATR Lab and is serving as the lead advisor for the ATR_FLUX project.
“Having a NASA project, with all these students involved, is like a dream,” Kim said. “My students are working on cutting-edge space technology, which someday we all can use in real life. It was a great honor to get to meet the NASA astronauts who are going to be on the moon in 2024.”
To see the team members and their profiles, visit the ATR_FLUX website at www.atr.cs.kent.edu/projects/nasa-suits-challenge-2020/.
How the Helmet Display Interface Works
The team is currently designing and testing its display systems on a Magic Leap headset with “mixed reality” smart glasses, in which 3D computer-generated images are superimposed onto real-world environments and objects, which augments the user’s experience of reality.
The goal is to enable immediate access to instructions, procedures, graphics, spacesuit status and health status information in the form of an audiovisual display so that astronauts can work more efficiently without constant direction from NASA’s mission control.
“If they are performing some kind of science task where they are collecting a sample, we can superimpose some information about the sample,” Cardenas said. “If they are fixing a rover on Mars, we can actually superimpose what region they should be working on or what task they should be performing.”
Conceptually, the ATR Lab is trying to combine these technologies with its approach to telepresence robotics focusing on how robots should interact and collaborate with humans in the future.
“Rather than just seeing robots as tools or as our servants, we want to connect with them and collaborate and work as teams,” Cardenas said. “The system that we are developing integrates that, and we want to create a synthetic agent or virtual agent that also communicates with the astronaut and provides them feedback.”
They also want to allow the astronaut to provide feedback to the agent. For example, if the agent is able to capture or recognize an object but labels it incorrectly, they want to allow the astronaut to tell the agent that the object is incorrectly classified. In essence, the agent could actually “learn.”
“If you think about what it takes to perform a task in space, you’ve got to consider that there is always a time constraint and different conditions,” Cardenas said. “So, our belief is that if we are able to allow astronauts to better understand their biosignals or allow their teammates to understand how an astronaut is performing, we can also allow them to get more feedback or give them advice on how to perform the task better.”
Caitlyn Lenhoff is a master's student working on the project’s augmented reality user interface aspects.
“I’m really interested in augmented reality and I want to see it used in more aspects of life, especially in schools,” Lenhoff said. “The NASA SUITS Challenge gives me an ability to try to implement something that I’m very interested in and passionate about. It would mean a lot to me if we can make an impact on space exploration.”
About the Telesuit and School of Fashion Collaboration
Michelle Park Kołacz, a second-year master’s student in the Fashion Industries Study program in the School of Fashion at Kent State, is one of a few students collaborating on the telesuit part of the project. Their goal is to create a proof of concept prototype in a jacket form. Last year, she collaborated with the ATR Lab group on a different version of the telesuit, which linked a human operator’s gestures to the robot’s gestures and actions.
She recently shared some of the highlights of the aesthetic and functional aspects of the ATR_FLUX telesuit concept, including different types of fabrics that stretch and support the piping for the inserted biometric data sensors that monitor the user’s muscles groups. Along the sleeves, there are zippers that open where the polymer-based strain sensors will be placed. Because this is a wearable technology, they are also considering the use of a conductive thread to insulate it with, such as cotton that would create more static.
All the sensors are attached to the base arm layer, which is a compression sleeve emulating the pressure of kinesiology tape through a combination of strategically patterned four-way stretch and a two-way stretch spandex.
“We were able to interview the Chief Planetary Scientist and an astronaut Tyler "Nick" Hague,” Park Kołacz said. “This firsthand input allowed us to accurately consider the user's experience and what we could design to actively benefit the wearer.”
“This opportunity is certainly out of this world,” Park Kołacz joked. “But, seriously, to be able to work on a project that could be implemented or inspire things that would go to the moon or outer space is certainly incredible. I don’t think that I had even thought of an opportunity like this when I was growing up. That’s one of the beautiful things about Kent State having so many different programs; this collaboration enables interdisciplinary interactions to create this vision that one talent alone wouldn’t necessarily be able to bring into fruition.
“It’s been a great experience working with the computer science students,” Park Kołacz continued. “They’re wonderful! Along the way, you learn so many different skills, and you gain a greater appreciation for what other people can do and what you can do together. It exposes you to other 'languages' so that you are able to converse with other people and come up with even greater projects in the future.”
ATR Lab Is Also Competing in the Three-Year World Robot Summit in Tokyo
This isn’t the first time Kim’s ATR Lab has competed on a national or world stage. The lab team unveiled its latest robot, "TeleBot-2," at the World Robot Summit in Tokyo in October 2018. The Kent State team was one of nine teams (and the only team from the United States) invited to compete in the Plant Disaster Robotics Category. That three-year competition, concluding in 2020, requires its robot to complete a set of tasks based on a plant disaster, like the Fukushima Nuclear Power Plant during the 2011 tsunami in Japan.
Its robot’s operation is based on the gestures of the human operator with telepresence control that essentially allows the operator to be the robot immersed. The operator wears a custom-made jacket that provides feedback to the robot. They hope to add legs for bipedality and more advanced software for controlling, which will allow the operator to hear, see and feel what the robot feels, so they ‘become’ the robot.
To follow the team's progress on this project, follow them on Facebook, Twitter and Instagram at @ATR_LAB.
# # #
Jim Maxwell, 330-672-8028, firstname.lastname@example.org