Irvin Cardenas

A team of students and faculty in the College of Arts and Sciences at Kent State University just took the next major step in becoming an internationally recognized robotics team.  

Kent State’s Advanced Telerobotics Research Lab (ATR_Kent) in the Department of Computer Science recently shared its latest iteration of a fully immersive telepresence robot, Telebot-3R, which allows a human operator to have direct control and perform various tasks through the robot. The World Robot Summit (WRS) Committee selected the Kent State team as one of 11 finalists – the only team from the United States – in the summit’s Plant Disaster Prevention Challenge Category. The ATR_Kent team’s focus is on applying technologies toward the control and navigation of a hybrid humanoid search-and-rescue robot.

Due to the COVID-19 pandemic, the World Robot Summit competition was postponed, and event organizers have yet to announce a rescheduled date. It was originally set to be held in Japan’s Fukushima Prefecture in August 2020. This is the final phase of the three-year World Robot Summit competition, in which teams from around the world compete to advance through a series of steps by building their robots from raw concepts to prototypes, and now to advanced versions.  

Hosted by Japan’s Ministry of Economy, Trade and Industry (METI) and the New Energy and Industrial Technology Development Organization (NEDO), the World Robot Summit is considered one of the top robot challenges in the world. The ATR_Kent team is one of three teams from the United States competing against research labs from universities and corporations from around the world, including teams from the University of Tokyo (Japan), the University of Cambridge (U.K.), Northeastern University (U.S.), the University of Oxford (U.K.), Hitachi (Japan) and Technische Universität Darmstadt (Germany).

“All the ATR_Kent members have been working extremely hard to achieve this qualification,” said Jong-Hoon Kim, Ph.D., assistant professor of computer science and director of the Advanced Telerobotics Research Lab at Kent State. “They spent many days working overnight, even on holidays, weekends and during the winter break, to pass the final WRS qualification examination.”

The ATR_Kent team produced this video to unveil a preview of its robot:



The team’s final robot will need to complete a set of tasks in an extraordinary adverse environment such as a plant disaster, like that of the Fukushima nuclear power plant during the 2011 tsunami in Japan. In 2018, the ATR_Kent team traveled to Tokyo to present its initial concept prototype, “Telebot-2.” Read about the team’s experience in 2018.

The ATR_Kent team members’ approach is unique in that they are using telerobotics and including advanced human machine interaction technology, including gesture-driven interaction. Key tasks the robot will be required to complete include inspecting and maintaining infrastructures based on several inspection methods, such as opening and closing valves.

Improved Mobility

In its latest iteration, Telebot-3R, the team members said that they improved on various key features of the Telebot-2, including a tank tread design to allow for dynamic movement, improved sensor fusion and autonomous operation. This robot design takes on a unique dog-like form and can use its treads to traverse up and down stairs or collapsed structures. It can fold its upper body, complete tasks with its arms and adjust its legs to navigate.

“This gives Telebot-3R the ability to adapt to a wide range of complex terrains without compromising its support polygon and to switch between forms autonomously,” said Irvin Steve Cardenas, a Kent State doctoral student in computer science who serves as ATR_Kent’s system architect.

Innovative Control

The Kent State team also improved the design and functionality of its virtual reality (VR) control system.

“Our team’s core strengths are not simply on the mechanical side but also on the human factors engineering and human-computers intelligent interaction,” Cardenas said. “Our goal is to make the control of such search and rescue robots more seamless and intuitive for the everyday user. We believe interfaces such as the one we are developing will help pave the way for that revolution.”

Immersive control is achieved by using a virtual reality headset and application that allow the operator to see through the eyes of the robot and work cooperatively with it through natural motion using the VR controllers. The mixed reality control system includes a VR operator as well as a human operator who uses the augmented reality headset to observe the environment, monitor operators and tasks, and give direction to the team.

The team’s design also allows the robot to grasp objects at different locations without additional lower body displacements. The robot has four main body configurations, and its torso can dynamically fold during navigation, allowing the robot to maintain its center of mass while traversing steep terrain or while climbing stairs.

The team’s engineering efforts were further supported by Kent State’s Design Innovation (DI) Initiative directed by J.R. Campbell, with contributions from Outreach Program Manager Andrea Oleniczak, who facilitated the machining and crafting of critical parts for the new robot. The Scalable Computer Architecture and Emerging Technologies (SCALE) Laboratory, directed by Gokarna Sharma, Ph.D., contributed to the research and development of the robot’s distributed control architecture and software optimization.

About the ATR_Kent Team

The primarily student team consists of Team Leader Xiangxu Lin; System Architect Irvin Steve Cardenas; Saifuddin Mahmud and Redwanul Haque Sourave (Artificial Intelligence Detection); Alfred Shaker and Zachary Law (Visualization); Sara Roman, Michael Nelson and Marcus Arnett (Robot Design); and Faculty Advisors Jong-Hoon Kim, Ph.D., and Gokarna Sharma, Ph.D., both assistant professors in Kent State’s Department of Computer Science.

To learn more about Kent State’s Advanced Telerobotics Lab, visit

In 2020, several members of this student team competed in NASA’s SUITS Challenge (Spacesuit User Interface Technologies) and placed in the top 10 with their helmet-based augmented reality system that collects motion and biometric data that could be used on future missions to the moon and Mars.

# # #

Media Contacts:
Irvin S. Cardenas,, 267-281-3487 
Jim Maxwell,, 330-672-8028 

There is a very good chance that technology, incubated at Kent State University, could play an integral role in improving NASA astronauts’ performance on the next space missions to the moon and Mars.

For months, an interdisciplinary team of 12 students and four faculty advisors have been working collaboratively to research and develop new assistive features for the space suits. Featuring a helmet-based augmented reality (AR) display system, their design could help astronauts better complete extravehicular activities (EVA), such as repairing spacecraft during NASA’s Artemis Program scheduled to launch in 2024. Their project also includes the design of a “telesuit,” a suit capable of collecting motion data and biometric data (such as heart rate and blood pressure) from the astronaut and showing a 3-D visual representation of this information on the AR helmet display.

“For exploration, it is essential that crewmembers on spacewalks are equipped with the appropriate human-autonomy enabling technologies necessary for the elevated demands of lunar surface exploration and extreme terrestrial access,” NASA officials explained on the agency’s website.

Selected as a top 10 team nationwide for the NASA Spacesuit User Interface Technologies for Students (SUITS) Challenge, Kent State’s “ATR_FLUX” team, was invited to present their work at NASA’s Johnson Space Center in Houston, Texas, on April 20-24. Due to the COVID-19 pandemic, the event will be reorganized as a virtual meeting on June 11. 

Read about how the team reacted to the COVID-19 crisis by fundraising and contacting international manufacturers in order to purchase medical masks that they personally delivered to Summa Health and Cleveland Clinic hospitals on April 1. 


Led by Irvin Cardenas, a computer science Ph.D. student and member of the Advanced Telerobotics Research (ATR) Lab in the College of Arts and Sciences, the team will present its project titled “Coactive and Collaborative Interaction Platform for xEMU”.

Their team includes undergraduate and graduate students from the Department of Computer Science, the School of Fashion, and the Mechatronics Engineering, Political Science and History departments. Their advisors are from the Department of Computer Science and the School of Fashion. Computer Science Professor Jong-Hoon Kim, Ph.D., is the director of the ATR Lab and is serving as the lead advisor for the ATR_FLUX project.

“Having a NASA project, with all these students involved, is like a dream,” Kim said. “My students are working on cutting-edge space technology, which someday we all can use in real life. It was a great honor to get to meet the NASA astronauts who are going to be on the moon in 2024.”

To see the team members and their profiles, visit the ATR_FLUX website at

How the Helmet Display Interface Works

The team is currently designing and testing its display systems on a Magic Leap headset with “mixed reality” smart glasses, in which 3D computer-generated images are superimposed onto real-world environments and objects, which augments the user’s experience of reality.

The goal is to enable immediate access to instructions, procedures, graphics, spacesuit status and health status information in the form of an audiovisual display so that astronauts can work more efficiently without constant direction from NASA’s mission control.

“If they are performing some kind of science task where they are collecting a sample, we can superimpose some information about the sample,” Cardenas said. “If they are fixing a rover on Mars, we can actually superimpose what region they should be working on or what task they should be performing.”

Conceptually, the ATR Lab is trying to combine these technologies with its approach to telepresence robotics focusing on how robots should interact and collaborate with humans in the future.

“Rather than just seeing robots as tools or as our servants, we want to connect with them and collaborate and work as teams,” Cardenas said. “The system that we are developing integrates that, and we want to create a synthetic agent or virtual agent that also communicates with the astronaut and provides them feedback.”

They also want to allow the astronaut to provide feedback to the agent. For example, if the agent is able to capture or recognize an object but labels it incorrectly, they want to allow the astronaut to tell the agent that the object is incorrectly classified. In essence, the agent could actually “learn.”

“If you think about what it takes to perform a task in space, you’ve got to consider that there is always a time constraint and different conditions,” Cardenas said. “So, our belief is that if we are able to allow astronauts to better understand their biosignals or allow their teammates to understand how an astronaut is performing, we can also allow them to get more feedback or give them advice on how to perform the task better.”

Caitlyn Lenhoff is a master's student working on the project’s augmented reality user interface aspects.

“I’m really interested in augmented reality and I want to see it used in more aspects of life, especially in schools,” Lenhoff said. “The NASA SUITS Challenge gives me an ability to try to implement something that I’m very interested in and passionate about. It would mean a lot to me if we can make an impact on space exploration.”

About the Telesuit and School of Fashion Collaboration

Michelle Park Kołacz, a second-year master’s student in the Fashion Industries Study program in the School of Fashion at Kent State, is one of a few students collaborating on the telesuit part of the project. Their goal is to create a proof of concept prototype in a jacket form. Last year, she collaborated with the ATR Lab group on a different version of the telesuit, which linked a human operator’s gestures to the robot’s gestures and actions.

She recently shared some of the highlights of the aesthetic and functional aspects of the ATR_FLUX telesuit concept, including different types of fabrics that stretch and support the piping for the inserted biometric data sensors that monitor the user’s muscles groups. Along the sleeves, there are zippers that open where the polymer-based strain sensors will be placed. Because this is a wearable technology, they are also considering the use of a conductive thread to insulate it with, such as cotton that would create more static.

All the sensors are attached to the base arm layer, which is a compression sleeve emulating the pressure of kinesiology tape through a combination of strategically patterned four-way stretch and a two-way stretch spandex.

“We were able to interview the Chief Planetary Scientist and an astronaut Tyler "Nick" Hague,” Park Kołacz said. “This firsthand input allowed us to accurately consider the user's experience and what we could design to actively benefit the wearer.”

“This opportunity is certainly out of this world,” Park Kołacz joked. “But, seriously, to be able to work on a project that could be implemented or inspire things that would go to the moon or outer space is certainly incredible. I don’t think that I had even thought of an opportunity like this when I was growing up. That’s one of the beautiful things about Kent State having so many different programs; this collaboration enables interdisciplinary interactions to create this vision that one talent alone wouldn’t necessarily be able to bring into fruition.

“It’s been a great experience working with the computer science students,” Park Kołacz continued. “They’re wonderful! Along the way, you learn so many different skills, and you gain a greater appreciation for what other people can do and what you can do together. It exposes you to other 'languages' so that you are able to converse with other people and come up with even greater projects in the future.”

ATR Lab Is Also Competing in the Three-Year World Robot Summit in Tokyo

This isn’t the first time Kim’s ATR Lab has competed on a national or world stage. The lab team unveiled its latest robot, "TeleBot-2," at the World Robot Summit in Tokyo in October 2018. The Kent State team was one of nine teams (and the only team from the United States) invited to compete in the Plant Disaster Robotics Category. That three-year competition, concluding in 2020, requires its robot to complete a set of tasks based on a plant disaster, like the Fukushima Nuclear Power Plant during the 2011 tsunami in Japan.

Its robot’s operation is based on the gestures of the human operator with telepresence control that essentially allows the operator to be the robot immersed. The operator wears a custom-made jacket that provides feedback to the robot. They hope to add legs for bipedality and more advanced software for controlling, which will allow the operator to hear, see and feel what the robot feels, so they ‘become’ the robot.

To follow the team's progress on this project, follow them on Facebook, Twitter and Instagram at @ATR_LAB.

# # #

Media Contact:
Jim Maxwell, 330-672-8028,