A team of students and faculty in the College of Arts and Sciences at Kent State University just took the next major step in becoming an internationally recognized robotics team.
Kent State’s Advanced Telerobotics Research Lab (ATR_Kent) in the Department of Computer Science recently shared its latest iteration of a fully immersive telepresence robot, Telebot-3R, which allows a human operator to have direct control and perform various tasks through the robot. The World Robot Summit (WRS) Committee selected the Kent State team as one of 11 finalists – the only team from the United States – in the summit’s Plant Disaster Prevention Challenge Category. The ATR_Kent team’s focus is on applying technologies toward the control and navigation of a hybrid humanoid search-and-rescue robot.
Due to the COVID-19 pandemic, the World Robot Summit competition was postponed, and event organizers have yet to announce a rescheduled date. It was originally set to be held in Japan’s Fukushima Prefecture in August 2020. This is the final phase of the three-year World Robot Summit competition, in which teams from around the world compete to advance through a series of steps by building their robots from raw concepts to prototypes, and now to advanced versions.
Hosted by Japan’s Ministry of Economy, Trade and Industry (METI) and the New Energy and Industrial Technology Development Organization (NEDO), the World Robot Summit is considered one of the top robot challenges in the world. The ATR_Kent team is one of three teams from the United States competing against research labs from universities and corporations from around the world, including teams from the University of Tokyo (Japan), the University of Cambridge (U.K.), Northeastern University (U.S.), the University of Oxford (U.K.), Hitachi (Japan) and Technische Universität Darmstadt (Germany).
“All the ATR_Kent members have been working extremely hard to achieve this qualification,” said Jong-Hoon Kim, Ph.D., assistant professor of computer science and director of the Advanced Telerobotics Research Lab at Kent State. “They spent many days working overnight, even on holidays, weekends and during the winter break, to pass the final WRS qualification examination.”
The ATR_Kent team produced this video to unveil a preview of its robot:
The team’s final robot will need to complete a set of tasks in an extraordinary adverse environment such as a plant disaster, like that of the Fukushima nuclear power plant during the 2011 tsunami in Japan. In 2018, the ATR_Kent team traveled to Tokyo to present its initial concept prototype, “Telebot-2.” Read about the team’s experience in 2018.
The ATR_Kent team members’ approach is unique in that they are using telerobotics and including advanced human machine interaction technology, including gesture-driven interaction. Key tasks the robot will be required to complete include inspecting and maintaining infrastructures based on several inspection methods, such as opening and closing valves.
In its latest iteration, Telebot-3R, the team members said that they improved on various key features of the Telebot-2, including a tank tread design to allow for dynamic movement, improved sensor fusion and autonomous operation. This robot design takes on a unique dog-like form and can use its treads to traverse up and down stairs or collapsed structures. It can fold its upper body, complete tasks with its arms and adjust its legs to navigate.
“This gives Telebot-3R the ability to adapt to a wide range of complex terrains without compromising its support polygon and to switch between forms autonomously,” said Irvin Steve Cardenas, a Kent State doctoral student in computer science who serves as ATR_Kent’s system architect.
The Kent State team also improved the design and functionality of its virtual reality (VR) control system.
“Our team’s core strengths are not simply on the mechanical side but also on the human factors engineering and human-computers intelligent interaction,” Cardenas said. “Our goal is to make the control of such search and rescue robots more seamless and intuitive for the everyday user. We believe interfaces such as the one we are developing will help pave the way for that revolution.”
Immersive control is achieved by using a virtual reality headset and application that allow the operator to see through the eyes of the robot and work cooperatively with it through natural motion using the VR controllers. The mixed reality control system includes a VR operator as well as a human operator who uses the augmented reality headset to observe the environment, monitor operators and tasks, and give direction to the team.
The team’s design also allows the robot to grasp objects at different locations without additional lower body displacements. The robot has four main body configurations, and its torso can dynamically fold during navigation, allowing the robot to maintain its center of mass while traversing steep terrain or while climbing stairs.
The team’s engineering efforts were further supported by Kent State’s Design Innovation (DI) Initiative directed by J.R. Campbell, with contributions from Outreach Program Manager Andrea Oleniczak, who facilitated the machining and crafting of critical parts for the new robot. The Scalable Computer Architecture and Emerging Technologies (SCALE) Laboratory, directed by Gokarna Sharma, Ph.D., contributed to the research and development of the robot’s distributed control architecture and software optimization.
About the ATR_Kent Team
The primarily student team consists of Team Leader Xiangxu Lin; System Architect Irvin Steve Cardenas; Saifuddin Mahmud and Redwanul Haque Sourave (Artificial Intelligence Detection); Alfred Shaker and Zachary Law (Visualization); Sara Roman, Michael Nelson and Marcus Arnett (Robot Design); and Faculty Advisors Jong-Hoon Kim, Ph.D., and Gokarna Sharma, Ph.D., both assistant professors in Kent State’s Department of Computer Science.
To learn more about Kent State’s Advanced Telerobotics Lab, visit www.atr.cs.kent.edu.
In 2020, several members of this student team competed in NASA’s SUITS Challenge (Spacesuit User Interface Technologies) and placed in the top 10 with their helmet-based augmented reality system that collects motion and biometric data that could be used on future missions to the moon and Mars.
# # #
Irvin S. Cardenas, firstname.lastname@example.org, 267-281-3487
Jim Maxwell, email@example.com, 330-672-8028