“We essentially want to study locomotion in fluids, by learning how fish swim and then use that fundamental knowledge to optimize robotic swimming,” Bo Cheng, an assistant professor of mechanical engineering at Penn State, said.
Cheng and Asok Ray, a professor of mechanical engineering, will be acting as principal investigators in a bid to develop a bio-inspired robot fish platform, as well as the corresponding learning algorithm needed to control and optimize its movements in underwater environments.
As reported by Penn State, researchers from the University of Houston will develop a stretchable skin peppered with sensors which will then gather flow and pressure data while the robot is moving within the water.
Researchers at the University of Virginia, meanwhile, will study the physics of the fluid and its corresponding impact on the structure of the robot.
According to Cheng, data from both Houston and Virginia will be used to improve the robot they are building at Penn State, making it a truly collaborative project.
For instance, Penn State’s robot fish platform will use the sensors developed by Houston to detect nearby objects and adjust its path accordingly. At the same time, it will use the data from Virginia to optimize the control strategy of the robot’s movement in the fluid.
“Through this collaboration, the key problem we hope to understand is the fundamental nature of fluid-structure interaction in the context of underwater locomotion, investigated as a biologically-inspired, cyber-physical system,” Cheng said.
Cheng, in an earlier statement, noted that their research findings will have the potential to enhance biologically inspired robotics that can glean more information in shallow, murky underwater environments where visual and sonar systems could prove difficult to use.
“The idea is inspired by what animals can do,” Cheng stated, noting that fish can perceive even the smallest changes in pressure, such as a fisherman casting a line on the surface of the water.
As noted in the Penn State report, the project will be funded for three years by the NSF Cyber-Physical Systems (CPS) program, which supports extensive research into systems that focus on the seamless integration of computational and physical components.
“The work is truly in the spirit of CPS, it merges the physical challenges of moving within water and how the different disciplines inside the robot are sensing, controlling and learning,” Ray said.
According to Cheng and Ray, by taking this deep dive into the physics of fish locomotion, they hope to make discoveries that will influence a new understanding of robotics development.
For instance, the researchers said, their studies could help improve search-and-rescue missions in the ocean or in confined underwater environments or even help develop nanoscale technology that could deliver medical treatments by swimming in blood vessels.
Swim to the future?
This won’t be the first fish-inspired creation to make a splash in the robotics scene.
The Massachusetts Institute of Technology’s Computer Science and Artificial Intelligence Laboratory (CSAIL), for instance, unveiled “SoFi” — a soft robotic fish that can independently swim alongside real fish in the ocean — in 2018; while researchers from Cornell University in New York revealed a working robotic lionfish powered by artificial blood early last year. (Related: Updating fluid-powered machines: Scientists design bizarre-looking lionfish powered by a blood-like compound.)
SoFi, according to its creators, can be used to track elusive creatures in the depths of the ocean, making it easier for humans to document endangered species without subjecting them to unnecessary stress.
“The robot is capable of close observations and interactions with marine life and appears to not be disturbing to real fish,” CSAIL’s Daniela Rus said.
The Cornell robot, on the other hand, will be used to check the feasibility of using liquid batteries that can be used to power different types of machines in the future.
For more stories about advances in the field of robotics, visit Robots.news.