With the increasing integration of robots in new industries, SUTD researchers are designing a variety of useful robots inspired by nature’s creativity.
From C-3PO in Star Wars to Disney’s WALL-E, robots integrated into the fabric of human society have always captured our imagination. In many Asian societies, robots have been assisting people with housekeeping and cleaning large modern infrastructures such as airports. But for robots to have more wide-ranging applications, they must be capable of performing complex tasks such as navigating around obstacles and pursuing moving targets. To design robots that can do so, researchers from the Singapore University of Technology and Design (SUTD) have taken inspiration from the living world.
They have come up with innovations ranging from soft robots that swim like real fish to robots controlled by hand-worn gloves. As we welcome more robots in our daily lives, we look into the exciting work of SUTD researchers in the field of nature-inspired robotics.
Solving hard problems with soft robots
One of the most important problems the next generation of robots have to solve is locomotion, the act of moving through complex physical environments while avoiding obstacles and pursuing goals.
Luckily, nature has had billions of years to evolve impressive solutions to these challenges, explained Assistant Professor Pablo Valdivia y Alvarado, a robotics researcher at SUTD. Taking cues from the animal kingdom, Valdivia y Alvarado’s team is designing robots that can move and find their way through water like fish do. “Our group studied the body and fin motions of different kinds of fish to achieve energy-efficient forward motions and sharp turning capabilities,” said Professor Valdivia y Alvarado.
However, because of the complex motions involved, traditional rigid robots require very complex mechanisms to be able to mimic real fish movements. This is where the expertise of Professor Valdivia y Alvarado’s team in making soft robots comes in. Using novel methods in additive manufacturing (AM), design, and new hybrid materials, the team is able to fabricate simple materials-based soft robots that display smooth fish-like motions, thanks to their carefully tailored structural properties.
The team’s novel AM processes include an approach to use printed liquid polymers as anchors to guide fibres to form various 2D embedded meshes. “Our approach allows the fabrication of a wide variety of fibre-based composite structures,” Professor Valdivia y Alvarado explained. “The embedded fibres help tailor the stretching, twisting, bending, electrical and colour-changing properties in soft robot components.”
Teaching robots how to swim
While Professor Valdivia y Alvarado’s team is able to successfully manufacture soft robot parts with properties similar to the body parts of marine animals, it is only the first step. The next step involves programming the robot to swim underwater.
Leveraging on their experience in computer modelling, the team used the fundamental fluid mechanics associated with the movement of various fish to tease out what makes each type of locomotion effective. However, standard simulations can take a long time to be implemented. This is why the team is using trained neural networks to help predict robot movements–an approach that uses machine learning to better control the way their robots swim.
Soft robots that swim are just one of the many projects Professor Valdivia y Alvarado’s team is working on. By combining their original AM methods with examples from nature, his team is also building a collection of soft robots with a wide range of applications, from grasping delicate objects like human hands to underwater sensors similar to seal whiskers.
“One of our ultimate goals is to be able to directly fabricate complete robots in a single shot via novel AM approaches,” said Professor Valdivia y Alvarado.
Robot Interaction solutions that fit in a hand glove
In addition to the ability to move like animals, robots also need to be controlled using more intuitive means for more widespread use. One instance of this is through wearable technologies, such as the work being done by the team led by Associate Professor Soh Gim Song, within the articulated systems and biomechanics group in SUTD.
Among the team’s many efforts on wearable technology is the design of a glove device powered by an algorithm capable of predicting human hand gestures. “The glove has stretchable sensors that can measure deformation data and inertia sensors that can measure motion data,” explained Professor Soh.
A sensor based glove device
Using machine learning, Professor Soh’s team then trained a model to fuse the different sensor signals and mapped it to specific kinematic states. Knowing the specific kinematic states of the hand is useful in many applications, such as serving as an interpreter for hearing and speech impaired persons to communicate with other individuals, or acting as a human-machine interface to operate, control and interact with robots or objects in a digital world. Currently, Professor Soh’s team is using this glove to command the motion behaviour of robots.
For instance, a forward-facing flexion-extension hand gesture commands the robot to follow a human, while an upright hand clenching gesture commands the robot to stop moving. “The next step is to combine this with mixed reality devices with the goal to create a seamless interaction boundary between humans and machines, just like what you see in Iron Man (the American superhero film),” said Professor Soh.
In addition to making robots controlled by hand-worn gloves, Professor Soh’s team is also working on miniature robots that can collectively navigate commercial buildings and public spaces. Among these robots are the patented spherical robot called Virgo and the climbing robot called Orion.
Like Professor Valdivia y Alvarado, Professor Soh’s team is creating biologically-inspired robots capable of moving collectively as many animals do. “We are working with the defense industry to use our robots in search and rescue, exploration and surveillance applications, with an emphasis on humans and robots working together as a team,” he said.