Forget R2-D2, mounted robotic arms, large hydraulic motors and rigid segmented joints – and say hello to the flex, bend and wiggle.

Soft robotic glove interface assists a person to move an apple from a table into a bowl

Words: Lucy Jolin / Photography: Wilson Hennessy

There might have been just the slightest squeeze, but it was enough to reduce everyone in Dr Aldo Faisal’s lab to tears. Testing out a robotic arm and bespoke glove that enabled a paralysed patient to decode eye movements should not have been so emotional.

“Why are you crying?” asked the patient, paralysed from a serious spinal cord injury. “It’s the first time in 20 years that you’ve squeezed my hand,” his wife replied.

“It made me realise the power of human touch,” says Faisal, Professor of AI and Neuroscience at the Department of Computing and the Department of Bioengineering. “Such a small thing. But so important.”

Welcome to the world of soft robotics, where movement is all about the flex, wriggle, bend, deformation and transformation made in response to environment. (“Humans are soft robots,” says Faisal. “My hand is a soft robot.”) Why do we need them? Traditional ‘hard’ robots are, after all, great at many things: building cars, stirring radioactive waste, neutralising explosives and so on. But they’re not so great at accurately replicating human actions, such as performing surgery.

“Conventional robots have important drawbacks because of their stiffness and rigidity, meaning that they could pose a risk to a patient in close interaction,” says Dr Enrico Franco, Research Associate at the Department of Mechanical Engineering.

Likewise, they aren’t capable of what Faisal’s patient’s wife found so moving – the subtlety of spontaneous human touch. And they don’t like transitions or interactions, points out Mirko Kovac, Professor in Aerial Robotics at the Department of Aeronautics, which houses the Brahmal Vasudevan Aerial Robotics Lab. Today’s drones are quite happy in the air above the sea, but they’re in trouble when they plunge into the ocean. And while they can take pictures of, say, a problematic component on an oil rig, they can’t mend it.

Soft robotics take inspiration from everyday life: an outstretched hand; a seabird – or a dead fish"

In contrast, soft robotics take inspiration from the ordinary biological structures and processes of everyday life: an outstretched hand, or a seabird diving into the water. Or, indeed, a dead fish ‘swimming’ upstream. This, it turns out, is a perfect demonstration of soft robotic principles: a system where the compliance of the structure is similar to that of the environment.

“In the right conditions of turbulence and forces in a river, the fish’s softness is tuned to harness that energy so it can drive itself forward,” says Dr Thrishantha Nanayakkara, Professor in Robotics at the Dyson School of Design Engineering.

“Likewise, there are kinds of seaweed which can hold certain structures to survive different frequencies and forces of currents. Despite all the turbulence, they bend, twist and survive. Cephalopods don’t have skeletons, but they can make their body behave like a skeletal body and create solid ‘limbs’ to pull prey to their mouth.”

As Nanayakkara points out, after four decades of robotics, we’re still waiting for the rigid robot you’d trust to hold a live hamster. And that’s because our whole approach has been back to front, he says.

"We frame it thus: the hamster is the problem; therefore, the robot is the solution. But that problem definition itself is wrong. We have separated the environment and the embodiment of the robot. Now think of the hamster and the robot as one single system. If there is no separation between the hamster, then the hamster is part of the solution.

“Soft robotics can make use of the shape of the object or the movement of the object to solve the problem. For example, if I am holding a hamster in my hand, I relax my fingers. Then the hamster relaxes, too. It is the right embodiment to match the hamster’s softness."

Soft robotics requires a completely new way of thinking about problems: circular rather than linear. Faisal calls it a virtuous cycle of understanding.

“On one side, we use the language of engineering to study how the brain controls and generates behaviour. On the other side, we’re using that biological understanding to improve technology – for example, to restore movement to people who are paralysed or have lost limbs. This way gives them agency. They are not just teleoperating something. They are using their own body.”

The traditional development path is: sketch a concept; develop a controller; simulate with that controller; choose the components; and test in the field. But nature doesn’t work like that, says Kovac.

“It grows. And as it grows, it evolves capabilities. The controller, materials, actuators and sensors evolve. It adapts computationally and physically. This physical artificial intelligence approach is about combining sensing, actuation materials, controllers, aerodynamics and autonomy into one coherent system. It is a co-evolution of control, materials, structures, design and learning – a new methodology which is circular rather than linear.”

In Kovac’s lab – which aims to develop a new generation of biologically inspired flying robots – his circular approach might start with structure. A model of a seabird wing with a control system for folding and spreading, for example, is put in a wind tunnel and studied to see how it behaves, and why it behaves like that. “And this is an integration of control, environmental interactions, materials properties and structural behaviour. None of this can be decoupled.”

Using something that works with its environment, rather than attempting to transform it, is key"

Dr Huai-Ti Lin, lecturer in the Department of Bioengineering, is also developing soft robotic principles to improve flight: in this case, studying the biomechanics and the sensory system of a dragonfly’s highly deformable wings to investigate the idea of ‘fly-by-feel’.

Today’s aircraft flight controllers are designed around steady states. When turbulence hits, pilots must use their personal experience to cope with this new, unsteady state. But flying animals, Lin points out, are very good at managing unsteady states by controlling their compliant wings.

“And the secret for that is they use a collection of airflow and strain sensors on the wing itself. A dragonfly’s wingblade alone contains nearly a thousand sensors. The wing can ‘feel’ any problems before the body. By learning the fly-by-feel approach from biology, we can enhance the flight control of future flying systems.”

Using something that works with its environment – rather than attempting to transform the environment to suit the object – is key to using soft robotics in healthcare, says Franco, who works on soft robotics for minimally invasive surgery. He has a very personal connection to his work: his grandparents both died of colorectal cancer. Colonoscopies, used to diagnose this type of cancer, have a sub-optimal uptake, he points out. “This is because existing colonoscopies can dislodge the bands of the intestines, creating discomfort and pain.”

His vision: a soft robot made of silicon rubber, inflated with pressurised fluid, controlled with a joystick, but semi-autonomous. The surgeon will direct the robot where to go, but the robot will find, by itself, the easiest way of getting there. An energy-based control approach will harness the friction between the surface of the robot and the internal organs, allowing movement to be either forward or backwards. Unlike the rigid colonoscopies, this approach works with the structure of the body. And again, unlike conventional robotics, it is cheaper to produce and could be operated in a surgery or local hospital, making it ideal for use in developing countries.

And soft robotics can also help to train doctors: Nanayakkara’s lab is currently running trials of RoboPatient, the universal patient. “Medical students have no control over the patients they are assigned to practise on,” he explains. “And the patients they meet might not be representative of how conditions and pain present differently in people of different genders and ethnicities. But RoboPatient allows conditions and pain to be replicated. It has a face, too, which shows pain. That means a student can experience a patient with a multitude of different conditions and pain responses, distilling years of experience into a few hours.”

The expression on a face, the flap of a wing, the wriggle of a hamster: who knows where a true understanding of the processes behind these could take us? Small things, indeed. But vital.