Human hands are astonishing tools. Here's why robots are struggling to match them

Estudio Santa Rita Illustration of a robotic hand holding a flower (Credit: Estudio Santa Rita)Estudio Santa Rita

Our hands perform thousands of complex tasks every day – can artificial intelligence help robots match these extraordinary human appendages?

The human hand is one of the most staggeringly sophisticated and physiologically intricate parts of the body. It has more than 30 muscles, 27 joints alongside a network of ligaments and tendons that give it 27 degrees of freedom. There are more than 17,000 touch receptors and nerve endings in the palm alone. These features allow our hands to perform a dazzling array of highly complex tasks through a broad range of different movements.   

But you don't need to tell any of that to Sarah de Lagarde.   

In August 2022, she was on top of the world. She had just climbed Mount Kilimanjaro with her husband and was supremely fit. But just one month later, she found herself lying in a hospital bed, with horrific injuries.  

While returning home from work, De Lagarde slipped and fell between a tube train and the platform at High Barnet station in London. Crushed by the departing train and another that then came into the station, she lost her right arm below the shoulder and part of her right leg.  

After the long healing process, she was offered a prosthetic arm by the UK's National Health Service, but it offered her little in terms of normal hand movement. Instead, it seemed to prioritise form over functionality. 

"It doesn't really look like a real arm," she says. "It was deemed creepy by my children." 

The prosthetic only featured a single joint at the elbow while the hand itself was a static mass on the end. For nine months she struggled to perform the daily tasks she had previously taken for granted, but then was offered something transformational – a battery-powered bionic arm utilising artificial intelligence (AI) to anticipating the movements she wants by detecting tiny electrical signals from her muscles. 

"Every time I make a movement it learns," De Lagarde says. "The machine learns to recognise the patterns and eventually it turns into generative AI, where it starts predicting what my next move is." 

World's first recipient of an AI-powered bionic arm

Even picking up something as simple as a pen, and fiddling it in our fingers to adopt a writing position involves seamless integration between body and brain. Hand-based tasks that we perform with barely a thought require a refined combination of both motor control and sensory feedback – from opening a door to playing a piano. 

With this level of complexity, it's no wonder that attempts to match the versatility and dexterity of human hands have evaded medical professionals and engineers alike for centuries. From the rudimentary spring-loaded iron hand of a 16th-Century German knight to the world's first robotic hand with sensory feedback created in 1960s Yugoslavia, nothing has come close to matching the natural abilities of the human hand. Until now.  

Advances in AI are ushering in a generation of machines that are getting close to matching human dexterity. Intelligent prostheses, like the one De Lagarde received, can anticipate and refine movement. Soft-fruit picking bots can pluck a strawberry in a field and place it delicately in a punnet of other berries without squishing them. Vision-guided robots can even carefully extract nuclear waste from reactors. But can they really ever compete with the amazing capabilities of the human hand?  

Embodied AI  

I recently gave birth to my first child. Within moments of entering the world, my daughter's small hand wrapped softly around my partner's forefinger. Unable to focus her eyes on anything more than a few inches in front of her, her hand and arm movements are limited, on the whole, to involuntary reflexes that allow her to grip an object when it is placed in her palm. It is an adorable illustration of the sensitivity of our dexterity, even in our earliest moments – and hints at how much it improves as we mature.  

Over the coming months, my daughter's vision will progress enough to give her depth perception, while the motor cortex of her brain will develop, giving her increasing control over her limbs. Her involuntary grasps will give way to more deliberate grabbing actions, her hands feeding signals back to her brain, allowing her to make fine adjustments in movement as she feels and explores the world around her. It will take my daughter several years of determined effort, trial, error and play to attain the level of hand dexterity that adults possess. 

And much like a baby learning how to use their hands, dexterous robots utilising embodied AI follow a similar roadmap. Such robots must co-exist with humans in an environment, and learn how to carry out physical tasks based on prior experience. They react to their environment and fine-tune their movements in response to such interactions. Trial and error plays a big part in this process.  

"Traditional AI handles information, while embodied AI perceives, understands, and reacts to the physical world," says Eric Jing Du, professor of civil engineering at the University of Florida. "It essentially endows robots with the ability to 'see' and 'feel' their surrounding environments, enabling them to perform actions in a human-like manner."  

Human sensory systems can detect minute changes, and rapidly adapt to changes in tasks and environments - Eric Jing Du

But this technology is still in its infancy. Human sensory systems are so complex and our perceptive abilities so adept that reproducing dexterity at the same level as the human hand remains a formidable challenge.   

"Human sensory systems can detect minute changes, and rapidly adapt to changes in tasks and environments," says Du. "They integrate multiple sensory inputs like vision, touch and temperature. Robots currently lack this level of integrated sensory perception."  

But the level of sophistication is rapidly increasing. Enter the DEX-EE robot. Developed by the Shadow Robot Company in collaboration with Google DeepMind, it's a three-fingered robotic hand that uses tendon-style drivers to elicit 12 degrees of freedom. Designed for "dexterous manipulation research", the team behind DEX-EE hope to demonstrate how physical interactions contribute to learning and the development of generalised intelligence.  

Each one of its three fingers contains fingertip sensors, which provide real-time three-dimensional data on their environment, along with information regarding their position, force and inertia. The device can handle and manipulate delicate objects including eggs and inflated balloons without damaging them. It has even learned to shake hands – something that requires it to react to interference from outside forces and unpredictable situations. At present, DEX-EE is just a research tool, not for deployment in real-world work situations where it could interact with humans.  

Understanding how to perform such functions, however, will be essential as robots become increasingly present alongside people both at work and at home. How hard, for example, should a robot grip an elderly patient as they move them onto a bed?  

One research project at the at the Fraunhofer IFF Institute in Madgeburg, Germany, set up a simple robot to repeatedly "punch" human volunteers in the arm a total of 19,000 times to help its algorithms learn the difference between potentially painful and comfortable forces. But some dexterous robots are already finding their way into the real world. 

The rise of the robots  

Roboticists have long dreamed of automata with anthropomorphic dexterity good enough to perform undesirable, dangerous or repetitive tasks. Rustam Stolkin, a professor of robotics at the University of Birmingham, leads a project to develop highly dexterous AI-controlled robots capable of handling nuclear waste from the energy sector, for example. While this work typically uses remotely-controlled robots, Stolkin is developing autonomous vision-guided robots that can go where it is too dangerous for humans to venture.

Estudio Santa Rita Even the most advanced robotic arms struggle to match the dexterity and adaptability of human hands (Credit: Estudio Santa Rita)Estudio Santa Rita
Even the most advanced robotic arms struggle to match the dexterity and adaptability of human hands (Credit: Estudio Santa Rita)

Perhaps the most well-known example of a real-world android is Boston Dynamics' humanoid robot Atlas, which captivated the world back in 2013 with its athletic capabilities. The most recent iteration of Atlas was unveiled towards the end of 2024 and combines computer vision with a form of AI known as reinforcement learning, in which feedback helps AI systems to get better at what they do. According to Boston Dynamics, this allows the robot to perform complex tasks like packing or organising objects on shelves.   

But the skills required to perform many of the tasks in human-led sectors where robots such as Atlas could take off, such as manufacturing, construction and healthcare, pose a particular challenge, according to Du.  

"This is because the majority of the hand-led motor actions in these sectors require not only precise movements but also adaptive responses to unpredictable variables such as irregular object shapes, varying textures, and dynamic environmental conditions," he says. 

Du and his colleagues are working on highly-dexterous construction robots that use embodied AI to learn motor skills by interacting with the real world

AI v the Mind

This article is part of AI v the Mind, a series that aims to explore the limits of cutting-edge AI and learn a little about how our own brains work along the way. With some expert help, each article pits different AI tools against the human mind, asking probing questions designed to test the limits of intelligence. Can a machine write a better joke than a professional comedian, or unpick a moral conundrum more elegantly than a philosopher? We hope to find out. 

At present, most robots are trained on specific tasks, on at a time, which means they struggle to adapt to new or unpredictable situations. This limits their applications. But Du argues that this is changing. "Recent advancements suggest that robots could eventually learn adaptable, versatile skills that enable them to handle a variety of tasks without prior specific training," he says. 

Tesla also gave its own humanoid robot Optimus a new hand at the end of 2024. The company released a video of the bot catching a tennis ball in mid-air. However, it was tele-operated by remote manual control, rather than autonomous, according to the engineers behind it. The hand has 25 degrees of freedom, they claim.   

But while some innovators have sought to recreate human hands and arms in machine form, others have opted for very different approaches to dexterity. Cambridge based robotics company Dogtooth Technologies has created soft fruit-picking robots, with highly dexterous arms and precision pincers capable of picking and packing delicate fruits like strawberries and raspberries at the same speed as human workers.   

The idea for the fruit-picking robots came to co-founder and chief executive Duncan Robertson while he was lying on a beach in Morocco. With a background in machine learning and computer vision, Robertson wanted to apply his skills to help clean up the litter on the beach, by creating a low-cost robot which could identify, sort, and remove detritus. When he returned home, he applied the same logic to soft fruit farming.   

The robots he developed along with the team at Dogtooth use machine learning models to deploy some of the skills that we as humans possess instinctively. Each of the robot's two arms has two colour cameras, much like eyes, which allow them to identify the ripeness of the berries and determine the depth of each of the target fruits from its end "effector", or gripping device.  

The robots map the dispersal and arrangement of ripe fruits on a plant and turn this into a sequence of actions, with precise route planning necessary in order to guide the picker arm to the fruit's stem in order to make a cut.   

Dogooth's robot's arms each have seven degrees of freedom, the same as the human arm, meaning these appendages can manoeuvre well enough to find the optimal angle for reaching each berry without damaging others still on the plant. The grasping device then gently grips the fruit by the stem, passing it into an inspection chamber before carefully placing the berry in a punnet for distribution. Another strawberry-picking system, created by Octinion, uses soft grippers to grasp the fruit as it transfers it from plant to basket. 

Estudio Santa Rita Thousands of touch receptors and nerve endings help our hands distinguish between different textures and adjust our grip according to friction (Credit: Estudio Santa Rita)Estudio Santa Rita
Thousands of touch receptors and nerve endings help our hands distinguish between different textures and adjust our grip according to friction (Credit: Estudio Santa Rita)

While many of us would instinctively know how much force is required to handle a strawberry without squishing it, it has taken decades of research and development for robots to do achieve the same dexterity. Roberts is keen to stress that his company's robots are not a replacement for human labourers, but that they could help to address the labour shortages facing many parts of the agricultural industry by allowing people and machines to harvest together.   

Robots capable of handling some of the more delicate tasks currently carried out by humans could provide an important boost to many industrial sectors, says Pulkit Agrawal, associate professor in the department of Electrical Engineering and Computer Science at the Massachusetts Institute of Technology, in Cambridge, Massachusetts.  

"In US manufacturing alone, some estimates forecast [a] two million-plus labour shortage," says Agrawal, who is developing machines able to manipulate objects. "Whether it be industrial applications, search-and-rescue, space exploration or helping the aging population, the impact of AI-powered robotics is going to transformative – much more than ChatGPT in my opinion." 

During the course of a day, however, human hands undertake thousands of different tasks, adapting in order to handle a variety of different shapes, sizes and materials. And robotics has some way to go to compete with that. One recent test of a robotic hand using open-source components costing less than $5,000 (£4,000) found that it could be trained to reorientate objects in the air. However, when confronted with a challenging object – a rubber duck shaped toy – the robot still fumbled and dropped the rubber duck around 56% of the time

Protheses that predict

Perhaps the ultimate application for robotic dexterity, however, is in prosthetics – taking the place of a human hand lost to accident or disease, for instance. The pioneering myoelectric prosthetic arm and hand that Sarah de Lagarde received gives some hints at what might be possible in the future.  

A collaboration between multiple software and hardware companies, her arm uses myoelectric pattern recognition, or neurological intent decoding, which is a form of machine learning that enables her hand to learn her movements and make predictions based on past behaviour. This means De Lagarde is able to move her hand more instinctively.  

"A piece of hardware embedded in Sarah's prosthetic arm records the muscle signals on the surface of her skin as she visualises that particular movement," says Blair Lock, chief executive at Coapt, the developer of the AI algorithm that drives De Lagarde's arm movements. This hardware decodes those muscle signals in order to guess what action De Lagarde intends to make with her hand. "The pattern recognition model can detect the intensity of the determined action, how fast, how hard. It is capable of actioning the commands in less than 25 milliseconds," adds Lock. 

De Lagarde likens the process to using a video game controller, in which you press a sequence of buttons to solicit a particular response from your on-screen avatar. At first, she found it difficult to multi-task as all her thoughts went into twitching the right sequence of muscle fibres in her shoulder But eventually, the AI algorithms have become adept at predicting her intentions, meaning she can now multitask much more easily.   

"I can instruct it to have a very light touch so I can pick up an egg without crushing it," says De Lagarde. "But at the same time I can intensify the grip and make it so much stronger that I could actually crush a can of coke."  

Artificial intelligence is also integrated into the app paired to her arm, which makes suggestions as to how to use the arm more optimally, based upon previous use. Although a great improvement, the prosthesis will never be quite as good as De Lagarde's original arm, she says. It's heavy, sweaty in the summer months, and needs charging once a day. Plus, she still has some hurdles to overcome regarding its functionality.  

The haptic feedback mechanisms in the prosthesis are still fairly rudimentary and De Lagarde mostly relies upon sight when handling objects. She periodically forgets she's holding something and releases her grip, dropping it on the floor. 

While we have made substantial progress in the last few years and human-like dexterity seems achievable, we are at least five years away, if not more – Pulkit Agrawal

Human hands by comparison use the networks of touch receptors on our fingers and palms in order to sense where something is, determine just how hard we need to grip it to pick it up and sense if the friction starts to change.  

Embedded AI is clearly leading to ever more dexterous robotics and prostheses. For now, however, it's clear that the technology still has some way to go before it completely matches or supersedes the incredible design of the human body. According to Agrawal, challenges remain in the physical robotic hardware and the software. "While we have made substantial progress in the last few years and human-like dexterity seems achievable, we are at least five years away, if not more," he says.  

Even as dexterity improves, there are other things to consider, says Du. "Safety is paramount," he notes. "This encompasses both physical safety, ensuring that robotic systems can operate without causing harm to human coworkers, and system safety, involving robust fail-safes and redundancies within AI algorithms to prevent malfunctions or unintended actions." Du also points out ethical considerations, such as the impact on jobs. 

For De Lagarde, improvements in the dexterity of robotic hands has brought back abilities she thought she had lost – simple tasks like pouring herself a glass of water, and giving her children a hug with two arms.   

When I ask de Lagarde where she'd like to see the technology in the future, she envisions a future where robotic body augmentation isn't just confined to those with limb difference or a disability, but could help the elderly remain active in their later years, for example. 

Although she may not have chosen to be an ambassador for embedded AI, de Lagarde's willingness to embrace the technology also offers a glimpse of just what might be possible. 

-- 

For more technology news and insights, sign up to our Tech Decoded newsletter, while The Essential List delivers a handpicked selection of features and insights to your inbox twice a week. 

For more science, technology, environment and health stories from the BBC, follow us on Facebook, Xand Instagram