This device, created in the United Kingdom, works with computer vision and is ten times faster than current prosthetics.

The prosthetic hand developed by these researchers.

Amputations could be less of a problem in the future thanks to a new generation of prosthetics that will allow users to reach for objects without having to think about it, just like a real hand.

Until now, the main advances in this field consisted of artificially connecting – by means of electrodes or cables – the brain with the unskilled parts of the body. However, a group of biomedical engineers at the University of Newcastle, UK, have come up with a better idea: a bionic hand. equipped with an optical sensor that photographs the objects in front of it, evaluates its shape and size, and from there decides how it should act.

As Kianoush Nazarpour, professor of Biomedical Engineering at this university in the north of England explains, “prosthetic limbs have changed very little in the last 100 years“The design is much better and the materials are lighter and more durable, but they still work the same way.”

These new prostheses manage to avoid their users having to look for a long time at the object to electrically stimulate the arm muscles and trigger a movement. Simply, your own hand watch the object and reacts instantly and fluently.

Some amputees from the Freeman Hospital in Newcastle have already been able to test this system, the results of which appeared this week in the Journal of Neural Engineering.

“Like a real hand, the user can reach out and pick up a cup or cookie with just a quick glance in the right direction,” explains Nazarpour. “Responsiveness has been one of the main obstacles for artificial limbs,” he explains. This is because many amputees they still have their healthy arm as a point of reference, so the prostheses will always seem slow and cumbersome in comparison.

All-seeing hand

Prosthetic hands currently in use require concentration and a lot of practice time until they are dominated in a functional way. This learning period has been shortened through the use of neural networks – on which artificial intelligence is based – to teach the prosthetic hand to recognize different objects.

That’s how it works

“We show the computer an image of, for example, a stick,” explains Ghazal Ghazaei, a doctoral student at the School of Electrical and Electronic Engineering and co-author of the study. “But not just one image, but many images of the same suit from different angles and orientations, even with different lights and against different backgrounds and eventually the computer learns what it needs to grab that stick, “he explains.

Thus, in addition to the sticks they grouped other objects to teach the hand four basic grips: pick up a cup, pick up the remote control, pinch with thumb and forefinger, and finally pick something up with thumb and fingers, what these engineers call a “tripod”.

What the camera does when it sees the object is to choose the best of those four ways to grasp it and, in a matter of milliseconds, send a signal to the hand to execute it.

However, its authors have announced that the hand it is nothing definitive, but a first step towards a fully connected bionic hand, capable of detecting pressure and temperature and being able to send that information to the brain.