Developed and led by scientists at Newcastle University, and funded by the Engineering and Physical Sciences Research Council (EPSRC) the technology has already been put to trail by a select number of amputees.
The bionic hand through a computer and a camera, takes a picture of the object in front of it, assesses its shape and size and triggers a series of movements in the hand and sends the correct movement to pick it up.
This bypasses the usual process which will require the user to see the object to stimulate the muscles and move the arm, and now the hand ‘sees’ and reacts in one smooth movement.
Dr Kianoush Nazarpour, a Senior Lecturer in biomedical engineering at the university, said: “The bionic hand can respond automatically.”
This intuitive hand could be the answer for a new generation of prosthetic limbs to give the wearer the ability to grip objects without the use of their brain, researchers said.
Dr Nazarpour commented: "Prosthetic limbs have changed very little in the past 100 years. Responsiveness has been one of the main barriers to artificial limbs.”
He continued: "For many amputees the reference point is their healthy arm or leg so prosthetics seem slow and cumbersome in comparison. Now, for the first time in a century, we have developed an 'intuitive' hand that can react without thinking."
The university has now teamed up with experts at Newcastle upon Tyne Hospitals NHS Foundation Trust, and aim to offer the ‘hand with eyes’ to some patients at Newcastle’s Freeman Hospital.
The hand has been programmed by the team (whose work is reported in the Journal of Neural Engineering) to react within milliseconds and perform four different ‘grasps’.
As current prosthetic hands are controlled via myoelectric signals, which is electrical activity of the muscles recorded from the skin surface of the stump, Dr Nazarpour says it takes more time, concentration and practice.
With the new bionic hand, it uses neural networks, which is the basis for Artificial Intelligence. The researchers showed the computer numerous object images and taught it to recognise the ‘grip’ needed for different objects.
Lead Author Ghazal Ghazaei, explained: “The computer isn’t just matching an image, it’s learning to recognise objects and group them according to the grasp type the hand has to perform to successfully pick it up.”
It groups objects by size, shape and orientation, according to the type of grasp that would be needed to pick them up. The team programmed the hand to perform four different ‘grasps’: palm wrist neutral (such as when you pick up a cup); palm wrist pronated (such as picking up the TV remote); tripod (thumb and two fingers) and pinch (thumb and first finger).
The work is part of a larger research project to develop a bionic hand that can sense pressure and temperature and transmit the information back to the brain, so watch this space.
Dr Nazarpour added: “It’s a stepping stone towards our ultimate goal. But importantly, it’s cheap and it can be implemented soon because it doesn’t require new prosthetics – we can just adapt the ones we have.”