On this website we study the ongoing developments of prosthetic arms with sensory feedback.
The entire process starts off in the brain. When a person wants to move their prosthetic hand, their brain sends signals to the nerves. The nerves send signals to the muscles. The muscles are embedded with sensors. The sensors decode the twitches in the muscle and as a result the person can move their hand. The person is able to pick up and hold different objects. They are able to carry out different tasks.
On the fingertips are sensors. So far only pressure sensors have been developed but in an interview we conducted with the program director of the Revolutionizing Prosthetics program, we were told that they were going to start developing heat, texture and vibration sensors. As the person is interacting with the object the pressure sensors will detect the amount of pressure being put on the object. This information will be encoded and send back through the neural network to the somatosensory cortex. As a result the person will be able to know how much pressure is being put on the object and can adjust the grip accordingly.
There are many different programs that are developing prosthetics with sensory feedback. As mentioned before there is the Johns Hopkins University’s Applied Physics Lab’s Revolutionizing Prosthetics program, which is being funded by DARPA. There is also the Terminator Arm, the Swiss Research team’s arm and the Case Western University arm. One particular arm that has received much media attention is the DEKA Arm that is also being funded by DARPA. All of the above arms are more advanced than the DEKA Arm, they are much more sensorized and have more degrees of freedom.
DARPA has currently started a new program called the Hand Proprioception and Touch Interfaces (HAPTIX) program. This program builds off the developments of the Revolutionizing Prosthetics program and the Reliable Neural-Interface Technology (RE-NET) program. The program specifically focuses on developing prosthetics with sensory feedback.
The DEKA Arm
The DEKA arm made its debut at the Johns Hopkins University Applied Physics Lab (JHU APL), Laurel, Maryland, in December 2010, and should be commercially in the very near future. The overall goal of the prosthesis movement is to attain “near-natural control of a prosthesis with sensory feedback provided directly to the brain,” and the DEKA arm brings us one step closer. Until recently, prosthetics have been under very non-intuitive control. Patients would have to contract muscles in parts of the body other than the arm to control a given function, imposing an excessive cognitive burden.
How it works
The DEKA Arm has three different types of models available depending on the level of the amputation. The radial model, the humeral model and the shoulder model. The prosthesis comes with six pre-programmed hand grip gestures. The user interface that provides sensory feedback is non-invasive. The sensory feedback is provided through a tactor, a small vibrating motor. The tactor is positioned directly against the user’s skin. A sensor on the hand, connected to a microprocessor, sends a signal to the tactor. The signal changes according to the grip strength. If the grip is light then the tactor vibrates slightly. As the grip strength increases the frequency of the vibration increases. This allows the user to control the grip of the prosthetic arm.