Biomedical engineering researchers from North Carolina State University and the University of North Carolina at Chapel Hill have developed a new type of prosthetics that uses neuromuscular signals to control and move powered prosthetic wrists and hands for amputees. The new prosthetics use computer models that watch and mimic the behavior of natural hands and wrists.
Currently, state-of-the-art powered prosthetics use pattern recognition to control and move. With these prosthetics, the users have to manually teach their prosthetics how to recognize patterns in muscle activity and then turn them into commands. This takes a lot of time and can be annoying to do.
The researchers wanted to eliminate the difficult development process of powered prosthetics. They also wanted to base the new prosthetics on what they already know about the human body. For example, even when a limb is amputated, the brain is still wired as if the limb is still there and neuromuscular signals are still being sent from the brain.
"We wanted to focus on what we already know about the human body," said He Huang, a professor in the joint biomedical engineering program at North Carolina State University and the University of North Carolina at Chapel Hill and lead researcher. "This is not only more intuitive for users, it is also more reliable and practical. That's because every time you change your posture, your neuromuscular signals for generating the same hand/wrist motion change. So relying solely on machine learning means teaching the device to do the same thing multiple times; once for each different posture, once for when you are sweaty versus when you are not, and so on. Our approach bypasses most of that."
The team developed a musculoskeletal model that is user-generic. In order to develop this model, the researchers placed electromyography sensors on the forearms of six able-bodied people. The sensors track the neuromuscular signals that are sent by their body during various different actions that use the hands and the wrists. This data was then used to create a generic model that translated neuromuscular signals into commands to make the prosthetics move.
“We use sensors to pick up the neurological signals and then convey that data to a computer, where it is fed into a virtual musculoskeletal model. The model takes the place of the muscles, joints and bones, calculating the movements that would take place if the hand and wrist were still whole. It then conveys that data to the prosthetic wrist and hand, which perform the relevant movements in a coordinated way and in real time — more closely resembling fluid, natural motion. By incorporating our knowledge of the biological processes behind generating movement, we were able to produce a novel neural interface for prosthetics that is generic to multiple users, including an amputee in this study, and is reliable across different arm postures," Huang says.
To test the new system, the team gathered able-bodied and amputee volunteers. The volunteers were then fitted with the new model and attempted to perform hand and wrist movements with little to no training. So far, it has been successful.
The new model can be used in more than prosthetics. The researchers say that it could also be used in computer interface devices and computer games with object manipulation on CAD programs. It also may be used for virtual reality development.
The researchers want to make it perfectly clear that this technology is years away from being widely available and they don’t have a calculated cost yet. The new prosthetics haven’t reached clinical trials yet. But they hope to have it ready soon.
The paper on this research was published in IEEE Transactions on Neural Systems and Rehabilitation Engineering.