Researchers have created an armband combining wearable biosensors with artificial intelligence for better prosthetic control
Back in May of last year, I wrote about the efforts by a team of scientists from the Chalmers University of Technology, Sahlgrenska University Hospital, University of Gothenburg, Integrum AB, Medical University of Vienna, and MIT (Center for Extreme Bionics) to develop a groundbreaking invention in the form of a mind-controlled arm prosthesis — one of the world’s most integrated interfaces between a human and the machine.
Engineers at the University of California, Berkeley have now created a new device that combines wearable biosensors with artificial intelligence software to help recognize what hand gesture a person intends to make based on electrical signal patterns in the forearm. The invention can pave the way for better and seamless interaction — enabling the user to type on the computer without a keyboard, play a video game without a controller, or even drive the car without the steering wheel.
According to the study, the hand gesture recognition system was created using a flexible armband that can read the electrical signals at 64 different points on the forearm. The electrical signals are then fed into an electrical chip, programmed with advanced AI called a hyperdimensional computing algorithm — capable of updating itself with new information.
“Reading hand gestures is one way of improving human-computer interaction. And, while there are other ways of doing that, by, for instance, using cameras and computer vision, this is a good solution that also maintains an individual’s privacy.”
~ Ali Moin, Lead Designer & First Co-Author
Under normal circumstances, hand muscles respond to the electrical signals sent via brain neurons to the muscle fibers — something that the electrodes in the cuff are attempting to sense in this electrical field. In the successful trials conducted, the researchers were able to teach the algorithm to recognize 21 individual hand gestures.
The AI software algorithm “learns” how electrical signals in the arm correspond with individual hand gestures — done by wearing the cuff and making hand gestures one by one. When a user eventually makes one of these gestures or even thinks about doing so, the AI system is able to determine which one it is, by matching its distinct nerve signal pattern up with one that it’s already learned.
A major advantage of the new system is its security feature. Since all the computation is done locally on a chip, no personal data is transferred to an external device or a nearby computer. This serves the dual purpose of not only helps in keeping personal biological data remains private but also speeds up computing time considerably.
The AI algorithm can also compensate for new variables that it can encounter — like sweat on your forearm or lifting the hand to an unusual position (e.g above the head). Although this technology is available elsewhere, its uniqueness comes from the biosensing, signal processing & interpretation, and artificial intelligence into one system.
Researchers hope to further tweak this device to make it into a viable commercial product — to be used in applications like gesture control of electronic devices, virtual reality, or even the operation of prosthetic hands.
Complete Research was published in the Journal of Nature Electronics.