Hemiparetic patients, such as stroke survivors,patients suffering from Parkinson’s diseases, multiple sclerosis, spinal cord injuries, traumatic brain injuries, cerebral palsy, poliomyelitis, motor impairments, weakness, spasticity, arthritis and elderly people may have limited or non-functionality in one of their hands and must face challenges in completing simple everyday activities such as eating, dressing, carrying their walker and other bimanual tasks. As a result, many patients rely on the assistance of family, friends, or staff at skilled nursing homes, compromising their independence and quality of life. A wearable robot, namely Supernumerary Robotic Fingers can be developed to augment the capabilities of human hand such that a variety of prehensile, bimanual and manipulation tasks can be performed single handedly. The fingers can be used by the patients to recover their grasping abilities presenting an active compensatory tool in the initial phases of therapeutic recovery and rehabilitation to promote the use of the arm even if the hand grasp function is not recovered.
The two robotic fingers can be intuitively controlled by the patients using the flex sensors attached to their fingers through a hand glove and IMU. These robotic fingers may provide chronic hemiparetic patients with the needed assistance to lead independent and productive lives. The overall proposed system can act as a motivation tool and as a upper limb rehabilitation involving both grasping and arm mobility to solve task-oriented activities.
We have developed a machine learning algorithm to detect the hand gesture and orientation from the data collected from flex sensor attached to the fingers and IMU and actuate the robotic fingers accordingly to accomplish the given task. Haptic sensors are attached to the robotic fingers to render haptic information which is interfaced with vibrotactile motors attached to one of the patients fingers to provide them with the information about the force exerted by the device and control it using the hand gestures for different types of objects.