Darpa Bionics

Neural Interface for Prosthetics
1. Goals and Impact

ORBAI’s goal is to provide a proof of concept third-wave artificial intelligence method that improves and extends the application of next-generation neurotechnology. This technology will have wide impact and influence on individuals with prosthetics by providing their prosthetics with enhanced capabilities. Additionally ORBAI’s technology could be further developed to enhance robotics and powered exoskeletons.

Our innovative solution will provide DARPA with a best-in-class solution to assist military personnel with any severe injury that resulted in a loss of limb. This product will enhance the neurological interface between user and prosthetic and will allow the user to regain the functionality that they lost. ORBAI’s revolutionary technology will assist DARPA with providing defense members, and others, with a revolutionary tool that will allow them to live with their prosthetic with as streamlined user interface and physical assistance as possible.

2. Technical Plan

In this application of controlling prosthetics, the control signals come from sensors in the prosthetic's socket, detecting the electrical activity from nerves and muscles in the stump, which are already in the same temporal-spatial format as our spiking neural nets (because they are a copy of human nervous system). The inputs go into one end of BICHNN and come out the other end as commands to run the actuators of the hand, from simple hands with whole hand grasping motors to precise robot hands with multiple actuators for each finger and the wrist. The prosthetics controller may be pre-trained on real motion capture of a human hand to provide the fine movement control to augment the coarse movement commands coming from the electrodes in the stump.

3. Military Applications

Additionally, our technology could be further applied to future robotics and powered exoskeletons. This could be further advanced to enhance bionics and brain-machine interfaces and enhance future DARPA capabilities. ORBAI BICHNN could map the pilot’s thoughts and mental commands to the robot control interface by sitting in several sessions of watching the robot move and mimicking how they would move it. BICHNN trains based on the signals at both ends, and quickly converges on a precise mapping so that the pilot can control the robot as effortlessly as moving their body. Again, the movement controller would be pre-trained on a human subject, perhaps a martial artist, so that the coarse movement commands by the EEG input would be augmented by this motion capture to make it move smoothly and precisely with perfect balance. And no combat robot exoskeleton would be complete without a Ninja spirit inside to guide it.