When the research moved to humans, the ability for a user to directly observe neural decoder outputs in the form of a moving cursor or robotic arm proved critical. That visual feedback allowed the user’s brain to adapt – essentially altering its own function to help the neural decoder achieve the task. Subsequent development of more advanced decoders opened the way for iterative co-adaptation between the system’s algorithms and the user’s neural activity, which further accelerated a user’s mind-based motion control. Today, researchers expect that the added ability to convey near-natural touch sensations will further improve feedback-driven learning.
By the early 2000s, DARPA began investing heavily in neurotechnology. The agency established the Brain-Machine Interfaces program to record patterns of neural activity in animal models and decode the neural states associated with memory formation, sensory perception, and motor intent.
During the in-human studies conducted under the HAPTIX and Revolutionizing Prosthetics programs, study participants were so highly engaged with the work that they effectively became part of the research team. That dynamic made it possible to tailor the system’s performance to the needs and wants of the participant. For instance, Jan, a woman living with quadriplegia, quickly achieved the goal of feeding herself a chocolate bar with a prosthetic arm controlled by a direct interface to her central nervous system. Jan next determined to project herself beyond the confines of her wheelchair and to enter a simulated cockpit. Despite being unable to move from the neck down, Jan used her neural interface to fly a virtual aircraft simply by looking at the plane on a monitor and visualizing it moving one direction or another.
“I could raise the nose of the plane up and down. Then I could bank it right or left,” Jan explained. “I was lost so quickly in that world because I was up in the clouds, and I was flying. And I was out of my chair. I was out of my broken body. I was flying!”
Nathan was similarly able to extend his abilities. Adding more sensors to his prosthetic arm and hand enabled him to detect infrared (IR) signals. Nathan used his brain signals to move the hand over a surface that emitted invisible IR signals only in a specific location. When the prosthesis crossed the target, the sensors converted the IR signal into electrical pulses delivered to Nathan’s somatosensory cortex, enabling him to “feel” infrared radiation. Nathan reported an immediate, touch-like perception of the IR field. Still unknown is whether users’ brains, after long-term use of a bidirectional interface with novel sensors like the IR ones, will adapt to a new type of input and ultimately experience a “sixth sense” in a new way.
Prosthetic movement and sensation are currently the best-studied applications for neural interfaces. Through the Neural Engineering System Design (NESD) program, DARPA has even extended its aspirations for higher-resolution upgrades of such systems to potentially restore hearing and vision to people with sensory deficits. What’s more, the agency’s view of human function expands beyond the motor and sensory domains and into the realm of cognitive function. That is why DARPA’s Biological Technologies Office has set out to explore whether implanted neural interfaces might be used to treat individuals with neuropsychiatric and memory disorders.
DARPA’s efforts in treating neuropsychiatric dysfunction kicked off in 2013, leveraging the availability of clinical devices for monitoring brain activity that federal regulators already had approved. DARPA-funded researchers recruited individuals with epilepsy or Parkinson’s disease, who, as part of their clinical treatments, had electrodes implanted in various regions of their brains. As many as half of these patients also experience symptoms such as anxiety or depression, making them especially suited for DARPA’s research.
One of these volunteers, Jane (name changed to protect the participant’s privacy), participated in a study funded under the Systems-Based Neurotechnology for Emerging Therapies (SUBNETS) program while she underwent neurosurgical monitoring for epilepsy at the University of California at San Francisco (UCSF). In addition to epilepsy, Jane was diagnosed with major depressive disorder and bipolar disorder and exhibited symptoms of severe anxiety.
The SUBNETS team fitted Jane with a new type of therapeutic neural interface that records a patient’s neural activity across interconnected subnetworks of the brain and delivers targeted, corrective electrical micro-stimulation designed to mitigate unhealthy brain activity. When the system delivered stimulation to a subnetwork of Jane’s brain responsible for regulating her emotions, she reported, “All of a sudden … I have some energy!” When a clinician asked her whether that emotional state was something she would experience on a good day, she affirmed, “This is normal Jane.”
That question addressed an important aspect of the SUBNETS approach. The program is not pursuing interventions designed to simply flip an emotional switch from sadness to happiness, but rather to maintain a healthy balance between emotional states by detecting and modulating extremes. An optimal therapeutic intervention would relieve a depressed patient from prolonged sadness or apathy, but still allow the individual to feel a normal range of emotion in response to experiences that typically evoke negative feelings in healthy individuals.
Prosthetic movement and sensation are currently the best-studied applications for neural interfaces. Through the Neural Engineering System Design (NESD) program, DARPA has even extended its aspirations for higher-resolution upgrades of such systems to potentially restore hearing and vision to people with sensory deficits. What’s more, the agency’s view of human function expands beyond the motor and sensory domains and into the realm of cognitive function.
The complexity of the brain makes the process of developing such interventions especially challenging. Neuropsychiatric conditions are often associated with abnormal states across multiple cognitive functions, such as emotion regulation, propensity for risk-taking, and cognitive flexibility, each of which is associated with a distinct subnetwork of the brain. Patients may also fall on opposite ends of the spectrum for these conditions. As such, clinicians must tailor interventions to a patient’s symptoms. That is why DARPA required that SUBNETS systems simultaneously record from multiple locations in a patient’s brain and interpret neural signals in real time to determine specific locations, parameters, and timing of therapeutic interventions.
A related approach opened the way for closed-loop cognitive prostheses designed to facilitate memory formation and recall. Foundational studies in rodents that began in the early 2000s, led by a research team at Wake Forest University, quickly progressed into non-human primates, and now humans.