CORE CONCEPTS

CORE CONCEPTS

Brain–machine interface Prashant Nair Science Writer

In a world awash in technology, the line between humans and machines has begun to blur, our thoughts and actions increasingly shaped and substantiated by machines. Perhaps nowhere is the blurring more evident than in a scientific endeavor called “neural interfacing,” a term for technology aimed at bridging the workings of machines and the human brain. Brain–machine interfaces operate at the nexus of thought and action, using the brain’s electrical signals to maneuver external devices such as prosthetic limbs, among other applications. (Noninvasive imaging techniques such as electroencephalography and functional MRI are also examples of brain–machine interfaces.) The hope is that such devices will someday help paralyzed people, who have lost motor control, to lead more independent lives. The idea of tapping into the brain’s electrical activity to control movement is more than two decades old, with attempts to record the neural coordinates of movement from the motor cortex of the monkey brain dating back to the 1960s (1). Progress has been understandably slow, but, within the last 5 years,

impressive gains in technology have helped mark a few milestones. In 2006, Brown University neuroscientist John Donoghue and others reported the result of a clinical trial of a surgically implanted, silicon-based device dubbed BrainGate, which allowed a 25-year-old tetraplegic patient with spinal cord injury to move a cursor on a computer screen, open an e-mail message, operate a television, open and close a prosthetic hand, and perform simple movements using a robotic arm—3 years after paralysis. Despite the advance, the researchers wrote that the use of the device depended on the “assistance of trained experts. The need for this assistance must be eliminated through system automation” (2). Two years later, University of Pittsburgh neuroscientist Andrew Schwartz and others moved the field another step forward: Macaque monkeys with lightly restrained arms and silicon electrodes implanted in the motor cortex could be taught to use their thoughts to move a mechanical arm, grasp food items, and even feed themselves (3). The advance was notable, partly because the monkeys’ brain appeared to have incorporated the

robotic arm as their own, fine-tuning maneuvers to effect precise moves. “It was control in free space. The movements looked natural, and the monkey could reach and grasp like humans do,” says Schwartz. In 2012, Massachusetts General Hospital neurologist Leigh Hochberg and others reported that a 58-year-old woman and a 66year-old man, both paralyzed from the neck down for years, learned to use implanted brain electrodes to control a robotic arm to reach and grasp small objects, and, in one case, even drink coffee from a bottle using a straw, suggesting that the technology might someday help paralyzed people carry out everyday activities (4). The same year, Schwartz and colleagues demonstrated that a 52-year-old tetraplegic woman could use a prosthetic limb to routinely execute sevendimensional movements, including reaching and grasping, following implantation of microelectrodes into her brain’s motor cortex and 13 weeks of training (5). Researchers working on brain–machine interfaces are no doubt making strides, but their reach has thus far exceeded their grasp. Among the many challenges tied to developing commercially viable devices for use in patients are the bulky size of the equipment, the limited durability of the implanted electrodes, and the difficulty of developing prosthetics that can relay sensory feedback to the brain (6). Yet “the idea that we can extract a meaningful signal from the brain is in a way a breakthrough,” says Schwartz, referring to the association between brain activity and real world actions that brain–machine interfaces have allowed researchers to unravel. 1 Evarts EV (1964) Temporal patterns of discharge of pyramidal tract neurons during sleep and waking in the monkey. J Neurophysiol 27:152–171. 2 Hochberg LR, et al. (2006) Neuronal ensemble control of prosthetic devices by a human with tetraplegia. Nature 442(7099): 164–171. 3 Velliste M, Perel S, Spalding MC, Whitford AS, Schwartz AB (2008) Cortical control of a prosthetic arm for self-feeding. Nature 453(7198):1098–1101. 4 Hochberg LR, et al. (2012) Reach and grasp by people with tetraplegia using a neurally controlled robotic arm. Nature 485(7398):372–375. 5 Collinger JL, et al. (2013) High-performance neuroprosthetic control by an individual with tetraplegia. Lancet 381(9866): 557–564. 6 Carey B (2012) Paralyzed, moving a robot with their minds. NY

Woman with tetraplegia uses a robotic arm to drink coffee from a bottle. Image reprinted by permission from Macmillan Publishers Ltd: Nature (4), copyright 2012. www.pnas.org/cgi/doi/10.1073/pnas.1319310110

Times. Available at http://www.nytimes.com/2012/05/17/science/ bodies-inert-they-moved-a-robot-with-their-minds.html. Accessed July 25, 2013.

PNAS | November 12, 2013 | vol. 110 | no. 46 | 18343

Brain-machine interface.

Brain-machine interface. - PDF Download Free
645KB Sizes 0 Downloads 0 Views