turning thoughts into Action · turning thoughts into Action 16 imagine sept/oct 2012. ... to be a...

4
AT THE INTERFACE OF BRAIN AND MACHINE BY DAN BACHER One day in late 2010, something remarkable happened that changed my life. I had been leading a project to develop a com- munication system for people with locked-in syndrome as part of my group’s work with the BrainGate Neural Interface System (NIS). For this specific project, our objective was to create an interface that would allow users to communicate using only their thoughts. I was responsible for developing the virtual keyboard soſtware and integrating it with the NIS. On that day in 2010, the plan was to test my keyboard interface with clinical trial participant S3. At the time, her usual method of communicating was to slowly move her eyes to individual letters printed on a clear piece of plastic, while a person behind the plastic would record each letter she chose. But on this day, she would use only her thoughts to move and click a computer cursor to type with my on- screen keyboard. S3’s eyes lit up when she saw the keyboard. I was trying to demonstrate some of the features when instead she defiantly started typing on her own: first “thank,” then “you.” ose two simple words—so commonly and automatically exchanged— were the most powerful words that had ever been spoken to me. (I do mean spoken: S3 used the built-in text-to-speech feature I’d integrated to have the computer speak her message.) is transformative moment was the first of what would become a series of exciting, humbling, and emotional experiences with S3 and other participants in the BrainGate clinical trial. In the following months, I worked with a team of engineers to create soſtware that could translate the BrainGate system’s command signals into coordinated movements of an advanced robotic arm. Months of long hours of developing, refining, and validating our soſtware were put to the test in April 2011. I was by S3’s side once again when she used this robotic arm to give herself a drink of coffee. Controlling the robotic arm only with her imagined movements, she reached out, picked up a bottle, took a drink, and put the bottle back down onto the table—a feat she last performed with her own arm nearly 15 years earlier. Almost 16 years ago, a woman suffered a brainstem stroke that left her quadriplegic and unable to speak but cognitively intact—a condition called locked-in syndrome. Researchers know her as S3: the number she was assigned as a participant in a clinical trial of a neural interface system called BrainGate. Neural interface systems allow people who are paralyzed by disease or injury to control external devices just by thinking about moving their paralyzed limbs. Last year, S3 made news when she operated a robotic arm to serve herself a sip of coffee, a task she accomplished using only her thoughts. Behind this groundbreaking achievement was a multidisciplinary team of scientists based at Brown University, Massachusetts General Hospital, Stanford University, and Providence VA Medical Center. Here, four BrainGate researchers discuss their contributions to this exciting project that, in the words of neuroengineer David Borton, makes “the thinkable possible.” Turning Thoughts into Action 16 imagine Sept/Oct 2012

Transcript of turning thoughts into Action · turning thoughts into Action 16 imagine sept/oct 2012. ... to be a...

Page 1: turning thoughts into Action · turning thoughts into Action 16 imagine sept/oct 2012. ... to be a part of this eff ort to create state-of-the-art technology that hopefully one day

AT THE INTERFACE OF BRAIN AND MACHINEBY DAN BACHEROne day in late 2010, something remarkable happened that changed my life. I had been leading a project to develop a com-munication system for people with locked-in syndrome as part of my group’s work with the BrainGate Neural Interface System (NIS). For this specifi c project, our objective was to create an interface that would allow users to communicate using only their thoughts. I was responsible for developing the virtual keyboard soft ware and integrating it with the NIS.

On that day in 2010, the plan was to test my keyboard interface with clinical trial participant S3. At the time, her usual method of communicating was to slowly move her eyes to individual letters printed on a clear piece of plastic, while a person behind the plastic would record each letter she chose. But on this day, she would use only her thoughts to move and click a computer cursor to type with my on-screen keyboard.

S3’s eyes lit up when she saw the keyboard. I was trying to

demonstrate some of the features when instead she defi antly started typing on her own: fi rst “thank,” then “you.” Th ose two simple words—so commonly and automatically exchanged—were the most powerful words that had ever been spoken to me. (I do mean spoken: S3 used the built-in text-to-speech feature I’d integrated to have the computer speak her message.) Th is transformative moment was the fi rst of what would become a series of exciting, humbling, and emotional experiences with S3 and other participants in the BrainGate clinical trial.

In the following months, I worked with a team of engineers to create soft ware that could translate the BrainGate system’s command signals into coordinated movements of an advanced robotic arm. Months of long hours of developing, refi ning, and validating our soft ware were put to the test in April 2011. I was by S3’s side once again when she used this robotic arm to give herself a drink of coff ee. Controlling the robotic arm only with her imagined movements, she reached out, picked up a bottle, took a drink, and put the bottle back down onto the table—a feat she last performed with her own arm nearly 15 years earlier.

Almost 16 years ago, a woman suff ered a brainstem stroke that left her quadriplegic and unable to speak but cognitively intact—a condition called locked-in syndrome. Researchers

know her as S3: the number she was assigned as a participant in a clinical trial of a neural interface system called BrainGate.Neural interface systems allow people who are paralyzed by disease or injury to control external

devices just by thinking about moving their paralyzed limbs. Last year, S3 made news when she operated a robotic arm to serve herself a sip of coff ee, a task she accomplished using only her thoughts.

Behind this groundbreaking achievement was a multidisciplinary team of scientists based at Brown University, Massachusetts General Hospital, Stanford University, and Providence VA Medical Center. Here, four BrainGate researchers discuss their contributions to this exciting project that, in the words of neuroengineer David Borton, makes “the thinkable possible.”

turning thoughts into Action

16 imagine sept/oct 2012

Page 2: turning thoughts into Action · turning thoughts into Action 16 imagine sept/oct 2012. ... to be a part of this eff ort to create state-of-the-art technology that hopefully one day

Again I felt humbled and proud while sharing in this emotional moment with S3 and our research team. Again I realized how incredibly cool my job is and how amazing it is to be a part of this eff ort to create state-of-the-art technology that hopefully one day will help people with locked-in syn-drome. If you’re ever searching for inspiration and purpose, I encourage you to seek out those who need help the most, identify what they need, and if you can’t help them fi nd it, go make it for them yourself ! As Goethe once wrote, “Knowing is not enough; we must apply. Willing is not enough; we must do.”

Dan Bacher received his Bs in biomedical engineering from syracuse University and his ms in bioengineering from the University of Pittsburgh. now a senior research and development engineer at Brown University, dan is also an aspiring entrepreneur who enjoys playing music, trying to stay in shape, and reading in his spare time.

DECODING NEURAL SIGNALSBY BEATA JAROSIEWICZ, PhDI am a neuroscientist by training, but during the course of my research career, I have learned computer programming skills that have become crucial to my work on BrainGate. My focus has been on using my neuroscience knowledge to help improve the computer programs that decode neural signals associated with the intent to move a limb.

Th e starting point of the BrainGate neural interface system is an electrode array placed in the hand/arm area of the motor cortex. Th ese electrodes record action potentials, or “spikes,” from neurons. When the person opens or closes (or imagines opening or closing) her hand, we find some neurons that increase or decrease their spiking rate. Other neurons change their spike rate for diff erent intended directions of movement. For example, one neuron might increase its spiking rate for a rightward arm movement and decrease its rate for a left ward movement. Th at neuron would be said to have a “preferred direction” to the right. Other nearby neurons might have pre-ferred directions to the left , up, down, forward, backward, or anywhere in between.

turning thoughts into Action

BrAinGATE2.orG

www.cty.jhu.edu/imagine imagine 17

Page 3: turning thoughts into Action · turning thoughts into Action 16 imagine sept/oct 2012. ... to be a part of this eff ort to create state-of-the-art technology that hopefully one day

We begin each research session with our study participants by fi guring out how each recorded neuron’s fi ring rate modulates with intended move-ments. We do this by displaying a cursor programmed to move to targets that appear one by one on a computer monitor while the participant imagines using her hand to move the cursor. During this calibration, a computer registers the spike rate of each neuron. Th en, using the spiking information and the imagined movement information, the computer creates a model of each recorded neuron’s preferred direction.

Once this model is created, the participant can start controlling the move-ment of the cursor. A computer algorithm called the “decoder” compares incoming neural activity with the model to fi gure out in which direction the person wants to move the cursor, and then sends this signal to the computer cursor. In this way, the person can control the cursor’s movement just by thinking about where she wants it to go. Th is is called “neural control.”

My contribution to this research has been to fi gure out the best way to calibrate the model and keep it calibrated when neural signals change (which can happen with, for example, tiny movements of brain tissue resulting from local blood pressure changes). Combining my neuroscience knowledge and my computer programming skills, I helped design a method to recalibrate the model using neural data collected during neural control. Th is not only makes the model more accurate from the start, but also keeps the model calibrated for long periods of time without having to interrupt neural control.

My next challenge is to make the NIS software fully automated or user-controlled so that its use does not depend on a trained technician or caregiver. Th is will bring us one step closer to helping people with paralysis communicate and interact with the environment more independently—a goal that motivates us all.

Beata Jarosiewicz earned her Phd in neuroscience from the center for the neural Basis of cognition at the University of Pittsburgh and is now an investigator in the neuroscience department at Brown University. in her free time, Beata likes to play volleyball, do gymnastics, and train her cats to impersonate dogs and people.

UNTETHERING THE LOCKED-IN MIND BY DAVID BORTON, PhDAs a neuroengineer, I try to solve neuroscience problems with the use of modern technology, such as custom electrical circuits, chips, and soft ware. People oft en say that technological advances have made the unthinkable pos-sible, but in the case of neuroengineering, they’ve made the thinkable possible.

Th e human brain consists of more than 80 billion neurons making over 100 trillion connections. Th ese neurons communicate with each other by sending electrical pulses, called action potentials, along their long axons and to neighboring neurons. How do we listen to, and make sense of, so many signals? Neuroengineers have already met one part of the challenge by designing specialized “microphones” that can sense the millions of action potentials every second as the neurons communicate with one another.

Currently, using a microelectrode array of 100 recording elements, we can listen to the activity of roughly 100 individual neurons at once. In the BrainGate project, these signals are transmitted outside the body through a long cable, amplifi ed to distinguish them from background noise in the brain, digitized into binary code, and processed with computational algo-rithms to decode what the signals might mean.

While this method of neural recording works incredibly well, when we look to a future when locked-in patients are moving their own limbs to walk down the hallway, we realize that the transmission of all this neural data must be done wirelessly. To achieve this, we must reinvent the ampli-fi er, digitizer, and data transmission mechanisms so they can be implanted in the patient.

Th e amplifi ers currently used by the BrainGate team are the size of a hardback book, and the digital signal processors take up the majority of a personal computer’s memory. To create smaller electronics, we leveraged advances in microelectronics ranging from chip design to fl exible printed circuit board technology. We have designed custom ultra-low-power application-specifi c integrated circuits (or ASICs), amplifi ers the size of an m&m, and integrated digitization circuitry—and put all of this into a device the size of a U.S. quarter. Th rough an encoded high-frequency radio transmission scheme similar to 4G LTE, this device transmits the digital neural data from the patient to a computer across the room, where it can be processed into prosthetic control signals. Th e device is packaged

Once this model is created, the participant can start controlling the move-Once this model is created, the participant can start controlling the move-Once this model is created, the participant can start controlling the move-Once this model is created, the participant can start controlling the move-Once this model is created, the participant can start controlling the move-Once this model is created, the participant can start controlling the move-Once this model is created, the participant can start controlling the move-ment of the cursor. A computer algorithm called the “decoder” compares

BrA

inG

ATE2

.orG

Before each research session, clinical trial participants fi rst perform a calibration task designed to match their neural activity to their intended movements.

18 imagine sept/oct 2012

Page 4: turning thoughts into Action · turning thoughts into Action 16 imagine sept/oct 2012. ... to be a part of this eff ort to create state-of-the-art technology that hopefully one day

In the BrainGate neural interface system, an implanted microelectrode array detects brain signals that are then translated into computer code, allowing the user to operate external

devices with only their thoughts.

in medical-grade silicone for implantation under the skin. We hope that this wireless neural interface will soon allow locked-in patients to be untethered from their hospital beds.

Neuroengineers have one of the most exciting jobs in the world. We get to design, develop, and deploy the next generation of brain-recording devices that will one day enable patients with spinal cord injury or neurodegenerative diseases to freely interact with the world around them. We have been working toward this goal for many years, but advances in neurotechnology are bring-ing it close to reality.

David Borton earned his bachelor’s degree in biomedical engineering from Washington University in st. Louis and his Phd in biomedical engineering from

Brown University. dave is an avid soccer player and played professionally for a year in Brazil. he also plays trumpet and guitar, and enjoys sailing, biking, rock climbing, and running marathons.

learn more about braingate and the work these researchers do at www.braingate2.org.

BrAinGATE2.orG

PUTTING RESEARCH TO THE TEST BY ERIN GALLIVANOne aft ernoon at work, S3 and I were having a conversation much like one you would have with any friend or coworker. She was telling me a story about her grandson, recalling his reaction to a present he received on his birthday. Our conversation, however, looked anything but typical: I was holding up a clear letter board as she focused her eyes on individual letters. When I met her gaze through the board, I said the letter out loud; if I was correct, we would move on to the next one, spelling out the words and sentences that made up the conversation.

With the BrainGate system, I have since seen S3 use typing interfaces to quickly spell out phrases on a computer. Our goal is that people will someday be able to use the system 24/7, without any assistance. For now, because our study is still in Phase 1, to evaluate the device’s safety, participants can use the BrainGate system only when a trained technician, like me, is present during our twice-weekly research sessions.

My job as a clinical research assistant is to run these sessions. On a typical day, I travel to the participant’s home and ask him or her to consent to a research session. Aft er downloading the ses-sion soft ware sent by the research scientists, I set up the neural connection by attaching a cable to the connector implanted on the participant’s head. I then explain the experiment, describing the task and what type of imagery they should use to complete the task, which varies from moving a computer cursor to oper-ating a robotic arm. Sessions run for about four hours, and I interact with the participant the whole time, answering their

questions and giving instructions, making a detailed log of the session, and recording video. At the end of the session, I send the data to the researchers at Brown for analysis.

It is part of my job to make sure that the technology we are developing is easy and enjoyable to use, and the participants off er a lot of great feedback and suggestions, which I pass on to the rest of the team. For example, when using the keyboard interfaces, participants off ered suggestions for making the layout easier to navigate and adding shortcuts to help them type faster. My primary goal is to collect and deliver the data, but my personal interaction with our participants allows me to see how the research will directly impact their lives.

Working with BrainGate has opened my eyes to all the good that can be done through science, medicine, and engi-neering. I will be leaving this position to go to medical school next month, and although I am sad to be moving on, the work that I have done here has made me realize that this is the cor-rect career path for me.

Erin Gallivan earned a Bs in mechanical engineering from Boston University. she worked as a mechanical design engineer at raytheon and then as a data specialist at dana-Farber cancer institute before joining the BrainGate team. she is now in her fi rst year at the keck school of medicine at the University of southern california.

www.cty.jhu.edu/imagine imagine 19