BTemplates.com

Powered by Blogger.

Pageviews past week

Quantum mechanics

Auto News

artificial intelligence

About Me

Recommend us on Google!

Information Technology

Popular Posts

Showing posts with label Brain–computer interface. Show all posts
Showing posts with label Brain–computer interface. Show all posts

Sunday, July 31, 2011

Brain Cap Technology Turns Thought Into Motion; Mind-Machine Interface Could Lead to New Life-Changing Technologies for Millions of People


"Brain cap" technology being developed at the University of Maryland allows users to turn their thoughts into motion. Associate Professor of Kinesiology José 'Pepe' L. Contreras-Vidal and his team have created a non-invasive, sensor-lined cap with neural interface software that soon could be used to control computers, robotic prosthetic limbs, motorized wheelchairs and even digital avatars.
University of Maryland associate professor of 
kinesiology Jose "Pepe" Contreras-Vidal wears his 
Brain Cap, a noninvasive, sensor-lined cap with neural 
interface software that soon could be used to control 
computers, robotic prosthetic limbs, motorized 
wheelchairs and even digital avatars. (Credit: John 
Consoli, University of Maryland)

"We are on track to develop, test and make available to the public- within the next few years -- a safe, reliable, noninvasive brain computer interface that can bring life-changing technology to millions of people whose ability to move has been diminished due to paralysis, stroke or other injury or illness," said Contreras-Vidal of the university's School of Public Health.

The potential and rapid progression of the UMD brain cap technology can be seen in a host of recent developments, including a just published study in the Journal of Neurophysiology, new grants from the National Science Foundation (NSF) and National Institutes of Health, and a growing list of partners that includes the University of Maryland School of Medicine, the Veterans Affairs Maryland Health Care System, the Johns Hopkins University Applied Physics Laboratory, Rice University and Walter Reed Army Medical Center's Integrated Department of Orthopaedics & Rehabilitation.

"We are doing something that few previously thought was possible," said Contreras-Vidal, who is also an affiliate professor in Maryland's Fischell Department of Bioengineering and the university's Neuroscience and Cognitive Science Program. "We use EEG [electroencephalography] to non-invasively read brain waves and translate them into movement commands for computers and other devices.

Peer Reviewed

Contreras-Vidal and his team have published three major papers on their technology over the past 18 months, the latest a just released study in the Journal of Neurophysiology in which they successfully used EEG brain signals to reconstruct the complex 3-D movements of the ankle, knee and hip joints during human treadmill walking. In two earlier studies they showed (1) similar results for 3-D hand movement and (2) that subjects wearing the brain cap could control a computer cursor with their thoughts.

Alessandro Presacco, a second-year doctoral student in Contreras-Vidal's Neural Engineering and Smart Prosthetics Lab, Contreras-Vidal and co-authors write that their Journal of Neurophysiology study indicated "that EEG signals can be used to study the cortical dynamics of walking and to develop brain-machine interfaces aimed at restoring human gait function."

There are other brain computer interface technologies under development, but Contreras-Vidal notes that these competing technologies are either very invasive, requiring electrodes to be implanted directly in the brain, or, if noninvasive, require much more training to use than does UMD's EEG-based, brain cap technology.

Partnering to Help Sufferers of Injury and Stroke

Contreras-Vidal and his team are collaborating on a rapidly growing cadre projects with researchers at other institutions to develop thought-controlled robotic prosthetics that can assist victims of injury and stroke. Their latest partnership is supported by a new $1.2 million NSF grant. Under this grant, Contreras-Vidal's Maryland team is embarking on a four-year project with researchers at Rice University, the University of Michigan and Drexel University to design a prosthetic arm that amputees can control directly with their brains, and which will allow users to feel what their robotic arm touches.



"There's nothing fictional about this," said Rice University co-principal investigator Marcia O'Malley, an associate professor of mechanical engineering. "The investigators on this grant have already demonstrated that much of this is possible. What remains is to bring all of it -- non-invasive neural decoding, direct brain control and [touch] sensory feedback -- together into one device."

In a NIH-supported project underway, Contreras-Vidal and his colleagues are pairing their brain cap's EEG-based technology with a DARPA-funded next-generation robotic arm designed by researchers at the Johns Hopkins Applied Physics Laboratory to function like a normal limb. And the UMD team is developing a new collaboration with the New Zealand's start-up Rexbionics, the developer of a powered lower-limb exoskeleton called Rex that could be used to restore gait after spinal cord injury.

Two of the earliest partnerships formed by Contreras-Vidal and his team are with the University of Maryland School of Medicine in Baltimore and the Veterans Affairs Medical Center in Baltimore. A particular focus of this research is the use of the brain cap technology to help stroke victims whose brain injuries affect their motor-sensory control. Originally funded by a seed grant from the University of Maryland, College Park and the University of Maryland, Baltimore, the work now also is supported by a VA merit grant (anklebot BMI) and an NIH grant (Stroke).

"There is a big push in brain science to understand what exercise does in terms of motor learning or motor retraining of the human brain," says Larry Forrester, an associate professor of physical therapy and rehabilitation science at the University of Maryland School of Medicine.

For the more than a year, Forrester and the UMD team have tracked the neural activity of people on a treadmill doing precise tasks like stepping over dotted lines. The researchers are matching specific brain activity recorded in real time with exact lower-limb movements.

This data could help stroke victims in several ways, Forrester says. One is a prosthetic device, called an "anklebot," or ankle robot, that stores data from a normal human gait and assists partially paralyzed people. People who are less mobile commonly suffer from other health issues such as obesity, diabetes or cardiovascular problems, Forrester says, "so we want to get [stroke survivors] up and moving by whatever means possible."

The second use of the EEG data in stroke victims is more complex, yet offers exciting possibilities. "By decoding the motion of a normal gait," Contreras-Vidal says, "we can then try and teach stroke victims to think in certain ways and match their own EEG signals with the normal signals." This could "retrain" healthy areas of the brain in what is known as neuroplasticity.

One potential method for retraining comes from one of the Maryland research team's newest members, Steve Graff, a first-year bioengineering doctoral student. He envisions a virtual reality game that matches real EEG data with on-screen characters. "It gives us a way to train someone to think the right thoughts to generate movement from digital avatars. If they can do that, then they can generate thoughts to move a device," says Graff, who brings a unique personal perspective to the work. He has congenital muscular dystrophy and uses a motorized wheelchair. The advances he's working on could allow him to use both hands -- to put on a jacket, dial his cell phone or throw a football while operating his chair with his mind.

No Surgery Required

During the past two decades a great deal of progress has been made in the study of direct brain to computer interfaces, most of it through studies using monkeys with electrodes implanted in their brains. However, for use in humans such an invasive approach poses many problems, not the least of which is that most people don't' want holes in their heads and wires attached to their brains. "EEG monitoring of the brain, which has a long, safe history for other applications, has been largely ignored by those working on brain-machine interfaces, because it was thought that the human skull blocked too much of the detailed information on brain activity needed to read thoughts about movement and turn those readings into movement commands for multi-functional high-degree of freedom prosthetics," said Contreras-Vidal. He is among the few who have used EEG, MEG or other sensing technologies to develop non-invasive neural interfaces, and the only one to have demonstrated decoding results comparable to those achieved by researchers using implanted electrodes.

A paper Contreras-Vidal and colleagues published in the Journal of Neuroscience in March 2010 showed the feasibility of Maryland's EEG-based technology to infer multidimensional natural movement from noninvasive measurements of brain activity. In their two latest studies, Contreras-Vidal and his team have further advanced the development of their EEG brain interface technology, and provided powerful new evidence that it can yield brain computer interface results as good as or better than those from invasive studies, while also requiring minimal training to use.

In a paper published in April in the Journal of Neural Engineering, the Maryland team demonstrated that people wearing the EEG brain cap, could after minimal training control a computer cursor with their thoughts and achieve performance levels comparable to those by subjects using invasive implanted electrode brain computer interface systems. Contreras-Vidal and his co-authors write that this study also shows that compared to studies of other noninvasive brain control interface systems, training time with their system was substantially shorter, requiring only a single 40-minute session.

Friday, December 17, 2010

Robot Arm Improves Performance of Brain-Controlled Device


The performance of a brain-machine interface designed to help paralyzed subjects move objects with their thoughts is improved with the addition of a robotic arm providing sensory feedback, a new study from the University of Chicago finds.
During the experiment, monkeys used their brain signals to move a computer cursor (red circle) to randomly placed targets (squares). When visual and proprioceptive feedback were included, the monkey's hand was moved by a robotic exoskeleton. The additional sensory information resulted in the cursor hitting the target faster and more directly. (Credit: Courtesy, with permission: Hatsopoulos, et al. The Journal of Neuroscience 2010.)

Devices that translate brain activity into the movement of a computer cursor or an external robotic arm have already proven successful in humans. But in these early systems, vision was the only tool a subject could use to help control the motion.

Adding a robot arm that provided kinesthetic information about movement and position in space improved the performance of monkeys using a brain-machine interface in a study published December 14 in The Journal of Neuroscience. Incorporating this sense may improve the design of "wearable robots" to help patients with spinal cord injuries, researchers said.

"A lot of patients that are motor-disabled might have partial sensory feedback," said Nicholas Hatsopoulos, PhD, Associate Professor and Chair of Computational Neuroscience at the University of Chicago. "That got us thinking that maybe we could use this natural form of feedback with wearable robots to provide that kind of feedback."

In the experiments, monkeys controlled a cursor without actively moving their arm via a device that translated activity in the primary motor cortex of their brain into cursor motion. While wearing a sleeve-like robotic exoskeleton that moved their arm in tandem with the cursor, the monkey's control of the cursor improved, hitting targets faster and via straighter paths than without the exoskeleton.

"We saw a 40 percent improvement in cursor control when the robotic exoskeleton passively moved the monkeys' arm," Hatsopoulos said. "This could be quite significant for daily activities being performed by a paralyzed patient that was equipped with such a system."

When a person moves their arm or hand, they use sensory feedback called proprioception to control that motion. For example, if one reaches out to grab a coffee mug, sensory neurons in the arm and hand send information back to the brain about where one's limbs are positioned and moving. Proprioception tells a person where their arm is positioned, even if their eyes are closed.

But in patients with conditions where sensory neurons die out, executing basic motor tasks such as buttoning a shirt or even walking becomes exceptionally difficult. Paraplegic subjects in the early clinical trials of brain-machine interfaces faced similar difficulty in attempting to move a computer cursor or robot arm using only visual cues. Those troubles helped researchers realize the importance of proprioception feedback, Hatsopoulos said.

"In the early days when we were doing this, we didn't even consider sensory feedback as an important component of the system," Hatsopoulos said. "We really thought it was just one-way: signals were coming from the brain, and then out to control the limb. It's only more recently that the community has really realized that there is this loop with feedback coming back."

Reflecting this loop, the researchers on the new study also observed changes in the brain activity recorded from the monkeys when sensory feedback was added to the set-up. With proprioception feedback, the information in the cell firing patterns of the primary motor cortex contained more information than in trials with only visual feedback, Hatsopoulos said, reflecting an improved signal-to-noise ratio.

The improvement seen from adding proprioception feedback may inform the next generation of brain-machine interface devices, Hatsopoulos said. Already, scientists are developing different types of "wearable robots" to augment a person's natural abilities. Combining a decoder of cortical activity with a robotic exoskeleton for the arm or hand can serve a dual purpose: allowing a paralyzed subject to move the limb, while also providing sensory feedback.

To benefit from this solution, a paralyzed patient must have retained some residual sensory information from the limbs despite the loss of motor function -- a common occurrence, Hatsopoulos said, particularly in patients with ALS, locked-in syndrome, or incomplete spinal cord injury. For patients without both motor and sensory function, direct stimulation of sensory cortex may be able to simulate the sensation of limb movement. Further research in that direction is currently underway, Hatsopoulos said.

"I think all the components are there; there's nothing here that's holding us back conceptually," Hatsopoulos said. "I think using these wearable robots and controlling them with the brain is, in my opinion, probably the most promising approach to take in helping paralyzed individuals regain the ability to move."

Funding for the research was provided by the National Institute of Neurological Disorders and Stroke and the Paralyzed Veterans of America Research Foundation.

Disclaimer: This article is not intended to provide medical advice, diagnosis or treatment. Views expressed here do not necessarily reflect to us.