Created a robotic arm that responds to brain waves captured with their headset

 

 

University of Toronto student Ryan Mintz and his team have created a robotic arm that responds to brain waves captured with their headset. They hope to one day create robotic limbs or prosthetics that the wearer can control with their thoughts.

Source: University Toronto. Publicado el 14/05/2014

Want proof that you don’t need big, specialized equipment to produce a mind-controlled robot arm? Just look at a recent University of Toronto student project. Ryan Mintz and crew have created an arm that you control using little more than a brainwave-sensing headset (which is no longer that rare) and a laptop. The team’s software is smart enough to steer the arm using subtle head movements, such as clenching your jaw or winking your eye; it also knows when you’ve relaxed.

The hardware is designed primarily with mobility and prosthetic limbs in mind. The current head gesture system could be used to steer a wheelchair, for example. In the long run, the students hope to improve the accuracy to the point where just thinking about an action is enough to get it done; you wouldn’t need a physical connection to muscles or the nervous system, like some of its rivals. The University of Toronto effort still faces stiff competition, but it shows that quadriplegics and others with little body control could eventually claim some independence with easily accessible (and hopefully affordable) technology.

Source: Engadget 5/15/14

Thought-Controlled Robotic Arm ‘Makes a Big Negative a Whole Lot Better’

Dr. Albert Chi (left) assists patient Johnny Matheny with the robotic Modular Prosthetic Limb. Matheny lost most of his left arm to cancer five years ago

Dr. Albert Chi was an undergraduate studying biomedical engineering when a motorcycle accident nearly cost him his left leg and foot. Hospitalized for a month, undergoing repeated surgeries, he was deeply impressed by his doctors’ compassion and skill.

He realized then that he had two passions: engineering and medicine.

Now Chi – a 2003 graduate of the UA College of Medicine and a trauma surgeon at Johns Hopkins Hospital in Baltimore – is part of a team of engineers and surgeons that has achieved what few of us ever would have thought possible.

The breakthrough is the Modular Prosthetic Limb, a robotic arm and hand that a person can control with their thoughts.

Johnny Matheny of West Virginia lost most of his left arm to cancer five years ago. Since May 2012, wearing the still experimental robotic limb, he has been able to point his prosthetic finger, grasp a ball and flex his wrist. He can distinguish between his index and little finger as well as detect the difference between soft and hard objects. He can feel his wife’s hand touching his artificial hand.

“Getting your arm cut off is a big negative,” Matheny said. “Dr. Chi has made my life a whole lot better.”

Chi began working on brain control algorithms aimed at controlling robotic arms 20 years ago, while studying biomedical engineering at Arizona State University. His faculty adviser was neurobiologist Andrew Schwartz, who first linked the information coming from the sensory and motor neurons in the brain’s cortex to a robotic arm and hand – the same artificial limb Matheny has been testing at the Johns Hopkins Applied Physics lab.

“I was involved with cortical-controlled robotics from the very, very, very beginning,” Chi recalled. After graduating cum laude with a bachelor’s degree in biomedical engineering, Chi then earned his master’s degree cum laude in the same field, both at ASU.

He then started medical school at the UA, “because I wanted to make a greater impact on patients’ lives.”

Chi stayed at the UA for his general surgery residency, which he finished in 2008. He completed a two-year fellowship at Baltimore’s Shock Trauma Hospital, then joined the surgery faculty at Johns Hopkins.

There he is part of the $150 million Revolutionizing Prosthetics project led by neuro-intensivist Geoffrey Ling, MD. Revolutionizing Prosthetics is funded by the Defense Advanced Research Projects Agency in response to the more than 1,300 men and women who have come home from war in Iraq and Afghanistan as amputees.

“Specialty centers like Walter Reed Army Hospital do a great job as far as getting a lot of these soldiers back to active duty – as high as 16 percent today, up from around 2 percent in 1980,” Chi said. “But there is a huge discrepancy between upper-extremity injury and lower-extremity injury. An  upper-extremity injury is pretty much a career-ending injury for you.

“So Dr. Ling challenged the world, led by Hopkins and the Applied Physics Lab, to come up with an engineering solution,” Chi said. “And what they came up with is the Modular Prosthetic Limb – modular because it can replace the amputated limb at any injury level. … It is really the world’s most advanced prosthetic limb.”

The modular limb is capable of replacing the natural arm’s motor and sensory function. The 100 sensors built into the arm are capable of “feeding back” temperature, pressure, joint angles and acceleration, Chi explained. “If coupled with all of the modular prosthetic limb’s capabilities,” he said, “the patient could experience feedback not only of temperature and pressure, but also surface texture and proprioception.”

In patients who are quadriplegic, the Modular Prosthetic Limb requires cortical implants to convey neuronal information to electronic sensors in the prosthesis. But for patients like Matheny, whose spinal cord is intact, Chi has performed a new surgical technique to control the prosthesis. Called Targeted Muscle Reinnervation, the technique utilizes the still viable nerves and muscles in an amputated limb.

“The patient with an amputated limb might think of moving the missing hand and wrist,” Chi explained, “but the signals from his brain go off into space and have nowhere to land – until now.”

Chi reroutes the endings of three nerves in the patient’s stump to adjacent muscles.

“It’s very much like electrical wiring,” he said. “Rewiring that information at the amputation site to residual muscles not only allows people to control the Modular Prosthetic Limb, but they have advanced motor control and sensory feedback also.”

The patient’s arm is given six months to heal before the patient begins what will be a lifelong routine of 15 to 30 minutes a day of mental imagery exercises, which re-establish the cortical signals that can now be transmitted to electrodes in the Modular Prosthetic Limb.

As amazing as all this is, Chi said, “What we really want to do is take the control that patients now have to the next level.”

Chi was commissioned into the Naval Reserve in April, and will now work with amputee patients at Walter Reed, in addition to his work at Johns Hopkins.

“I’ve been really fortunate, in terms of being in the right place at the right time. It was the experience of the motorcycle accident and the hospitalization and my engineering background that’s gotten me to where I am, where I can combine both of my passions of surgery and trauma to really empower people.”

Chi is amazed every day, he said, by how his patients work to “overcome whatever’s thrown at them. It’s really just a privilege to be part of these patient’s lives.”

Robots Controlled by the Mind. Mindwalker

Mindwalker from the University of Twente (Netherlands). A lack of mobility often leads to limited participation in social life. The purpose of this STREP is to conceive a system empowering lower limbs disabled people with walking abilities that let them perform their usual daily activities in the most autonomous and natural manner.

To May 2013, after successful ethical board approval, the whole MINDWALKER setup has been shipped from the University of Twente (Netherlands) to the Santa Lucia Foundation (Italy) in March. Clinical evaluation has been carried out since. About 20 trials have been performed so far, with 5 spinal cord injured patients of the Santa Lucia Foundation. The evaluation results will be reported in the project’s deliverables, and will allow road-mapping the required improvements to turn this MINDWALKER prototype into a mature product.

The project addresses 3 main different fields of expertise:

  • BCI technologies
  • Virtual Reality
  • Exoskeleton Mechatronics and Control

The project top level objective is to combine these expertises to develop an integrated MINDWALKER system. In addition the system shall undergo a clinical evaluation process.

Mindwalker Project Research Objectives

Approaches

New smart dry EEG bio-sensors will be applied to enable lightweight wearable EEG caps for everyday use.

Novel approaches to non-invasive BCI will be experimented in order to control a purpose-designed lower limbs orthosis enabling different types of gaits. Complementary research on EMG processing will strengthen the approach. The main BCI approach relies on Dynamic Recurrent Neural Network (DRNN) technology.

A Virtual Reality (VR) training environment will assist the patients in generating the correct brain control signals and in properly using the orthosis. The VR training environment will comprise both a set of components for the progressive patient training in a safe and controlled medical environment, and a lightweight portable set using immersive VR solutions for self-training at home.

The orthosis will be designed to support the weight of an adult, to address the dynamic stability of a body-exoskeleton combined system, and to enable different walking modalities.

Evaluation

The developed technologies will be assessed and validated with the support of a formal clinical evaluation procedure. This will allow to measure the strengths and weaknesses of the chosen approaches and to identify improvements required to build a future commercial system. In addition the resulting system will be progressively tested in everyday life environments and situations, ranging from simple activities at home to eventually shopping and interacting with people in the street.

Public Material

Project’s Leaflet – May 2013

Robots Controlled by the Mind: Second Nature for People

Small electrodes placed on or inside the brain allow patients to interact with computers or control robotic limbs simply by thinking about how to execute those actions. This technology could improve communication and daily life for a person who is paralyzed or has lost the ability to speak from a stroke or neurodegenerative disease.

Now, University of Washington researchers have demonstrated that when humans use this technology – called a brain-computer interface – the brain behaves much like it does when completing simple motor skills such as kicking a ball, typing or waving a hand. Learning to control a robotic arm or a prosthetic limb could become second nature for people who are paralyzed.

BCI brain image showing activity changes

“What we’re seeing is that practice makes perfect with these tasks,” said Rajesh Rao, a UW professor of computer science and engineering and a senior researcher involved in the study. “There’s a lot of engagement of the brain’s cognitive resources at the very beginning, but as you get better at the task, those resources aren’t needed anymore and the brain is freed up.”

Rao and UW collaborators Jeffrey Ojemann, a professor of neurological surgery, and Jeremiah Wander, a doctoral student in bioengineering, published their results online June 10 in the Proceedings of the National Academy of Sciences.

In this study, seven people with severe epilepsy were hospitalized for a monitoring procedure that tries to identify where in the brain seizures originate. Physicians cut through the scalp, drilled into the skull and placed a thin sheet of electrodes directly on top of the brain. While they were watching for seizure signals, the researchers also conducted this study.

The patients were asked to move a mouse cursor on a computer screen by using only their thoughts to control the cursor’s movement. Electrodes on their brains picked up the signals directing the cursor to move, sending them to an amplifier and then a laptop to be analyzed. Within 40 milliseconds, the computer calculated the intentions transmitted through the signal and updated the movement of the cursor on the screen.

Researchers found that when patients started the task, a lot of brain activity was centered in the prefrontal cortex, an area associated with learning a new skill. But after often as little as 10 minutes, frontal brain activity lessened, and the brain signals transitioned to patterns similar to those seen during more automatic actions.

“Now we have a brain marker that shows a patient has actually learned a task,” Ojemann said. “Once the signal has turned off, you can assume the person has learned it.”

While researchers have demonstrated success in using brain-computer interfaces in monkeys and humans, this is the first study that clearly maps the neurological signals throughout the brain. The researchers were surprised at how many parts of the brain were involved.

“We now have a larger-scale view of what’s happening in the brain of a subject as he or she is learning a task,” Rao said. “The surprising result is that even though only a very localized population of cells is used in the brain-computer interface, the brain recruits many other areas that aren’t directly involved to get the job done.”

Several types of brain-computer interfaces are being developed and tested. The least invasive is a device placed on a person’s head that can detect weak electrical signatures of brain activity. Basic commercial gaming products are on the market, but this technology isn’t very reliable yet because signals from eye blinking and other muscle movements interfere too much.

A more invasive alternative is to surgically place electrodes inside the brain tissue itself to record the activity of individual neurons. Researchers at Brown University and the University of Pittsburghhave demonstrated this in humans as patients, unable to move their arms or legs, have learned to control robotic arms using the signal directly from their brain.

The UW team tested electrodes on the surface of the brain, underneath the skull. This allows researchers to record brain signals at higher frequencies and with less interference than measurements from the scalp. A future wireless device could be built to remain inside a person’s head for a longer time to be able to control computer cursors or robotic limbs at home.

“This is one push as to how we can improve the devices and make them more useful to people,” Wander said. “If we have an understanding of how someone learns to use these devices, we can build them to respond accordingly.”

The research team, along with the National Science Foundation’s Engineering Research Center for Sensorimotor Neural Engineering headquartered at the UW, will continue developing these technologies.

Distributed cortical adaptation during learning of a brain–computer interface task

What is a Brain-Computer Interface (BCI)

 

Robots Controlled by the Mind. Case: Quadcopter

Researchers at the Department of Biomedical Engineering and the Institute for Engineering in Medicine, both from University of Minnesota, announced that they have developed a new non-invasive system that allowed people to control a UAS using only your mind, on a three-dimensional physical space. The results were published in the June 4 issue of the Journal of Neural Engineering (see end summary). The subjects who participated were able to quickly and accurately monitor over a sustained period of time to a Quadcopter (multirotor robot, a helicopter with four rotors for support and propulsion). When people think of a movement or move, neurons in the motor cortex produce different and small electrical currents, specifically in the motor cortex, the area of the brain that regulates movement. By Electroencephalography (EEG) and Magnetic Resonance Imaging (MRI), the researchers mapped the neurons activated by an exclusive Brain-Computer Interface which records the electrical activity of the brain through a high-tech cap equipped with 64 electrodes. Subjects who participated in the study were asked to imagine the use of his right hand, left hand, both hands and none, to instruct the quadcopter to turn right, left, up and down, respectively. The brain signals were recorded and sent to quadcopter via Wi-Fi, moving average speed 0.69 ms-1 line The interface had been tested previously in which subjects were control a virtual helicopter on a computer screen. After several training sessions, subjects were asked to lead the quadcopter through two large rings suspended from the ceiling of a gym. Here is the result. http://youtu.be/rpHy-fUyXYk

Quadcopter control in three-dimensional space using a noninvasive motor imagery-based brain–computer interface

At the balanced intersection of human and machine adaptation is found the optimally functioning brain–computer interface (BCI). In this study, we report a novel experiment of BCI controlling a robotic quadcopter in three-dimensional (3D) physical space using noninvasive scalp electroencephalogram (EEG) in human subjects. We then quantify the performance of this system using metrics suitable for asynchronous BCI. Lastly, we examine the impact that the operation of a real world device has on subjects’ control in comparison to a 2D virtual cursor task. Approach. Five human subjects were trained to modulate their sensorimotor rhythms to control an AR Drone navigating a 3D physical space. Visual feedback was provided via a forward facing camera on the hull of the drone. Main results. Individual subjects were able to accurately acquire up to 90.5% of all valid targets presented while travelling at an average straight-line speed of 0.69 m s−1. Significance. Freely exploring and interacting with the world around us is a crucial element of autonomy that is lost in the context of neurodegenerative disease. Brain–computer interfaces are systems that aim to restore or enhance a user’s ability to interact with the environment via a computer and through the use of only thought. We demonstrate for the first time the ability to control a flying robot in 3D physical space using noninvasive scalp recorded EEG in humans. Our work indicates the potential of noninvasive EEG-based BCI systems for accomplish complex control in 3D physical space. The present study may serve as a framework for the investigation of multidimensional noninvasive BCI control in a physical environment using telepresence robotics.

Read the article here, pdf