Monday, February 18, 2013

Brain implants could create sense of touch in artificial limbs

Brain implants could create sense of touch in artificial limbs One of the main flaws of current human, brain-controlled prosthetics is that patients cannot sense the texture of what they touch.

WASHINGTON: Rats can't usually see infrared light, but they have "touched" it after Duke University neurobiologists fitted the animals with an infrared detector wired to electrodes implanted in the part of that processes information related to the sense of touch.

One of the main flaws of current human, brain-controlled prosthetics is that patients cannot sense the texture of what they touch, said Duke neurobiologist Miguel Nicolelis, who carried out the study with his team.

His goal is to give quadriplegics not only the ability to move their limbs again, but also to sense the texture of objects placed in their hands or experience the nuances of the terrain under their feet.

His lab studies how to connect brain cells with external electrodes for brain-machine interfaces and neural prosthetics in human patients and non-human primates, giving them the ability to control limbs, both real and virtual, using only their minds.

He and his team have shown that monkeys, without moving any part of their real bodies, could use their electrical brain activity to guide the virtual hands of an avatar to touch virtual objects and recognize their simulated textures.

His latest study showed that the rats' cortexes respond both to the simulated sense of touch created by the infrared light sensors and to whisker touch, as if the cortex is dividing itself evenly so that the brain cells process both types of information.

This plasticity of the brain counters the current "optogenetic" approach to brain stimulation, which suggests that a particular neuronal cell type should be stimulated to generate a desired neurological function. Instead, stimulating a broader range of cell types might help a cortical region adapt to new sensory sources, said Nicolelis, who is a professor of neurobiology, biomedical engineering and psychology and neuroscience at Duke University.

His team recently documented the firing patterns of nearly 2,000 individual, interconnected neurons in monkeys. Recording the electrical activity from thousands of neurons at once is important for improving the accuracy and performance of neuroprosthetic devices, he said.

This brain-machine interface work is all part of an international effort called the Walk Again Project to build a whole-body exoskeleton that could help paralyzed people regain motor and sensory abilities using brain activity to control the apparatus.

He and his collaborators expect to first use the exoskeleton in the opening ceremony of the FIFA Soccer World Cup in June 2014.

Nicolelis said infrared sensing might be built into such an exoskeleton so patients wearing the suit could have sensory information about where their limbs are and how objects feel when they touch them.

Thursday, December 27, 2012

Internet overdose may leave kids brain-dead


The Google generation, that relies on the internet for everything are in danger of becoming brain-dead, one of Britain’s leading inventors has warned. Trevor Baylis, who invented the wind-up radio, said children are losing creativity and practical skills because they spend too much time

in front of screens, the Daily Mail reported. The 75-year-old noted that children nowadays are dependent on Google searches.

He warned that a lot of kids would become fairly brain-dead if they become so dependent on the internet, because they will not be able to do things the old-fashioned way.

Baylis said he fears that the next generation of inventors is being lost, with young people often unable to make anything with their hands. But he believes that simple challenges in schools using tools such as Meccano model kits would give children invaluable skills

Baylis suggested that children should be taught to be practical, and not to become mobile phone or computer dependent.

Brain scan 'can sort dementia by type'

Frontotemporal dementia on MRI scan Tell-tale shrinkage of the frontal and temporal lobes on an MRI scan
Scientists say they have found a way to distinguish between different types of dementia without the need for invasive tests, like a lumbar puncture.

US experts could accurately identify Alzheimer's disease and another type of dementia from structural brain patterns on medical scans, Neurology reports.

Currently, doctors can struggle to diagnose dementia, meaning the most appropriate treatment may be delayed.

More invasive tests can help, but are unpleasant for the patient.

Distinguishing features
Despite being two distinct diseases, Alzheimer's and frontotemporal dementia, share similar clinical features and symptoms and can be hard to tell apart without medical tests

Both cause the person to be confused and forgetful and can affect their personality, emotions and behaviour.

Alzheimer's tends to attack the cerebral cortex - the layer of grey matter covering the brain - where as frontotemporal dementia, as the name suggests, tends to affect the temporal and frontal lobes of the brain, which can show up on brain scans, but these are not always diagnostic.

A lumbar puncture - a needle in the spine - may also be used to check protein levels in the brain, which tend to be higher in Alzheimer's than with frontotemporal dementia.

A team at the University of Pennsylvania set out to see if they could ultimately dispense of the lumbar puncture test altogether and instead predict brain protein levels using MRI brain scans alone.

They recruited 185 patients who had already been diagnosed with either Alzheimer's disease or frontotemporal dementia and had undergone a lumbar puncture test and MRI scanning.

The researchers scrutinised the brain scans to see if they could find any patterns that tallied with the protein level results from the lumbar puncture tests

They found the density of gray matter on the MRI scans correlated with the protein results.
The MRI prediction method was 75% accurate at identifying the correct diagnosis.

Although this figure is some way off an ideal 100%, it could still be a useful screening tool, say the researchers.

Lead researcher Dr Corey McMillan said: "This could be used as a screening method and any borderline cases could follow up with the lumbar puncture or PET scan."

Dr Simon Ridley, Head of Research at Alzheimer's Research UK, said: "This small study suggests a potential new method for researchers to distinguish between two different types of dementia, and a next step will be to investigate its accuracy in much larger studies involving people without dementia.

"While this method is not currently intended for use in the doctor's surgery, it may prove to be a useful tool for scientists developing new treatments. The ability to accurately detect a disease is vital for recruiting the right people to clinical trials and for measuring how well a drug may be working.

"Ultimately, different causes of dementia will need different treatment approaches, so the ability to accurately distinguish these diseases from one another will be crucial."

The only drug currently licensed in England and Wales for treating frontotemporal dementia is rivastigmine.
There are four licensed treatments for Alzheimer's - donepezil, galantamine, rivastigmine and memantine.

Dementia
  • There are many causes of dementia, with Alzheimer's the most common
  • More than half a million people in the UK have Alzheimer's disease
  • Frontotemporal dementia tends to affects people who are younger - under 65 - and can affect a personality and behaviour
  • Other types of dementia include vascular dementia and dementia with Lewy bodies

Big bets on the brain

Expect rapid advances as science learns to control—and be controlled by—the organ that does not accept stasis The biggest advances in neuroscience came in only the 1990s. As 2013 rolls in, prepare to change and be changed. Photo: iStockphoto
The biggest advances in neuroscience came in only the 1990s. As 2013 rolls in, prepare to change and be changed
Change does not roll in on wheels of inevitability, said the great American civil rights icon, Martin Luther King Jr. King’s inspiration, Mahatma Gandhi, said we must be the change we want to see. The man who never agreed with Gandhi, Winston Churchill, concurred: To improve is to change, Churchill said, to be perfect is to change often.
Humans are defined by their ability to change things. They change their environment, themselves and the course of history. This is because at the core of their being is an organ that does not accept stasis.
The human brain is changing all the time, learning, adapting, reprogramming and rewiring itself. When it experiences something new, it changes. Indeed, reading this article is changing your brain, which means, of course, that we can guide or shape these changes.
If you asked me which scientific frontier excited me the most this year, I would say brain research, specifically the brain-machine interface and neural engineering. These areas are likely to see great advances next year—converting, as it were, science fiction into fact sooner than we imagine.
My year began at the University of Berkeley, California, where Brian Pasley and Jack Gallant offered me varied journeys into the human brain. Pasley, a post-doctoral researcher at the neuroscience programme, was part of a team that decoded brain waves and replayed them as—somewhat slurry—words. Gallant, a neuroscience professor, headed a team that used computers to record neural activity and playback—hazy and grainy—movie clips that volunteers had previously seen.
These are small but significant advances in the great search to unlock the secrets of memory and consciousness, critical elements in understanding how to rewire the brain and guide its neural networks towards new frontiers: coaxing speech from a paralysed person; accessing the mind of a patient in coma; building artificial limbs that respond directly to the brain’s commands; growing neurons artificially and connecting them to the body’s natural, neural pathways.
The convergence of advances in a variety of fields—from engineering to neuroscience—is helping us tinker with the brain. For instance, consider the two challenges in creating a prosthetic directly controlled by the brain. One, human nerves and electronic wires use radically different modes of communication. Two, the body’s immune response to foreign objects, such as wires and other electronics, scars and impairs tissue needed to keep prosthetics in good order.
“Advances in nanotechnology and tissue engineering...are addressing both challenges,” write D. Kacy Cullen and Douglas H. Smith of the University of Pennsylvania’s Center for Brain Injury and Repair in the January 2013 issue of the Scientific American. “Rather than trying to force nerves to communicate directly with the standard electronics in modern prostheses, we and others are building new kinds of bridges between nerves and artificial limbs—linkages that take advantage of the nervous system system’s inborn ability to adapt itself to new situations.”
Today’s techniques are cumbersome, but advances will come hard and fast, as they always have in science. For instance, it is generally known that Alexander Graham Bell made the world’s first telephone call in 1876. What isn’t as well known is that he demonstrated the first wireless telephone message only four years later. So, Pasley’s and his colleagues implanted electrodes in the brain, while Gallant’s subjects lay prone for up to three hours in an MRI machine that recorded their neural activity. But as computing power and other techniques develop, it should not be long—perhaps in this decade—before “thinking caps” record and replay what you see and think.
“Once we know what the brain is telling us through patterns of brain activity, we can work backwards and start to get at the fundamental language of the brain—how simple digital outputs from massive populations of neurons code for complex sensations, emotions, thoughts and actions,” Charan Ranganath, a neuroscientist who runs the Dynamic Memory Lab at the University of California-Davis, told me earlier this year.
These patterns have clinical implications, of the kind I referred to earlier—developing prosthetic implants and brain-computer interfaces for people with motor, sensory or cognitive problems.
As always, there are dark sides to these advances. Could discerning patterns from brainwaves lead to the involuntary extraction of information by security agencies and terrorists? The short answer is yes. Brain development has led humans to greater conflict and simultaneously pushed them to new achievements, one often leading to the other. As Cullen and Smith note, “much of the progress in prosthetic design has occurred as a result of armed conflict—most recently the wars in Afghanistan and Iraq.”
Elements of the sciences that probe the brain are not new. Social cognitive theory, which explains how people change by watching others, dates to the 1940s. But the biggest advances in neuroscience came in only the 1990s. As 2013 rolls in, prepare to change and be changed.