Monday, March 1, 2010

Earl Miller top down Bottom Up brain processes

Earl Miller is the Picower professor of Neuroscience at MIT.  His paper, "An Integrative Theory of Prefrontal Cortex Function", has been designated a Current Classic as among the most cited papers in Neuroscience and Behavior. He use fMRI and implanted microelectrodes in humans and monkeys.
We talked about top down and bottom up brain functions and how they apply to brain dysfunction and optimal fuctioning.
Earl K. Miller, Ph.D. may be one of the most widely covered neuroscientists, media-wise, with his work and wisdom covered by the Washington Post, NY Times, Scientific American, Harvard Business Review, ABC News, The Scientist, MSNBC, Boston Globe, Financial Times, Discover, USA Today and many more.

The is the Picower Professor of Neuroscience, Associate Director, The Picower Institute for Learning and Memory
The Miller Lab uses experimental and theoretical approaches to study the neural basis of cognition. We investigatehow categories, concepts, and rules are learned, mental flexibility, how attention is focused, and, more generally, how the brain coordinates goal-directed thought and action.
Our goal is toconstruct more detailed, mechanistic accounts of how executive control is implemented in the brain and its dysfunction in diseases such as autism, schizophrenia, and attention deficit disorder.
The Picower Institute for Learning and Memory and Department of Brain and Cognitive Sciences at the MASSACHUSETTS INSTITUTE OF TECHNOLOGY 

Notes and some transcription from the interview:
Generally, when we talk about top down and bottom up in the brain, in a we're talking about raw sensory input-what's out there in the outside world-- that's bottom up.
And top Down, is the knowledge you use to decide what's important in the outside world, what decisions you make based on those sensory inputs.

deciding what's important to pay attention to.
In bottom up we're talking about the raw information in the outside world and in top down we're talking about the knowledge we have in our head in order to act on that information.
Study the electrical activity of the brain nd look at how the brain processes bottom up sensory inputs and then uses acquired knowledge to filter things out
Hierarchical signals that control sensory inputs come from the part of the brain called the pre-frontal cortex behind the forehead on the right side of your face-- executive part of the brain. processes.
Where top information comes from in the brain-- come from pre-frontal cortex. Synthesizes all the information coming into the brain and decides what's important.
Cure brain diseases where executive brain control-- where decision making is disfunctional.

On thing wehre there is Real interplay between top down and bottom up in the world today. 100 years ago there was a lot less distraction than there are now. It was relatively easy to decide what to pay attention to because there weren't many alternatives. IN todays day and age, we have all sorts of things vying for our attention. We have TV, the internet, our hand held devices
So it takes much more top-down control to deal with the complex, dynamic demanding world we're in now.
Many people have dysfucntions with that. Many people have trouble with focus and staying on task, filtering out distractions and getting side-tracked.
Our ultimate goal is to find out what parts of the brain are responsible for top down control and fix that in people who need it strengthened.

Rob What are some examples of disease or disfunction.
ADHD is a primary on-- people who have trouble staying keeping focused, people who are easily distractable. There also Decision making problems...
People who have normal cognition can keep goals on track. Some people with prefrontal cortex problems lose sightof the goals, the context, of the big picture and their minds are buffetted by the bottom up sensory inputs and they have very little top down control-- lose the forest-- the big picture, and get lost in the trees.
Rob: My experience with neuroscience, the way the mind works has taught me that there's a need for balancing taking place
Earl: Absolutely'

Rob: So you need the bottom up. Are there also problems with the ways that information comes in from the bottom up?
Earl: Well you can have certain types of impairments-- blindness or deafness or more subtle examples where some people have trouble recognizing faces. And that's an example of a deficit in a bottom-up sensory input. The main thing we're hitting on is the balance, just as you described. THere's a balance in top down and bottom up. There's always an optimal way of balancing the two.
He describes a theory on balancing top down and bottom up considerations-- that there are two parts of the brain involved-- Pre-frontal cortex (PFC) largest part and most developed part in humans. for top down and it interacts with more primitve brain structures-- the basal ganglia-- are very primitive brain structures-- even reptiles have basal ganglia The basal ganglia are involved in learning important details about the outside world, assigning value to all the things around us. And they do so by assigning value to individual things, individual details of everyday experience. .
PFC takes all details learned by the more primitive basal ganglia and puts it together into the big picture-- the big idea about what's going on, so you have a better idea of what sort of goals are available out there, what sort of behaviors you have to engage in to attain those goals, botht the long term and shor term.
We think normal cognition is the balanace between these more detail oriented primitive structures of the brain that assign value to the details of the things around us and the PFC is involved in getting the big picture.
When the Bottom-up Brain goes out of whack; OCD, Autism
And in some case this system can go out of whack. If the basal ganglia becomes too strong relative to the PFC, if it does it's job too efficiently, then you have a brain that's oriented towards details and misses the big picture.

Rob: LIke OCD?
Earl: Exactly. OCD involves Dysfunction in the frontal lobes, the PFC and basal ganglia. And that's a case where the brain focuses on particular details and gets caught in a rut.
Now, other examples of the brain where imbalance is dysfunctional may be the autistic brain. In the brains of autistics, they figuratively cannot see the forest from the trees. They focus on every little detail of everyday life. So, for example, one thing that parents of autistic kids will report is that they'll painstakingly teach their kids to do things o their own, like brush their teeth. Then one day they'll come home with a new toothbrush that's a different color, like a blue toothbrush instead of a red toothbrush and the kid completely falls apart. It's almost as if they're learning things all over again just because a little, irrelevant details of the toothbrush have changed-- it's color. You and I know that the color of the toothbrush doesn't matter. It's what the toothbrush does. But in an autistic brain every little detail is equally important as every other detail, so if something new happens, they completely fall apart-- almost like they're seeing it for the very first time. If you have a brain like that, every time... you look at something from a new angle, ... it becomes like a new learning experience, instead of picking up where they left off and that's probably why you see such severe learning disabilities in a brain like that.

Rob: So an autistic has a predominance of the basal ganglia effect and the PFC is not up to doing the integration and interpretation?
Earl: Yes. THis is just a hypothesis based on data and tests done over the years-- and we're now doing a direct test of this hypothesis. And there's lots of supporting information. And the general idea is that , what we call plasticity mechanisms in the brain-- the mechanisma of the brain that allow for re-wiring-- are too strong in the basal ganglia. And the basal ganglia learn the details too well and the basal ganglia overwhelm the ability of the prefrontal cortex to get the big picture.
Normal cognition between more detail oriened, primitive structures.... and prefrontal cortex. In some cases the balance becomes out of whack.

Autistic Brain cannot see the forrest for the trees. Focus on all the details in everyday life.
brushing teeth.
temple grandin
Plasticity mechanisms are too strong in the basal ganglia
brain games-- for improvng attention, multitasking-- computer games to improve your brain.
with better understanding
the grassroots, the outside world, sort of form it's structure. 16:00
Overactive pre-frontal cortex-- some are more pre-frontal, more goal oriented. Others just live for the day. That's goign to be dysfunctional.
Personality disorders; too rigid, totalitarian, otherwise very driven and opinionated.
Stimulus bound behavior.... 23
Delusions, schizophrenia are top down disorders.
Using fMRI, micro-electrodes montior activity of individual neurons. with monkeys AND humans.
main difference in human and animal brain is size of prefrontal cortex 1/3 of cortex in humans 16% in monkeys, 4-5% in other animals.
Humans have a lot of folds in brain which add surface area.
Biggest difference between human mind and animal is human mind gets big picture easily.
brain synchrony "favored" 38:00 phase synchrony line up thoughts and organize them with different synchronies.

Mind Over Matter: Brain Control Interfaces Become a Reality_4

The Future ET: Can we expect the normal evolution of technology to reduce if not eliminate those performance limitations?
GS: That's a complicated question. Just making your headset smaller and with a more complicated package is not going to eliminate the subject-training requirement. Now if someone created a robust dry electrode that can operate well in a noisy environment, that would clearly make things better. That is something that future technology could provide. There are parts that can be mastered with technology, but there are always going to be parts that are difficult when you are recording signals from the scalp.
ET: Do you think we will be able to move away from primarily recognizing motor activity in the brain and start detecting patterns in areas like the visual and auditory regions? Also, what about those stories involving the use of fMRI machines to do just that kind of advanced detection?
GS: In regards to your second question, it is unlikely in my opinion that we will ever see an MRI machine small enough and portable enough to be worn on your head. So that makes those devices an unlikely candidate for a wearable BCI and will keep them confined to the laboratory. In response to your first question we are already using ECoG to detect what vowels and consonants you are speaking, and when you imagine speaking them. We've had several abstracts published on this research over the past year or so and we are going to submit the study we've done on this very soon. I think that soon we'll be able to detect what words you are speaking and imagining to speak. We're not there yet, but we are getting closer.
ET:How accurate is the system with vowels and consonants?
GS: Only 40 percent, and even that is when using a small set of choices consisting of either four vowels or four consonants and trying to pick one of them out of the set, which isn't all that spectacular especially when you compare that to a typical speech recognition system. However there are a lot of areas in the system that we think we can improve, and we have bright hopes for the future.
(Note: the claimed accuracy of several commercially available speech recognition systems is well into the high 90 percent range.)
ET:Is there anything else you'd like to say about the future of BCI?
GS: I believe that the future of BCI revolves around man and machine meeting each other midway, just like you and I do when talking. I may have a funny accent that you find hard to understand, but over time you learn to adapt to it and to understand me better. Just like with motor imagery, the solution lies with the optimization of both brain and the computer.

Mind Over Matter: Brain Control Interfaces Become a Reality_3

EEG vs. ECoG Control ET: What is the state of the art in recording neuron activity, and can you compare the detection capability of electroencephalogram (EEG) [non-surgery] versus electrocorticography (ECoG) [invasive] solutions?
GS: At the moment, we don't have a very good way to record from a lot of neurons over long periods of time. A sensor that could be implanted and stay in place and produce a good signal for a long time, while recording a large number of areas in the brain with high fidelity, currently does not exist. The ECoG electrodes are probably the closest to what I just described and have a clear technical trajectory to satisfying those requirements. It is important to understand that the technique does not record individual neurons but records the activity of a lot of neurons at once. However, ECoG can distinguish and detect the activity of much smaller regions of neurons than EEG. EEG, since it is much further away from the neurons, can record only fairly gross patterns of activity in the brain. With ECoG you can presumably pick up signals that are only a millimeter apart, a 1mm spatial resolution in detecting neuron activity. EEG's spatial signal resolution is two orders of magnitude worse since it is in the centimeter range. It's very similar to being in a football stadium and you are sitting next to six other people and you can hear what they say. If you go to a particular sector you can hear some people cheer for one team or the other. If you are outside the stadium you can't even hear that, but you can tell by the crowd when someone has scored a touchdown.
Mind Over Matter: Brain Control Interfaces Become a Reality - 
Brain Color-Coded
How much information about hand movements’ direction is encoded by ECoG signals in brain areas.
ET: Can you comment on the current crop of consumer BCI devices which include NeuroSky's single-electrode, OCZ's three-electrode, and Emotiv's 14-electrode headsets?
GS: All these devices are trying to apply the technology to a particular area: video gaming. I'm not sure that the number of electrodes is important. From an industry and venture capital perspective, what's important is how much money will it make and that is critically dependent on creating a product that people are willing to spend money on. The jury is still out, and it is not clear whether or not they can create a product that can sustain an industry and represents a product that has than just a geek or wow factor. Neurosky has some big partners like Mattel, which uses its headset with its Mindflex game. And then there's the Uncle Milton Star War Force Trainer. But it is too early to gauge their success. As long as the game is fun they'll be successful. Clearly, by adding the words "brain control" or "brain device" that's going to add to the wow factor, but what they actually record remains to be seen. Given the location of where the electrodes are placed, they are likely to pick up the muscle activity from grimacing or making other facial expressions while concentrating, rather than picking up brain activity. But it is possible that they could be picking up brain activity too. In the end, I feel the commercial success of these games will outweigh the technical details of what signals they are actually picking up.
ET: The videos I've seen of some of the Emotiv headsets show people gesturing while thinking in order to help signal detection. Also, it takes the system about six seconds to detect a response. Can EEG based headsets produce better results than these?
GS: It already does, as we have shown in the laboratory. As I mentioned earlier with the published thought control demonstration using a 3D cursor, the subject could hit one of four to eight targets in a second or two. You move the cursor towards the target with the cursor location being updated approximately every 50 milliseconds.
ET: Is that an EEG or ECoG system?
GS: EEG. ECoG of course can also do this, but that system use scalp-recorded EEG technology. However, that level of success requires a lot of training on the part of the subject. It also requires a very robust, reliable environment like a laboratory, where a highly trained technician takes a half hour to set up the subject and a conductive gel is applied to the sensor before putting it on the scalp. This is an environment quite different from and superior to what the average consumer will have; consumer devices typically use dry electrodes that do not perform as well. Commercially, if your game requires your customer to have a laboratory environment, and forces them to train for three months before they can play, then it's not going to sell very well. Another factor is the amount of preparation that the consumer has to do when setting up and tearing down the equipment for each session, which will also affect the consumer's enjoyment of the game. The reality is that you have to create a product that can perform in a noisy environment and with a minimal amount of training, preparation, and teardown time. Therefore the level of performance you can expect from a consumer device has to be a lot less.
(Note: The Emotiv headset requires you to moisten and then seat 16 sensor pads before wearing the headset and put them back in their storage case when you are done, a process that combined takes about 10 minutes. Note that although they are moistened, you do not have to apply a messy conductive gel.)

Mind Over Matter: Brain Control Interfaces Become a Reality_2

Current Uses of Brain Control Interfaces

Picture courtesy of Dr. Jonathan Wolpaw, Wadsworth Center, Albany, NY.
Mind Over Matter: Brain Control Interfaces Become a Reality - 
Letter PickingET: Can you give me an example of a recent research application of your technique?
GS: We're about to publish a study now where we are demonstrating by far the fastest system that has ever been demonstrated when working with human subjects—possibly with monkeys too but I'm not sure. They system involves allows a subject to type over 20 characters per minute purely by thought alone. That's still a lot slower than typing, but when compared to a system for example where the subject has to blink their eyes as a cursor slowly scans over the alphabet, it's a big improvement.

ET: Was the subject disabled in some way?
GS: No, the subject involved had epilepsy. Since those patients have to have electrodes implanted anyways as part of the pre-operative surgical protocol for treating epilepsy, it gives us the perfect opportunity to do research with them.
ET: Will you ever get an implant yourself?
GS: I don't think my wife will let me so probably not.
ET: I thought most spouses wanted greater control over their partners?
GS: If you're willing to go off the record then I might have some answers for you!
ET: Tell me about the SIGFRIED project.
GS: SIGFRIED is a technology that allows brain activity to be visualized in real time for clinical uses. It's been acquired by and will be put to use by several clinics both in the United States and in Europe for the purpose of localizing activity in the brain prior to invasive brain surgery.
ET: Are you software programmer or hardware designer yourself?
GS: I've been programming computers for 25 years now.
ET: What language is SIGFRIED written in?
GS: C++.
ET: Are some people naturally more talented at using a BCI than others?
GS: Using a BCI is skill like any other. Some people are naturally better at it, while others never seem to be able to pick it up. Sometimes it's a natural talent issue, while in many others it's more an issue of motivation.
ET: What is the maximum number of disparate signals you can detect reliably with a BCI?
GS: You have to distinguish between the number of signals being extracted in what we call "open loop" fashion where we just monitor what the brain is doing and the number of signals that people are actually using to control a device. When they actually control the device I think the maximum number is three, at least in humans. For example, controlling a cursor in three dimensions simultaneously requires mapping one signal to the control of one axis. In other examples where you simply monitor what the brain is doing, you can record and decode more than three signals at the same time. We just published a study where we showed it's possible to determine from the brain signals how people are moving each of the five fingers on a hand and detect which ones they were flexing. This means you can not only tell which finger is moving but how it's moving just by looking at brain signals.
ET: If only three signals are currently being detected, how did those people select letters for "thought typing" in the system you mentioned?
GS: That system was actually quite simple and uses only one signal. The subject watches a matrix of the characters. Whenever the right row or column is flashed—the one that contains the character the subject wants—the brain produces a response that's different than when the subject sees when one of the unwanted rows or columns flash. Of course, it could potentially be faster if it wasn't a binary system and you could detect what word or character you were thinking of. That's something we are working on at the moment and we are having somewhat encouraging results.

Mind Over Matter: Brain Control Interfaces Become a Reality_1

The Interview ExtremeTech: What part of the brain does your technique monitor?
Dr. Gerwin Schalk: In theory, the technique can monitor any part of the brain. The areas most commonly monitored for communication and controlling devices are the areas responsible for controlling motor function. They reflect whether or not you move a hand or even in what direction you are moving the hand. The basic idea is to first identify the signals that relate to those motor functions. For example, take the brain signals associated with the event of moving your hand, or when you are not moving you hand, or even when you simply imagine moving your hand, or when you are not thinking of moving your hand. Once you know that signal you can start associating that signal with a particular output. In a very simple case you would be associating imagining moving your hand with the answer "yes," and another signal with the answer "no." This allows you to answer simple yes or no questions just by imagining hand movements. Obviously there are more complicated versions of this.
Mind Over Matter: Brain Control Interfaces Become a Reality - 
Basic BCI Design
ET: Are the neurons in that part of the brain a lot stronger, making them easier to identify?
GS: I think it's because those areas of the brain are already set up for controlling things and for interacting with the environment, for outputting something which in this case is a motor command. Other areas of the brain are more set up for handling input, like the part of the brain that processes visual input. An area like that would not be particularly well suited to controlling something, because it would be particularly easy for signals there to be modified just by turning on the light. You want an area that can execute a particular function no matter what you're seeing, hearing, or smelling. That's why the motor system is a pretty good area to control devices—because it controls its environment in normal behavior.
ET: Is there a qualitative difference in the brain activity that occurs when you imagine moving a limb and when you actually do it?
GS: There 's been a pretty big debate about that. Recent research has shown conclusively that the exact same areas of the brain are active when you imagine moving your hand as compared to when you actually move it. That's not the case with speech, for example. When you imagine speaking you only mentally rehearse it but you don't necessarily imagine how to move your face and your mouth in order to produce that speech. What you would see is activity in auditory areas of the brain as you mentally rehearse that speech, but you will not see activity in the motor areas of the brain involved in facial movement. This indicates that when you imagine speaking you are not mentally simulating how to move your face and mouth. In contrast, when you imagine walking your motor system is simulating how to move you arms and legs without actually doing it. However there is usually a difference in signal strength between imagining limb movement and doing it, with the latter being noticeably stronger. Paradoxically, a recent study that I participated in demonstrated that once you give actual feedback to those brain signals associated with movement imagery, for example using them to control a cursor on the screen, then the brain signals for imagined movement can become stronger than those for actual movement. What that means is not entirely clear, but it does show that if you train the brain you can get it to produce stronger activity for imagined movement than it would for actual movement.
ET: That's a powerful change in the normal pattern of brain activity. Have any of your subjects reported substantial changes in their dreams after working with your interface?
GS: Not that I know of. The tasks that we ask them to do are pretty similar to their normal daily behavior. You're not going to dream differently just by imagining moving your hand a few times a day. Of course, everything we do changes our future behavior. A promising line of research involves rehabilitating the brain after a stroke. The idea is to try and rehabilitate the brain in a manner similar to how you rehabilitate muscles after an injury. You try to train the brain to producer stronger activity where it is needed to control muscle movement and restore normal function. At least that's the theory.
ET: What would the surgery be like for getting an implanted BCI, and how risky is the procedure?
GS: The surgery involves opening up the skull and placing a sheet of electrodes on the surface of the brain. In the future a wireless transmitter would be added, making the implant virtually undetectable to anyone, but the patient and the procedure could almost be done on an outpatient basis—not completely, but nearly so. The procedure would be less invasive than a breast augmentation. It wouldn't be very risky at all.

Mind Over Matter: Brain Control Interfaces Become a Reality

The ability to influence the physical world merely by thought has been a dream of mankind for many years. Now researchers are making real progress in letting people control a PC simply by thinking, and the first crop of consumer Brain Control Interface (BCI) headsets has arrived. Right now these are being used only for simple games, and hardware and applications to support the technology are scarce. But this still represents a major advance that could significantly change how we all interact with computers.
For doctors this means an opportunity to help those who are so physically incapacitated due to spinal injury, brain trauma, or disease that they literally are prisoners of their own bodies. Patients like this are said to have "locked-in" syndrome. For those people, who can't even speak and therefore cannot take advantage of speech recognition software, BCI may someday be their best hope for making a real connection to the outside world.
Efficient control of a PC via BCI is not here yet, but the journey is finally underway. At the forefront of the expedition is Dr. Gerwin Schalk, a 12-year veteran in the field of BCI. Dr. Schalk is a research scientist at the Wadsworth Center, a public health laboratory that is part of the New York State Department of Health. His current brainchild is the SIGFRIED project, the technology that powers the research he is currently doing into brain control interfaces.
An advanced pattern detection and visualization tool, SIGFRIED allows clinics and researchers to better understand and interpret the massive amount of data that comes from the sensors used in brain control technology. As that is the first step in creating signals robust enough to drive an external device like a PC, SIGFRIED is facilitating the creation of BCI-compatible software that responds faster and with more advanced capabilities than previous generations.

Mind Over Matter: Brain Control Interfaces Become a Reality - Dr. 
Gerwin Schalk
During the course of our hour-long interview, Schalk explained the current state of the art for BCI in both research labs and the real world of computer devices. What he had to say reveals a field that has begun in earnest, and is helping to usher in a future that may be stranger that we think—and arrive sooner than we expect.

Dr. Gerwin Schalk

Change your TV channel by mind control

In the future, we may well be able to control our televisions simply by thinking of what we would rather be watching!
The ultimate remote control for the lazy
The latest research into thought-controlled technology from the University of Washington suggests that mind-controlled computer input is one big step closer to reality.
The research team has discovered that interacting with brain-computer interfaces lets patients create "super-active populations of brain cells" and that "a human brain could quickly become adept at manipulating an external device such as a computer interface or a prosthetic limb."
Or indeed, a television.
Brain-building exercises
The latest research shows how the brain can control an on-screen cursor are published this week in the 'Proceedings of the National Academy of Sciences'.
"Bodybuilders get muscles that are larger than normal by lifting weights," said lead author Kai Miller, a UW doctoral student in physics, neuroscience and medicine.
"We get brain activity that's larger than normal by interacting with brain-computer interfaces. By using these interfaces, patients create super-active populations of brain cells."
Medical breakthroughs
Brain Control Interface (BCI) headsets are being developed primarily for medical applications, such as the possibility of helping stroke victims or those suffering from the tragically debilitating "locked in syndrome" whereby they have no means of communicating with the outside world.
Control of a TV or PC via BCI is a spin-off from this research and, while not yet achievable, is looking like it is now a real future possibility.
Research scientist Dr. Gerwin Schalk is one of the leading authorities in the field of BCI, a 12-year veteran in the burgeoning and fascinating area of brain controlled tech. Dr. Schalk is currently working on the SIGFRIED project which lets researchers understand and interpret data from the sensors used in brain controlled tech.
You can see a fascinating and revealing interview outlining how Schalk's latest research brings mind-controlled computing one step closer over at

EarthShare Sex addiction divides mental health experts

Is extreme sexual acting-out an obsessive-compulsive disorder, a sign of depression, or just bad behavior? 'If we are looking at a disorder, it's not clear what that disorder is,' says one expert.

Tiger Woods, who recently admitted to multiple extramarital affairs, said he is receiving treatment. David Duchovny, who plays a sex-obsessed professor on the TV show "Californication," underwent rehab in 2008. Pop psychiatrist Dr. Drew Pinsky has launched a reality series dealing with the subject.

"Sex addiction" talk seems to be everywhere. But mental health experts are split on what underlies this kind of behavior.

The American Psychiatric Assn. has proposed that out-of-control sexual appetites be included as a diagnosis in the next edition of the psychiatrists' bible, the Diagnostic and Statistical Manual of Mental Disorders, to be published in 2013.

Unlike compulsive gambling, which also is proposed for addition to the new DSM (to be called DSM-5), the proposed new diagnosis -- "hypersexual disorder" -- stops short of categorizing these problems as addictions, and for a reason.

"If we are looking at a disorder, it's not clear what that disorder is," says Michael Miner, a professor of family medicine and community health at the University of Minnesota who advised the DSM-5 committee on sexual disorders. "There is not an agreed-upon name. The research is in its infancy."

Patterns of extreme sexual acting-out are described variously by therapists as an addiction, as a type of obsessive-compulsive disorder, or as a symptom of another psychiatric illness, such as depression.

The lines specialists draw between what is sexually normal or abnormal have long been in flux. Some behaviors, such as pedophilia, are almost universally considered abnormal and have been described in the DSM for decades. Homosexuality was once considered deviant, but that reference was dropped from the DSM decades ago.

Therapists who see patients -- mostly men -- with problems caused by repetitive sexual behaviors, whether sex with consenting adults, pornography or cybersex, said the addition of a hypersexual behavior category is long overdue.

"There is no doubt in my mind that this condition exists and that it's serious," said Dr. Martin P. Kafka, an associate clinical professor of psychiatry at Harvard University who was a member of the DSM-5 work group on sexual disorders.

"There are definitely men who are consumed by porn or consumed by sex with consenting adults -- who have multiple affairs or multiple prostitutes. The consequences associated with this behavior are very significant, including divorce, pregnancy and STDs," Kafka said.

Some studies suggest that hypersexual behavior is indeed similar to an addiction, akin to the loss of control that seizes compulsive gamblers or shoppers.

For example, in a 1997 survey of 53 self-identified sex addicts in a 12-step recovery program, 98% said they had three or more withdrawal symptoms, 94% that they had tried unsuccessfully to control their behavior and 92% that they spent more time engaging in sexual behavior than they intended to.

In addition, screening tests designed for "sexually addicted" individuals have also been shown to accurately identify people with substance abuse problems, implying that the disorders have similarities.

Based on the addiction model, several sex addiction treatment centers have opened in recent years -- including Pine Grove in Hattiesburg, Miss., where rumors have placed Woods. Twelve-step programs, often the foundation of substance abuse treatment, are a staple of such facilities.

But they may not reach far enough, Kafka said. Many patients with hypersexual behavior relapse after 12-step programs, he said, because they haven't addressed other issues in their lives. He believes that certain moods or psychiatric conditions cause sexual behavior to become disinhibited and abnormal.

In a 2004 study of 31 self-defined sex addicts, for example, researchers at the Kinsey Institute for Research in Sex, Gender, and Reproduction at Indiana University found that most of the individuals had an increased interest in sex when they were in depressed or anxious emotional states.

The ramped-up sexual behavior may be linked to changes in levels of key brain chemicals, such as serotonin, that occur when people experience mood disorders, some scientists think. These chemical changes might lift sexual inhibitions.

Impulsivity scores are also higher in sexually overcharged men, Miner and colleagues found in a study comparing eight men with compulsive sexual behavior to a control group.

The report, published in November in the journal Psychiatry Research, was one of the few studies to examine the brain physiology of such individuals. It showed that the hypersexual men had distinct patterns of activity in the frontal lobe region of the brain. The pattern, however, did not match that of patients diagnosed with other kinds of impulse control problems.

Maureen Canning, director of the sexual disorders program at the Meadows treatment center in Wickenburg, Ariz., has another theory. Based on anecdotal experience, she said, she believes that trauma in childhood, such as sexual abuse or witnessing of sexual behavior, disrupts normal development and drives hypersexuality in adulthood.

"When these children grow up . . . they become obsessed about correcting the trauma," said Canning, author of the 2008 book "Lust, Anger, Love: Understanding Sexual Addiction and the Road to Healthy Intimacy."

Attempting to understand what causes hypersexual behavior goes beyond curiosity: It lies at the heart of crafting effective treatments. But there are few studies on what works, Kafka said.

Meanwhile, some outspoken critics doubt that hypersexual behavior is a disorder at all. They argue against creating a label that can stigmatize people, or provide excuses for what is just plain poor conduct.

It's alarming "for a group of psychiatrists to try to legislate how much sex we can enjoy before we're labeled mentally ill," said Christopher Lane, a Northwestern University literature professor and author of a 2007 book criticizing mental health professionals for ever-expanding ideas of what constitutes abnormal behavior.

Lane suggested that the rush to reclassify some behaviors as treatable conditions is driven in part by business interests: Treatment centers pop up. The pharmaceutical industry offers pills as remedies.

What is out-of-bounds sexual activity does vary by culture, Miner said.

"Sex in the United States is a very odd phenomenon. We are probably one of the more sexualized societies in the world and also one of the most puritanical," he said. "You wonder, if Tiger Woods was a French golfer, whether this . . . would have been such a big deal."

Biomarkers of brain damage in HIV infection

HIV budding from an infected cell. Credit: NIAID

HIV_cellThe global HIV/AIDS epidemic is still hitting hard. During 2008, there were 33.4 million people living with HIV and an estimated 2.7 million people newly affected, according to the World Health Organization. These high figures inevitably led to a large number of death from AIDS, an estimated 2 million in that year.
There have been improvements in HIV treatment that control the infection and lengthen the life expectancy of infected people, but 95% of the infections and deaths occurred in developing countries where nutrition and health care are challenging.
There is still some confusion between HIV and AIDS. The initial infection is brought about by the human immunodeficiency virus (HIV) which attacks the white blood cells called CD4+ cells. These are involved in helping other types of immune cell to seek and destroy germs within the body. As HIV infection progresses, more CD4+ cells are killed and the ability to fight infection is weakened.
The acquired immunodeficiency syndrome (AIDS) represents the later stages of HIV infection when a person has precariously low levels of CD4+ cells and suffers from other infections such as pneumonia, tuberculosis, fungal infections and cancer.
A further late-stage problem is dementia, brought about by cognitive deterioration of the brain. Many autopsied brains from affected individuals have revealed the presence of HIV-encephalitis. However, it is not always easy to determine when someone infected by HIV is suffering from brain injury because the condition is clinically silent. It may show up in brain imaging scans but these are expensive to operate and not amenable to screening.
Some biomarkers of neurological damage have been proposed but there are no well-validated procedures as yet. This deficiency in HIV research has prompted researchers in the USA to undertake a proteomics study in search of biomarkers of HIV-induced brain injury, which might be able to indicate the degree of progression and help develop ways to preserve cognitive function.
Ann Ragin and co-researchers from Northwestern University, Johns Hopkins University, the Harvard-MIT Division of Health Science and Technology, North Shore University Health System (Evanston, IL) and the Children's Memorial Hospital, Chicago, cooperated in the study. Their subjects were eight men and two women aged 38-62 who were in advanced-stage HIV infection.
The neurological status of each individual was determined by several standard tests, including the Karnofsky performance scale for assigning functional status and the Memorial Sloan Kettering procedure for cognitive impairment. In addition, the viral loads, absolute CD4 cell counts, body mass index and haemoglobin levels were measured.
The participants were subjected to two principal tests to assess changes that had taken place over a three year period. The levels of 22 potential biomarkers, measured by a proteomics technique, were correlated with the progression of brain injury, determined by quantitative magnetic resonance imaging in vivo.
The biomarkers were determined in plasma using Luminex-based technology in which fluorescent, colour-coded microspheres were coated with antibodies to trap the specific protein targets. The beads were then passed through a laser beam to measure the fluorescence intensities.
The MRI studies used automated brain segmentation algorithms to derive volume fractions of grey and white matter and cerebrospinal fluid. Diffusion tensor imaging (DTI), which has been shown to be sensitive to changes in the brain induced by HIV, was also employed to measure the diffusivity of water molecules.
The data from the tests at year zero revealed a correlation between the level of the monocyte chemoattractant protein-1 [MCP-1, also known as chemokine (C-C motif) ligand 2] and the loss of white matter integrity due to axion disruption.
Three years later, the correlation of MCP-1 to brain injury was more extensive. It was linked to whole-brain DTI measurements, as well as brain volumetric measurements, including reduced grey matter and increased CSF volume. It also correlated with changes in brain parenchyma volume. These neurological modifications were consistent with irreversible neuronal loss and atrophy, both common occurrences in HIV-dementia.
The inferred involvement between MCP-1 and brain damage and resultant cognitive impairment backs up several reported studies. HIV is imported into the brain on activated monocytes in the blood and MCP-1 enhances their progress. Dementia has been linked with the levels of activated monocytes entering the brain. MCP-1 may also be involved with viral replication within the brain.
The combination of multiplexed proteomics techniques with automated image analysis is ideal for high-throughput analysis, requiring little human intervention. It holds great promise for identifying further factors associated with neurological regression in HIV infection, which can be clinically silent, and could ultimately help in the preservation of brain and cognitive function.

Brain 'hears' sound of silence

what: Your brain can hear the sound of silence.
Your brain can hear the sound of silence. (Source: iStockphoto)

While we tend to characterise silence as the absence of sound, the brain hears it loud and clear, US researchers have found.
According to a recent study from the University of Oregon, some areas of the brain respond solely to sound termination.
Rather than sound stimuli travelling through the same brain pathways from start to finish as previously thought, studies of neurone activity in rats have shown that the onset and offset of sounds take separate routes.
"This is something we see a lot of in the brain: that features which are important for perception are computed and then explicitly represented," says Michael Wehr, lead researcher and psychologist at the University of Oregon's Institute of Neuroscience.
Knowing how the brain responds to, and organises, sounds could lead to better treatment for those who have hearing loss.
Sound information moves through the cochlea and the auditory cortex, the part of the brain responsible for processing sound, as a series of vibrations.
By measuring the frequency of those vibrations before and after exposure to brief noises, Wehr and his team discovered that neurones sort the start and end of sounds through separate channels.
"In the auditory system, information about the onset and offset of a sound is implicitly contained in the firing of neurones close to the sensory receptors, but is explicitly represented by on-responses and off-responses in higher brain areas," says Wehr.

Sensing speech

These discrete responses are especially important for language processing.
"Examples are the distinction between 'chop' and 'shop,' or between 'stay' and 'say,'" says Dr Marjorie Leek, a research investigator for the National Center for Rehabilitative Auditory Research.
"In both of these examples, there's a short, transient-like difference either on the beginning of one of the words or within the syllable. Onset and offset responses would be critical to perceiving these cues related to silence."
Although different neurones may respond to sound onsets and offsets, the brain relies on all of them equally to correctly decipher the timing, source and motion of sounds.
"One of the major challenges of the entire ear-brain system is to preserve precise timing information that is ubiquitous in human speech, that supports information about localisation of sound in space, that allows a listener to separate sound sources that are occurring simultaneously, that help to suppress echoes in a highly-reverberant space, and that provide cues to auditory motion," says Leek.
For people with hearing problems, the auditory cortex doesn't properly encode frequencies or temporal cues necessary for understanding and recognising sound information.
Better knowing how the brain organises and groups sounds could lead to more effective hearing therapies and devices, although Wehr recognises that there's still much follow-up research to complete.

The Aging Brain Is Less Quick, But More Shrewd

The Aging 
Brain: Less Quick, But More Shrewd
Lifelong learning and brain stimulation can help increase memory and decision-making ability, according to neuroscientists.

For baby-boomers, there is both good news and bad news about the cognitive health of the aging brain.

Brain researcher Gary Small from UCLA conveys the bad news first: "Reaction time is slower," he says. "It takes us longer to learn new information. Sometimes it takes us longer to retrieve information, so we have that tip-of-the-tongue phenomenon — where you almost have that word or that thought. That's typical of the middle-age brain."

As we age, our ability to multi-task diminishes. "We're quick, but we're sloppy when we're in middle-age. We make more errors when we're in middle age," says Small.

The Older, But Wiser, Brain

But Small has found that it's not all bad news. He points to a continued improvement in complex reasoning skills as we enter middle age. Small suggests that this increase may be due to a process in the brain called "myelination." Myelin is the insulation wrapped around brain cells that increases their conductivity — the speed with which information travels from brain cell to brain cell. And the myelination doesn't reach its peak until middle age. By this point, says Small, "the neuro-circuits fire more rapidly, as if you're going from dial-up to DSL." Complex reasoning skills improve, and we're able to anticipate problems and reason things out better than when we were young.

And, Small adds, there's another area of improvement as we age: empathy — the ability to understand the emotional point of view of another. Empathy increases as we age.

'Your Brain On Google'

One of the great discoveries from recent neuroscience research is that the human brain is always changing, from moment to moment and throughout life. It continues to develop, and even continues to grow new brain cells.

"An old myth in neuroscience," says Small, "is that once a brain cell dies off you can't replace it." But many studies have now shown, he adds, that there is, in fact, brain cell growth throughout life. So, he says, the brain can continue to learn throughout the middle age years and beyond.

In a recent study that Small refers to as "your brain on Google," healthy, middle-aged volunteers, all novices on the computer, were taught how to do a Google search. They were told then to practice doing online searches for an hour a day, for seven days. After the week's practice, the volunteers came back into Small's lab and had their brains scanned while doing a Google search.

The scans revealed significant increases in brain activity in the areas that control memory and decision-making.

"The area of the brain that showed the increases was the frontal lobe, the thinking brain, especially in areas that control decision making and working memory," Small says. One interpretation of his findings, he says, is that with practice, a middle-age brain can very quickly alter its neuron-circuitry, can strengthen the neuron circuits that control short-term memory and decision making.

Physical Fitness Helps Brain, Too

Research by neuroscientist Art Kramer, from the University of Illinois, highlights the plasticity — the ability to grow and change — of the aging brain. In his studies on physical exercise, Kramer has found that memory can improve with treadmill workouts.

"Over a six-month to one-year period," Kramer says, "three days a week, working up to an hour a day, people improved in various aspects of both short-term and long-term memory."

After treadmill training, the "aging couch potatoes," as Kramer calls them, were given brain scans. Those who'd trained had larger hippocampi, the brain area key for memory. Other brain regions too — central for decision-making, planning and multi-tasking — were also larger in the treadmill exercisers. "There are a number of regions," says Kramer, "that on MRI scans tend to show not just stability but increases as a function of exercise in middle-age and older brains."

Such research studies underscore that both physical exercise and cognitive brain training contribute to brain health. And these two scientists not only talk the talk, they also, quite literally, walk the walk. Kramer, 56, goes to the gym four or five days a week, getting aerobic exercise on a stationary bike and strength training by lifting weights. Small, 58, does a New York Times crossword and numbers puzzle every morning, as well as a series of toning and stretching exercises and at least 20 minutes of aerobic exercise each day.

Brain images suggest Alzheimer's drug is working

LONDON (Reuters) - New imaging technology suggests an experimental drug for Alzheimer's reduces clumps of plaque in the brain by around 25 percent, lifting hopes for a medicine that disappointed in clinical tests two years ago.


Bapineuzumab -- being developed by Pfizer Inc, Irish drugmaker Elan Corp and Johnson & Johnson -- is a potential game-changer because it could be the first drug to treat the underlying cause of the degenerative brain disease.

Investor confidence in the antibody medicine, however, took a big hit in July 2008 when it failed to meet its main goal in a mid-stage trial and caused brain swelling at higher doses. The new study, which only involved 28 patients, is modest fillip.

"It demonstrated that the drug has an effect on the pathological hallmark of Alzheimer's disease," lead researcher Juha Rinne from Finland's University of Turku told Reuters.

Rinne and colleagues used a novel imaging substance called carbon-11-labeled Pittsburgh compound B, which sticks to areas of the brain where there is a lot of beta amyloid plaque.

After 78 weeks, they found that patients given bapineuzumab had about a 25 percent reduction in plaque compared with those on placebo. The effect was similar with three different doses of the drug, they reported in the journal Lancet Neurology.

The treatment was generally well tolerated, although two patients on the highest dose had transient brain swelling. The drug's developers have since dropped the top dose from large ongoing Phase III trials.

Commenting on the results, Sam Gandy from New York's Mount Sinai School of Medicine said it was too early to say effective disease-modifying drugs were at hand, but the ability to measure plaque in living subjects was "something of a breakthrough."

Experts are divided on the root cause of Alzheimer's and hence the best way to tackle it.

Most advanced drugs, like bapineuzumab, have focused on removing clumps of amyloid plaques, which are thought to stop brain cells from functioning properly. But a rival school blames toxic tangles caused by an abnormal build-up of the protein tau.

Rinne's imaging study was funded by Elan and Wyeth, which is now part of Pfizer.

(Editing by Jon Loades-Carter)