Thursday, May 6, 2010

New Protection Device Exceeds Expectations in Carotid Stenting

Dual Upstream Balloons Keep Debris from Injuring Brain SAN DIEGO, May 6 /PRNewswire/ -- A novel device that takes a simple approach to protecting the brain during procedures to improve blood flow through arteries in the neck appears to be safer and more effective than existing protection devices, according to a study presented today at the Society for Cardiovascular Angiography and Interventions (SCAI) 33rd Annual Scientific Sessions.
The ARMOUR study showed that when the MO.MA protection device was placed upstream of a cholesterol blockage in the carotid arteries, the one-month risk of heart attack, stroke or death was approximately 75 percent lower than would be typical with a downstream filter, the most common tool for catching bits of plaque and blood clot that can break loose during angioplasty and stenting.
"The MO.MA device protects the brain before we ever touch the lesion," said Robert M. Bersin, M.D., FSCAI, medical director of endovascular services for Seattle Cardiology and Swedish Medical Center, both in Seattle. "This is one of the best results ever observed for carotid stenting."
Angioplasty and stenting of the carotid arteries in the neck are not safe unless the brain is protected from particles that are knocked loose during the procedure. Filter devices positioned downstream of the blockage are often used to catch debris and prevent it from traveling to the brain or other organs. But pushing a filter across a tight blockage can itself dislodge small pieces of plaque and blood clot, and injure the brain.
The MO.MA device avoids this complication. It consists of two tiny balloons that are threaded into the neck arteries on a slender catheter and positioned forward of the blockage, one in the main carotid artery and one in the external carotid artery. Inflation of the balloons temporarily stops blood flow to the internal carotid artery, where the blockage is located. After the narrowed artery is widened with an angioplasty balloon and held open with an expandable metal stent, the interventional cardiologist uses a syringe to suction out any debris that has broken loose, repeating until no more particles are visible.
To evaluate the safety and effectiveness of the MO.MA protection device, researchers at 25 medical centers in the United States and Europe recruited 225 patients who were considered too high-risk for open surgery of the neck arteries, usually because of advanced age, other serious medical conditions, or challenging neck anatomy. The procedure was successful without major adverse cardiovascular or cerebrovascular events (MACCE) such as heart attack, stroke or death in 93 percent of patients. Within 30 days of stenting, the combined MACCE rate was 2.7 percent, far lower than the 13 percent that was predicted on the basis of experience with filter protection devices. The major stroke rate was less than 1 percent.
In addition, patients who were experiencing stroke-like symptoms at the time of the procedure had no higher MACCE rate than the overall group, whereas a doubling of the MACCE rate is typical in symptomatic patients treated with filter protection devices, Dr. Bersin said. Finally, among patients older than 75 years, the MACCE rate was just 2.3 percent.
"The idea that carotid stenting is not safe in the elderly no longer applies," Dr. Bersin said. "This device is a game-changer."
The ARMOUR study was funded by Invatec. Dr. Bersin reports no conflicts of interest.
Dr. Bersin will present the study "Use of the INVATEC MO.MA® proximal cerebral protection device during carotid stenting (The ARMOUR Trial)" at an oral abstract session on Thursday, May 6, 10:30 a.m. to 10:42 a.m. (Pacific Time).
About SCAI
Headquartered in Washington, D.C., the Society for Cardiovascular Angiography and Interventions is a 4,000-member professional organization representing invasive and interventional cardiologists in approximately 70 nations. SCAI's mission is to promote excellence in invasive and interventional cardiovascular medicine through physician education and representation, and advancement of quality standards to enhance patient care. SCAI's annual meeting has become the leading venue for education, discussion, and debate about the latest developments in this dynamic medical specialty. SCAI's patient and physician education program, Seconds Count, offers comprehensive information about cardiovascular disease. For more information about SCAI and Seconds Count, visit www.scai.org or www.seconds-count.org.

Skull Caps and Genomes

neanderthal 440The skull cap is thick and flat. It looks distinctively human, and yet its massive brow ridge, hanging over the eyes like a boney pair of googles, is impossible to ignore. In 1857, an anatomist named Hermann Schaafhausen stared at the skull cap in his laboratory at the University of Bonn and tried to make sense of it. Quarry workers had found it the year before in a cave in a valley called Neander. A schoolteacher had saved the skull cap, along with a few other bones, from destruction and brought it to Schaafhausen to examine. And now Schaafhausen had to make the call. Was it human? Or was it some human-like ape?
Schaafhausen did not have much help to fall back on. At the time, archaeologists had only found faint hints that humans had coexisted with fossil animals, such as spears buried in caves near the bones of hyenas. Charles Darwin was still two years away from publishing the Origin of Species and providing a theory to make sense of human evolution. Naturalists tended to look at humanity as a collection of races arranged in a rank from savagery to civilization. The most savage races barely ranked above apes, while the naturalists themselves, of course, belonged to the race at the top of the ladder. When anatomists looked at human bodies, they found what they thought was a validation of this hierarchy: differences in the size of skulls, the slopes of brows, the width of noses. Yet all their attempts to neatly sort humanity were bedeviled by the tremendous variation in our species. Within a single so-called race, people varied in color, height, facial features–even in their brow ridges. Schaafhausen knew, for example, about a skull dug up from an ancient grave in Germany that “resembled that of a Negro,” as he later wrote.
To make sense of the “Neanderthal cranium,” as he called it, Schaafhausen tried to fit it into this confusing landscape of human variation. As peculiar as the bone was, he decided it must belong to a human. It was very much unlike the cranium of living Europeans, but Schaafhausen speculated that it belonged to an ancient forerunner. Yet for naturalists of Schaafhausen’s age, such a heavy brow ridge implied not the advanced refinement of European civilization, but wild savagery. Well, Schaafhausen thought, Europeans were pretty savage back in the day. “Even of the Germans,” Schaafhausen wrote in his report on the Neanderthal cranium, “Caesar remarks that the Roman soldiers were unable to withstand their aspect and the flashing of their eyes, and that a sudden panic seized his army.” Schaafhausen found many other passages in classical history that suggested to him a pracitically monstrous past for Europe. “The Irish were voracious cannibals, and considered it praiseworthy to eat the bodies of their parents,” he wrote. Even in the 1200s, ancient tribes in Scandinavia still lived in the mountains and forests, wearing animal skins, “uttering sounds more like the cries of wild beasts than human speech.”
Surely this heavy-browed Neanderthal would have fit right in.
Some 150 years later, pieces of that original Neanderthal cranium now sit in another laboratory in Liepzig, just 230 miles away from Schaafhausen’s lab. Instead of calipers, it is filled with a different set of measuring tools: ones that can read out sequences of DNA that have been hiding in Neanderthal fossils for 50,000 years or more. And today a team of scientists based at the Max Planck Institute of Evolutionary Anthropology published a rough draft of the entire Neanderthal genome.
It is an historic day, but it reminds us, once again, that the publication of a genome does not automatically answer all the questions scientists have about the organism to whom the genome belongs. In fact, a careful look at the new report is a humbling experience. We gaze at the Neanderthal genome today as Schaafhausen gazed at the Neanderthal skull cap that first introduced us to these ambiguous humans.
Since Schaafhausen’s day, paleoanthropologists have discovered Neanderthals across a huge range stretching from Spain to Israel to Siberia. Their fossils range from about 400,000 years ago to about 28,000 years ago. Instead of a lone skull cap, scientists now have just about every bone from its skeleton. Neanderthals were stocky and strong, with a brain about the size of our own. The isotopes in their bones suggest a diet rich in meat, and their fractured bones suggest a rough time getting that food. There’s no evidence that Neanderthals could paint spectacular images of rhinos and deer on cave walls like humans did. But they still left behind many traces of very sophisticated behavior, from intricate tools to painted jewelry.
Ideas about our own kinship to Neanderthals have swung dramatically over the years. For many decades after their initial discovery, paleoanthropologists only found Neanderthal bones in Europe. Many researchers decided, like Schaafhausen, that Neanderthals were the ancestors of living Europeans. But they were also part of a much larger lineage of humans that spanned the Old World. Their peculiar features, like the heavy brow, were just a local variation. Over the past million years, the linked populations of humans in Africa, Europe, and Asia all evolved together into modern humans.
In the 1980s, a different view emerged. All living humans could trace their ancestry to a small population in Africa perhaps 150,000 years ago. They spread out across all of Africa, and then moved into Europe and Asia about 50,000 years ago. If they encountered other hominins in their way, such as the Neanderthals, they did not interbreed. Eventually, only our own species, the African-originating Homo sapiens, was left.
The evidence scientists marshalled for this “Out of Africa” view of human evolution took the form of both fossils and genes. The stocky, heavy browed Neanderthals did not evolve smoothly into slender, flat-faced Europeans, scientists argued. Instead, modern-looking Europeans just popped up about 40,000 years ago. What’s more, they argued, those modern-looking Europeans resembled older humans from Africa.
At the time, geneticists were learning how to sequence genes and compare different versions of the same genes among individuals. Some of the first genes that scientists sequenced were in the mitochondria, little blobs in our cells that generate energy. Mitochondria also carry DNA, and they have the added attraction of being passed down only from mothers to their children. The mitochondrial DNA of Europeans was much closer to that of Asians than either was to Africans. What’s more, the diversity of mitochondrial DNA among Africans was huge compared to the rest of the world. These sorts of results suggested that living humans shared a common ancestor in Africa. And the amount of mutations in each branch of the human tree suggested that that common ancestor lived about 150,000 years ago, not a million years ago.
Over the past 30 years, scientists have battled over which of these views–multi-regionalism versus Out of Africa–is right. And along the way, they’ve also developed more complex variations that fall in between the two extremes. Some have suggested, for example, that modern humans emerged out of Africa in a series of waves. Some have suggested that modern humans and other hominins interbred, leaving us with a mix of genetic material.
Reconstructing this history is important for many reasons, not the least of which is that scientists can use it to plot out the rise of the human mind. If Neanderthals could make their own jewelry 50,000 years ago, for example, they might well have had brains capable of recognizing themselves as both individuals and as members of a group. Humans are the only living animals with that package of cognitive skills. Perhaps that package had already evolved in the common ancestor of humans and Neanderthals. Or perhaps it evolved independently in both lineages.
In the 1990s, the German geneticist Svanta Paabo led a team of scientists in search of a new kind of evidence to test these ideas: ancient DNA. They were able to extract bits of DNA from bones that were found along with Schaafhausen’s skull cap in the Neander valley cave. Despite being 42,000 years old, the fossils still retained some genetic material. But reading that DNA proved to be a collossal challenge. Over thousands of years, DNA breaks into tiny pieces, and some of the individual “letters” (or nucleotides) in the Neanderthal genes become damaged, effectively turning parts of its genome into gibberish. It’s also hard to isolate Neanderthal DNA from the far more abundant DNA of microbes that live in the fossils today. And the scientists themselves can contaminate the samples with their own DNA as well.
Over the years, Paabo and his colleagues have found ways to overcome a lot of these problems. They’ve also taken advantage of the awesome leaps that genome-sequencing technology has taken since they started the project. They have been able to reconstruct bigger and bigger stretches of DNA. They’ve been able to fish them out of a number of Neanderthal fossils from many parts of the Old World. And today they can offer us a rough picture of all the DNA it takes to be a Neanderthal.
To create a rough draft of the Neanderthal genome, the scientists gathered DNA from the fossils of individual Neanderthals that lived in Croatia about 40,000 years ago. The scientists sequenced fragments of DNA totalling more than 4 billion nucleotides. To figure out what spot on which chromosome each fragment belonged, they lined up the Neanderthal DNA against the genomes of humans and chimpanzees. They are far from having a precise read on all 3 billion nucleotides in the Neanderthal genome. But they were able to zero in on many regions of the rough draft and get a much finer picture of interesting genes.
One of the big questions the scientists wanted to tackle was how those interesting genes evolved over the past six million years, since our ancestors split off from the ancestors of chimpanzees. So they compared the Neanderthal genome to the genome of chimpanzees, as well as to humans from different regions of the world, including Africa, Europe, Asia, and New Guinea.
This comparison is tricky because human DNA, like human skulls, is loaded with variations. The DNA of any two people can differ at millions of spots. Those differences may consist of as little a single nucleotide, or a long stretch of duplicated DNA. Each of us picks up a few dozen new mutations when we’re born, but most of the variations in our genome have been circulating in our species for centuries, millennia, and, in some cases, hundreds of thousands of years. Over the course of history these variants have gotten mixed and matched in different human populations. Some of them vary from continent to continent. It’s possible to tell someone from Nigeria from someone from China based on just a couple hundred genetic markers. But a lot of the same variations that Chinese people have also exist in Nigeria. That’s because Chinese people and Nigerians descend from an ancestral population. The gene variants first arose in that ancestral variation and then were all passed down from generation to generation, even as humans migrated and diverged across the planet. And when Paabo and his colleagues looked at the Neanderthal genome, they discovered that Neanderthals carried some of the same variants in their genome too.
The scientists compared the variants in the Neanderthal genome to those in humans to figure out when the two kinds of humans diverged. They estimate that the two populations became distinct between 270,000 and 440,000 years ago. After the split, our own ancestors continued to evolve. It’s possible that genes that evolved after that split helped to make us uniquely human. To identify some of those genes, Paabo and his colleagues looked for genes that were identical in Neaderthals and chimpanzees, but had undergone a significant change in humans.
They didn’t find many. In one search, they looked for protein-coding genes. Genes give cells instructions for how to assemble amino acids into proteins. Some mutations don’t change the final recipe for a protein, while some do. Paabo and his colleagues found that just 78 human genes have evolved to make a new kind of protein, differing from the ancestral form by one or more amino acids. (We have, bear in mind, 20,000 protein-coding genes.) Only five genes have more than one altered amino acid.
The scientists also found some potentially important changes in stretched of human DNA that doesn’t encode genes. Some of these non-coding stretches act as switches for neighboring genes. Others encode tiny pieces of single-stranded versions of DNA, called microRNAs. MicroRNAs can act as volume knobs for other genes, boosting or squelching the proteins they make.
Another way to look for uniquely human DNA is to search for stretches of genetic material that still retain the fingerprint of natural selection. In the case of many genes, several variants of the same gene have coexisted for hundreds of thousands of years. Some variants found in living humans also turn up in the Neanderthal genome. But there are some cases in which natural selection has strongly favored humans with one variant of a gene over others. The selection has been so strong sometimes that all the other variants have vanished. Today, living humans all share one successful variant, while the Neanderthal genome contains one that no longer exists in our species. The scientists discovered 212 regions of the human genome that have experienced this so-called “selective sweep.”
You can see the full list of all these promising pieces of DNA in the paper Paabo and his colleagues published today. If you’re looking for a revelation of what it means to be human, be prepared to be disappointed by a dreary catalog of sterile names like RPTN and GREB1 and OR1K1. You may find yourself with a case of Yet Another Genome Syndrome. In all fairness, the scientists do take a crack at finding meaning in their catalog. They note that a number of evolved genes are active in skin cells. But does that mean that we evolved a new kind of skin color? A new way of sweating? A better ability to heal wounds? At this point, nobody really knows.
If you believe the difference between humans and Neanderthals is primarily in the way we think, then you may be intrigued by the strongly selected genes that have been linked to the brain. These genes got their links to the brain thanks to the mental disorders that they can help produce when they mutate. For exampe, one gene, called AUTS2, gets its name from its link to autism. Another strongly-selected human gene, NRG3, has been linked to schizophrenia. Unfortunately, these disease associations just tell scientists what happens when these genes go awry, not what they do in normal brains.
The most satisfying hypothesis the scientists offer is also the one with the deepest historical resonance. It has to do with the brow ridge that so puzzled Schaafhausen back in 1857. One of the strongly selected genes in humans, known as RUNX2, has been linked to a condition known as cleiodocranial dysplasia. People who suffer from this condition have a bell-shaped rib cage, deformed shoulder bones, and a thick brow ridge. All three traits distinguish Neanderthals from humans.
Paabo and his colleagues then turned to the debate over what happened when humans emerged from Africa. Scientists have debated for years what happened when our ancestors encountered Neanderthals and other extinct hominin populations. Some have argued that they kept their distance and never interbred. Others have scoffed that any human could show such self-restraint. After all, humans have been known to have sex with all sorts of mammals when given the opportunity, so why should they have been so scrupulous about a very human-like mammal?
The evidence that scientists have gathered up till now has been very confusing. If you just look at mitochondria, for example, all the Neanderthal form tiny twigs on a branch that’s distant from the human branch. If Neanderthals and humans had interbred often enough, then some people today might be carrying mitochondrial DNA that was more like that of Neanderthals than like other humans.
On the other hand, some scientists looking at other genes have found what they claim to be evidence of interbreeding. They would find gene variants in living humans that had evolved from an ancestral gene about a million years ago. One way to explain this pattern was to propose that modern humans interbred with Neanderthals or other hominins. Some of their DNA then entered our gene pool and has survived till today. In one case, a team of scientists proposed that a gene variant called Microcephalin D hopped into our species from Neanderthals and then spread very quickly, driven perhaps by natural selection. Making this hypothesis even more intriguing was the fact that the gene is involved in building the brain.
Paabo and his colleagues looked for pieces of the Neanderthal genome scattered in the genomes of living humans. The scientists found that on average, the Neanderthal genome is a little more similar to the genomes of people in Europe, China, and New Guinea, than it is to the genomes of people from Africa. After carefully comparing the most similar segments of the genomes, the scientists propose that Neanderthals interbred with the first immigrants out of Africa–perhaps in the Middle East, where the bones of both early humans and Neanderthals have been found.
Today, the people of Europe and Asia have genomes that are 1 to 4 percent Neanderthal.That interbreeding doesn’t seem to have meant much to us, in any biological sense. None of the segments our species picked up from Neanderthals was favored by natural selection. (Microcephalin D turns out to have been nothing special.)
While working on this post, I contacted two experts who have been critical of some earlier studies on hominin interbreeding, Laurence Excoffier of the University of Bern and Nick Barton of the University of Edinburgh. Both scientist gave the Neanderthal genome paper high marks and agree in particular that the interbreeding hypothesis is a good one. But they do think some alternative hypotheses have to be tested. For example, interbreeding is not the only way that some living humans might have ended up with Neanderthal-like pieces of DNA. Cast your mind back 500,000 years, before the populations of humans and Neanderthals had diverged. Imagine that those ancestral Africans were not trading genes freely. Instead imagine that some kind of barrier emerged to keep some gene variants in one part of Africa and other variants in another part.
Now imagine that the ancestors of Neanderthals leave Africa, and then much later the ancestors of Europeans and Asians leave Africa. It’s possible that both sets of immigrants came from the same part of Africa. They might have both taken some gene variants with them did not exist in other parts of Africa. Today, some living Africans still lack those variants. This scenario could lead to Europeans and Asians with Neanderthal-like pieces of DNA without a single hybrid baby ever being born.
If humans and Neanderthals did indeed interbreed, Excoffier thinks there’s huge puzzle to be solved. The new paper suggests that genes flowed from Neanderthals to humans only at some point between 50,000 and 80,000 years ago–before Europeans and Asians diverged. Yet we know that humans and Neanderthals coexisted for another 20,000 years in Europe, and probably about as long in Asia. If humans and Neanderthals interbred during that later period, Excoffier argues, the evidence should be sitting in the genomes of Europeans or Asians. The fact that the evidence is not there means that somehow humans really did find the self-restraint not to mate with Neanderthals.
Because interbreeding involves sex, it dominates the headlines about Paabo’s research. But I’m left wondering about the Neanderthals themselves. We now have a rough draft of the operating instructions for a kind of human that has been gone from the planet for 28,000 years, which had its own kind of culture, its own way of making its way through the world. Yet I found very little in the paper about what the Neanderthal genome tells us about their owners. It’s wonderful to use the Neanderthal genome as a tool for subtracting away our ancestral DNA and figure out what makes us uniquely human. But it would also be great to know what made Neanderthals uniquely Neanderthal.

Memo to H.R: Older Brains = Smarter Brains

Baby Boomers looking to keep working well into their 60s and beyond may have just found their patron saint in New York Times science editor Barbara Strauch. In her new book, The Secret Life of the Grown-Up Brain, Strauch makes a compelling case that older brains are smarter brains.
A few weeks ago I reported on the job-hunting mistakes Baby Boomers are making in today’s tough market. In addition to avoiding those job-hunting missteps, perhaps one of the best moves for job seekers is to get this book into the hands of every hiring manager in America. Far from a liability, your aging brain can be a valuable asset.
Think Again: The Value of the Aging Boomer Brain
Strauch draws on a long-term longitudinal study that has been tracking the brain performance of study subjects as they age. Far from deteriorating, it seems that our brains, if not our knees, hit their stride in middle age:
…what the researchers found is astounding. During the span of time that constitutes the modern middle age - roughly age forty through the sixties - the people in the study did better on tests of the most important and complex cognitive skills than the same group of people had when they were in their twenties. In four out of six of the categories tested - vocabulary, verbal memory, spatial orientation, and, perhaps most heartening of all, inductive reasoning - people performed best, on average, between the ages of forty to sixty- five.
In a recent interview on NPR’s Fresh Air, Strauch expounded on the upside of aging
We think we’re sort of the smartest in college or in graduate school, but when we do the tests we find that’s not true in many areas, including inductive reasoning…We are better than we were in our 20s. And that to me is amazing.
Amen. Interestingly, it’s not as if employers are clueless about the talents of the well-seasoned brain. In a 2006 study of employer attitudes toward older workers, less than 10 percent of white-collar seasoned workers were viewed as “less productive.”


Clearly skill isn’t the issue. Salary is another matter. There’s no getting around that older workers tend to be more expensive (salary and benefits.) And there’s no question that many older workers may need to readjust their salary expectations as they job-change later in their careers. But as Strauch’s book makes clear, at the right salary, older workers can provide valuable brain power to an organization.
Mind Games
While the middle-aged brain can continue to operate at a high skill level when it comes to reasoning and problem-solving, some brain skills do in fact deteriorate with age. Just ask anyone eligible for AARP membership about their ability to recall names quickly. The good news is that the brain skills that don’t age well seem to be less vital then the cognitive skills that actually improve with age.
As for keeping your neurons firing at a high level, Strauch says the research points to two forms of exercise. Turns out good old physical exertion that gets the blood flowing can help keep your brain toned. You also want to give your brain plenty of mental exercise. As I mentioned in an earlier post, pushing your brain out of its comfort zone - exploring new ideas, trying new hobbies, studying a new language  - helps an older brain stay smart.

Deep Brain Stimulation Beneficial in Parkinson's

THURSDAY, May 6 (HealthDay News) -- Patients with advanced Parkinson's disease who undergo deep brain stimulation (DBS) in addition to the best medical therapy report better quality of life than patients who receive only best medical therapy, though they are at increased risk of serious adverse events, according to a study published online April 29 in The Lancet Neurology.
Adrian Williams, M.D., of Queen Elizabeth Hospital in Birmingham, U.K., and colleagues randomized 366 Parkinson's disease patients to receive surgery and best medical therapy (which involved standard Parkinson's disease drug treatments, including apomorphine when appropriate) or best medical therapy only. One year after treatment, the patients assessed changes to their quality of life on the Parkinson's disease questionnaire (PDQ), and functioning and cognitive status on the unified Parkinson's disease rating scale.
The researchers found that DBS surgery plus medical treatment improved quality of life more than medical treatment alone on the one to 100 PDQ scale (a 5.0 point improvement for surgery compared to a 0.3 point improvement for medical treatment only). Also, patients in the surgery group had significant improvements in mobility, daily living activities, and bodily discomfort, as well as reduced dyskinesia and duration of uncontrolled motor symptoms. However, 19 percent of surgery patients had serious surgery-related adverse events, one of whom died, and 20 patients in the surgery group had serious adverse events related to the disease and drug treatment, compared to 13 patients in the medical treatment group.
"At one year, surgery and best medical therapy improved patient self-reported quality of life more than best medical therapy alone in patients with advanced Parkinson's disease. These differences are clinically meaningful, but surgery is not without risk and targeting of patients most likely to benefit might be warranted," the authors write.
Two study authors reported receiving travel grants or professional fees from Medtronic.

Endometrial stem cells restore brain dopamine levels

Mouse study may lead to new therapies for Parkinson's Disease Endometrial stem cells injected into the brains of mice with a laboratory-induced form of Parkinson's disease appeared to take over the functioning of brain cells eradicated by the disease.
The finding raises the possibility that women with Parkinson's disease could serve as their own stem cell donors. Similarly, because endometrial stem cells are readily available and easy to collect, banks of endometrial stem cells could be stored for men and women with Parkinson's disease.
"These early results are encouraging," said Alan E. Guttmacher, M.D., acting director of the Eunice Kennedy Shriver National Institute of Child Health and Human Development (NICHD), the NIH Institute that funded the study. "Endometrial stem cells are widely available, easy to access and appear to take on the characteristics of nervous system tissue readily."
Parkinson's disease results from a loss of brain cells that produce the chemical messenger dopamine, which aids the transmission of brain signals that coordinate movement.
This is the first time that researchers have successfully transplanted stem cells derived from the endometrium, or the lining of the uterus, into another kind of tissue (the brain) and shown that these cells can develop into cells with the properties of that tissue.
The findings appear online in the Journal of Cellular and Molecular Medicine.
The study's authors were Erin F. Wolff, Xiao-Bing Gao, Katherine V. Yao, Zane B. Andrews, Hongling Du, John D. Elsworth and Hugh S. Taylor, all of Yale University School of Medicine.
Stem cells retain the capacity to develop into a range of cell types with specific functions. They have been derived from umbilical cord blood, bone marrow, embryonic tissue, and from other tissues with an inherent capacity to develop into specialized cells. Because of their ability to divide into new cells and to develop into a variety of cell types, stem cells are considered promising for the treatment of many diseases in which the body's own cells are damaged or depleted.
In the current study, the researchers generated stem cells using endometrial tissue obtained from nine women who did not have Parkinson's disease and verified that, in laboratory cultures, the unspecialized endometrial stem cells could be transformed into dopamine-producing nerve cells like those in the brain.
The researchers also demonstrated that, when injected directly into the brains of mice with a Parkinson's-like condition, endometrial stem cells would develop into dopamine-producing cells.
Unspecialized stem cells from the endometrial tissue were injected into mouse striatum, a structure deep in the brain that plays a vital role in coordinating balance and movement. When the researchers examined the animals' striata five weeks later, they found that the stem cells had populated the striatum and an adjacent brain region, the substantia nigra. The substantia nigra produces abnormally low levels of dopamine in human Parkinson's disease and the mouse version of the disorder. The researchers confirmed that the stem cells that had migrated to the substantia nigra became dopamine-producing nerve cells and that the animals' dopamine levels were partially restored.
The study did not examine the longer-term effects of the stem cell transplants or evaluate any changes in the ability of the mice to move. The researchers noted that additional research would need to be conducted to evaluate the safety and efficacy of the technique before it could be approved for human use.
According to the researchers, stem cells derived from endometrial tissue appear to be less likely to be rejected than are stem cells from other sources. As expected, the stem cells generated dopamine producing cells when transplanted into the brains of mice with compromised immune systems. However, the transplants also successfully gave rise to dopamine producing cells in the brains of mice with normal immune systems.
According to Dr. Taylor, because women could provide their own donor tissue, there would be no concern that their bodies would reject the implants. Moreover, because endometrial tissue is widely available, banks of stem cells could be established. The stem cells could be matched by tissue type to male recipients with Parkinson's to minimize the chances of rejection.
In addition, Dr. Taylor added that endometrial stem cells might prove to be easier to obtain and easier to use than many other types of stem cells. With each menstrual cycle, women generate new endometrial tissue every month, so the stem cells are readily available. Even after menopause, women taking estrogen supplements are capable of generating new endometrial tissue. Because doctors can gather samples of the endometrial lining in a simple office procedure, it is also easier to collect than other types of adult stem cells, such as those from bone marrow, which must be collected surgically.
"Endometrial tissue is probably the most readily available, safest, most easily attainable source of stem cells that is currently available. We hope the cells we derived are the first of many types that will be used to treat a variety of diseases," said senior author Hugh S. Taylor, M.D., of Yale University. "I think this is just the tip of the iceberg for what we will be able to do with these cells."
###
The NICHD sponsors research on development, before and after birth; maternal, child, and family health; reproductive biology and population issues; and medical rehabilitation. For more information, visit the Institute's Web site at http://www.nichd.nih.gov/.
The National Institutes of Health (NIH) — The Nation's Medical Research Agency — includes 27 Institutes and Centers and is a component of the U. S. Department of Health and Human Services. It is the primary federal agency for conducting and supporting basic, clinical, and translational medical research, and it investigates the causes, treatments, and cures for both common and rare diseases. For more information about NIH and its programs, visit http://www.nih.gov.


Tongue Piercings Could Lead to Severe Brain Infections

She was young and wild, a bit of a rebel. The 22-year-old woman's body art announced that message as subtly as a billboard on a highway. No stranger to illegal drugs, she had inhaled or injected the worst of them, including cocaine and heroin. But recently she had turned a corner. Last month she removed her new tongue piercing only two days after getting it, and she hadn't injected any drugs for the last five months. Ironic that after cleaning up her act she should now find herself stricken with a throbbing headache that was unfazed by aspirin. Severe nausea, vomiting and vertigo drove her to the hospital.
The results of an HIV test were negative. A neurological exam found mild ataxia (un-coordination) in her left leg, which she had already noticed. This alerted her doctors to a potential problem in the right side of her brain in the region of motor co-ordination, called the cerebellum. A CT scan and an MRI delivered the diagnoses with alarming clarity. The woman was suffering from a festering brain abscess in her cerebellum. Emergency brain surgery was scheduled to remove the diseased brain tissue and drain the infection. Treatment with strong antibiotics was begun immediately. Laboratory analysis revealed that the infected brain tissue was a harrowing cesspool of infectious bacteria that included Streptococcus, Peptostreptococcus, Actinomyces and Eikenella. If she hadn't injected any drugs in five months, how did these flesh-eating bugs get into her brain?
On the other side of the world from the New Haven, Connecticut hospital where the woman was treated, a 22-year-old Israeli man, who in outward appearance might have made a compatible match to the rebellious young woman, was suddenly stricken with a high fever and profound fatigue. The young man had always been blessed with youthful vigor and unlike the woman, he never abused illegal drugs. Without warning the man rapidly developed global aphasia; that is, the inability to speak or write or understand written or spoken language. His neurologists knew with certainty that his left cerebral cortex was impaired. This part of the brain is where Broca's and Wernicke's areas are located, which control speech and language comprehension. Soon the right side of his body became paralyzed.
A CT and MRI scan revealed 13 horrifying ring-shaped bleeding abscesses the size of ping-pong balls in the man's brain. Lab tests showed that he was negative for HIV and cystic fibrosis, but his blood count confirmed that his body was fighting a life-threatening infection. There were too many abscesses to remove surgically. A brain biopsy revealed the pus-filled brain tissue was swarming with the nasty bacteria Streptococcus intermedius.

This bacterium is part of the normal flora of the mouth and upper respiratory tract, but when the germ gets inside the body it forms life-threatening abscesses in the liver, brain or inner lining of the heart. Streptococcus intermedius infections of the brain are usually the result of head trauma or complications after brain surgery, but this man had been perfectly healthy until 22 days ago. What had happened to this man two weeks earlier?
A medical history revealed that nothing remarkable had occurred to the man recently, except that he had received a tongue piercing two weeks before. A common thread tied the young man and woman's fate together in the medical literature--a link through lingual baubles to brain infection.
The woman would survive, but after suffering three more weeks in the hospital, the young man would lose his life to the germs that entered through the piercing in his tongue. They invaded without causing any local infection of the tongue or producing a fever, and they silently worked their way into his brain where they turned it to pus.
Thirty-six percent of college age males and 62 percent of college age females have body piercings (not including earlobe piercings in women). In women, 10 percent of these piercings are in the nose and 11 percent are through the tongue. For college age men the rates are 1.2 and 4 percent for nose and tongue piercings. 70 percent of people with tongue piercings report complications, ranging from local infections, eroded gums, chipped teeth, and more serious systemic infections including hepatitis B and C.
When you think about it, this really should come as no surprise. The mouth and nasal passages are a veritable incubator of nasty disease-causing bacteria. We all suffer sore throats, respiratory and nasal infections as a result, some of them quite serious. The surprising thing is how resistant our vulnerable tongue is to infection--unless you poke a hole through it. The tongue is shielded with a thick, tough outer layer of skin and it is bathed continually with saliva containing antimicrobial proteins. In contrast to earlobes, the tongue is richly supplied with blood, which provides an invading germ ready access to the blood stream, to spread infection throughout the body. The veins that drain the tongue connect directly to the internal jugular vein, which is a direct route into the brain. Earlobes are cleaned with surgical antiseptic before piercing them, but the tongue is not prepped before stabbing a hole through it. Mouthwash usually precedes the needle, but that is more for the benefit of the person doing the piercing.
Disease attacking the brain is perhaps the most dreaded of all disorders for most people. You can't avoid most of them, but some of them you can.
---
From a study by Herskovitz, et al., published in the October 2009 issue of the journal Archives in Neurology, and Martinello and Cooney, published in the January 2003 issue of the journal Clinical Infectious Diseases.

Treatment-Resistant Depression Responded To Magnetic Brain Stimulation In Trial

A treatment that uses magnetic currents to stimulate parts of the brain appeared to induce remission in patients with treatment-resistant depression, concluded researchers who tested the method in a randomized trial.

You can read a report on the National Institute of Mental Health funded trial online in the May issue of Archives of General Psychiatry. The research was the work of first author Dr Mark S. George, from the Brain Stimulation Division, Department of Psychiatry, at the Medical University of South Carolina, Charleston, and colleagues from this and other research centers in the US.

For some patients with depression, a disabling disease that is costly to treat, psychotherapy and drugs don't work, and some researchers have suggested that daily transcranial magnetic stimulation (rTMS) of the left prefrontal brain may be an effective alternative.

However, while rTMS has been studied as a potential treatment for depression, the quality of previous research is questionable, said the authors, explaining that one of the problems is how to mask the "sham" condition.

When designing a trial to test a device as opposed to a drug, the equivalent to controlling for the placebo effect is to mask the "sham" condition, ie hiding from participants whether they are receiving "real" exposure or a "sham".

To mask the "sham" condition in this study, George and colleagues blanked off the magnetic field with a metal plate inserted in the rTMS device. Thus some participants were exposed to a "real" magnetic field (the treatment group), and others were exposed to a "sham" magnetic field (the control group), but none of them knew which group they were in.

They recruited 190 patients with depression who were not on medication and ramdomly assigned 92 to the treatment group and 98 to the control or "sham" group. The patients were attending 4 US university hospital clinics.

The treatment group received magnetic stimulation of the left prefrontal cortex for 37.5 minutes every day for three weeks.

The control group received a sham treatment that gave them the same sensory feeling of being stimulated using a similar coil and electrodes attached to the scalp but no exposure to a magnetic field: the plate was in place inside the device.

86 per cent of the treatment group and 90 per cent of the sham group completed the treatment. Among these the results showed that:
  • For 14.1 per cent of the treatment group, their depression went into remission.
  • This compared with 5.1 per cent in the sham group.
  • The odds of achieving remission were 4.2 times greater in the treatment group.
  • In other words, for every 12 patients receiving this treatment, 1 would remit from depression.
The researchers concluded that:

"Daily left prefrontal rTMS as monotherapy produced statistically significant and clinically meaningful antidepressant therapeutic effects greater than sham."

They noted that "patients, treaters, and raters were effectively masked", and in fact this was one of the most important aspects of the study:

"... no one who knew the randomization status of the patient ever came in contact with the patient or interacted with the data," they wrote, explaining that they developed a new sham system that simulated the stimulation experience so it felt the same as the real thing.

At the end of the treatment phase, patients, treaters and clinical raters (who assessed the patients' depression) were asked to guess which group they were in: the treatment or the sham group. Only the treaters guessed at a rate that was more accurate than chance, but they were not very confident of their responses, said the authors.

"Daily Left Prefrontal Transcranial Magnetic Stimulation Therapy for Major Depressive Disorder: A Sham-Controlled Randomized Trial."
Mark S. George; Sarah H. Lisanby; David Avery; William M. McDonald; Valerie Durkalski; Martina Pavlicova; Berry Anderson; Ziad Nahas; Peter Bulow; Paul Zarkowski; Paul E. Holtzheimer III; Theresa Schwartz; Harold A. Sackeim.
Arch Gen Psychiatry, Vol. 67, No. 5, May 2010.

Exercise 'rebuilds parts of brain'

Physical exercise may help to rebuild parts of the brain that are lost with age, a study has suggested.
Epileptic seizures might also trigger brain cell regeneration, according to animal studies.
Scientists believe the discovery may lead to new ways of tackling age-related memory loss or the effects of brain injuries or Alzheimer's disease.
It used to be thought that from birth onwards brain cells died off but were not replaced. Now it is known that at least some nerve cells can be replenished in the hippocampus, the brain region that plays a key role in learning and memory.
However, a large proportion of the stem cells that give rise to new neurons remain dormant in adults. The new research in mice shows how these cells can be "kick-started" into action by physical activity or epileptic seizures.
Scientists in Germany found that physically active mice developed more newborn hippocampal neurons than inactive animals.
"Running promotes the formation of new neurons," said study leader Dr Verdon Taylor, from the Max Planck Institute of Immunobiology in Freiburg.
Abnormal brain activity, as occurs during epileptic seizures, also appeared to trigger neuron generation.
Excessive formation of new nerve cells is thought to play a role in epilepsy, said Dr Taylor, whose research appears in the journal Cell Stem Cell.

Doctor tests new drugs on brain cancer patients

Researcher receives $1M in grants
Dr. Peter Forsyth (centre) and grad student David Liu stand with patient Lisa Sangregorio in Forsyth's lab in the University of Calgary medical centre on Tuesday May 5, 2010. Forsyth has received special funding from the Canadian Cancer Society to help continue his groundbreaking research in brain cancer research. Sangregorio is a mom of two who is still fighting a brain tumour, but  is still alive today thanks to Dr. Forsyth.
Dr. Peter Forsyth (centre) and grad student David Liu stand with patient Lisa Sangregorio in Forsyth's lab in the University of Calgary medical centre on Tuesday May 5, 2010. Forsyth has received special funding from the Canadian Cancer Society to help continue his groundbreaking research in brain cancer research. 
 
Sangregorio is a mom of two who is still fighting a brain tumour, but is still alive today thanks to Dr. Forsyth.
Lisa Sangregoria -- a mother of two young girls, still breast-feeding the youngest at 13 months of age -- awoke one morning to a terrible headache. She was in so much pain, she could hardly walk and was seeing double.
Knowing she had to take care of her kids, she shook it off, until the same condition, only slightly worse, happened again three days later.
A trip to emergency ended in the worst news possible: a malignant brain tumour was growing above her left eye, surrounded by a large cyst.
"I was so shocked, so confused. I was completely healthy, I was active, a marathon runner, busy taking care of my two girls," says Sangregoria.
"For our family, it was beyond traumatic, devastating, hard on my husband and the girls."
She survived two difficult surgeries that were able to remove up to 60 per cent of her tumour. She also endured radiation and two rounds of chemotherapy, the last of which forced her to a bloated weight of nearly 245 pounds.
But for more than two years, her tumour has stopped growing, and remains manageable.
The 37-year-old stay-at-home mom applauded the Canadian Cancer Society on Wednesday for granting her oncologist, Dr. Peter Forsyth, $300,000 for a study testing a new class of drugs on patients with brain tumours.
"He is such an amazing doctor -- his persistence, his patience amazes me. And he fights everyday to save the lives of people with brain tumours."
Forsyth, well-known for his progressive research, says brain cancer continues to be one of the most devastating cancers.
Close to 2,600 Canadians will be diagnosed with brain cancer this year. The most common form, malignant glioma, cuts life expectancy to about one year.
"Brain cancer is really nasty. Not only do we want to be able to treat it better, but we want to move to a place where we can match the type of treatment to a particular treatment."
Forsyth is also hoping to discover how to identify whether a specific patient will respond to a specific type of cancer drug by looking at proteins and genetic makeup.
"Every patient is different, and responds differently to certain drugs.
"We want to be able to make sure we are giving a patient a drug that will work, so they're not wasting their time," Forsyth says.
"And when we know a certain drug won't work, we can also tell them, 'You only have a year, you may as well go to Hawaii.' "
Sangregoria was one of the lucky ones who discovered her second chemotherapy drug inhibited the growth of her tumour.
But that was only after she lived through six months of radiation that had no impact, and nine months of an initial chemotherapy drug that inhibited some growth but allowed the tumour to spread elsewhere in her brain.
"It's almost like pouring chocolate syrup into milk, stirring it, and then trying to find where the chocolate milk started," Forsyth said.
"It's very difficult, but we have to keep trying."
Forsyth says the new class of drugs may one day benefit patients such as Sangregoria, but he adds, knocking on a wooden counter, she will hopefully continue to stay in remission and never have to find out.
She can't drive for fear of seizures, but is able to walk her girls to and from school, and enjoy a somewhat more normal family life.
Seven years after her original diagnosis, she's still in remission, and happy to tell her story with a smile and an inspiring sense of humour.
Forsyth is also receiving a $705,000 Canadian Cancer Society grant to test the ability of a common virus to destroy brain cancer cells.
The Canadian Cancer Society, the largest national charitable funder of cancer research, contributed nearly $50 million to research projects across the country last year.
Today, more than 62 per cent of Canadians will survive a cancer diagnosis -- nearly double the survival rate of the 1960s.

Brain Develops Differently in Fragile X Syndrome

MRI scans reveal anatomical alterations in children with genetic disorder

THURSDAY, May 6 (HealthDay News) -- Brain development in very young boys with fragile X syndrome differs from that in boys without the genetic disorder, a new study has found.
Fragile X syndrome, which is triggered by a mutation in a gene on the X chromosome, is the leading cause of inherited intellectual disability and autism. Though the syndrome affects about one in every 4,000 people, males with the disorder experience more significant symptoms than females.
U.S. researchers used high-resolution MRI to monitor long-term changes that differentiated the brain anatomy of 41 boys with fragile X syndrome and a control group of 21 healthy boys and seven other children who were experiencing developmental delays not caused by fragile X syndrome.
Detailed images of the children's brains were first taken when they were 1 to 3 years old. Follow-up images were taken up to two years later. The first sets of images revealed that the children with fragile X syndrome had an overabundance of gray matter in some brain regions (caudate and thalamus) and a reduced amount of gray matter in a part of the cerebellum called the vermis.
The findings suggest that the genetic mutation had already started to cause identifiable, consistent alterations in brain development, perhaps even before birth, the study noted.
The researchers also found that other areas of the brain, such as the basal forebrain and many regions of the cerebral cortex, were the same in children with fragile X syndrome and those in the control group at the first imaging session. However, differences were seen two years later, which suggests that certain effects of the X chromosome mutation become evident only later in brain development.
The study findings were published online May 3 in the Proceedings of the National Academy of Sciences.
"A number of years ago, we saw new treatments [for fragile X syndrome] quickly coming down the line," Dr. Allan Reiss, a professor of psychiatry and behavioral sciences and radiology at the Stanford University School of Medicine and the study's senior author, said in a university news release. "We wanted to provide information that could be used to guide those treatments."
Knowing where and when fragile X syndrome affects brain development can help researchers monitor the effects of new treatments, Reiss explained.

Dark chocolate 'can reduce risk of brain damage after stroke

Dark chocolate can reduce the risk of brain damage following a stroke, a new study has found.
Researchers discovered that a compound called epicatechin, commonly found in dark chocolate, protects the brain against stroke by shielding nerve cells. 
A team of researchers based their findings on tests in mice and it is hoped the effects can soon be replicated in humans.
Dark chocolate
Researchers have found that eating dark chocolate can reduce the risk of brain damage
The researchers gave the mice a dose of epicatechin and then induced a stroke in animals by cutting of blood supply to their brains.
They found that the animals that had taken the epicatechin had significantly less brain damage than the ones that had not been given the compound.
And researchers found epicatechin was a better treatment for stroke than current methods.

THE WONDERS OF DARK CHOCOLATE

It may be bad for the waistline but most Britons can't resist the lure of a bar of chocolate.
The people of Britain are Europe's top chocaholics, munching their way through 605,000 tonnes of it a year -  around a stone and half each - and a quarter of the Continent's entire ration of chocolate.
While most will have been aware that their favourite sweet treat is packed with fat and sugar, many will have consoled themselves with the plethora of health benefits linked to chocolate.
One of the key attractions for many is that chocolate simply makes us feel good - stimulating the release of chemicals more normally associated with sex and exercise.
Researchers have even gone as far as to claim that the smell of chocolate alone can protect against colds.
Another ingredient of chocolate, theobromine, has proved to be better at suppressing a tickly throat than the medication used in cough mixtures.
A flavanol called epicatechin appears to be able to stave of illnesses from heart disease to cancer.
British research has shown that epicatechin also boosts blood flow to the brain - a property which could cut the risk of dementia, as well as staving off fatigue.
Flavanols - which are most abundant in dark chocolate - can also help keep diabetes and high blood pressure under control.
Studies have also shown that snacking on 20g of dark chocolate - roughly half a small bar - morning and evening - helps keep stress at bay.
But, sadly for chocolate lovers, the treat's high fat and sugar content means dieticians recommend it is eaten as part of a balanced diet, rich in less appealing foods such a brown rice, pulses and fruit and vegetables.
Researchers from America's Johns Hopkins University say the findings could be important in the possible treatment of strokes.
Associate Professor Sylvain Dori said: 'Animals that had preventively ingested the epicatechin suffered significantly less brain damage than the ones that had not been given the compound.
'While most treatments against stroke in humans have to be given within a two- to three-hour time window to be effective, epicatechin appeared to limit further neuronal damage when given to mice 3.5 hours after a stroke. 
'Given six hours after a stroke, however, the compound offered no protection to brain cells.'
Prof Dore said the finding could a step forward in our understanding of strokes.
'I hope this research into these pathways could lead to insights into limiting acute stroke damage and possibly protecting against chronic neurological degenerative conditions, such as Alzheimer's disease and other age-related cognitive disorders.
'The amount of dark chocolate people would need to consume to benefit from its protective effects remains unclear, because we have not studied it in clinical trials.  
'People shouldn't take this research as a free pass to go out and consume large amounts of chocolate, which is high in calories and fat.
'In fact, people should be reminded to eat a healthy diet with a variety of fruits and vegetables.'
The study has been published in the Journal of Cerebral Blood Flow and Metabolism.
Prof Dore said scientists have been intrigued by the potential health benefits of epicatechin by studying the Kuna Indians, a remote population living on islands off the coast of Panama.  
He added: 'The islands' residents had a low incidence of cardiovascular disease. Scientists who studied them found nothing striking in the genes and realized that when they moved away from Kuna, they were no longer protected from heart problems.
'Researchers soon discovered the residents of Kuna regularly drank a very bitter cocoa drink, with a consistency like molasses, instead of coffee or soda. The drink was high in the compound epicatechin.'
But Prof Dori said the amount of epicatechin needed could end up being quite small because the suspected beneficial mechanism was indirect.
He explained: 'Epicatechin itself may not be shielding brain cells from free radical damage directly, but instead, epicatechin, and its metabolites, may be prompting the cells to defend themselves.'