Monday, May 29, 2017

Kids' vitamin gummies—unhealthy, poorly regulated and exploitative

Sugary gummie vitamins are no substitute for a healthy diet.

There are many brands of kids' "gummies" on the market. They are promoted as deliciously flavoured and a great way for growing bodies (and fussy eaters) to get the nutrients they need.

The "active" ingredients are usually listed as vitamins, minerals and sometimes omega-3 fats and vegetable powders. They may say "contains sugars" or they may not. Rarely, some list an amount of sugar and other ingredients such as food acids like citric acid, lactic acid and ascorbic acid.

In our opinion, these products are unhealthy and exploitative. Their high sugar content may appeal to young children, but they're not a good introduction to a healthy diet.

The problem of tooth decay

Dental caries are a significant Australian public health problem. In 2014-15, A$9.5 billion was spent on dental services in Australia, up from $6.1 billion in 2007–08. In Australia, around 50% of children start primary school with largely untreated cavities. In Victoria, 7.1% of children aged under 12 have had a general anaesthetic for dental treatment.

Sugars provide food for the bacteria that dissolve tooth enamel. As sugar consumption increases, so do cavities. This damage is irreparable and individuals are left with life-long problems that require fillings, and possibly root canal work or extractions. In addition, food acid (especially citric acid) causes dental erosion that can lead to the progressive loss of the surface of the tooth. This may require complex and lengthy treatment involving fillings, veneers and crowns. The sticky consistency of "gummies" adds to the problem.

The World Health Organisation (WHO) says higher rates of dental caries occur when the intake of free sugars (added sugar plus honey, syrups and sugars in fruit juices) is more than 10% of total energy intake. This is despite fluoride in drinking water and using toothpaste.

Dental caries rates decline progressively as sugar intake is reduced to less than 5% of total energy intake. Hence, for a range of health reasons, the WHO recommends we get no more than 5 to 10% of our daily energy from free sugars.

So, two- to three-year-olds with a daily energy intake of 4,300 to 5,450 kilojoules (kJ) shouldn't consume more than a maximum 430 to 545 kJ, or about six to eight teaspoons (25-32g) of free sugar a day, and preferably half that amount. And four- to eight-year-olds, with a daily energy intake of 5,700 to 7,100 kJ, shouldn't consume more than 570 to 710 kJ, or about eight to ten teaspoons (33-42g) a day, and again, preferably half that.

Contrary to this advice, 50% of Australian children aged two to three, and 67% of four- to eight-year-olds, consumed more than 10% of their total energy from free sugars in 2011-12. The top 10% of two- to three-year-old boys consumed 18 teaspoons (70g), rising to 23 teaspoons (90g) in the top 10% of four- to eight-year-olds.

Tooth decay is a significant and expensive health problem

Knowing how much sugar is in what we eat

Part of the problem is there is currently no clear way of knowing how much sugar has been added to a product (including gummies) by looking at the ingredients listed on the label. Choice (the Australian Consumers' Association) is campaigning for food and health ministers to act on added sugar labelling so consumers can limit their consumption, as advised by the WHO and other authorities.

"Gummies" also exemplify the problem of regulating products at the food-medicine interface. Some of these products, such as the Kids Smart Vita Gummies above, are listed with the Therapeutic Goods Administration (TGA) as complementary medicines.

For complementary medicines, there is a requirement to declare the presence, but not the quantity, of sugars on the label.

For no apparent reason, other "gummies" such as Bioglan Omega 3 Fish Oil Kids Gummies have not been listed with the TGA and may be classified as foods by their sponsor.

For food, there is a requirement by Food Standards Australia New Zealand (FSANZ) to disclose the total content of sugars on the nutrition information panel on the product label.

The Bioglan website states each bottle of 60 gummies contains 168g of product; an average serving is two gummies (5.6g), which the formulation states have 3g sugar (54% by weight). They also stated there was 3mg of sugar per 100g of product which is clearly mislabelled; 100g of product must contain 54g of sugar, not 3mg.

Using the TGA Food-Medicine Interface Guidance Tool, we determined this product was a food, so we sent a complaint about mislabelling to the NSW Food Authority. However, they advised us to send the complaint to the TGA. The TGA response ignored our concern about mislabelling. We also asked why there were different sugar labelling requirements for foods compared to medicine. The TGA stated the warning statement, "contains sugar", serves as an advisory without unnecessarily deterring general consumers from taking a medicine they may need.

It is our view "gummies" that contain food acids, and have a high sugar content, are not medicines consumers need, and their sale should be prohibited on public health grounds. At the very least, the amount of sugar (and the presence of food acids) should be disclosed.

Health benefits dubious

In addition to the high and damaging sugar content, we argue these are exploitative products that mislead consumers about the benefit of dietary supplements.

Both the website and the label of Kids Smart Vita Gummies Multivitamin for Fussy Eaters say the zinc content will boost the appetite of a "fussy eater". Zinc is readily available in foods such as meat, fish and poultry while cereals, grains and dairy foods also contribute substantial amounts. We are unaware of any evidence that zinc boosts the appetite of "fussy eaters".

Kids Smart omega-3 supplementation claims "to help support brain function, growth and development". The US Food and Drug Administration recommends eating oily fish two to three times a week. They do not recommend taking omega 3 supplements, reflecting findings that randomised controlled trials of fish oil supplementation have generally been disappointing and fish contain many more nutrients than omega-3 supplements.

Gummy vitamins are unhealthy and exploitative products that mislead parents about the benefits of dietary supplements. The TGA and FSANZ should urgently review the regulation of these products.

Neurons can learn temporal patterns

Individual neurons can learn not only single responses to a particular signal, but also a series of reactions at precisely timed intervals. This is what emerges from a study at Lund University in Sweden.

"It is like striking a piano key with a finger not just once, but as a programmed series of several keystrokes," says neurophysiology researcher Germund Hesslow.

The work constitutes basic research, but has a bearing on the development of neural networks and artificial intelligence as well as research on learning. Autism, ADHD and language disorders in children, for example, may be associated with disruptions in these and other basic learning mechanisms.

Learning is commonly thought to be based on strengthening or weakening of the contacts between the brain's neurons. The Lund researchers have previously shown that a cell can also learn a timed association, so that it sends a signal with a certain learned delay. Now, it seems that a neuron can be trained not only to give a single response, but a whole complex series of several responses. The brain's learning capacity is greater than previously thought

"This means that the brain's capacity for learning is even greater than previously thought!" says Germund Hesslow's colleague Dan-Anders Jirenhed. He thinks that, in the future, artificial neural networks with "trained neurons" could be capable of managing more complex tasks in a more efficient way.

The Lund researchers' study focuses on the neurons' capacity for associative learning and temporal learning. In the experiments, the cells learned during several hours of training to associate two different signals. If the delay between the signals was a quarter of a second, the cells learned to respond after a quarter of a second. If the interval was half a second, the cells responded after half a second, and so on.

The researchers now show that the cells can learn not only one, but several reactions in a series. "Signal – brief pause - signal – long pause - signal" gives rise to a series of responses with exactly the same intervals of time: "response – brief pause – response - long pause - response".

The cells studied by the researchers are called Purkinje cells and are located in the cerebellum. The cerebellum is the part of the brain that controls bodily position, balance and movement. It also plays an important role in learning long series of complicated movements which require precise timing, such as the movements of the hands and fingers when playing the piano.

Lyme Isn’t the Only Disease Ticks Are Spreading This Summer

IT STARTED WITH vomiting and a fever. But a few days later, five-month old Liam was in the emergency room, his tiny body gripped by hourly waves of seizures. X-rays and MRIs showed deep swelling in his brain. When an infectious disease specialist at Connecticut Children’s Medical Center diagnosed Liam with Powassan virus in November, he became the first recorded case in state history. Doctors think Liam picked up the rare neurological disease from a tick his father brought back after a deer hunting trip.

The toddler survived with some scar tissue—but not everyone who gets Powassan, POW for short, is so lucky. With no treatment available, half of all people who contract the virus suffer permanent brain damage; 10 percent die. And while POW is nowhere near as prevalent as that other tick-borne summer scourge—Lyme—it is starting to show up more often.

Scientists disagree if that’s because doctor awareness and improved testing tools are just turning up more cases, or whether anthropogenic forces like climate change, reforestation, and suburban developments are driving up the likelihood that humans will come in contact with POW. But one thing’s for sure: The only way to get those answers is to collect more data. Entomologists and virologists have been saying this for years. Yet, current surveillance efforts are limited to counting cases only once they’ve reached hospital beds.

A Taste for Flesh

As far as emerging diseases go, Connecticut has had more than its fair share. Liam lives in the town of Griswold, just 30 miles down the road from Lyme, where in 1975 a mysterious outbreak of swollen knees, skin rashes, headaches, and severe fatigue swept through the town. Lyme disease has reached epidemic proportions in recent years; cases have tripled in the US over the last two decades as the tick that spreads the illness has expanded its territories from a few northern pockets into half of all US counties.

At some point along its manifest destiny tour, Ixodes scapularis, the blacklegged, or deer tick, seems to have picked up POW. Maybe it was in a skunk burrow or a badger den—historically, POW is carried around by a different kind of tick that prefers members of the weasel family. And unlike their weasel-loving brethren, deer ticks, in addition to their namesake host, have a real taste for human flesh.

This is one of the reasons Phil Armstrong, a virologist and medical entomologist with the Connecticut Agricultural Experiment Station, believes the risk for getting POW has increased over the last few decades. The research station, which has one of the country’s longest-running tick-borne disease programs, has data going all the way back to the ’70s.

Back then, scientists collected and screened thousands of ticks looking for Lyme and other diseases. Armstrong says none of them were carrying the POW virus. “If it was present then we would have detected it,” he says. “Now we go to those same locations and 2 to 3 percent of the ticks have POW.”

In another study, Connecticut researchers analyzed deer blood collected from animals that hunters had killed and brought in to DNR check stations, again going back to the ’70s. They found the prevalence of deer antibodies to the POW virus had increased substantially over a 40 year period. “Deer are always heavily parasitized by the deer tick,” says Armstrong. “But it’s only recently that they’re also getting exposed to this virus.”

Scientists like Armstrong estimate that POW is only prevalent in about 4 percent of deer ticks, way lower than the 30 to 40 percent prevalence of Lyme disease. But here’s the thing. Lyme disease, which is caused by a spiral-shaped bacterium, takes about 48 hours to transmit; if you find a tick on your body and remove it within a day or two, you can usually escape a Lyme infection. POW, on the other hand, goes from the tick’s body, through its saliva, and into your bloodstream within a few minutes of a bite. So even though it’s not in many ticks, if the right one gets you, there’s not much you can do.

The most public health officials can do is recommend wearing long sleeves and pants when hiking, and using repellents on your skin, gear and clothing. That, and staying away from high-tick areas. Of course, that’s easier said than done. Because most health agencies, including the Centers for Disease Control and Prevention, don’t have great ideas about where and when disease-carrying ticks are going to be out in full force.

“It’s very difficult to predict from year to year,” says Marc Fischer, a medical officer with CDC’s Division of Vector-Borne Diseases. “Changes in weather, temperature, humidity, in host species abundance, how many ticks are infected, where they are and how likely they are to bite humans, all these things are constantly changing and have to be factored in.”

Modeling Mayhem A model that can incorporate all those parameters could actually tell you something useful about disease risk when it comes to ticks. And that’s exactly what Goudarz Molaei is building right now. Molaei runs the state of Connecticut’s unique and long-running tick-testing program. Residents who get bitten send in their ticks and Molaei screens them for diseases like Lyme.

The problem is he can’t screen for POW because of cuts to the state budget. He’s working on trying to find other partners at academic institutions who might be able to change that. But for now, the only real data on POW in humans comes in from hospitals that are seeing patients with the most serious symptoms of the disease. They then pass samples on to state health departments and eventually the CDC, which records the cases. Molaei’s screening protocol could help catch a diagnosis earlier, or provide information about how often people contract the virus but don’t get the worst symptoms. But since he doesn’t have the funds to test for POW, for now he’s focusing on Lyme.

He’s using 20 years worth of data—from weather patterns to rodent and deer populations to habitat changes and disease abundance—to make a Lyme disease risk map of Connecticut. For the first time, it would tell residents where tick-borne disease hot spots are, and help public health authorities focus their efforts. He’s hoping to finish by this fall. And in the meantime he’s taking a much more blunt approach; he’s telling people they should all be on high alert. This spring, his mailbox has been inundated with ticks.

“Usually I can return test results to people in a few days,” says Molaei. “This month it’s been taking me more like two weeks because the number of submissions keeps going up.” In spring of 2014, he says he would get maybe 50 ticks a month. Now, he’s getting up to 200 ticks a day.

What changed? The weather. Two years of mild winters has tick populations booming. And Molaei expects to see more of the same, as climate change makes warmer winters in the northeastern US more common. Studies show that warmer-than-normal temperatures increase tick reproduction two to five times. As agricultural land increasingly turns back to forest and suburbs continue to encroach on these newly wild areas, people are also getting more and more opportunities to encounter ticks, and POW.

At least for now though, official numbers don’t reflect these increasing risks. Fischer says that based on the national reporting system currently in place—where state authorities send any cases they think might be POW virus to CDC labs for testing—the disease doesn’t appear to be on the rise. Or, if it is, it’s happening very slowly, and could just be an artifact of better testing technologies and physician awareness. “It’s very tough to tease out surveillance bias,” he says. “Especially with so little data to work with.”

Epidemiological data has historically been counted in numbers of sick people and dead bodies. But with technological advances in genetic sequencing, immunological screening, and computational biology, there’s no reason to wait for hospital admissions to begin structuring a public health response. First-of-their-kind risk maps like the one Molaei is building can help officials understand the environmental and economic drivers of tick-borne disease, in close to real time. Connecticut may be in the best position to lead the way, given its 20-year headstart on data collection. But other states should take note. Because ticks aren’t going anywhere, except almost everywhere.

How to deal with your mental health when the world feels like a scary place

Having anxiety and obsessive thoughts often boils down to a simple misunderstanding of the level of danger in a situation.

My brain tells me that leaving switches turned on will result in the house burning down, that if I don’t go and check the door seven times before bed I’ll get burgled, and – in the worst times – that if I go outside I’ll get murdered.

A big part of getting better has been reassuring myself that these thoughts aren’t rational. I tell myself that the world is not as dangerous as my brain makes it out to be, that I’m safe, that there’s no reason to be scared.

But last week things did get genuinely, truly scary.

I woke up to news that there had been an explosion in Manchester, where lots of my friends live, at an Ariana Grande concert that a few of them could well have been at.

Once I’d checked that those closest to me were safe, the horrible pit of fear in my stomach didn’t go away.

Martyn Hett, someone I barely knew outside of his social media presence but who’d showed me how truly lovely he was when we spoke about his mum for a story, was missing for the entire day. It was later discovered that he had died in the explosion.

That has, honestly, hit me hard.

I don’t in any way want to compare how I feel with what Martyn’s friends and family must be going through, or what the friends and family of other victims of the explosion are dealing with, so I won’t focus on that part of things.

Instead I’ll talk about the emotions that come not with being in any way involved in terrorist attacks, but of reading about them, watching the events unfold, spotting every single breaking news alert and feeling more and more afraid.

When every scroll through Twitter, every check of your phone, and every breaking news alert tells you of another horrible development in a news cycle – another death announced, a security threat level upped, a stabbing in your hometown – it’s easy for mental health issues to come up to the fore.

Quickly, nowhere feels safe.

Your brain tells you that everyone you know is in danger. It tells you that if you step outside, you’re going to get killed.

Every single part of normal daily life becomes terrifying – your commute, a journey through a crowded place, sitting at your desk at work. It’s scary enough if you don’t have mental illness to deal with. But when your brain takes threats and magnifies them, real-world issues can become genuinely debilitating.

Then you add in the guilt. The guilt of struggling so much mentally when there are people who’ve directly been affected by whatever awful thing is happening in the news.

Feeling lost, scared, and tearing up at every fresh news alert has made me feel like a massive drama queen, as though I’m making a genuine crisis all about me.

I feel ridiculous struggling with mental health issues when there are people in hospitals fighting for their lives.

There’s no quick fix, but here’s what’s been helping me to deal with mental health in the aftermath of terrifying real-life events. First off, I’m learning that it’s okay to turn off the news, stop looking at Twitter, and be a little out of the loop.

A lot of the time, we can feel like we have a duty to be informed, a responsibility to know everything that’s going on. We think it’s disrespectful to stop paying attention. I get that.

But when breaking news alerts become too much to handle, give yourself permission to turn off and look after yourself first.

There’s been a lot of research into the impact of watching traumatic events unfold through the news, most of which suggests that the more information you find out, the more horrible details you read, the higher your stress levels become. Being exposed to horrific events through news coverage can make you feel more vulnerable and at risk, even if you don’t have existing mental health issues.

If you can feel yourself getting too stressed by the news to cope – whether that means you’re having panic attacks, can’t get on with your daily routine, or find yourself thinking that the world is doomed and nothing will ever be good again – there’s nothing wrong with switching off.

At the end of the day, unless it’s your job, you don’t need to know all the details of a terror attack or be aware of every new development as it happens.

There’s been a lot of research into the impact of watching traumatic events unfold through the news, most of which suggests that the more information you find out, the more horrible details you read, the higher your stress levels become.

Being exposed to horrific events through news coverage can make you feel more vulnerable and at risk, even if you don’t have existing mental health issues.

If you can feel yourself getting too stressed by the news to cope – whether that means you’re having panic attacks, can’t get on with your daily routine, or find yourself thinking that the world is doomed and nothing will ever be good again – there’s nothing wrong with switching off.

At the end of the day, unless it’s your job, you don’t need to know all the details of a terror attack or be aware of every new development as it happens.

Over the last ten years, there have been 1.4 deaths per year caused by terrorism in the U.K.

In 2011 alone, there were 693 deaths caused by falling down steps in the U.K.

So really, your staircase is a much bigger threat to your life than terrorism.*

*Please don’t let that make you scared of stairs, I’m just pointing out that the risk posed by terror threats isn’t as huge as your brain may make it appear.

If you’re struggling with the mental effects of bad news, ask for help.

That might mean booking in an extra session with a therapist, even as a one-off. That’s perfectly normal and can be a massive help.

It’s important to let people know that you’re struggling – that you’re scared to leave the house, that your worries are becoming overwhelming, or that you can’t stop obsessing over the news. You’re not being self-obsessed or needy.

Some people will find it harder to deal with horrible events than others, but letting people know you’re finding things hard is crucial.

One other thing that’s helpful for everyone: look for the good.

In the aftermath of terrible things, there will always be stories of people coming together to support victims, rebuild the community, or just to remind everyone that there’s still good in the world.

Look for those stories. Take them in. Remember that while the world can be scary, people are overwhelmingly lovely. We’re going to be okay.


An extended sleep loss can make the brain go into hyper drive during the neural cleaning process.

A recently released study found a new, potential effect for periods of extended sleep loss. Apparently, the brain can still carry out the cleaning process usually performed while sleeping even when awake. But this also takes place at an accelerated rate, and on the long term, can seemingly cause more problems than help.

Research was based on studies and observations of four mice groups and their brains while sleeping or trying to. Study results became available in a paper in the Journal of Neuroscience. These found that astrocytes and microglial cells go into overdrive when the mice are sleep and severely sleep deprived. On the long term, this also affected their processes, as their increased activity started damaging the brain.

Research was carried out by Marche Polytechnic University in Italy scientists, led by Michele Bellesi. Together with his colleagues, he analyzed the brains of well rested or recently awoken mice. They also took a look at the brain structures of sleep deprived and severely sleep-deprived mice. In an unexpected turn, these were also noted to present the usual brain-cleaning process usually carried out during sleep.

As the body rests, the brain cleans itself up by renewing its cells and getting rid of the day’s unnecessary neural activity. It does so with help from astrocytes, which refresh and renew synapses, and microglial cells. The process actually named “to devour”, phagocytosis, sees the removal of old or worn out cells.

These processes were detected even in severely sleep deprived mice. But in such cases, it was also noted to be far more active, and potentially dangerous as by going into overdrive, it causes severe cleaning. From old and worn cells, microglial cells can seemingly pass to healthy ones. Astrocytes are also far more active in sleepless brains.

“Like many other stressors, extended sleep disruption may lead to a state of sustained microglia activation, perhaps increasing the brain’s susceptibility to other forms of damage,” stated the researchers.

The study team will now try to determine if the same process is also replicated in humans. Also, it will try to establish if getting enough sleep can help recover the connections lost during the excessive cleaning.

The revision diet: what's the best food and drink to help students focus?

Even at exam time, eating well is easy and can have a real effect on your concentration levels, nutrition experts say

What’s the best library lunch to give your brain the fuel it needs?

It’s heads-down revision time for exams and dissertations. The pressure’s on, so you’ll want all the help you can get to aid your memory and raise your grades (without smart drugs or cheating). Nutrition experts say that eating well can make a real difference to your revision regime – so what brain-boosting food and drink do they recommend?

How much caffeine is too much?

Coffee, green tea and energy drinks are staples of the all-night library stint. But how much caffeine is too much?

“Caffeine – particularly coffee – can have numerous benefits extending to cardiovascular health, insulin sensitivity, prevention of type 2 diabetes and acting as a potent antioxidant,” says nutritional therapist Daniel O’Shaughnessy. “However, while caffeine may make you more alert, individuals can build up a tolerance meaning this is short-lived. Caffeine can also increase blood sugar and eventually lead to dips causing lack of focus and energy.”

“It’s also worth bearing in mind that people react differently to caffeine,” says nutritional therapist Joanne Crovini. It has the potential to increase levels of the stress hormone cortisol. “Some people can drink it at midnight and go straight to sleep, whereas other people get teeth clenching and feelings of anxiety after a small amount.”

Most adults can tolerate single doses of caffeine up to 200mg and a daily intake of up to 400mg without any concerns, nutrition scientist Sarah Coe says; a mug of instant coffee is around 100mg and a cup of tea is 75mg of caffeine. “Remember that energy drinks and some soft drinks contain caffeine too, and coffee from a coffee shop may be stronger than coffee made at home. As broad advice I’d say stop drinking caffeine by 2pm and have a maximum of two cups of coffee or equivalent a day, but be aware of your own reaction to it.”

Wholegrain foods slowly release energy over the day.


Wholegrain foods will stave off hunger (advice on cooking some of them can be found here). Examples include porridge and wholemeal bread. Crovini explains that combining wholegrain with protein will help keep blood sugar levels balanced, which is essential for mood and concentration.

O’Shaughnessy agrees. Buying grains in bulk with your housemates is a great way to save money, as is avoiding the more overpriced “fad” grains, he says. “Brown rice, oats and buckwheat are good, cheap alternatives,” he says, adding that the high levels of magnesium in buckwheat also helps to calm nerves.

Nuts and berries

Berries and nuts are a convenient snack that pack a nutritional punch. “Blueberries, like many dark coloured fruits and vegetables, have a high antioxidant content, which is thought to protect the brain from oxidative damage and slow age-related decline,” explains Crovini. Frozen berries are usually cheap, last longer and don’t lose their nutrients when frozen. Less healthy are flavoured and coated nuts, which contain added oil, salt and sugars.

Ditch the supplements

Doctors often recommend taking vitamin supplements to top up on the nutrients you need – but these can be expensive. Fortunately, they’re not the only option. “Food should always come before supplements and the key to getting as many nutrients as possible is to eat as varied a diet as possible, with lots of different colours,” says Crovini. “Use frozen berries and dark green vegetables like savoy cabbage, which are reasonably priced.”

Coe agrees: it’s better to get everything you need from food and drink: “For example, oranges not only contain vitamin C [which boosts the immune system] but also fibre and other components that you can’t get packaged together in a tablet.”

Water improves memory performance.

Dark chocolate

Dark chocolate has a mild effect on increasing blood flow and reducing blood pressure, due to the polyphenol content, says Crovini. “It’s also a good source of magnesium, which is an essential mineral for relaxation.”

O’Shaughnessy recommends choosing chocolate that’s 80% or more in cacao to avoid any negative effects to teeth, skin and weight. The darker the chocolate, the less sugar in it.


A recent study by the University of East London and University of Westminster found that keeping hydrated can boost attention by almost 25%. “We found that drinking even a really small amount of water (25 ml) resulted in improved performance on a test of attention,” says Dr Caroline Edmonds, who co-authored the research. Drinking 300 ml improves memory performance and can improve your mood as well.

The experts’ recommended library lunch

Base your lunch on starchy foods, particularly wholegrain varieties, Coe says. Sandwiches, wraps and bagels are quick and easy to prepare, or you could use leftovers from the night before to make a pasta, rice or couscous salad.

Grainy salads with canned fish and vegetables are good if you don’t fancy bread. Tinned mackerel with beetroot, roasted sweet potato cubes, lots of green leaves like rocket or watercress and some pumpkin seeds, are ideal, Crovini says. Or try canned salmon with brown rice, canned chickpeas, chopped cucumber and tomato.

For sweetness, you’ll want the usual healthy stuff: a small pot of natural yogurt with either an apple, some berries or a chunk of dark chocolate.

Don’t skip meals, Crovini adds. Eating regularly will help keep blood sugar balanced and feed the brain with the fuel it needs.

A New Drug for A.L.S., but the Diagnosis Remains Dire

A neighbor of mine was recently told he has a devastating neurological disorder that is usually fatal within a few years of diagnosis. Though a new drug was recently approved for the illness, treatments may only slow progression of the disease for a time or extend life for maybe two or three months.

He is a man of about 60 I’ve long considered the quintessential Mr. Fix-it, able to repair everything from bicycles to bathtubs. Now he is facing amyotrophic lateral sclerosis, or Lou Gehrig’s disease — a disease that no one yet knows how to fix.

I can only imagine what he is going through because he does not want to talk about it. However, many others similarly afflicted have openly addressed the challenges they faced, though it is usually up to friends and family to express them and advocate for more and better research and public understanding.

A.L.S. attacks the nerve cells in the brain and spinal cord that control voluntary muscle movements, like chewing, walking, breathing, swallowing and talking. It is invariably progressive. Lacking nervous system stimulation, the muscles soon begin to weaken, twitch and waste away until individuals can no longer speak, eat, move or even breathe on their own.

Last year, the Centers for Disease Control and Prevention estimated that between 14,000 and 15,000 Americans have A.L.S., which makes it sound like a rare disease, but only because life expectancy is so short. A.L.S. occurs throughout the world, and it is probably far more common than generally thought.

Over the course of a lifetime, one person in about 400 is likely to develop it, a risk not unlike that of multiple sclerosis. But with the rare exception of an outlier like the brilliant physicist Stephen Hawking, who has had A.L.S. for more than 50 years, it usually kills so quickly that many people do not know anyone living with this disease. Only one person in 10 with A.L.S. is likely to live for a decade or longer.

The disease is most commonly diagnosed in middle age, among people in their 50s or 60s, though it sometimes afflicts young adults. Dr. Hawking was found to have it at age 21.

Early symptoms can be very subtle and thus are often overlooked or attributed to a minor problem like lack of sleep, undue stress, overwork or poor diet. However, the underlying damage can start long before the symptoms are noticed. Given the redundancy built into the brain, about a third of motor neurons are destroyed before signs of muscle loss become apparent.

Initial symptoms depend on which group of motor neurons are affected first. In about 70 percent of people, the first symptoms involve muscle weakness in the legs or arms that can result in frequent tripping, instability, stiffness, difficulty walking or inability to open a jar or turn a key. About one-quarter of cases start with muscle loss in the face, mouth and throat, resulting in slurring of speech and swallowing difficulties, and in 5 percent, the muscles of the trunk are first affected. However, in most people the disease soon spreads to affect nearly all voluntary movements.

Patients usually retain control over bladder and bowel function and eye movement until very late in the disease. In fact, after losing the ability to speak or write, many learn to communicate by looking at letters or words on a computer and using a voice synthesizer.

Sensory nerves and the autonomic nervous system are usually spared so that most with the disease can hear, see, touch, smell and taste. But as patients lose the ability to swallow, oral feeding creates a choking hazard — some, in fact, choke on their own saliva — and tube feeding becomes the only option for maintaining nutrition.

Half or more of patients remain mentally sharp, bearing painful witness to their physical decline, although mild cognitive and behavioral changes are fairly common, and 10 percent to 15 percent of patients develop symptoms of frontotemporal dementia. They may become withdrawn, apathetic, uninhibited, distractible and repeat words or gestures.

The cause or causes of A.L.S. are unknown in 90 percent to 95 percent of cases. The remaining cases are inherited from a parent who carries a mutation in one or more genes. Researchers are studying these genes in patients and engineered mice in hopes of developing drugs or stem cells that slow, stop or even reverse progression of the disease.

For example, a recent study published online in JAMA Neurology by researchers at Methodist Neurological Institute in Houston suggests that reducing inflammation by modifying certain abnormal immune cells may prove helpful to patients, especially if the treatment could be applied early in the disease.

Other recent studies in mice, yeast and fruit flies by researchers at Stanford University School of Medicine suggest that suppressing a certain protein called ataxin-2 may foster resistance to A.L.S. In mice genetically engineered to have A.L.S., Lindsay Becker, a graduate student, found that completely removing ataxin-2 enabled some of the animals to live “hundreds and hundreds of days,” instead of only a month.

Another promising avenue of research involves the abnormal behavior of an enzyme called RIPK1, which can damage neurons by disrupting the production of the myelin sheath that insulates axons, the neuron extensions that transmit signals from one cell to the next. Researchers at Harvard Medical School showed that in genetically engineered mice with A.L.S., a substance called necrostatin-1 not only restored the myelin sheath and stopped axon damage but also prevented limb weakness.

Currently, only two drugs have been approved for treating A.L.S. One is Rilutek (riluzole), which counters the elevated levels of the neurotransmitter glutamate that arise in the brains and spinal fluid of A.L.S. patients. Its limited effect on life span — an extension of a few months — suggests that excess glutamate is hardly the only noxious factor involved in the disease. The Food and Drug Administration just approved a second drug, Radicava (edaravone), said to slow progression of the disease in a six-month study in Japan, though its effects on survival are not yet known. It must be administered intravenously for 10 days every two weeks at a cost of more than $145,000 a year for the medication alone.

Aside from genetically transmitted familial cases, potential risk factors for A.L.S. include traumatic brain injury and exposure to toxic substances like lead and certain pesticides. The risk is higher than expected among military veterans, professional football players and athletes who took dietary supplements containing branched-chain amino acids. The Department of Veterans Affairs recognizes A.L.S. as a service-connected disease.

Big Data: a big boon for mental healthcare

Big data is also being used to augment mental healthcare at a more individual level

In the world of individual therapy as well, big data is making an entry

The amount of data being digitally collected and stored is vast and expanding exponentially. As a result, the science of data management and analysis is also advancing, so much so that now data is not just analysed in retrospect but used and collected to become predictive. Computer scientists have invented the term big data to describe this evolving technology.

Big data has been successfully used by search engines to customise your searches; by political analysts to understand voting patterns of different demographics in political cycles; and especially by pioneers in healthcare. Google Flu Trends aggregated google search queries and made predictions about influenza outbreaks and activities in more than 25 countries (the service is no longer analysing the data itself but now providing data to institutions specialising in infectious disease research). Twitter has also been used for population-level influenza surveillance, along with understanding public sentiment towards vaccination, and investigating prescription drug abuse to state a few.

GV (formerly known as Google Ventures) is investing 36% of its sizable $2 billion portfolio in life sciences start-ups. The same trend is now starting to manifest in mental health as well thanks to wide-spread application and use of big data which along with the maturation of Natural Language Processing and Machine Learning technologies is offering exciting possibilities for the improvement of both population level and individual level mental health.

Pioneering work by Munmun De Choudhury and colleagues have applied computational methods to monitor population health and identify risk factors for individuals for a number of mental health issues using various social media platforms, including Twitter, Facebook and Reddit. For the population health domain, a crowdsourced data set of tweets derived from Twitter-users with depression-indicative CES-D (Center for Epidemiological Studies – Depression) scores was collected and then used to train a statistical Machine Learning algorithm capable of identifying depression-indicative tweets. These were then applied to geocoded Twitter data derived from 50 US states and the results correlated well with the Centers for Disease Control depression data.

For the identification of risk factors for individual domain , Twitter data was used again to investigate experiences of postpartum depression in new mothers. Birth announcements from public Twitter data using phrases such as ‘it’s a boy/girl’ were automatically identified and then the pre and post-birth Twitter feed of the new mothers’ was analysed. It was found that using Machine Learning techniques along with analysis of pre-birth behaviour patterns could predict postnatal emotional and behavioural changes with 71% accuracy.

The World Well Being Project is another example of the fascinating ways in which big data can be used to augment population level mental healthcare. One of the studies conducted used 148 million tweets along with the Centers for Disease Control mortality data and found a high correlation between words characteristic of negative emotions and heart disease mortality figures—more highly correlated than official socio-economic, demographic, and health statistics.

Big data is also being used to augment mental healthcare at a more individual level. Researchers at the University of Chicago are developing an app which monitors sleep and activity patterns to combat depression in university students. When the app picks up on behaviours that match certain symptoms of depression—such as irregular movement and physical activity, disturbed or abnormal sleep patterns, social isolation, drop in class attendance—it not only gives real-time suggestions and activities such as a counsellor would, but the data is also analysed and transmitted so that the university counsellor can keep a better track of more students and extend services to those in need.

In the world of individual therapy as well, big data is making an entry. Feedback informed treatments or FIT are psychotherapy metrics drawing on historical data to predict when clients are at a risk of deterioration. These metrics are surveys that clients fill in as a part of their therapy, detailing their progression through the weeks of therapy. The algorithm then predicts which clients are at a high risk of drop out, relapse in the case of substance abuse or ‘deterioration’ as otherwise specified. These metrics are said to work in two ways. One, it provides an element of blunt feedback that therapists often lack from their patients. Two, risk alerts allow therapists to adjust treatments and can compensate for clinical blind spots or overconfidence.

Despite the advancements in the field of mental health and big data, it is far from perfect. While there is an abundance of mental healthcare apps (honestly just type in depression into your app store and a list of hundred programs will pop up—and that's just for depression), they have come under scathing indictment from the American Psychiatric Association’s (APA’s) Smartphone App Evaluation Task Force . Studies by the APA and University of Liverpool have found that, despite their ubiquity, many of the smartphone apps have a lack of an “underlying evidence base, a lack of scientific credibility, and limited clinical effectiveness.”

Despite so, this burgeoning industry meets an important need of getting treatment to those with limited or no access to it. Given the pervasiveness of smartphones, these apps might serve as a portable therapist—particularly in rural and low-income regions. Public health organisations seem to be buying the concept and in its Mental Health Action Plan 2013-2020, the World Health Organization (WHO) recommended “the promotion of self-care, for instance, through the use of electronic and mobile health technologies” as a part of its agenda.

Given the personal and private nature of mental healthcare, it is difficult to imagine something as disconnected as big data anywhere near it. This is probably why the pace of acceptance of big data into mental health is glacial, especially by mental health professionals. A particular issue of serious concern raised is that of privacy. There is no real framework in place to ensure that privacy and confidentiality of records remain intact; nor are there any clear boundaries for ethical data collection.

A question of ethics is also raised when we think of the possibility of computer algorithms making clinical diagnoses or recommendations. The technology seems to be moving at a faster pace than law makers can keep up with. Despite the errors and risks, the big data analytics boom holds significant promise for understanding and improving mental health at both the individual and population level.

Friday, May 26, 2017

Dieters note: six signs of dangerous malnutrition

Lose weight and die of starvation – not one and the same.

Trying to lose weight as quickly as possible, many people begin to literally starve. Not realizing that too harsh calorie restriction may not accelerate, but rather to stall weight loss.

Calorie restriction up to starvation almost always gives a result opposite of the desired effect. In fact, to burn calories, our bodies needed the calories. This is how to light a fire – you need to have something extra for ignition, at least a match. Food is not only fuel for the body, but also a means of firing: it “stirs up” the metabolism and triggering the process of burning calories. When food is not enough, the body begins to burn, but primarily to store energy.

Although calorie daily depends on activity level, age and gender, most women should eat 1200 to 1500 calories a day. If you fall below this level then the body becomes difficult to perform their basic biological functions that support overall health. Not sure if you fit the right numbers? There are several symptoms that points to critical lack of calories:

You always think about food

Can’t concentrate on your everyday duties, because all thoughts are about food? You are either eating too little or choosing the wrong foods that increase the feeling of hunger. Healthy snacks in between meals (yogurt, nuts, and dried fruits) will help you achieve optimal energy intake and concentrate on eating, and on current Affairs. In order not to suffer constantly from hunger, combine in a food product rich in proteins, and foods that are high in fiber. The same yogurt can be supplemented with fruit, a chicken breast with vegetables or cereals, cottage cheese – nuts.

You eat irregularly

I think the fewer there are, the faster you will lose weight? Too long – over 4 hours – missing meals, first, it is fraught with health, secondly, slow metabolism, and thirdly, more likely to lead to a breakdown of dieting and overeating. If you and then skip Breakfast or evening meal, you need to try to eat more regularly. It is recommended that 3 main meals and 2 small snacks a day. If you adhere to the strategy “Not to eat after 6”, then before bed you must, at least drink a glass of nonfat yogurt.

You have lost menstruation

If you lose weight properly, healthy and wholesome foods with enough vitamins and minerals, and reasonable exercise, then you should not notice any critical changes in the menstrual cycle. Lack of calories can lead to violation of menstruation, or even to stop them. The body is simply not enough fat to produce the hormones that trigger menstruation. All because of a reasonable nature simply does not allow for procreation in the conditions of a hypothetical hunger.

Do you suffer from migraines

Our brain operates on glucose coming from the blood, and the blood glucose falls due to eaten carbohydrates. Reduce calories and carbs to the deficit – and your brain is simply not enough energy. If you feel involuntary trembling, dizziness, weakness, if you have clouded consciousness or the onset of a migraine for no apparent reason, it could mean that you do not have enough calories and blood sugar dropped too low. Diabetics as hypoglycemia are well known, but those who never experienced skip-threatening symptoms very easily. In this case, you need to eat some “fast” carbs, simply put, something sweet. And in the future to avoid similar conditions by returning to the diet a little more calories.

You are too irritable

No wonder it is believed that fat people are good people. The hunger on the background of the lack of calories triggers the production of the stress hormone cortisol, which causes irritability. Incidentally, the same hormone responsible for “warehousing” fats. So, if you’re angry all the time, then you start to get fat even with a strict calorie restriction.

You fall asleep on the go

Haven’t done anything yet, and already tired? If the thought of work you tend to sleep, and the day you are constantly dozing off, the blame for this can be not the shortage of sleep and lack of calories. The body due to the lack of energy goes into “sleep mode”. To prevent this from happening, in any case can not refuse a Breakfast or replace it with a Cup of coffee. For Breakfast you need ample calories from protein and complex carbohydrates, which will provide a feeling of satiety and maintain during the working day.

Sugar IS fuelling various forms of cancer by giving tumours the energy they need to multiply, scientists admit

  • Range of previous studies have suggested that tumours actually thrive off sugar
  • They pointed to it using glucose as energy to mutate and spread across the body
  • Now experts have shown 1 type of cancer has more of a sweet tooth than others
  • The findings are worrying because 'we are very addicted to sugar', they said 

A sugar rich diet may be fuelling various forms of cancer, as new research confirms a long suspected belief.

Previous studies have suggested that tumours thrive off sugar, using it as energy to mutate and spread across the body. Now scientists have shown one type of cancer - which can be found in the lungs, head and neck, oesophagus and cervix - has more of a sweet tooth than others.

Squamous cell carcinoma (SqCC) was more dependent on sugar to grow, University of Texas experts found.

This form of the disease used higher levels of a protein that carries glucose to cells to enable them to multiply, they discovered.

Previous studies have suggested that tumours thrive off sugar, using it as energy to mutate and spread across the body. A new study has proved this theory to be true

Lead author Dr Jung-whan Kim said: 'It has been suspected that many cancer cells are heavily dependent on sugar as their energy supply. 'But it turns out that one specific type - squamous cell carcinoma - is remarkably more dependent.

'This type of cancer clearly consumes a lot of sugar. One of our next steps is to look at why this is the case.'

Writing in the journal Nature Communications, he warned the findings were worrying because 'as a culture, we are very addicted to sugar'.

He added: 'Excessive sugar consumption is not only a problem that can lead to complications like diabetes, but also, based on our studies and others, the evidence is mounting that some cancers are also highly dependent on sugar.

'We'd like to know from a scientific standpoint whether we might be able to affect cancer progression with dietary changes.'

Health officials across the world have stood firm on their stance towards sugar in recent years, despite growing evidence showing it to potentially fuel tumour growth.

Instead, they highlight the fact that all cells, not just cancerous ones, require energy, which is found in the form of glucose, to survive.

Writing in the journal Nature Communications, researchers warned the findings were worrying because 'as a culture, we are very addicted to sugar'

Without a sufficient supply of the sugar, each cell in the body would struggle to perform their duties.

Cancer Research UK make clear that cancerous cells aren't just dependent on sugar for their growth, as they rely on amino acids and fats also.

The new findings came after researchers looked into the differences between two major subtypes of non-small cell lung cancer - adenocarcinoma (ADC) and SqCC.

About one quarter of all lung cancers are SqCC, which has been difficult to treat with targeted therapies. The study first tapped into The Cancer Genome Atlas, which maps information about 33 types of cancer gathered from more than 11,000 patients.

Based on that data, it found a protein responsible for transporting glucose into cells was present in significantly higher levels in lung SqCC than in lung ADC.

The protein, called glucose transporter 1, or GLUT1, takes up glucose into cells, where the sugar provides a fundamental energy source and fuels cell metabolism.


Sugar interrupts the supply of important neurotransmitter precursors through the blood-brain barrier - and particularly ones that help produce serotonin and dopamine, which influence mood. Too much sugar can increase the risk of anxiety and depression due to a mix of energy rushes after ingestion followed by subsequent sugar crashes.


Increased sugar levels can decrease the amount of good cholesterol in the bloodstream and increase the amount of bad cholesterol, as well as blood fats. These factors all lead to an increased risk of heart disease. Sugary foods convert to glucose, which causes insulin to be released in a matter of minutes. This rapid process raises the heart rate, as well as the risk of high blood pressure.


The liver struggles to process excessive amounts of sugar. The unprocessed sugars are converted to fat calls, which are distributed throughout the body.

As a result, you can gain weight and are at risk for fatty liver disease and even obesity. Over time, the liver can become resistant to insulin which can lead to elevated insulin levels throughout the body.


The pancreas regulates blood sugar levels either by lowering them with insulin or raising them through glucagon. Maintaining blood sugar levels helps several organs function including the brain, heart and liver.


Too much sugar in the body can cause bacteria to migrate from the colon to the small intestine, where nearly no bacteria is present. As they proliferate on the foods digesting in the small intestine, it can cause bloating, acid reflux, gas and abdominal cramping.

GLUT1 is also necessary for normal cell function, such as building cell membranes.

As high levels of GLUT1 was implicated in SqCC's appetite for sugar, the researchers examined human lung tissue, isolated lung cancer cells and animals to find evidence of the link.

Professor Kim added: 'We looked at this from several different experimental angles, and consistently, GLUT1 was highly active in the squamous subtype of cancer.

'Adenocarcinoma is much less dependent on sugar.

'Our study is the first to show systematically that the metabolism of these two subtypes are indeed distinct and unique.'

The study also investigated the effect of a GLUT1 inhibitor in isolated lung cancer cells and mice with both types of non-small cell lung cancer.

When the mice were given the inhibitors, their SqCC growth slowed down, but this was the opposite for the adenocarcinoma. The findings indicate that GLUT1 could be a potential target for new lines of drug therapy, especially for SqCC.

The study also found GLUT1 levels were much higher in four other types of squamous cell cancer of the head and neck, oesophagus and cervix.

Study finds gray matter density increases during adolescence

MRI-derived gray matter measures, density, volume, mass, and cortical thickness, show distinct age and sex effects, as well as age-dependent intermodal correlations around adolescence

For years, the common narrative in human developmental neuroimaging has been that gray matter in the brain - the tissue found in regions of the brain responsible for muscle control, sensory perception such as seeing and hearing, memory, emotions, speech, decision making, and self-control—declines in adolescence, a finding derived mainly from studies of gray matter volume and cortical thickness (the thickness of the outer layers of brain that contain gray matter). Since it has been well-established that larger brain volume is associated with better cognitive performance, it was puzzling that cognitive performance shows a dramatic improvement from childhood to young adulthood at the same time that brain volume and cortical thickness decline.

A new study published by Penn Medicine researchers this month and featured on the cover of the Journal of Neuroscience may help resolve this puzzle, revealing that while volume indeed decreases from childhood to young adulthood, gray matter density actually increases. Their findings also show that while females have lower brain volume, proportionate to their smaller size, they have higher gray matter density than males, which could explain why their cognitive performance is comparable despite having lower brain volume. Thus, while adolescents lose brain volume, and females have lower brain volume than males, this is compensated for by increased density of gray matter.

"It is quite rare for a single study to solve a paradox that has been lingering in a field for decades, let alone two paradoxes, as was done by Gennatas in his analysis of data from this large-scale study of a whole cohort of youths," said Ruben Gur.

"We now have a richer, fuller concept of what happens during brain development and now better understand the complementary unfolding processes in the brain that describe what happens."

The study was led by Ruben Gur, PhD, professor of Psychiatry, Neurology, and Radiology in the Perelman School of Medicine at the University of Pennsylvania, Raquel Gur, MD, PhD, a professor of Psychiatry, Neurology, and Radiology, and Efstathios Gennatas, MBBS, a doctoral student of neuroscience working in the Brain Behavior Laboratory at Penn.

According to Gur, the study findings may better explain the extent and intensity of changes in mental life and behavior that occur during the transition from childhood to young adulthood.

"If we are puzzled by the behavior of adolescents, it may help to know that they need to adjust to a brain that is changing in its size and composition at the same time that demands on performance and acceptable behavior keep scaling up," Gur added.

In the study, the researchers evaluated 1,189 youth between the ages of 8 and 23 who completed magnetic resonance imaging as part of the Philadelphia Neurodevelopmental Cohort, a community-based study of brain development that includes rich neuroimaging and cognitive data, to look at age-related effects on multiple measures of regional gray matter, including gray matter volume, gray matter density, and cortical thickness. Neuroimaging allowed the researchers to derive several measures of human brain structure in a noninvasive way.

Observing such measures during development allowed the researchers to study the brain at different ages to characterize how a child's brain differs from an adult's. "This novel characterization of brain development may help us better understand the relationship between brain structure and cognitive performance," Gennatas said.

"Our findings also emphasize the need to examine several measures of brain structure at the same time. Volume and cortical thickness have received the most attention in developmental studies in the past, but gray matter density may be as important for understanding how improved performance relates to brain development."

Further study is required to fully characterize the biological underpinnings of different MRI-derived measures by combining neuroimaging and brain histology. The study's findings in healthy people can also help researchers understand the effects of brain disorders in males and females as they evolve during adolescence.

Can hand amputation and reattachment affect the brain?

The effects of amputation on the brain may continue even as amputees recover sensory and motor functions in transplanted hands.

Changes in the brain can persist in individuals who receive hand transplants.

The amputation of a limb severs nerves that control sensation and movement, but also leads to dramatic changes in areas of the brain that controlled the functions of the lost limb. Researchers from the University of Missouri have found evidence of specific neurochemical changes associated with lower neuronal health in these brain regions. Further, they report that some of these changes in the brain may persist in individuals who receive hand transplants, despite their recovered hand function.

“When there is a sudden increase or decrease in stimulation that the brain receives, the function and structure of the brain begins to change,” said lead author Carmen M. Cirstea. “Using a noninvasive approach known as magnetic resonance spectroscopy (MRS) to examine areas of the brain previously involved with hand function, we observed the types of changes taking place at the neurochemical level after amputation, transplantation or reattachment.”

Cirstea, with co-author Scott Frey, used MRS to evaluate the neuronal health and function of nerve cells of current hand amputees, former amputees and healthy subjects.The researchers instructed volunteers to flex their fingers to activate sensorimotor areas in both sides of the brain. The research team then analyzed N-acetylaspartate (NAA) levels, a chemical associated with neuronal health. The researchers found that NAA values for the reattachment and transplant patients were similar to levels of amputees and significantly lower than the healthy control group.

Frey noted, “These findings show that after surgical repairs, the effects of nerve injuries on the mature brain may continue even as former amputees recover varying degrees of sensory and motor functions in replanted or transplanted hands.”Due to the small number of reattachment and transplant patients studied, the researchers said that the results should be interpreted with caution until more work is completed.

Thursday, May 25, 2017

Learning to read in adulthood transforms brain: study

The human brain can adapt and transform itself based on external stimulus well into adulthood, researches have found.

A study of women in India who learned to read in their 30s shows the human brain's incredible capacity to reorganise and transform itself, researchers said today.

Researchers recruited women in India, a country with an illiteracy rate of around 39 percent, to see what they could learn about the areas of the brain devoted to reading.

At the start of the study, most of the women could not read a word of their mother tongue, Hindi.

But after six months of training, the women reached a level comparable to first-grade proficiency.

"This growth of knowledge is remarkable", said Falk Huettig from the Max Planck Institute for Psycholinguistics, lead author of the study in the journal Science Advances.

"While it is quite difficult for us to learn a new language, it appears to be much easier for us to learn to read. The adult brain proves to be astonishingly flexible."

Specifically, researchers found that the exterior of the brain - known as the cortex, which is able to adapt quickly to new challenges - was not the main area where transformation occurred.

Instead, researchers found that reorganisation took place deep inside the brain, particularly in the brainstem and thalamus, a walnut-sized structure that relays sensory and motor information.

"We observed that the so-called colliculi superiores, a part of the brainstem, and the pulvinar, located in the thalamus, adapt the timing of their activity patterns to those of the visual cortex," said co-author Michael Skeide, a scientific researcher at the Max Planck Institute for Human Cognitive and Brain Sciences in Leipzig.

"These deep structures in the thalamus and brainstem help our visual cortex to filter important information from the flood of visual input even before we consciously perceive it." Researchers found that the more signals aligned in the affected brain regions, the better the women's reading skills became.

"These brain systems increasingly fine-tune their communication as learners become more and more proficient in reading," Skeide added.

"This could explain why experienced readers navigate more efficiently through a text." The finding could also have implications for the treatment of dyslexia, which some researchers have blamed on a malfunctioning thalamus.

"Since we found out that only a few months of reading training can modify the thalamus fundamentally, we have to scrutinise this hypothesis," Skeide said.

Co-authors on the study came from India's Centre of Bio- Medical Research (CBMR) Lucknow and the University of Hyderabad.

No evidence that brain-stimulation technique boosts cognitive training

Transcranial direct-current stimulation (tDCS)--a non-invasive technique for applying electric current to areas of the brain--may be growing in popularity, but new research suggests that it probably does not add any meaningful benefit to cognitive training. The study is published in Psychological Science, a journal of the Association for Psychological Science.

"Our findings suggest that applying tDCS while older participants engaged in daily working memory training over four weeks did not result in improved cognitive ability," explains researcher Martin Lövdén of Karolinska Institutet and Stockholm University.

"The study is important because it addresses what has arguably been the most promising cognitive application of tDCS: the possibility of long-term cognitive enhancement from relatively limited practice on select cognitive tasks," Lövdén adds. "Cognitive enhancement is of interest not just to scientists, but also to the student studying for final exams, the gamer playing online games, and the retiree remembering which pills to take. Because of this large audience, it is of utmost importance to conduct systematic research to disentangle hype from fact."

Working memory--our capacity for holding information in mind at any given moment--underlies many fundamental cognitive processes and is linked with some aspects of intelligence. Research has shown that working memory training improves working memory performance but it's unclear whether this specific training can yield improvements to broader cognitive abilities.

Recent interest and publicity surrounding the potential effects of tDCS--which involves conducting a weak electrical current to the brain via electrodes on the scalp--led Lövdén and colleagues to wonder: Could using tDCS during cognitive training enhance brain plasticity and enable transfer from working memory to other cognitive processes?

The researchers enrolled 123 healthy adults who were between 65 and 75 years old in a 4-week training program. All participants completed a battery of cognitive tests, which included tasks that were incorporated in the training and tasks that were not, at the beginning of the study and again at the end. Those randomly assigned to the experimental group trained on tasks that targeted their ability to update mental representations and their ability to switch between different tasks and rules, while those in the active control group trained on tasks that focused on perceptual speed.

As they completed the training tasks, some participants received 25 minutes of tDCS current to the left dorsolateral prefrontal cortex, an area of the brain that plays a central role in working memory; other participants were led to believe they were receiving 25 minutes of current, when in actuality the current was only active for a total 30 seconds.

Comparing participants' performance before and after training indicated that those who received working memory training did improve on the updating and switching tasks they had encountered during training and on similar tasks that they had not encountered previously.

But there was no evidence that tDCS produced any additional benefit to the working memory training--at the end of the study, participants who received tDCS did not show greater improvement than their peers.

When the researchers pooled the data from this study with findings from six other studies, they again found no evidence of any additional benefit from working memory training that was combined with tDCS.

Given strong public interest in cognitive enhancement, Lövdén and colleagues urge caution when it comes to this as-of-yet unproven application of tDCS:

"A growing number of people in the general public, presumably inspired by such uninhibited optimism, are now using tDCS to perform better at work or in online gaming, and online communities offer advice on the purchase, fabrication, and use of tDCS devices," the researchers write. "Unsurprisingly, commercial exploitation is rapidly being developed to meet this new public demand for cognitive enhancement via tDCS, often without a single human trial to support the sellers' or manufacturers' claims."

"These findings highlight exactly how limited our knowledge is of the mechanisms underlying the potential effects of tDCS on human cognition and encourages the research community to take a step back and focus its resources on developing strategies for uncovering such mechanisms before using the technique in more applied settings," Lövdén concludes.

Co-authors on the study include Jonna Nilsson, Alexander V. Lebedev, and Anders Rydström of Karolinska Institutet and Stockholm University.

This research received funding from the European Research Council (ERC) under the European Union's Seventh Framework Programme (FP7/2007-2013) and ERC Grant No. 617280 -REBOOT. Martin Lövdén was also supported by a Distinguished Young Researchers grant from the Swedish Research Council (446-2013-7189).

For more information about this study, please contact: Martin Lövdén at

The article abstract are available online:

The APS journal Psychological Science is the highest ranked empirical journal in psychology. For a copy of the article "Direct-Current Stimulation Does Little to Improve the Outcome of Working Memory Training in Older Adults" and access to other Psychological Science research findings, please contact Anna Mikulak at 202-293-9300 or

Your Brain Is Trying to Show You The Future - And It Might Save Your Life

Our brains are pretty good at filling in the blanks when it comes to our sense of perception - often to the point we have a mental movie of an entire event before it even finishes unfolding.

New research has shown this 'mind's eye' prediction of future motion occurs at a higher speed than in reality - a trait we could have evolved to compensate for our relatively sluggish sense of vision.

Unless you have a condition called aphantasia, which makes it impossible to summon up mental images, you'll be familiar with how your visual cortex builds imagined scenes in your mind.

Until now, most research on the imagery that arises in anticipation of an ongoing event - or "preplay" - has been conducted on animals. This new study takes a close look at what's going on in the visual cortex of humans.

Researchers from Radboud University in the Netherlands put 29 university students into a functional magnetic resonance imaging (fMRI) scanner to map their brain activity as they watched a white dot step across a screen.

Participants were asked to watch the same animation repeat 108 times over a number of short sessions. By the end, their brains were very well primed to know what to expect as the dot travelled left to right and right to left in about half a second.

Now they had built in these expectations, the participants watched a random sequence of 24 'dot' movies. Some were just like the previous ones, with the dot moving across the screen, while others had the dots in the starting or ending positions only, plus a few 'oddball' trials delaying the final step in the movement sequence.

The entire experiment was conducted twice with each student, while another four volunteers acted as controls to rule out residual effects between the trials.

A series of fMRI scans were taken of their brains at ultra-fast speed to capture the blood flow in certain tissues.

As the volunteers watched the dots jump, a corresponding part of their visual cortex lit up with each step.

When shown just the starting dot, the same parts of the brain were activated, mentally completing the sequence in anticipation, though at twice the speed of the actual dot sequences.

The diagrams below give you some impression of how the brain scans of watching the moving dots compared to the scans of the volunteers when they were simply anticipating movement - in 'preplay'"

That isn't to say we can put a number on how fast 'fast-forward' is, since the fMRI scanner can only take snapshots at a certain speed, even on ultra-fast.

But it does suggest that we have a way to quickly visualise relatively simple movements, such as a ball rushing at our head, in at least half the time it would take for the event to occur.

Previous studies have estimated a need to look at an image for at least 150 milliseconds in order for our brains to capture enough information to make a judgement on what to look at next.

Then a study a few years ago found we can actually accomplish the task a lot faster - in just 13 milliseconds - at the risk of making some mistakes. That's quick, though processing lots of visual information quickly comes at a cost of chewing up more energy.

Nonetheless, it still means we're living up to a tenth of a second in the past, which could make all the difference between life and death.

It's possible that we evolved this ability to predict the future in fast-forward to save time and effort, helping us act sooner.

"Imagine you are standing at a road, a car is approaching and you need to decide "Do I cross, or do I wait for the car to pass first?"," lead researcher Matthias Ekman told MailOnline.

"Our study suggests that our visual system can fast-forward the trajectory of the car and thereby help us with our decision whether to wait or not."

The research calls into question the role of our visual cortex not just in perceiving current events, but in using past experiences to build perceptions of future ones.

"Thus, the notion of preplay processes in the visual system blurs the boundaries between memory and perception, and underscores the integrated nature of these two cognitive faculties," the researchers write in their report.

It pays to remain a little sceptical about conclusions on any relationship between higher blood-flow across specific bits of the brain and the part's role in a cognitive task, especially while questions remain on how closely linked metabolism is with the neural activity.

But if you've ever experienced the illusion of time standing still, you'll know our brains have some weird and remarkable tricks to cope with a fast paced world rushing past our eyes.

Wednesday, May 24, 2017

Researchers aim to explore role of physical activity on aging trajectories of the brain

Most people know that regular exercise can keep a body looking and feeling young.

What about the brain?

"There has been a wealth of evidence from past studies that physical activity has beneficial effects on neurocognitive functions, such as memory and regulatory control," says Mark Peterson, Ph.D., M.S., FACSM, assistant professor of physical medicine and rehabilitation at Michigan Medicine. "Essentially, those studies show that the physical activity alters the brain's aging trajectories to preserve cognitive health."

Peterson and colleagues were recently awarded a two-year grant from the University of Michigan's Exercise & Sport Science Initiative to further examine the role physical activity plays on the brain. The grant is one of four recently awarded by the U-M initiative to study physical activity.

A shortage of comprehensive analyses propelled the new effort.

"Those previous studies did not examine the effects in a large cohort," Peterson says. "We're hoping this study fills that knowledge gap and can validate and extend the previous claims."

Peterson and his U-M colleagues in psychiatry, Chandra Sripada, M.D., Ph.D., and computer science, Jenna Wiens, Ph.D., will obtain and study the cohort and data from the United Kingdom Biobank.

"The U.K. Biobank is the world's largest prospective epidemiological study," Peterson says. "It gathers extensive questionnaires and physical and cognitive measures from 500,000 participants."

"We'll be incorporating deep-learning techniques to predict brain age from raw neuroimaging, and will examine the independent effects of objectively measured physical activity on brain age and cognitive function in the cohort," he adds.

Peterson, also a member of the U-M Global Research, Education and Collaboration in Health and Institute for Healthcare Policy and Innovation, spoke more about the study.

What made you decide to research this topic?

Peterson: There is a wealth of evidence that in middle-age and older individuals, physical activity has beneficial effects on neurocognitive functions (working memory, declarative memory, attention, etc.).

It has been conjectured that physical activity produces these effects by altering typical aging trajectories of critical brain circuits that underlie major cognitive functions. However, at this time, there have been no rigorous, large-scale studies to examine the effects of objectively measured physical activity on aging trajectories of the human brain.

What will be your focus?

Peterson: We expect to better understand the role of physical activity participation and dosage on deviations in brain health and cognitive function. It is now known that distinct brain circuits associated with specific neurocognitive functions exhibit distinct trajectories of change during aging.

Thus, we will generate maps of neurotypical change in order to assess how physical activity influences an individual's position along the expected aging trajectories. To generate these maps of neurotypical change, we will apply deep-learning methodologies.

How will your work differ from prior research efforts?

Peterson: Previous studies have used machine learning to predict brain age in various disease processes (e.g., Alzheimer's). However, this work will be the first ever to use deep-learning algorithms to better understand deviations in brain age by physical activity and functional profiles in otherwise healthy middle-aged adults.

Moreover, we will go substantially beyond previous studies in multiple ways:

  • The inclusion of participants from the world's largest epidemiologic study
  • Physical activity will be measured objectively (and not just by self-report)
  • Utilization of multimodal imaging (not just imaging of brain volume) 
  • Examination of circuit-specific aging trajectories 
  • Employment of advanced deep-learning methods for constructing brain age trajectories 
  • Analyzing complex relationships between physical activity, brain aging and cognitive functioning to gain evidence about potential causal pathways

What are you hoping to accomplish?

Peterson: Our proposed study is poised to decisively answer two critical unanswered questions: What is the effect of physical activity on circuit-specific brain aging, and does this effect mediate the effect of physical activity on improved cognitive functioning?

We anticipate that this work will lead to subsequent cross-disciplinary collaboration within U-M to secure federal funds for prospectively studying the effects of exercise for physical and cognitive health preservation or improvements in adults with and without deficits.

Who might benefit most from this research?

Peterson: This work is poised to inform public health and clinical audiences regarding the benefits of exercise and physical activity on brain health. Therefore, we expect that our findings will support and bolster the movement for integrating recommendations for exercise in clinical care.

Cambridge company ARM could restore limb movement with brain implants

ARM looking to help people who have suffered from a stroke or spinal cord injury.

ARM computer chips are found in over half the world's mobile devices.

It could help those with debilitating conditions like Parkinson’s and Alzheimer’s – and even enable those with paralysis to move again.

Cambridge technology giant ARM is working with an American engineering research centre to develop brain-implantable chips to tackle neurodegenerative diseases and help people who have suffered from a stroke or spinal cord injury.

The company has signed an agreement with the Center for Sensorimotor Neural Engineering (CSNE), based at the University of Washington, which wants to use its understanding of how the brain processes information to design implantable devices that can restore sensation and limb function, and even augment the brain’s natural healing power.

The aim of the 10-year project is to design a system-on-a-chip (SoC) for bi-directional brain-computer interfaces (BBCI). The implant will be designed to take neural signals representing movements that a person with a neurological condition or paralysis wants to make and direct them to a stimulator implanted in the spinal cord, enabling those movements to be carried out.

Further advances could enable the system to send information in the other direction, allowing the person to feel what their hand is touching, for example.

The approach could be used to enable those with artificial limbs to get feedback from them – so they could feel how tightly they are holding a loved one’s hand or how hot their cup of coffee is, for example.

And temporary implants could help individuals recover from strokes.

Dr Scott Ransom, the CSNE’s director of industry relations and innovation, said: “We are very excited to be collaborating with a company like ARM.

“ARM’s strong expertise in power-efficient microprocessors complements the CSNE’s work in computational neuroscience and brain-computer interfacing, and we expect the partnership to lead to advances in not only medical technology but other applications as well, such as consumer electronics.”

The system will decode signals in the brain and digitise them so they can be processed. But brain-implantable chips need to be very small and capable. Among the challenges is the power efficiency required and the heat generated.

ARM said: “Our industry-proven ARM Cortex-M0 processor, the smallest ARM processor available, will contribute to this very important area of research by being an integral part of the CSNE’s brain-implantable SoC.

“The project is a natural fit for ARM and our vision of improving lives around the globe by shaping a smarter, happier and healthier world with technology.

“Our ongoing goal of increasing the power-efficiency of ARM products aligns with CSNE’s advanced research work in developing low-power, efficient and implantable neural devices for medical applications.”

It is thought use of such a system could even help to coax brain neurons to rewire in ways that will help the brain recover from stroke.

While the concept sounds like science fiction, there have been some steps towards it already.

In March, researchers, including a team at Case Western Reserve University in Cleveland, Ohio, became the first to restore brain-controlled hand and arm motion in a person with complete paralysis.

Bill Kochevar, who was injured in a cycling accident, used thoughts to send messages from implants in his brain to 36 electrodes in his arm and hand, enabling him to feed himself.