How Behavior Is Shaped; Who's an Orchid, Who's a Dandelion
Researchers are making strides in understanding how genes work with the environment to shape behavior, adding a new twist to the age-old debate over whether nature or nurture is mostly responsible for how people develop.
They are finding that sensitivity to the environment resides in the biology of the nervous system. And some people, because of their genetic makeup and life experiences, are more sensitive to outside influences than others. Scientists point to a type they call orchids—people who wilt under poor conditions but flourish in supportive climes. Meanwhile, dandelions aren't much affected by the world around them, whether supportive or harsh.
Who Wilts, Who Flourishes
The interplay of the brain's biology and the environment makes some people more sensitive to outside influences than others, research shows. People more susceptible to external factors, both positive and negative, are called orchids. Others are hardy dandelions.
Getty Images
Orchids
Orchids
Prone to anxiety, depression, aggression in difficult conditions
Trouble maintaining attention amid distractions
Act out more when parents fight, but pay more attention in school when have happy home life
Better students, more likely to share when praised
Poverty, marital conflict, negativity speed up puberty
Puberty starts later and progresses more slowly if higher quality relationship with parents
Meditation likely to reduce stress levels dramatically
Tim Graham/Getty Images
A dandelion
Dandelions
Affected relatively little by adversity
Can focus amid distractions
Learning doesn't vary with home life
Sharing doesn't improve more when praised
Whatever the family life, puberty tends to proceed at about the same pace
Meditation cuts stress levels, but not as much as in orchids
Sources: Bruce Ellis, of the University of Arizona; Jay Belsky, of the University of California, Davis
Part of the difference stems from variation in genes like DRD4, which helps regulate a chemical in the brain called dopamine, a neurotransmitter that helps people experience pleasure and reward. Evidence suggests that people who produce less dopamine—the orchids—don't learn as well from negative feedback or in a distracting environment, but do perform well in a warm but strict setting.
About 30% of Caucasians could be called orchids as a result of the genetic variation to DRD4, one review of research on the subject has shown. Prevalence in other ethnicities is less well known.
Researchers say the most startling discovery is that while sensitive orchids are hurt by bad outside influences, they can benefit profoundly from positive environments. Children who acted out more and did worse in school than classmates while coping with fighting parents, for example, shared more and performed better than peers after an intervention to promote a happier home life, according to a 2010 study of 338 children in the journal Child Development.
"The very characteristics that were often thought of as children's greatest frailties can also be their greatest strengths," says Bruce Ellis, a University of Arizona professor of family studies and human development who helped coin the orchid and dandelion designations and develop the theory.
The most recent study, published in August in the Proceedings of the National Academy of Sciences, looked at the impact of the economy on mothers' parenting.
The study found mothers with a particular genetic variation yelled, cursed and slapped their children more as the economy plunged during the recent downturn of 2008, though they parented less harshly than mothers who didn't have the genetic change as the economy improved in the early 2000s.
Mothers with the sensitive kind of gene do parenting "worse when conditions are deteriorating," says Irwin Garfinkel, a professor at Columbia University's School of Social Work who helped author the study. "But those with the sensitive gene do better when conditions are improving."
The findings that only certain people may be sensitive to outside influences have triggered a spirited debate about how best to help troubled youths and adults. Some say treatment might need to be different for those identified as orchids than those who are dandelions.
Much is still unknown about the mechanics behind people's environmental susceptibility. It is likely that most people aren't either an orchid or a dandelion, but have the qualities of each to varying degrees.
Critics like Glenn Roisman, a professor at the University of Minnesota's Institute of Child Development, question the strength of the evidence implicating particular genetic hitches in environmental sensitivity and say more rigorous study is needed.
Dr. Roisman says the research must better distinguish how good or how bad outside influences need to be to have a significant effect, and whether a person's susceptibility is specific to certain factors.
"If you're an orchid, you may be an orchid susceptible to specific environmental circumstances," such as parenting but not peer pressure, Dr. Roisman says.
Jay Belsky, a University of California, Davis, professor of human development, was among those who pioneered the idea that certain people are developmentally malleable.
Researchers had long thought that childhood experiences shaped how people turned out later in life. Dr. Belsky figured it made evolutionary sense that some children would be more susceptible to early influences than others because the future is uncertain.
If the future turned out as anticipated, these developmentally malleable children would be in a great position to flourish because they wound up fitting the environment in which they found themselves. But if the future was unexpected, these same kids would be mismatched, perhaps disastrously so.
To ensure survival over generations regardless of what the future brought, parents would have both orchid and dandelion offspring, Dr. Belsky thought.
Evidence hashing out the biology behind the theory and supporting its validity began pouring in about five years ago, once the technology for parsing genetic data was more widely available to researchers.
Researcher Marinus Van IJzendoorn and colleagues at Leiden University in the Netherlands took a sample of 157 children at risk for aggression and disobedience. They swabbed the inside of the study subjects' cheeks and analyzed the cells to see who had a variation of DRD4, the dopamine-regulating gene.
At a laboratory, Dr. Van IJzendoorn filmed the study subjects' mothers working with their at-risk children. Half of the parents in the study were visited six times by a social worker who reviewed the video and discussed how to be warmer while setting limits more strictly; the other parents didn't receive such training. The mothers answered questionnaires designed to assess the children's behavior.
"We found clear-cut evidence" that the children with the DRD4 variant "were more open to the changes in their parents' behavior: These children who showed most aggressive behavior without the parent training, displayed least problem behaviors after the training," Dr. Van IJzendoorn said in an email. The study was published in 2008 in the journal Developmental Psychology.
In 2011, Dr. Van IJzendoorn and colleagues published in the journal Development and Psychopathology an analysis of 15 studies involving more than 1,200 children confirming the hypothesis that dopamine-system related genes mark a person's susceptibility to the environment.
Bed rest, once a key part of treating back pain, has a limited role in healing sore backs. In very small doses, bed rest can give you a break when standing or sitting causes severe pain. Too much may make back pain worse. Here is how to do bed rest “right.”
To get the most from staying in bed, limit the time you are lying down to a few hours at a time, and for no longer than a day or two. You can rest on a bed or sofa, in any comfortable position. To ease the strain on your back, try putting pillows under your head and between your knees when lying on your side, under your knees when lying on your back, or under your hips when lying on your stomach. These positions reduce forces that sitting or standing impose on the back — especially on the disks, ligaments, and muscles.
An extended period of bed rest isn’t helpful for moderate back strain at any stage of therapy. While your back may feel a little better in the short term, too much time in bed can trigger other problems. Muscles lose conditioning and tone, you may develop digestive issues such as constipation, and there is some risk of developing blood clots in the veins of your pelvis and legs. And being on prolonged bed rest does nothing for your mental health and sense of well-being. Depression, as well as an increased sense of physical weakness and malaise, is common among people confined to bed.
Is it okay to try to get active as quickly as possible? Well-designed clinical trials suggest that an early return to normal activities — with some rest as needed — is better than staying home from work for an extended period.
Brain fitness has basic principles: variety and curiosity. When anything you do becomes second nature, you need to make a change. If you can do the crossword puzzle in your sleep, it's time for you to move on to a new challenge in order to get the best workout for your brain. Curiosity about the world around you, how it works and how you can understand it will keep your brain working fast and efficiently. Use the ideas below to help attain your quest for mental fitness.
1. Play Games
Brain fitness programs and games are a wonderful way to tease and challenge your brain. Suduko, crosswords and electronic games can all improve your brain's speed and memory. These games 3rely on logic, word skills, math and more. These games are also fun. You'll get benefit more by doing these games a little bit every day -- spend 15 minutes or so, not hours.
2. Meditation
Daily meditation is perhaps the single greatest thing you can do for your mind/body health. Meditation not only relaxes you, it gives your brain a workout. By creating a different mental state, you engage your brain in new and interesting ways while increasing your brain fitness.
3. Eat for Your Brain
Your brain needs you to eat healthy fats9. Focus on fish oils from wild salmon, nuts such as walnuts, seeds such as flax seed and olive oil. Eat more of these foods and less saturated fats. Eliminate transfats completely from your diet.
4. Tell Good Stories
Stories are a way that we solidify memories, interpret events and share moments. Practice telling your stories, both new and old, so that they are interesting, compelling and fun. Some basic storytelling techniques will go a long way in keeping people's interest both in you and in what you have to say.
5. Turn Off Your Television
The average person watches more than 4 hours of television everyday. Television can stand in the way of relationships, life and more. Turn off your TV and spend more time living and exercising your mind and body.
6. Exercise Your Body To Exercise Your Brain
Physical exercise is great brain exercise too. By moving your body, your brain has to learn new muscle skills, estimate distance and practice balance. Choose a variety of exercises to challenge your brain.
7. Read Something Different
Books are portable, free from libraries and filled with infinite interesting characters, information and facts. Branch out from familiar reading topics. If you usually read history books, try a contemporary novel. Read foreign authors, the classics and random books. Not only will your brain get a workout by imagining different time periods, cultures and peoples, you will also have interesting stories to tell about your reading, what it makes you think of and the connections you draw between modern life and the words.
8. Learn a New Skill
Learning a new skill works multiple areas of the brain. Your memory comes into play, you learn new movements and you associate things differently. Reading Shakespeare, learning to cook and building an airplane out of toothpicks all will challenge your brain and give you something to think about.
9. Make Simple Changes
We love our routines. We have hobbies and pastimes that we could do for hours on end. But the more something is 'second nature,' the less our brains have to work to do it. To really help your brain stay young, challenge it. Change routes to the grocery store, use your opposite hand to open doors and eat dessert first. All this will force your brain to wake up from habits and pay attention again.
10. Train Your Brain
Brain training is becoming a trend. There are formal courses, websites and books with programs on how to train your brain to work better and faster. There is some research behind these programs, but the basic principles are memory, visualization and reasoning. Work on these three concepts everyday and your brain will be ready for anything.
"Scarcity captures the mind. Just as [people who are hungry can only think about food,] when we experience scarcity of any kind, we become absorbed by it. The mind orients automatically, powerfully, toward unfulfilled needs. For the hungry that need is food. For the busy it might be a project that needs to be finished. For the cash-strapped it might be this month's rent payment; for the lonely, a lack of companionship. Scarcity is more than just the displeasure of having very little. It changes how we think. It imposes itself on our minds . . . Scarcity is not just a physical constraint. It is also a mind-set. When scarcity captures our attention, it changes how we think. By staying top of mind, it affects what we notice, how we weigh our choices, how we deliberate, and ultimately what we decide and how we behave. When we function under scarcity, we represent, manage, and deal with problems differently."—Sendhil Mullainathan and Eldar Shafir,Scarcity: Why Having Too Little Means So Much
Few environmental factors are as reliable as the 24-hour day, and an evolutionary argument can be made for why the diurnal rhythms of the Earth’s rotation are so coupled with human metabolism. Our behavior, our physiology, and our biochemistry reflect the daily cycles of the planet, and people who fall out of sync with these cycles are more likely to suffer from diabetes, obesity, and heart disease. Gastrointestinal disorders, depression, and other ailments are also more common among people who don’t have normal sleep habits. But according to new research, it’s not just disrupted sleep that can lead to these myriad physiological symptoms; it’s also the altered patterns of food consumption that go along with keeping such strange hours.
Shift workers who punch in in the evening have offered epidemiologists a glimpse into the importance of keeping normal sleep-wake patterns—that is, with activity coinciding with daylight. It’s been shown repeatedly that these employees are prone to developing metabolic disorders, and one review of the research concluded that night-shift workers are 40 percent more likely to develop cardiovascular disease.1
The mechanisms for these associations have been less clear, but a wealth of animal studies and emerging research on humans implicate the timing of eating as an important factor in maintaining energy balance and good health. In rodents, “simply restricting feeding to incorrect times has adverse consequences,” says Joe Bass of Northwestern University. Mouse studies have shown that a high-fat diet, freely available around-the-clock, will make the animals obese and unhealthy. But if mice are fed only at night—when these nocturnal animals are normally active—the untoward metabolic effects are drastically reduced, despite consuming the same number of calories.
Even less dramatic affronts to our normal circadian cycles may affect the way we process food. Earlier this year, Frank Scheer of Harvard Medical School and Marta Garaulet of Murcia University published the results from a study of 420 dieters in Spain. The participants had signed up for a weight-loss program, and the investigators tracked their eating habits. Half of the participants ate their main meal earlier in the day, before 3 p.m., while the other half ate later. Both groups followed a similar diet, exercised about the same amount, slept the same number of hours, and even produced similar levels of hunger-related hormones. Yet the early eaters lost weight faster and by the end of the study had shed a greater percentage of their body weight than the late eaters.2 “These data indicated that the timing of the main meal, which [for Spaniards] is lunch, predicted the success of weight loss,” says Scheer.
Scheer’s findings add to the growing recognition that our metabolisms are primed by the circadian machinery written in our genes, and that discord between the two can wreak havoc on our systems. According to Satchidananda Panda of the Salk Institute, “we are very different animals between the day and night.”
“That picture changed pretty rapidly in the late ’90s after the first clock genes were cloned,” says Joseph Takahashi, an investigator with theHoward Hughes Medical Institute and a professor at the University of Texas Southwestern Medical Center. Upon identifying the key genes that synchronize organisms’ behavior and bodily functions with the Earth’s rotation, Takahashi and others began finding clock genes expressed in nearly every tissue of the body. “That sort of threw everybody into kind of a quandary,” says Vincent Cassone, a biology professor at the University of Kentucky: Was the SCN really our primary pacemaker, or were cells throughout the body keeping their own time? The search was on to discover what these genes and the proteins they encode were doing outside of the brain. (See “Time and Temperature,” The Scientist, February 2011.)
Sure enough, researchers discovered that the SCN is not the body’s only timepiece. Additional oscillators in the peripheral tissues help adjust the daily rhythmic functions of organs. (See illustration here.) In the gut, for instance, intestinal motility and absorption differ depending on the time of day. Like all of the body’s clocks, these rhythms are guided by clock genes that operate in a transcriptional feedback loop. Transcription factors such as CLOCK andBMAL1 activate the expression of a large number of genes, including Period andCryptochrome, whose proteins, in turn, inhibit CLOCK and BMAL1, causing daily oscillations in their expression.
Circadian clocks in the periphery are guided by the SCN, and all of the clocks are vulnerable to the influence of zeitgebers (from the German for “time giver”), environmental stimuli that tell the body what time it is. The SCN’s primary zeitgeber is light. Clocks of peripheral tissues, on the other hand, can take their cues from other inputs, such as food consumption.
In the mouse liver, for instance, about 300 different transcripts oscillate when mice are prohibited from eating. Give the animals access to food throughout the day and night, and the number of oscillating transcripts jumps to about 3,000. If you then consolidate the availability of food to 8 or 9 hours during the day—when mice should be sleeping—that number surges to 5,000.3 “This means that eating has a big effect,” says Panda.
Similarly, Cassone has shown that some of the rhythmically expressed genes driving the circadian clock in the mammalian gastrointestinal tract are sensitive to the timing of eating. Clock proteins in the colon peak in abundance at dramatically different times during a 24-hour cycle, depending on whether the animals eat throughout the day or during a restricted time period.4 Furthermore, animals with a dysfunctional master clock—those with a lesioned SCN, for example—can use food consumption as a way to get back on schedule. “If we give animals a timed feeding, the gastrointestinal system learns the time of day,” Cassone says.
Despite the seemingly strong influence of food intake on the body’s peripheral clocks, the SCN appears much less affected. Thus, researchers speculate that at the heart of the health problems seen in shift workers and in mice fed during their normal sleeping periods is an uncoupling of the SCN and the peripheral clocks. “We suspect that eating at the inappropriate time of the day ends up with peripheral clocks—in the liver, in fat, in the pancreas, in the muscle—being in a phase which is now different from the SCN,” says Georgios Paschos, a researcher at the University of Pennsylvania. “This, we think, can be the initiation of issues in energy homeostasis.”
Metabolism and the clock
Taking a closer look at the genes whose expression can be impacted by mistimed eating, Panda has found impacts on glucose metabolism, fatty acid synthesis and breakdown, cholesterol production, and liver function.5 He argues that some proteins require a period of fasting to operate properly. PhosphoCREB (pCREB), for example, regulates the process of glucose release when animals are sleeping. “This [gene] should only be on during the day when the mice are fasting,” says Panda. Instead, in animals fed throughout the day and night, pCREB levels remain high, and the consequence is a high blood-sugar level. (See “Feeding Time,” The Scientist, January 2013.)
Indeed, studies that directly disturb peripheral tissue clocks by ablating clockwork genes yield dramatic metabolic problems. Paschos has found that knocking out Bmal1 in the fat cells of mice, for instance, leads to obesity and changes in the concentration of circulating polyunsaturated fatty acids. Additionally, Bass found that mice whose pancreatic clocks are knocked out by a mutant Bmal1 or Clock specific to the pancreas can’t produce insulin properly and develop diabetes.6 The animals maintained normal feeding rhythms and body weight, but they ended up with impaired glucose tolerance and decreased insulin secretion. “The clock is a very dominant regulator of gene expression in the pancreas, and that has a very big effect on function,” Bass says.
One striking example of metabolism’s marriage to the body’s clocks came to light about 25 years ago, when the University of Pennsylvania’s Mitch Lazar discovered Rev-erbα, a nuclear receptor that regulates gene expression through an epigenomic modulator, histone deacetylase 3 (HDAC3).7 In Lazar’s long quest to understand the role of Rev-erbα, he became fascinated by the remarkable circadian oscillation in its expression. In the case of the liver, “it’s almost like in a mouse every day the molecule gets knocked out by 5 a.m., and by 5 p.m. it’s one of the more highly expressed genes in the cell,” says Lazar.From these studies it’s clear that the clocks in peripheral tissues—vulnerable as they are to the timing of eating—are vital to metabolism in the body’s organs. “I would say the clock is playing a very fundamental role regulating all metabolic pathways,” says Takahashi, “not just in organ systems, but at a cellular level.”
In 2011, Lazar’s team found that when they knocked out HDAC3 in the liver, they got “a really dramatic” result, Lazar says: the liver filled up with fat.8 The study provided a molecular explanation for what had been known for decades—that there is a circadian rhythm for lipid storage and synthesis. During sleeping periods, the body burns lipids, and during waking, the liver stores them up. HDAC3, which is highly expressed during the day when the rodents are sleeping, apparently helps mediate the use of lipids while the animals fast. When Rev-erbα and HDAC3 are shut down at night, when the animals are awake and presumably eating, glucose precursors are shunted towards lipid synthesis and storage. Later, when the animals are sleeping, they can reverse the process so that their livers make glucose for use by the rest of the body, Lazar says. He and his colleagues suspect that the circadian cycling of Rev-erbα and HDAC3 “is one of these protective mechanisms for allowing the liver to produce glucose at times when the mammal is not eating,” says Lazar.
Subsequent work in Panda’s lab, published last year, found that mice fed a high-fat diet throughout the day had blunted oscillations of Rev-erbα expression, as well as increased fat deposits in liver cells and markers of liver disease.5
Again, researchers suspect that the root of the problem is the asynchrony of the master clock of the SCN and the peripheral clocks in the liver, gut, pancreas, and other organs involved in metabolism. The brain may be getting the signal from one zeitgeber, light, that it’s time to sleep (and, say, burn lipids), says Lazar, while another zeitgeber, food, is telling the cell that it’s time to be active (and store lipids). “Now you’re going to be giving conflicting signals to that animal, and the net result could be dysregulating metabolism,” he says. “I think a lot of the pathology here, when we finally understand it, will be about dissonance between signals.”
Circadian metabolites
A METABOLIC CLOCK: Circadian function is married to metabolism through a variety of pathways, most notably by its relationship to the histone deacetylase SIRT1 and the metabolite it depends upon, NAD+. The well-known clock components CLOCK and BMAL1 initiate the expression of NAMPT (1), a key enzyme in the production of NAD+ (2). This contributes to the circadian-dependent availability of NAD+, and, in turn, the daily rhythm in activity of SIRT1 (3). SIRT1 is not only involved in myriad cellular processes, including insulin secretion, gluconeogenesis, decreased adipogenesis, and mitochondrial biogenesis, but it can inhibit the activity of CLOCK as well (4).THE SCIENTIST STAFFTaken together, Panda’s and Lazar’s experiments show how the clock can influence metabolism and how eating can influence the clock. “It’s like a thermostat, almost, in that it’s maintaining timing, but it can be adjusted according to the energy environment,” says Bass.
Now, the question is: What’s mediating that feedback? Research by Paolo Sassone-Corsi of the University of California, Irvine, and others has exposed the intimate links between energy metabolites and circadian clock function, which could explain how food signals are translated into time.
In 2006, Sassone-Corsi’s group discovered that CLOCK itself is a histone acetyltransferase, which adds acetyl groups to histones. The corresponding deacetylase, SIRT1, can remove acetyl groups from histones and other proteins, including BMAL1. As part of these discoveries, Sassone-Corsi found that SIRT1’s function requires NAD+ (nicotinamide adenine dinucleotide), an energy metabolite. (See illustration here.) “That was the moment where I realized it’s a molecular link between the clock system and epigenetics and metabolism,” he says.
NAD+ itself cycles in a circadian rhythm. Sassone-Corsi’s group, concurrently with Takahashi, Bass, and their colleagues, showed in a pair of 2009 papers that the clock system controls an enzyme, NAMPT, which is a rate-limiting step in the production of NAD+.9,10 “It’s a perfect example” of how inseparably metabolism and the clock function, says Sassone-Corsi.
Humans, particularly those in developed countries with abundant artificial light, late-night TV, and 24-hour diners, have been putting themselves through an inadvertent experiment over the last few decades.
Acetyl-CoA—an enzyme vital to the energy balance within cells—is another metabolite that appears to be intimately intertwined with the circadian clock. Preliminary results from Sassone-Corsi’s lab suggests that acetyl-CoA synthase?1, the enzyme that regulates acetyl-CoA’s production, is itself activated by circadian acetylation. That’s because SIRT1 is the deacetylase of acetyl-CoA synthase?1. And SIRT1’s activity, again, is itself dependent upon a metabolite, NAD+.
The intertwining of metabolites and circadian clockwork is likely extensive. Sassone-Corsi and his colleagues have since found that of about 600 metabolites in the liver, more than half oscillate in a clock-controlled manner. He and his colleagues have developed an online resource, called Circadiomics, to catalog metabolites that have a circadian rhythm in the liver, and they plan to expand their database to the muscle. His group is also now exposing animals to various diets to see how networks of cellular pathways affiliated with a particular metabolite are affected.
Overall, his research and others’ have revealed the ubiquitous and complex interplay of regulation and feedback between metabolism and the clock. “The clock controls metabolites, and then metabolites feed back on the clock system,” says Sassone-Corsi. How this interplay is affected by different diet regimes remains to be seen.
It’s possible, then, that presenting food at times when the genome is hunkered down for fasting and energy storage might lead to weight gain and metabolic disorders. Lazar says the experiment has yet to be done to connect the dots between inappropriate food timing, epigenetic activity dysregulated by the clock, and metabolic diseases. But humans, particularly those in developed countries with abundant artificial light, late-night TV, and 24-hour diners, have been putting themselves through an inadvertent experiment over the last few decades. No longer does daylight dictate the times when we eat. “That is the cycle that has gone wrong in the last 50 years,” says Panda.
With caution and caveats, one could speculate that this is, in part, why obesity and metabolic disorders have escalated to epidemic levels, particularly when mistimed eating is coupled with a high-fat, high-carbohydrate diet. It stands to reason that our metabolic functions, as controlled by the circadian clock, evolved to cycle in harmony with the Earth’s daily rhythms, to optimize processes such as energy use and storage. In doing so, we became adapted to eat during the daytime, and maladapted for eating at night. Opposing these rhythms, as many of us now do, may challenge our bodies’ normal cycles and set us up for disease. “Like many evolutionary arguments, it’s hard to prove,” says Lazar. “But otherwise it’s hard to imagine why else we would need things so tightly linked to the Earth’s rotation.”
Eat too many and spend too few, and you will become obese and sickly. This is the conventional wisdom. But increasingly, it looks too simplistic. All calories do not seem to be created equal, and the way the body processes the same calories may vary dramatically from one person to the next.
This is the intriguing suggestion from the latest research into metabolic syndrome, the nasty clique that includes high blood pressure, high blood sugar, unbalanced cholesterol and, of course, obesity. This uniquely modern scourge has swept across America, where obesity rates are notoriously high. But it is also doing damage from Mexico to South Africa and India, raising levels of disease and pushing up health costs.
Metabolic syndrome can still be blamed on eating too much and exercising too little. But it is crucial to understand why some foods are particularly harmful and why some people gain more weight than others. Thankfully, researchers are beginning to offer explanations in a series of recent papers.
One debate concerns the villainy of glucose, which is found in starches, and fructose, found in fruits, table sugar and, not surprisingly, high-fructose corn syrup. Diets with a high “glycaemic index”, raising glucose levels in the blood, seem to promote metabolic problems. David Ludwig of Boston Children’s Hospital has shown that those on a diet with a low glycaemic index experience metabolic changes that help them keep weight off, compared with those fed a low-fat diet. This challenges the notion that a calorie is a calorie. Others, however, blame fructose, which seems to promote obesity and insulin resistance. Now a study published in Nature Communications by Richard Johnson, of the University of Colorado, explains that glucose may do its harm, in part, through its conversion to fructose.
Dr Johnson and his colleagues administered a diet of water and glucose to three types of mice. One group acted as a control and two others lacked enzymes that help the body process fructose. The normal mice developed a fatty liver and became resistant to insulin. The others were protected. The body’s conversion of glucose to fructose, therefore, seems to help spur metabolic woes.
You are what you eat, maybe
Even more intriguing is the notion that the same diet may be treated differently by different people. Four recent papers explored this theme. In one, published in Science in July, Joseph Majzoub, also of Boston Children’s Hospital, deleted in mice a gene called Mrap2. Dr Majzoub and his colleagues showed that this helps to control appetite. Surprisingly, however, even when the mutant critters ate the same as normal mice, they still gained more weight. Why that is remains unclear, but it may be through Mrap2’s effect on another gene, called Mc4r, which is known to be involved in weight gain.
The second and third papers, published as a pair in Nature in August, looked at another way that different bodies metabolise the same diet. Both studies were overseen by Dusko Ehrlich of the National Institute of Agricultural Research in France. One examined bacteria in nearly 300 Danish participants and found those with more diverse microbiota in their gut showed fewer signs of metabolic syndrome, including obesity and insulin resistance. The other study put 49 overweight participants on a high-fibre diet. Those who began with fewer bacterial species saw an increase in bacterial diversity and an improvement in metabolic indicators. This was not the case for those who already had a diverse microbiome, even when fed the same diet.
Jeffrey Gordon, of Washington University in St Louis, says these two studies point to the importance of what he calls “job vacancies” in the microbiota of the obese. Fed the proper diet, a person with more vacancies may see the jobs filled by helpful bacteria. In the fourth paper, by Dr Gordon and recently published in Science, he explores this in mice. To control for the effects of genetics, Dr Gordon found four pairs of human twins, with one twin obese and the other lean. He collected their stool, then transferred the twins’ bacteria to sets of mice. Fed an identical diet, the mice with bacteria from an obese twin became obese, whereas mice with bacteria from a thin twin remained lean.
Dr Gordon then tested what would happen when mice with different bacteria were housed together—mouse droppings help to transfer bacteria. Bacteria from the lean mice made their way to the mice with the obese twin’s bacteria, preventing those mice from gaining weight and developing other metabolic abnormalities. But the phenomenon did not work in reverse, probably due to Dr Gordon’s theory on the microbiota’s job vacancies. Interestingly, the invasion did not occur, and obesity was not prevented, when the mice ate a diet high in fat and low in fruits and vegetables. The transfer of helpful bacteria therefore seems to depend on diet.
Dr Gordon hopes to be able to identify specific bacteria that might, eventually, be isolated and used as a treatment for obesity. For now, however, he and other researchers are exposing a complex interplay of factors.
One type of calorie may be metabolised differently than another. But the effect of a particular diet depends on a person’s genes and bacteria. And that person’s bacteria are determined in part by his diet. Metabolic syndrome, it seems, hinges on an intricate relationship between food, bacteria and genetics. Understand it, and researchers will illuminate one of modernity’s most common ailments
Sit-ups once ruled as the way to tighter abs and a slimmer waistline, while “planks” were merely flooring. Now planks — exercises in which you assume a position and hold it — are the gold standard for working your core, while classic sit-ups and crunches have fallen out of favor. Why the shift?
One reason is that sit-ups are hard on your back — by pushing your curved spine against the floor and by working your hip flexors, the muscles that run from the thighs to the lumbar vertebrae in the lower back. When hip flexors are too strong or too tight, they tug on the lower spine which can be a source of lower back discomfort.
Second, planks recruit a better balance of muscles on the front, sides, and back of the body during exercise than sit-ups, which target just a few muscles. Remember, your core goes far beyond your abdominal muscles.
Finally, activities of daily living, as well as sports and recreational activities, call on your muscles to work together, not in isolation. Sit-ups or crunches strengthen just a few muscle groups. Through dynamic patterns of movement, a good core workout helps strengthen the entire set of core muscles — the muscles you rely on for daily activities as well as sports and recreational activities.
Core workout can cause muscle soreness
Many popular workouts that aim to strengthen your arms, legs, and abs give short shrift to many of the muscles that form your body’s core (the group of muscles that form the sturdy central link connecting your upper and lower body). Strong core muscles are essential to improving performance in almost any sport — and are the secret to sidestepping debilitating back pain.
If you haven’t been working your core muscles regularly — or if you challenge yourself with a new set of exercises — expect to feel a little soreness as you get used to your new routine.
Extremely sore muscles a day or two after a core workout means you probably overdid it and might need to dial down your workout a bit. Next time, try to finish just one full set of each exercise in the workout. You might also do fewer repetitions (reps) of the exercises you find especially hard. Once you can do reps without much soreness, build strength by adding one more rep of the harder exercises in each session until you’re doing the full number of reps comfortably. Then try adding a second set.
If your muscles feel really sore within 24 to 48 hours of adding a burst of core work, cut back on the number of reps. For example, say you are doing planks, the modern alternative to pushups. Instead of trying to do four front planks a day, start with one. Stick with that for a few days, then add a second plank. When you’re comfortable at that level — that is, not feeling a lot of muscle soreness — add a third plank. And so on. If even one plank knocks you out, cut back on how long you hold it: instead of 30 seconds, try 10 seconds for several days, then try 15 or 20 seconds, and so on.
Delayed-onset muscle soreness is a normal response to working your muscles. Usually, it peaks 24 to 48 hours after a workout before gradually easing, then disappearing entirely in another day or so. But if you experience sudden, sharp, or long-lasting pain, check with your doctor
In the war on distraction, a new long-term study of disrupted attention, multitasking, and aging shows dramatic results in improving working memory for older participants through use of an online game
Begin with a Twist
Let’s start with the twist: A specially-designed video game helped reverse signs of aging in the brains of players in their 60s and 70s. So, even though competing claims on our attention, including from all those devices that bleep and burp and screech, often swamp our ability to focus, evidence from a major study to be published tomorrow indicates that training on a video game improved not only the ability to stay on task but also shored up short-term memory in aging adults. You may have to read the latter half of that last sentence over again if your email flashed in the background while you skimmed, texts pinged through to your cell phone while you absorbed this new information, and the television erupted with the sound of shelling in Syria while you wondered if you should read on. These are among the wages of rapid-fire disruptions that frequently hobble cognitive functioning in so-called “normal aging.”
Decline in our ability to filter out distraction and focus attention, unfortunately, begins not in middle age but rather in our 20s. Ongoing research on memory over the past five years in the laboratory of Dr. Adam Gazzaley at the University of California-San Francisco identified underlying neural mechanisms that characterize this process of decline. The connections between paying attention, filtering out interference, and remembering are critical because it’s obviously far more difficult to retrieve something never properly imprinted in the first place. Here’s one of the ways we get derailed: Even that casual mention of unfolding catastrophe in Syria in the last paragraph may set off a distracting internal dialogue as other manifestations of external distraction tugged on your attention. In that case, it may be hard to accept the idea that relatively use of an immersive, even fun, video game will help in the war on disruption. Stick with me, here. Maybe you could turn down the volume on a couple of the competing channels? A little more focus, please.
The Results
Here’s the breaking news. In a major, long-term, well-controlled study, published tomorrow on the cover of the prestigious science magazine Nature, researchers in Dr. Gazzaley’s laboratory show that modest exposure to NeuroRacer, the customized video training game, helped participants improve their ability to screen out distraction and stay on task. These are the essential building blocks for successful multitasking. Three linked experiments in the Gazzaley Lab involved 174 subjects grouped evenly along the age continuum, from 20-85-years. Effects of the training were robust, showing notable improvement in the ability of older players to keep on task. Study subjects demonstrated markedly better scores after just 12 sessions in training taken an hour a day and three times a week. A participant aged 79-years old even outscored members of a control group made up of 20-somethings who did not engage in training.
The experiments reported on this week are the latest from the research program of Dr. Gazzaley, a physician and noted brain imager who set out, a decade ago, to create interventions to reverse the tide of aging in the brain. The neuroscientist explains that he’s out to reinforce “top-down modulation” in human cognition, which means strengthening the sense of greater control over cognitive power in aging adults. What’s bound to draw the most attention is the claims that playing the game spilled into a more general shoring up of cognitive abilities.
Most significantly, monitoring through electroencephalography (EEG) showed evidence of markedly increased activity of a particular kind in the prefrontal cortex, part of the brain responsible for cognitive control. Before you speed out to scoop up a raft of commercial video games, though, there’s a caveat from lead author on the study, Joaquin Anguera. “Video games aren't a panacea,” he told me. “PlayingMedal of Honor is not going to solve all your problems. The game we designed, NeuroRacer, was sculpted to target a specific ability.” In other words, the neuroscientists built a game designed specifically to promote “interference-resisting abilities,” and they’d gathered evidence of both targeted and larger gains.
The Controversy Over Brain Training
Findings in the study by Anguera, Gazzaley, and nine other colleagues at UCSF build on the work of others in laboratories around the world on the dynamic neuroscientists call transfer. That’s the contested idea that brain training in a specific area can yield more general benefits in cognitive functioning. The adds another layer to accumulating eveidenc he that video training and other interventions can, under certain circumstances, shore up not only performance in a narrow channel – doing crossword puzzles, remembering larger blocks of numbers, and so on – but also general thinking capacities.
An earlier study, also published in Nature back in 2010, dumped all over the idea that brain training games yielded anything more than the ability to improve performance on narrow grounds. The publicity the earlier study generated left the impression that transfer was quite rare, perhaps even illusory. In fact, “Game Over for Brain Training” was the headline on a video about the 2010 study of 11,500 participants, still posted on the Website of neuroscientist Adrian Owen. Owen, a noted neuroscientist, oversaw the earlier study which evaluated claims of beneficial effects of other video games.
“The holy grail of brain training is that they will transfer to other brain functions,” he explained on a television show, broadcast by BBC, built around the experiment as it unfolded. “We all know that if you practice something you will get good at it. There’s no question that if you practice on a brain training game you’ll get better at that brain training game. The question we were trying to get at is: Does it lead to any general benefit in other areas of life, in other aspects of cognitive function?” In clarifying the point, Owen added, “If you want to get better at playing the violin, practice the violin. But you’re not going to get any better at playing the trumpet by practicing the violin.”
The conclusion that brain training had no transfer effect, and the publicity it generated around the world, only fueled rising skepticism about outsize claims of those promoting everything from supplements to quick-fix gimmicks in the multimillion dollar enterprise aimed at the burgeoning population of aging adults concerned about fading memory. In an overall review of those claims, the Stanford Center on Longevity previously warned that an exploitative hard sell for products from computer training to supplements ranged “from reasonable though untested to blatantly false.” There were few provable links between discrete activities and broader benefits, the review concluded. In other words, taking your vitamins or doing a variety of brain training exercises was unlikely to help you retrieve the name of your boss’s partner, painfully tripping along the tip of your tongue.
In the current report in Nature, however, 11 researchers in Gazzaley’s laboratory presented persuasive evidence of transfer. They ventured one step further, detailing the underlying neural mechanisms at play. “… (A)ge-related deficits in neural signatures of cognitive control, as measured with electroencephalography were remediated by multitasking training (ie. enhanced midline frontal theta power and frontal-posterior theta coherence),” they write. Use of the specially-designed game, in other words, markedly improved performance of older adults not merely on the game itself but also led to robust increases in activity in those parts of the brain’s prefrontal cortex associated with greater cognitive control.
“We got the whole story this time,” Dr. Gazzaley told me, sounding rather triumphant, when I reached him on the phone in mid-August as he was going through final edits of the article. “We demonstrated transfer from the training to other types of cognitive abilities and showed that the multitasking improvements on the game were sustainable.” He felt especially chuffed that older adults who participated in a version of the game that required multitasking were also the ones who demonstrated heightened ability in fighting distraction and sustaining their attention.
In an email several days later, he summed up the significance of this breakthrough: “What we have here is a link between neural plasticity and behavioral plasticity, pointing to a neural basis of transfer effects…Transfer has become the holy grail for training studies, but it is not a magic trick. Our data suggest that there must be a common neural mechanism of cognitive control that underlies working memory, sustained attention and multitasking and we put pressure on it with our game.”
This was a point fleshed out by the lead author of the study, Joaquin Anguera. “Many groups have found transfer to other abilities, other groups have shown neural changes following training, but we are the first to show both – and that they are correlated with each other,” he said. “That’s something no other cognitive training study has previously shown, ever – hey, with video game training!” The next step, he said, would be refining NeuroRacer, fleshing out the role of the prefrontal cortex, and devising interventions that even more effectively reverse the “costs” of distraction for older adults. If they can succeed in these goals, the implications for helping people navigate the disruptiveness of modern life would be immense.
Backdrop
The current study builds on earlier research by Gazzaley, a physician and neurologist at the Mission Bay campus of UCSF. He trained originally in an MD/PhD program at Mt. Sinai Medical Center, focussed on the patterns of cognitive decline in aged monkeys. Unlike humans, monkeys suffer from memory loss as they age but don’t contract Alzheimer’s Disease, and that set him on the search for solutions to commonly-experienced cognitive slippage. When he moved to a medical residency in clinical neurology at the University of Pennsylvania, Gazzaley regularly conducted extensive histories on aging patients, exploring the common but often quite troubling lapses in cognition related to so-called “normal aging.” He routinely asked his patients, “Have you noticed any change in your thinking abilities?” and used the responses to shape his research goals.
In a postdoctoral fellowship in the laboratory of Dr. Mark D’Esposito, a renowned expert on working memory at U.C. Berkeley, Gazzaley was perhaps best known as part of the team which identified separate neural markers for focusing attention and filtering out distraction – two functions once thought of as an indivisible thing. “Not two sides of the same coin, but different coins,” the neuroscientist said, in describing this breakthrough. Followup work detailed the connection between paying attention and simultaneously warding off interruptions. The current study builds on this earlier finding, which launched Gazzaley on his quest to identify the underlying neural mechanisms controlling what we remember and how we forget.
Focus in his research on the underlying mechanics of brain systems mechanisms is the aspect of the research program that his one-time mentor, Mark D’Esposito, values most. “To me, too much emphasis is being placed on the mode of training (video games, therapists, traditional cognitive training, etc.),” he wrote me, after reviewing the new study. Instead, the monitoring of brain activity which will allow researchers to use “brain systems (rather than the cognitive symptoms) to develop cognitive interventions” held the greatest promise, according to D’Esposito.
The Game
NeuroRacer itself is a variant of a driving-while-distracted interactive experience. Back in the spring of 2011, mid-way through the current study, I spent several weeks embedded with researchers and participants because I wanted to follow an experiment – no matter its results – as a major neuroscience effort of this kind unfolded over time. I followed participants through steps of the study, watched as the experiment continued, sat in on evaluation sessions while sketchy results tumbled, and even tried my hand at the game. The game, developed with the help of designers at LucasFilms, seemed visually rich. It provided an immersive experience, certainly more fun than testing programs purposely stripped down to their most basic elements. The colors were appealing and the game had the feel of something dynamic and interesting, even if the action doesn't include rapid fire explosions or the need to throttle or shoot at anybody.
It was this immersive feature of NeuroRacer that most impressed other researchers in the field. “It’s the first real attempt to meaningfully harness the power of games in a dedicated cognitive enhancement tool,” commented C. Shawn Green, a neuroscientist at the University of Wisconsin-Madison and experts on brain training. “Many of the ‘brain trainers’ out there are really nothing more than slightly dressed-up versions of classic psychology paradigms,” Green went on. “You’re simply not going to create a useful paradigm just by labeling something a ‘video game’ (and maybe having a few stars appear when people do well). The game used here is reasonably playable as a real game, but also tractable as a research tool, thus allowed the authors to ask solid questions about the mechanics underlying the behavioral and neural changes.”
For users, the game relies on familiar conventions. When the screen lights up, the road reveals itself. On my first run at NeuroRacer, done sitting behind a computer in a darkened lab, I thought: Piece of cake, what idiot couldn’t master this? Controlling the direction by a modem operated on the right, I managed to hold my sporty auto avatar in its lane for several seconds. Then, unfortunately, the road kept slipping away from me, winding around curves, and unexpectedly rising and falling in elevation. The multitasking version of the game required me to drive and, simultaneously, respond accurately through a button on the left to signs that popped up at random on the screen.
Designers of NeuroRacer used an “adaptive staircase algorithm” to ratchet up the level of difficulty, meaning that each time a participant improved enough to respond accurately at least 80 percent of the time the challenge increases incrementally. This component of the research is the study’s “special sauce,” Gazzaley told me. The adaptation allows participants of different ages and abilities to start at an equivalent level of comfort and provides a basis for valid comparison across age groups.
“It leads to continuous challenge over the training period,” he explained. “Unlike the real world, where you get better at at something, it gets easier – here, as you get better it gets harder.” In effect, Gazzaley and his co-inventors had produced a more useful training program simply by paying closer attention to the fundamental principles of gaming. The algorithm “adjusts the game to keep you in the zone,” Gazzaley went on. “Game designers call it ‘flow.’”
Trial Runs
My own experience with the game involved few periods of happy flow. My performance stacked up in a series of rather spectacular crashes and disheartening scores on sign identification. In brief, I flunked multitasking in this particular form. Upright, behind the computer in a darkened room, I felt a little like Charlie Chaplin in Modern Times, out of sync with the machinery. In that second stage of the experiment, I also played NeuroRacer while prone inside a tube and being monitored by functional Magnetic Resonance Imaging (fMRI). Prone, in the tube, it was more like a monkey in a cage, flailing around ineffectively and mostly blurting out, “Oh, crap – missed again.”
When I met with him in his office the next day, Gazzaley had excused these awful initial scores as just one piece of mounting evidence about the grand generational mystery: Why even healthy adults suffer progressively high deficits in responding to competing claims on attention. Older adults typically experience more difficulty in switching between tasks, partly because of what Gazzaley called “stickiness of perception.” This sticky quality of perception as we age, which means that we leave multiple channels of perception and attention cracked open when we switch from doing one thing to another.
“Older adults appear to have a problem in re-engaging, but also in disengaging,” Gazzaley explained. When you switch tasks, you’re basically going from one complex network to another. If that transition is sluggish from disuse or changes in the brain – in the neurochemistry – you’ll feel the consequences.” Limiting the costs of disengagement, and easing the transition to re-engagement, is the aim of ongoing research in his laboratory. If you bring down costs of distraction, Gazzaley has argued, you could bolster the capacity to sustain attention, stay on task, and increase cognitive performance. This was part of what he’d set out to prove with NeuroRacer.
Next Steps
Next steps in the laboratory center on a series of ongoing experiments intended to flesh out a more complete understanding of the dynamics underlying those transitions – engagement, disengagement, and re-engagement – that end up smoothing, or hampering, the ability to switch tasks. Through follow-up studies, Gazzaley’s colleagues are already on the hunt for varied ways of shoring up working memory, not only for aging adults but younger people as well. Gazzaley speculates that effective treatments to enhance brainpower will involve turning down the intensity of signaling from the so-called “default network” in the prefrontal cortex in additional to amplifying activation in the midline theta area.
He suggested that the current study also will prove relevant, as well, for those seeking new treatments for disease, including Alzheimer’s, dementia, depression, and ADHD. “All complex systems are susceptible to noise, or interference, and our brains are the most complex system we are aware of. And so, interference through distraction and interruption is a major vulnerability of our brains,” he wrote to me. “If you have something wrong with your brain, then you are likely to have a greater susceptibility to interference compared to healthy individuals.”
In other words, discovery of underlying mechanics that make healthy aging adults feel as though they’re slipping mental gears should translate into methods for shoring up cognitive functioning even in people with diagnosable illnesses. “Those conditions – ADHD, depression and dementia – all involve trouble with interference and general deficiencies in cognitive control,” Gazzaley went on. “And so it’s reasonable to hypothesize that this intervention may also be effective in improving their cognitive control abilities by challenging them with such a high-interference game.”
Relevance to diagnosable illnesses, rather than the common complaints of healthy adults, will have to be tested, of course. Torkel Klingberg, an expert in working memory at the Karolinska Institute, cautioned that independent studies will be required “to confirm the effects, and also investigate the effect in clinical samples (eg. ADHD or dementia).” Klingberg thought the most significant contribution of the current study was its role in providing “a substantial rebuttal of the view that the cognitive capacity of an individual is a fixed trait that cannot be affected by environment or training.” The challenge now would be to design interventions with robust results in improving the daily lives of aging adults.
Other researchers in the field praised the approach and quality of the study. “The distinctive contribution of the current study was that it opened a unique window into the mechanisms underlying multitasking, and more generally, cognitive control,” wrote Bornali Kundu, lead author of an influential recent study on attention and short-term memory in the Journal of Neuroscience. She pointed out that the benefits of current available treatments are limited by the narrow scope of their effects. “So far we treat such disorders with drugs that target one or more neurotransmitters,” Kundu said. “By training the particular set of neurons (and by default also regulating the relevant neurochemistry) we are specifically treating the patient’s ‘problematic’ brain circuit, however complicated that circuit may be.”
Arthur Kramer, director of the Beckmann Institute for Advanced Science and Technology at the University of Illinois, credited the researchers for making considerable strides in demonstrating how training, transfer and retention of information occurs. He felt the study should be replicated with larger numbers of participants with more varied skills, and that there was much more to be learned by fleshing the effects of transfer through more detailed batteries of tests before and after the training. The ultimate test that matters, he added, was whether “the training effects transfer to real-world skills and behaviors, e.g. safely driving an automobile, maintaining independence as we age…”
Four new prototypes based on the NeuroRacer game are under development in the lab, and a souped-up version of NeuroRacer is also being worked on at a company called Akili Interactive Labs, spun off from his laboratory several years ago. Gazzaley figures that some people will raise questions about his role in simultaneously conducting the research while also inventing training programs in a for-profit venture, but he sees only complementary interests. The company, after all, will not produce entertainment versions of the game, but rather will test and market a video game derived from NeuroRacer as an FDA-approved device. “Hey, one of my main goals from the beginning was to see this research leave the lab,” he said. Eventually, “medicine will not be equated with drugs only,” the neuroscientist added. One day NeuroRacer might be remembered “not as a game, but rather asmedicine.” Here was the latest installment in the war on distraction, bound to build to an even greater crescendo as Boomers clamor for more help in beating back the effects of aging on the brain.