Kevin Lewis

October 24, 2012

Chocolate Consumption, Cognitive Function, and Nobel Laureates

Franz Messerli
New England Journal of Medicine, 18 October 2012, Pages 1562-1564

"There was a close, significant linear correlation (r=0.791, P<0.0001) between chocolate consumption per capita and the number of Nobel laureates per 10 million persons in a total of 23 countries...When recalculated with the exclusion of Sweden, the correlation coefficient increased to 0.862. Switzerland was the top performer in terms of both the number of Nobel laureates and chocolate consumption...A second hypothesis, reverse causation - that is, that enhanced cognitive performance could stimulate countrywide chocolate consumption - must also be considered. It is conceivable that persons with superior cognitive function (i.e., the cognoscenti) are more aware of the health benefits of the flavanols in dark chocolate and are therefore prone to increasing their consumption...Finally, as to a third hypothesis, it is difficult to identify a plausible common denominator that could possibly drive both chocolate consumption and the number of Nobel laureates over many years. Differences in socioeconomic status from country to country and geographic and climatic factors may play some role, but they fall short of fully explaining the close correlation observed."


Motivational Versus Metabolic Effects of Carbohydrates on Self-Control

Daniel Molden et al.
Psychological Science, October 2012, Pages 1137-1144

Self-control is critical for achievement and well-being. However, people's capacity for self-control is limited and becomes depleted through use. One prominent explanation for this depletion posits that self-control consumes energy through carbohydrate metabolization, which further suggests that ingesting carbohydrates improves self-control. Some evidence has supported this energy model, but because of its broad implications for efforts to improve self-control, we reevaluated the role of carbohydrates in self-control processes. In four experiments, we found that (a) exerting self-control did not increase carbohydrate metabolization, as assessed with highly precise measurements of blood glucose levels under carefully standardized conditions; (b) rinsing one's mouth with, but not ingesting, carbohydrate solutions immediately bolstered self-control; and (c) carbohydrate rinsing did not increase blood glucose. These findings challenge metabolic explanations for the role of carbohydrates in self-control depletion; we therefore propose an alternative motivational model for these and other previously observed effects of carbohydrates on self-control.


The Sweet Taste of Success: The Presence of Glucose in the Oral Cavity Moderates the Depletion of Self-Control Resources

Martin Hagger & Nikos Chatzisarantis
Personality and Social Psychology Bulletin, forthcoming

According to the resource-depletion model, self-control is a limited resource that is depleted after a period of exertion. Evidence consistent with this model indicates that self-control relies on glucose metabolism and glucose supplementation to depleted individuals replenishes self-control resources. In five experiments, we tested an alternative hypothesis that glucose in the oral cavity counteracts the deleterious effects of self-control depletion. We predicted a glucose mouth rinse, as opposed to an artificially sweetened placebo rinse, would lead to better self-control after depletion. In Studies 1 to 3, participants engaging in a depleting task performed significantly better on a subsequent self-control task after receiving a glucose mouth rinse, as opposed to participants rinsing with a placebo. Studies 4 and 5 replicated these findings and demonstrated that the glucose mouth rinse had no effect on self-control in nondepleted participants. Results are consistent with a neural rather than metabolic mechanism for the effect of glucose supplementation on self-control.


Rational snacking: Young children's decision-making on the marshmallow task is moderated by beliefs about environmental reliability

Celeste Kidd, Holly Palmeri & Richard Aslin
Cognition, forthcoming

Children are notoriously bad at delaying gratification to achieve later, greater rewards (e.g., Piaget, 1970) - and some are worse at waiting than others. Individual differences in the ability-to-wait have been attributed to self-control, in part because of evidence that long-delayers are more successful in later life (e.g., Shoda, Mischel, & Peake, 1990). Here we provide evidence that, in addition to self-control, children's wait-times are modulated by an implicit, rational decision-making process that considers environmental reliability. We tested children (M = 4;6, N = 28) using a classic paradigm - the marshmallow task (Mischel, 1974) - in an environment demonstrated to be either unreliable or reliable. Children in the reliable condition waited significantly longer than those in the unreliable condition (p < 0.0005), suggesting that children's wait-times reflected reasoned beliefs about whether waiting would ultimately pay off. Thus, wait-times on sustained delay-of-gratification tasks (e.g., the marshmallow task) may not only reflect differences in self-control abilities, but also beliefs about the stability of the world.


Prolonged myelination in human neocortical evolution

Daniel Miller et al.
Proceedings of the National Academy of Sciences, 9 October 2012, Pages 16480-16485

Nerve myelination facilitates saltatory action potential conduction and exhibits spatiotemporal variation during development associated with the acquisition of behavioral and cognitive maturity. Although human cognitive development is unique, it is not known whether the ontogenetic progression of myelination in the human neocortex is evolutionarily exceptional. In this study, we quantified myelinated axon fiber length density and the expression of myelin-related proteins throughout postnatal life in the somatosensory (areas 3b/3a/1/2), motor (area 4), frontopolar (prefrontal area 10), and visual (areas 17/18) neocortex of chimpanzees (N = 20) and humans (N = 33). Our examination revealed that neocortical myelination is developmentally protracted in humans compared with chimpanzees. In chimpanzees, the density of myelinated axons increased steadily until adult-like levels were achieved at approximately the time of sexual maturity. In contrast, humans displayed slower myelination during childhood, characterized by a delayed period of maturation that extended beyond late adolescence. This comparative research contributes evidence crucial to understanding the evolution of human cognition and behavior, which arises from the unfolding of nervous system development within the context of an enriched cultural environment. Perturbations of normal developmental processes and the decreased expression of myelin-related molecules have been related to psychiatric disorders such as schizophrenia. Thus, these species differences suggest that the human-specific shift in the timing of cortical maturation during adolescence may have implications for vulnerability to certain psychiatric disorders.


Child Height, Health and Human Capital: Evidence using Genetic Markers

Stephanie von Hinke Kessler Scholder et al.
European Economic Review, forthcoming

Height has long been recognized as being associated with better outcomes: the question is whether this association is causal. We use children's genetic variants as instrumental variables to deal with possible unobserved confounders and examine the effect of child/adolescent height on a wide range of outcomes: academic performance, IQ, self-esteem, depression symptoms and behavioral problems. OLS findings show that taller children have higher IQ, perform better in school, and are less likely to have behavioral problems. The IV results differ: taller girls (but not boys) have better cognitive performance and, in contrast to the OLS, greater height appears to increase behavioral problems.


The Prevalence of ADHD: Its Diagnosis and Treatment in Four School Districts Across Two States

Mark Wolraich et al.
Journal of Attention Disorders, forthcoming

Objective: To describe the epidemiology of ADHD in communities using a DSM-IVTR case definition.

Method: This community-based study used multiple informants to develop and apply a DSM -IVTR-based case definition of ADHD to screening and diagnostic interview data collected for children 5-13 years of age. Teachers screened 10,427 children (66.4%) in four school districts across two states (SC and OK). ADHD ratings by teachers and parent reports of diagnosis and medication treatment were used to stratify children into high and low risk for ADHD. Parents (n = 855) of high risk and gender frequency-matched low risk children completed structured diagnostic interviews. The case definition was applied to generate community prevalence estimates, weighted to reflect the complex sampling design.

Results: ADHD prevalence was 8.7% in SC and 10.6% in OK. The prevalence of ADHD medication use was 10.1% (SC) and 7.4% (OK). Of those medicated, 39.5% (SC) and 28.3% (OK) met the case definition. Comparison children taking medication had higher mean symptom counts than other comparison children.

Conclusions: Our ADHD estimates are at the upper end of those from previous studies. The identification of a large proportion of comparison children taking ADHD medication suggests that our estimates may be conservative; these children were not included as cases in the case definition, although some might be effectively treated.


The Impact of Time Between Cognitive Tasks on Performance: Evidence from Advanced Placement Exams

Ian Fillmore & Devin Pope
NBER Working Paper, October 2012

In many education and work environments, economic agents must perform several mental tasks in a short period of time. As with physical fatigue, it is likely that cognitive fatigue can occur and affect performance if a series of mental tasks are scheduled close together. In this paper, we identify the impact of time between cognitive tasks on performance in a particular context: the taking of Advanced Placement (AP) exams by high-school students. We exploit the fact that AP exam dates change from year to year, so that students who take two subject exams in one year may have a different number of days between the exams than students who take the same two exams in a different year. We find strong evidence that a shorter amount of time between exams is associated with lower scores, particularly on the second exam. Our estimates suggest that students who take exams with 10 days of separation are 8% more likely to pass both exams than students who take the same two exams with only 1 day of separation.


Caffeine prevents cognitive impairment induced by chronic psychosocial stress and/or high fat-high carbohydrate diet

K.H. Alzoubi et al.
Behavioural Brain Research, 15 January 2013, Pages 7-14

Caffeine alleviates cognitive impairment associated with a variety of health conditions. In this study, we examined the effect of caffeine treatment on chronic stress- and/or high fat-high carbohydrate Western diet (WD)-induced impairment of learning and memory in rats. Chronic psychosocial stress, WD and caffeine (0.3 g/L in drinking water) were simultaneously administered for 3 months to adult male Wistar rats. At the conclusion of the 3 months, and while the previous treatments continued, rats were tested in the radial arm water maze (RAWM) for learning, short-term and long-term memory. This procedure was applied on a daily basis to all animals for 5 consecutive days or until the animal reaches days to criterion (DTC) in the 12th learning trial and memory tests. DTC is the number of days that the animal takes to make zero error in two consecutive days. Chronic stress and/or WD groups caused impaired learning, which was prevented by chronic caffeine administration. In the memory tests, chronic caffeine administration also prevented memory impairment during chronic stress conditions and/or WD. Furthermore, DTC value for caffeine treated stress, WD, and stress/WD groups indicated that caffeine normalizes memory impairment in these groups. These results showed that chronic caffeine administration prevented stress and/or WD-induced impairment of spatial learning and memory.


Storm in a coffee cup: Caffeine modifies brain activation to social signals of threat

Jessica Smith et al.
Social Cognitive and Affective Neuroscience, October 2012, Pages 831-840

Caffeine, an adenosine A1 and A2A receptor antagonist, is the most popular psychostimulant drug in the world, but it is also anxiogenic. The neural correlates of caffeine-induced anxiety are currently unknown. This study investigated the effects of caffeine on brain regions implicated in social threat processing and anxiety. Participants were 14 healthy male non/infrequent caffeine consumers. In a double-blind placebo-controlled crossover design, they underwent blood oxygenation level-dependent functional magnetic resonance imaging (fMRI) while performing an emotional face processing task 1 h after receiving caffeine (250 mg) or placebo in two fMRI sessions (counterbalanced, 1-week washout). They rated anxiety and mental alertness, and their blood pressure was measured, before and 2 h after treatment. Results showed that caffeine induced threat-related (angry/fearful faces > happy faces) midbrain-periaqueductal gray activation and abolished threat-related medial prefrontal cortex wall activation. Effects of caffeine on extent of threat-related amygdala activation correlated negatively with level of dietary caffeine intake. In concurrence with these changes in threat-related brain activation, caffeine increased self-rated anxiety and diastolic blood pressure. Caffeine did not affect primary visual cortex activation. These results are the first to demonstrate potential neural correlates of the anxiogenic effect of caffeine, and they implicate the amygdala as a key site for caffeine tolerance.


What Eye Movements Can Tell about Theory of Mind in a Strategic Game

Ben Meijering et al.
PLoS ONE, September 2012

This study investigates strategies in reasoning about mental states of others, a process that requires theory of mind. It is a first step in studying the cognitive basis of such reasoning, as strategies affect tradeoffs between cognitive resources. Participants were presented with a two-player game that required reasoning about the mental states of the opponent. Game theory literature discerns two candidate strategies that participants could use in this game: either forward reasoning or backward reasoning. Forward reasoning proceeds from the first decision point to the last, whereas backward reasoning proceeds in the opposite direction. Backward reasoning is the only optimal strategy, because the optimal outcome is known at each decision point. Nevertheless, we argue that participants prefer forward reasoning because it is similar to causal reasoning. Causal reasoning, in turn, is prevalent in human reasoning. Eye movements were measured to discern between forward and backward progressions of fixations. The observed fixation sequences corresponded best with forward reasoning. Early in games, the probability of observing a forward progression of fixations is higher than the probability of observing a backward progression. Later in games, the probabilities of forward and backward progressions are similar, which seems to imply that participants were either applying backward reasoning or jumping back to previous decision points while applying forward reasoning. Thus, the game-theoretical favorite strategy, backward reasoning, does seem to exist in human reasoning. However, participants preferred the more familiar, practiced, and prevalent strategy: forward reasoning.


Associations between early life adversity and executive function in children adopted internationally from orphanages

Camelia Hostinar et al.
Proceedings of the National Academy of Sciences, 16 October 2012, Pages 17208-17212

Executive function (EF) abilities are increasingly recognized as an important protective factor for children experiencing adversity, promoting better stress and emotion regulation as well as social and academic adjustment. We provide evidence that early life adversity is associated with significant reductions in EF performance on a developmentally sensitive battery of laboratory EF tasks that measured cognitive flexibility, working memory, and inhibitory control. Animal models also suggest that early adversity has a negative impact on the development of prefrontal cortex-based cognitive functions. In this study, we report EF performance 1 y after adoption in 2.5- to 4-y-old children who had experienced institutional care in orphanages overseas compared with a group of age-matched nonadopted children. To our knowledge, this is the youngest age and the soonest after adoption that reduced EF performance has been shown using laboratory measures in this population. EF reductions in performance were significant above and beyond differences in intelligence quotient. Within the adopted sample, current EF was associated with measures of early deprivation after controlling for intelligence quotient, with less time spent in the birth family before placement in an institution and lower quality of physical/social care in institutions predicting poorer performance on the EF battery.


Early life adversity reduces stress reactivity and enhances impulsive behavior: Implications for health behaviors

William Lovallo
International Journal of Psychophysiology, forthcoming

Altered reactivity to stress, either in the direction of exaggerated reactivity or diminished reactivity, may signal a dysregulation of systems intended to maintain homeostasis and a state of good health. Evidence has accumulated that diminished reactivity to psychosocial stress may signal poor health outcomes. One source of diminished cortisol and autonomic reactivity is the experience of adverse rearing during childhood and adolescence. The Oklahoma Family Health Patterns Project has examined a cohort of 426 healthy young adults with and without a family history of alcoholism. Regardless of family history, persons who had experienced high degrees of adversity prior to age 16 had a constellation of changes including reduced cortisol and heart rate reactivity, diminished cognitive capacity, and unstable regulation of affect, leading to behavioral impulsivity and antisocial tendencies. We present a model whereby this constellation of physiological, cognitive, and affective tendencies is consistent with altered central dopaminergic activity leading to changes in brain function that may foster impulsive and risky behaviors. These in turn may promote greater use of alcohol other drugs along with adopting poor health behaviors. This model provides a pathway from early life adversity to low stress reactivity that forms a basis for risky behaviors and poor health outcomes.


Interpersonal trauma exposure and cognitive development in children to age 8 years: A longitudinal study

Michelle Bosquet Enlow et al.
Journal of Epidemiology & Community Health, November 2012, Pages 1005-1010

Background: Childhood trauma exposure has been associated with deficits in cognitive functioning. The influence of timing of exposure on the magnitude and persistence of deficits is not well understood. The impact of exposure in early development has been especially under-investigated. This study examined the impact of interpersonal trauma exposure (IPT) in the first years of life on childhood cognitive functioning.

Methods: Children (N=206) participating in a longitudinal birth cohort study were assessed prospectively for exposure to IPT (physical or emotional abuse or neglect, sexual abuse, witnessing maternal partner violence) between birth and 64 months. Child intelligent quotient (IQ) scores were assessed at 24, 64 and 96 months of age. Race/ethnicity, gender, socioeconomic status, maternal IQ, birth complications, birth weight and cognitive stimulation in the home were also assessed.

Results: IPT was significantly associated with decreased cognitive scores at all time points, even after controlling for socio-demographic factors, maternal IQ, birth complications, birth weight and cognitive stimulation in the home. IPT in the first 2 years appeared to be especially detrimental. On average, compared with children not exposed to IPT in the first 2 years, exposed children scored one-half SD lower across cognitive assessments.

Conclusion: IPT in early life may have adverse effects on cognitive development. IPT during the first 2 years may have particular impact, with effects persisting at least into later childhood.


The Gambler's Fallacy Is Associated with Weak Affective Decision Making but Strong Cognitive Ability

Gui Xue et al.
PLoS ONE, October 2012

Humans demonstrate an inherent bias towards making maladaptive decisions, as shown by a phenomenon known as the gambler's fallacy (GF). The GF has been traditionally considered as a heuristic bias supported by the fast and automatic intuition system, which can be overcome by the reasoning system. The present study examined an intriguing hypothesis, based on emerging evidence from neuroscience research, that the GF might be attributed to a weak affective but strong cognitive decision making mechanism. With data from a large sample of college students, we found that individuals' use of the GF strategy was positively correlated with their general intelligence and executive function, such as working memory and conflict resolution, but negatively correlated with their affective decision making capacities, as measured by the Iowa Gambling Task. Our result provides a novel insight into the mechanisms underlying the GF, which highlights the significant role of affective mechanisms in adaptive decision-making.


A Show of Hands: Relations Between Young Children's Gesturing and Executive Function

Gina O'Neill & Patricia Miller
Developmental Psychology, forthcoming

This study brought together 2 literatures - gesturing and executive function - in order to examine the possible role of gesture in children's executive function. Children (N = 41) aged 2½-6 years performed a sorting-shift executive function task (Dimensional Change Card Sort). Responses of interest included correct sorting, response latency, spontaneous gestures, and verbal and gestural explanations for sorts. An examination of performance over trials permitted a fine-grained depiction of patterns of younger and older high gesturing versus low gesturing children. Relevant gesturing was positively associated with correct sorting, even more strongly than was age, and had its greatest impact right after the shift to a new relevant dimension. Generally high gesturers outperformed low gesturers even on trials in which the former did not gesture. Results were discussed in terms of theories of gesturing and of possible processes (e.g., scaffolding, adding a second representation) by which gestures might facilitate executive function, and vice versa. Possible preexisting differences between high and low gesturers also were considered. The findings open up a new avenue of research and theorizing about the possible role of gesturing in emerging executive function.


Human handedness: An inherited evolutionary trait

Gillian Forrester et al.
Behavioural Brain Research, 15 January 2013, Pages 200-206

Our objective was to demonstrate that human population-level, right-handedness, is not species specific, precipitated from language areas in the brain, but rather is context specific and inherited from a behavior common to both humans and great apes. In general, previous methods of assessing human handedness have neglected to consider the context of action, or employ methods suitable for direct comparison across species. We employed a bottom-up, context-sensitive method to quantitatively assess manual actions in right-handed, typically developing children during naturalistic behavior. By classifying the target to which participants directed a manual action, as animate (social partner, self) or inanimate (non-living functional objects), we found that children demonstrated a significant right-hand bias for manual actions directed toward inanimate targets, but not for manual actions directed toward animate targets. This pattern was revealed at both the group and individual levels. We used a focal video sampling, corpus data-mining approach to allow for direct comparisons with captive gorillas (Forrester et al. Animal Cognition 2011;14(6):903-7) and chimpanzees (Forrester et al. Animal Cognition, in press). Comparisons of handedness patters support the view that population-level, human handedness, and its origin in cerebral lateralization is not a new or human-unique characteristic. These data are consistent with the theory that human right-handedness is a trait developed through tool use that was inherited from an ancestor common to both humans and great apes.


Genome-Wide DNA Methylation and Gene Expression Analyses of Monozygotic Twins Discordant for Intelligence Levels

Chih-Chieh Yu et al.
PLoS ONE, October 2012

Human intelligence, as measured by intelligence quotient (IQ) tests, demonstrates one of the highest heritabilities among human quantitative traits. Nevertheless, studies to identify quantitative trait loci responsible for intelligence face challenges because of the small effect sizes of individual genes. Phenotypically discordant monozygotic (MZ) twins provide a feasible way to minimize the effects of irrelevant genetic and environmental factors, and should yield more interpretable results by finding epigenetic or gene expression differences between twins. Here we conducted array-based genome-wide DNA methylation and gene expression analyses using 17 pairs of healthy MZ twins discordant intelligently. ARHGAP18, related to Rho GTPase, was identified in pair-wise methylation status analysis and validated via direct bisulfite sequencing and quantitative RT-PCR. To perform expression profile analysis, gene set enrichment analysis (GSEA) between the groups of twins with higher IQ and their co-twins revealed up-regulated expression of several ribosome-related genes and DNA replication-related genes in the group with higher IQ. To focus more on individual pairs, we conducted pair-wise GSEA and leading edge analysis, which indicated up-regulated expression of several ion channel-related genes in twins with lower IQ. Our findings implied that these groups of genes may be related to IQ and should shed light on the mechanism underlying human intelligence.


Are systemizing and autistic traits related to talent and interest in mathematics and engineering? Testing some of the central claims of the empathizing-systemizing theory

Kinga Morsanyi et al.
British Journal of Psychology, November 2012, Pages 472-496

In two experiments, we tested some of the central claims of the empathizing-systemizing (E-S) theory. Experiment 1 showed that the systemizing quotient (SQ) was unrelated to performance on a mathematics test, although it was correlated with statistics-related attitudes, self-efficacy, and anxiety. In Experiment 2, systemizing skills, and gender differences in these skills, were more strongly related to spatial thinking styles than to SQ. In fact, when we partialled the effect of spatial thinking styles, SQ was no longer related to systemizing skills. Additionally, there was no relationship between the Autism Spectrum Quotient (AQ) and the SQ, or skills and interest in mathematics and mechanical reasoning. We discuss the implications of our findings for the E-S theory, and for understanding the autistic cognitive profile.


Abstract Concepts: Data from a Grey Parrot

Irene Pepperberg
Behavioural Processes, forthcoming

Do humans and nonhumans share the ability to form abstract concepts? Until the 1960s, many researchers questioned whether avian subjects could form categorical constructs, much less more abstract formulations, including concepts such as same-different or exact understanding of number. Although ethologists argued that nonhumans, including birds, had to have some understanding of divisions such as prey versus predator, mate versus nonmate, food versus nonfood, or basic relational concepts such as more versus less, simply in order to survive, no claims were made that these abilities reflected cognitive processes, and little formal data from psychology laboratories could initially support such claims. Researchers like Anthony Wright, however, succeeded in obtaining such data and inspired many others to pursue these topics, with the eventual result that several avian species are now considered "feathered primates" in terms of cognitive processes. Here I review research on numerical concepts in the Grey parrot (Psittacus erithacus), demonstrating that at least one subject, Alex, understood number symbols as abstract representations of real-world collections, in ways comparing favorably to those of apes and young human children. He not only understood such concepts, but appeared to learn them in ways more similar to humans than to apes.


Dolphins Can Maintain Vigilant Behavior through Echolocation for 15 Days without Interruption or Cognitive Impairment

Brian Branstetter et al.
PLoS ONE, October 2012

In dolphins, natural selection has developed unihemispheric sleep where alternating hemispheres of their brain stay awake. This allows dolphins to maintain consciousness in response to respiratory demands of the ocean. Unihemispheric sleep may also allow dolphins to maintain vigilant states over long periods of time. Because of the relatively poor visibility in the ocean, dolphins use echolocation to interrogate their environment. During echolocation, dolphin produce clicks and listen to returning echoes to determine the location and identity of objects. The extent to which individual dolphins are able to maintain continuous vigilance through this active sense is unknown. Here we show that dolphins may continuously echolocate and accurately report the presence of targets for at least 15 days without interruption. During a total of three sessions, each lasting five days, two dolphins maintained echolocation behaviors while successfully detecting and reporting targets. Overall performance was between 75 to 86% correct for one dolphin and 97 to 99% correct for a second dolphin. Both animals demonstrated diel patterns in echolocation behavior. A 15-day testing session with one dolphin resulted in near perfect performance with no significant decrement over time. Our results demonstrate that dolphins can continuously monitor their environment and maintain long-term vigilant behavior through echolocation.

to your National Affairs subscriber account.

Already a subscriber? Activate your account.


Unlimited access to intelligent essays on the nation’s affairs.

Subscribe to National Affairs.