Category Archives: Human development

NEWSWEEK on the ‘new (neuro)science of feelings’

NEWSWEEK reports on new neuroscience research into human emotion, which appears to be more closely linked to cognition than was thought to be the case just a couple of decades ago:

From the earliest days of brain mapping—determining which regions are responsible for which functions—neuroscientists traced feelings and thoughts to structures that were barely within hailing distance of each other. The limbic system deep in the brain, including the amygdala and hippocampus, seemed to be the brain’s holy terror of a 2-year-old, the site of anger, fear, and anxiety, as well as positive emotions. The frontal cortex, just behind the forehead, was the exalted thinker, where forethought and judgment, reason and volition, attention and cognition came from. As recently as the 1980s, neuroscientists focused almost exclusively on cognition and the other functions of the frontal cortex; emotions were deemed of so little interest that neuroscience left them to psychology…

[Subsequent research has found that] both prefrontal-cortex activity and the number of pathways sending calming signals to the amygdala determine just how easily a person will bounce back from adversity. Through these two mechanisms, our “thinking brain” is able to calm our “feeling” self, enabling the brain to plan and act effectively without being distracted by negative emotion.

The relationship between technology use and cognition

At least as far back as Lev Vygotsky, psychological scientists and others interested in human cognition have considered the relationship between humans’ technology use and cognition. Vygotsky, for example, saw humans’ mediation of ‘tools and signs’ (i.e., technological innovations, both concrete and abstract) as a key part of the learning process, while psychologist Jerome Bruner argues that using technology has a recursive relationship with cognition – as we use tools to accomplish some task, our thinking about the task itself can change.

In this article from The Atlantic, writer Ross Anderson examines the growing body of technologies and methods explicitly designed for ‘cognitive enhancement:’

It could be that we are on the verge of a great deluge of cognitive enhancement. Or it’s possible that new brain-enhancing drugs and technologies will be nothing compared to how we’ve transformed our minds in the past. If it seems that making ourselves “artificially” smarter is somehow inhuman, it may be that similar activities are actually what made us human.

New York Times on “the hormone surge of middle childhood”

The New York Times reports:

Said to begin around 5 or 6, when toddlerhood has ended and even the most protractedly breast-fed children have been weaned, and to end when the teen years commence, middle childhood certainly lacks the physical flamboyance of the epochs fore and aft: no gotcha cuteness of babydom, no secondary sexual billboards of pubescence.

Yet as new findings from neuroscience, evolutionary biology, paleontology and anthropology make clear, middle childhood is anything but a bland placeholder. To the contrary, it is a time of great cognitive creativity and ambition, when the brain has pretty much reached its adult size and can focus on threading together its private intranet service — on forging, organizing, amplifying and annotating the tens of billions of synaptic connections that allow brain cells and brain domains to communicate.

New science of bilingualism offers fresh insights into language acquisition, learning

The New York Times reports:

Once, experts feared that young children exposed to more than one language would suffer “language confusion,” which might delay their speech development. Today, parents often are urged to capitalize on that early knack for acquiring language…

But there is more and more research to draw on, reaching back to infancy and even to the womb. As the relatively new science of bilingualism pushes back to the origins of speech and language, scientists are teasing out the earliest differences between brains exposed to one language and brains exposed to two.

New research describes “pathological altruism”

The New York Times reports:

The author of “On Being Certain” and the coming “A Skeptic’s Guide to the Mind,” Dr. [Robert A.] Burton is a contributor to a scholarly yet surprisingly sprightly volume called “Pathological Altruism,” to be published this fall by Oxford University Press. And he says his colleague’s behavior is a good example of that catchily contradictory term, just beginning to make the rounds through the psychological sciences.

As the new book makes clear, pathological altruism is not limited to showcase acts of self-sacrifice, like donating a kidney or a part of one’s liver to a total stranger. The book is the first comprehensive treatment of the idea that when ostensibly generous “how can I help you?” behavior is taken to extremes, misapplied or stridently rhapsodized, it can become unhelpful, unproductive and even destructive.

New study quantifies influence of genes on intelligence

The Los Angeles Times reports:

Intelligence is in the genes, researchers reported Tuesday in the journal Molecular Psychology.

The international team, led by Ian Deary of the University of Edinburgh in Scotland and Peter Visscher of the Queensland Institute of Medical Research in Brisbane, Australia, compared the DNA of more than 3,500 people, middle aged and older, who also had taken intelligence tests.  They calculated that more than 40% of the differences in intelligence among test subjects was associated with genetic variation.

The genome-wide association study, as such broad-sweep genetic studies are known, suggested that humans inherit much of their smarts, and a large number of genes are involved.

New research finds altruism and cooperation among humans may have emerged from prehistoric warfare

The New York Times reports:

Compared with other species, humans are highly cooperative and altruistic, at least toward members of their own group. Evolutionary biologists have been hard pressed to account for this self-sacrificing behavior, given that an altruist who works for the benefit of others will have less time for his family’s interests and leave fewer surviving children. Genes for altruistic behavior should therefore disappear…

Warfare “may have contributed to the spread of human altruism,” [economist Samuel Bowles] and his colleague Herbert Gintis write in their new book, “A Cooperative Species”(Princeton, 2011). “We initially recoiled at this unpleasant and surprising conclusion. But the simulations and the data on prehistoric warfare tell a convincing story.”

Archaeology lends some support to the idea. “Groups that successfully organize themselves to raid others will acquire external resources and, in the long run, will be at a selective advantage against groups that are less well organized,” [archeologists Charles Stanish and Abigail Levine] write of their findings in the Central Andes.