As an educational psychologist, my scholarly training has focused on cognition rather than on emotion/affect; however, I’m becoming more interested in the neuroscience of emotion since reading a few papers which argue that to the degree affective phenomena can draw upon cognitive resources, affect does impact cognition. Newsweek reports here on recent research that examines the neurological underpinnings of hypocritical behavior.
Scientists have long bickered over whether hypocrisy is driven by emotion or by reason—that is, by our gut instinct to cast a halo over ourselves, or by efforts to rationalize and justify our own transgressions. In other moral judgments, brain imaging shows, regions involved in feeling, not thinking, rule. In “the train dilemma,” for instance, people are asked whether they would throw a switch to send an out-of-control train off a track where it would kill 10 people and onto one where it would kill one. Most of us say we would. But would we heave a large man onto the track to derail the train and save the 10? Most of us say no: although the save-10-lose-one calculus is identical, the emotional component—heaving someone to his death rather than throwing an impersonal switch—is repugnant, and the brain’s emotion regions scream “Don’t!”
The role of emotion in moral judgments has upended the Enlightenment notion that our ethical sense is based on high-minded philosophy and cognition. That brings us to hypocrisy, which is almost ridiculously easy to bring out in people. In a new study that will not exactly restore your faith in human nature, psychologists David DeSteno and Piercarlo Valdesolo of Northeastern University instructed 94 people to assign themselves and a stranger one of two tasks: an easy one, looking for hidden images in a photo, or a hard one, solving math and logic problems. The participants could make the assignments themselves, or have a computer do it randomly. Then everyone was asked, how fairly did you act?, from “extremely unfairly” (1) to “extremely fairly” (7). Next they watched someone else make the assignments, and judged that person’s ethics. Selflessness was a virtual no-show: 87 out of 94 people opted for the easy task and gave the next guy the onerous one. Hypocrisy, however, showed up with bells on: every single person who made the selfish choice judged his own behavior more leniently—on average, 4.5 vs. 3.1—than that of someone else who grabbed the easy task for himself, the scientists will report in the Journal of Experimental Social Psychology.
The LA Times reports on a study that joins the existing literature on the alleged biological underpinnings of homosexuality.
The brains of gay men resemble those of straight women, according to research published today that provides more evidence of the role of biology in sexual orientation.
Using brain-scanning equipment, researchers said they discovered similarities in the brain circuits that deal with language, perhaps explaining why homosexual men tend to outperform straight men on verbal skills tests — as do heterosexual women.
The area of the brain that processes emotions also looked much the same in gay men and straight women — and both groups have higher rates of depressive disorders than heterosexual men, researchers said.
The study in Proceedings of the National Academy of Sciences, however, found that the brain similarities were not as close in the case of gay women and straight men.
This other LA Times article reviews some of this ‘homosexuality as biology’ literature.
I don’t really follow animal behavior research very closely but this Live Science report (via MSNBC) raises some interesting issues in the old “what is intelligence?” debate.
Chimps and orangutans plan for the future just like us.
They are capable of exercising self-control to postpone gratification and to imagine future events via “mental time travel,” according to new research from Lunds University Cognitive Science in Sweden.
The skill of future planning was commonly thought to be exclusive to humans, although some studies of apes and crows have challenged this idea, say researchers Mathias and Helena Osvath. Now, for the first time, there is “conclusive evidence of advanced planning capacities in non-human species,” they say.
The results are detailed online this week in the journal Animal Cognition.
The NY Times examines how young is ‘too young’ for children to begin using technological devices. Although the article focuses on information and communication technology, I would argue that those of us in the industrialized West begin using ‘high’ technology earlier than we often realize. Light switch, anyone?
EVERYONE knows that babies crawl before they walk, and that tricycles come before two-wheelers. But at what age should children get their first cellphone, laptop or virtual persona?
These are new questions being faced by 21st-century parents, and there is no wisdom from the generations for guidance. You can’t exactly say to your teenager, “When I was a boy, I didn’t have an unlimited texting plan until I was in high school.”
Some parents eagerly provide their children with technology. “My 4-year-old has been on the Web since he could sit up,” said Samantha Morra, a mother of two from Montclair, N.J. “My 6-year-old has an iPod and wants a cellphone, although my husband and I aren’t sure who he’d call.”
New research suggests that not all synaptic connections are the same:
Evolution’s recipe for making a brain more complex has long seemed simple enough. Just increase the number of nerve cells, or neurons, and the interconnections between them. A human brain, for instance, is three times the volume of a chimpanzee’s.
A whole new dimension of evolutionary complexity has now emerged from a cross-species study led by Dr. Seth Grant at the Sanger Institute in England.
Dr. Grant looked at the interconnections between neurons, known as synapses, which until now have been regarded as a standard feature of neurons.
But in fact the synapses get considerably more complex going up the evolutionary scale, Dr. Grant and colleagues reported online Sunday in Nature Neuroscience. In worms and flies, the synapses mediate simple forms of learning, but in higher animals they are built from a much richer array of protein components and conduct complex learning and pattern recognition, Dr. Grant said.
The finding may open a new window into how the brain operates. “One of the biggest questions in neuroscience is to answer what are the design principles by which the human brain is constructed, and this is one of those principles,” Dr. Grant said.
The NY Times reports on the perception of so-called optical illusions:
Staring at a pattern meant to evoke an optical illusion is usually an act of idle curiosity, akin to palm reading or astrology. The dot disappears, or it doesn’t. The silhouette of the dancer spins clockwise or counterclockwise. The three-dimensional face materializes or not, and the explanation always seems to have something to do with the eye or creativity or even personality.
That’s the usual cue to nod and feign renewed absorption in the pattern.
In fact, scientists have investigated such illusions for hundreds of years, looking for clues to how the brain constructs a seamless whole from the bouncing kaleidoscope of light coming through the eyes. Brain researchers today call the illusions perceptual, not optical, because the entire visual system is involved, and their theories about what is occurring can sound as exotic as anyone’s.
In the current issue of the journal Cognitive Science, researchers at the California Institute of Technology and the University of Sussex argue that the brain’s adaptive ability to see into the near future creates many common illusions.
“It takes time for the brain to process visual information, so it has to anticipate the future to perceive the present,” said Mark Changizi, the lead author of the paper, who is now at Rensselaer Polytechnic Institute. “One common functional mechanism can explain many of these seemingly unrelated illusions.” His co-authors were Andrew Hsieh, Romi Nijhawan, Ryota Kanai and Shinsuke Shimojo.
Newsweek reports here on the concept of executive function and its applications in designing and carrying out instruction. Given the increasing visibility within the academy of theories of multi-dimentional intelligence, such as Howard Gardner’s theory of multiple intelligence and Daniel Goleman’s work on emotional intelligence, it’s nice to see this article in a ‘pop’ news weekly.
Most people can recall a kid from grade school who couldn’t stay seated, who talked out of turn and fidgeted constantly, whose backpack overflowed with crumpled handouts and who always had to ask other kids what the homework assignment was. Those kids weren’t bad kids, but they seemed to have absolutely no self-control, no internal disciplinarian to put a brake on their impulses, to keep their attention focused. Not surprisingly, they were almost always lousy students as well.
This kind of student has been tagged with a variety of labels over the years: antisocial personality, conduct disorder, stupid. But recent advances in psychology and brain science are now suggesting that a child’s ability to inhibit distracting thoughts and stay focused may be a fundamental cognitive skill, one that plays a big part in academic success from preschool on. Indeed, this and closely related skills may be more important than traditional IQ in predicting a child’s school performance.
The scientific name for this set of skills is “executive function,” or EF. It’s an emerging concept in student assessment and could eventually displace traditional measures of ability and achievement. EF comprises not only effortful control and cognitive focus but also working memory and mental flexibility—the ability to adjust to change, to think outside the box. These are the uniquely human skills that, taken together, allow us keep our more impulsive and distractible brain in check. New research shows that EF, more than IQ, leads to success in basic academics like arithmetic and grammar. It also suggests that we can pump up these EF skills with regular exercise, just as we do with muscles.