NEWSWEEK reports on new neuroscience research into human emotion, which appears to be more closely linked to cognition than was thought to be the case just a couple of decades ago:
From the earliest days of brain mapping—determining which regions are responsible for which functions—neuroscientists traced feelings and thoughts to structures that were barely within hailing distance of each other. The limbic system deep in the brain, including the amygdala and hippocampus, seemed to be the brain’s holy terror of a 2-year-old, the site of anger, fear, and anxiety, as well as positive emotions. The frontal cortex, just behind the forehead, was the exalted thinker, where forethought and judgment, reason and volition, attention and cognition came from. As recently as the 1980s, neuroscientists focused almost exclusively on cognition and the other functions of the frontal cortex; emotions were deemed of so little interest that neuroscience left them to psychology…
[Subsequent research has found that] both prefrontal-cortex activity and the number of pathways sending calming signals to the amygdala determine just how easily a person will bounce back from adversity. Through these two mechanisms, our “thinking brain” is able to calm our “feeling” self, enabling the brain to plan and act effectively without being distracted by negative emotion.
The New York Times reports:
Workers in the digital era can feel at times as if they are playing a video game, battling the barrage of e-mails and instant messages, juggling documents, Web sites and online calendars. To cope, people have become swift with the mouse, toggling among dozens of overlapping windows on a single monitor.
But there is a growing new tactic for countering the data assault: the addition of a second computer screen. Or a third…
David E. Meyer, a psychology professor at the University of Michigan whose research has found that multitasking can take a serious toll on productivity… Warned that productivity can suffer when people keep interrupting their thoughts by scanning multiple screens rather than focusing on one task.
At least as far back as Lev Vygotsky, psychological scientists and others interested in human cognition have considered the relationship between humans’ technology use and cognition. Vygotsky, for example, saw humans’ mediation of ‘tools and signs’ (i.e., technological innovations, both concrete and abstract) as a key part of the learning process, while psychologist Jerome Bruner argues that using technology has a recursive relationship with cognition – as we use tools to accomplish some task, our thinking about the task itself can change.
In this article from The Atlantic, writer Ross Anderson examines the growing body of technologies and methods explicitly designed for ‘cognitive enhancement:’
It could be that we are on the verge of a great deluge of cognitive enhancement. Or it’s possible that new brain-enhancing drugs and technologies will be nothing compared to how we’ve transformed our minds in the past. If it seems that making ourselves “artificially” smarter is somehow inhuman, it may be that similar activities are actually what made us human.
Posted in Human behavior, Human development, Intelligence, Judgment & decision-making, Learning, Linguistics, Memory, Neuroscience, Perception, Technology and cognition, Vygotsky