by Jason Kornwitz
Northeastern University psychology professor Iris Berent delivered the 52nd annual Robert D. Klein Lecture on Tuesday afternoon in the Raytheon Amphitheater. In her talk—titled “How do human brains give rise to language?”—Berent argued that human language is a product of a specialized biological system, that we are innately equipped with a language instinct.
“People know how to talk in more or less the sense that spiders know how to spin webs,” she explained, quoting the cognitive scientist Steven Pinker. “Spiders spin webs because they have spider brains, which give them the urge to spin and the competence to succeed.”
Here are five takeaways from the lecture, which was established in 1964 and renamed in 1979 in tribute to the late Robert D. Klein, professor of mathematics and vice chairman of the Faculty Senate.
What makes human language so special?
“We humans are good at language,” Berent explained on Tuesday. “We acquire language rapidly and spontaneously, and we are apparently unique in our ability to do so.”
For more than a decade, her research has focused on why, probing into one central question: What makes human language so special? As an expert in the phonological structure of language, she has tackled the inquiry from a broad interdisciplinary perspective, using a diverse set of methods, languages, and research populations, including infants and children.
Her findings have been published in top scientific journals—including Language, Cognition, and the Proceedings of the National Academy of Science—and her work has been funded by both the National Institutes of Health and the National Science Foundation.
A big brain
At the beginning of her lecture, Berent summarized the standard answer to why humans can acquire language while other nonhuman animals cannot, appealing to three generic yet distinguishing properties of our species.
One is our brain size. Another is our capacity to engage in social interactions. And the third is our highly refined auditory and articulatory motor control systems.
None of these capacities are specific to knowledge—a large brain, for example, also allows you to solve math problems—and Berent argued that they’re entirely insufficient to explain the language gap between a human and, say, a dog.
For her, it’s more plausible to posit that language is a specialized biological system, to argue that we are innately equipped with a “chip” specifically designed for language.
After ruling out two prominent objections to this theory—including the so-called blank state hypothesis, which suggests that all knowledge derives from experience and perception—Berent went on to show that the brain seems to behave like a specialized organ.
‘Blif v. lbif’
“If language is a specialized biological system,” she told the audience, “then distinct languages should likewise share aspects of their design.” And that’s exactly what her research has found.
Regardless of our mother tongue, she said, we prefer certain linguistic structures to others. Despite significant differences between languages as unrelated as Korean and Spanish, all of them seem to share the same set of unwritten rules that dictate how sounds can be arranged to form words.
In a study published in 2007, her team showed that all spoken languages favor certain syllables. For instance, syllables such as “lbif” are much less common across languages than syllables such as “blif.” A later study showed that people are sensitive to this rule even if neither of those syllables occurs in their language.
“In this view, the language system in the brain of every speaker is equipped with abstract universal principles that favor ‘bla’ to ‘bna’ and so on,” she explained. “Critically, these principles are active even if you have never heard either.”
What babies can tell us about language
Later in her talk, Berent pointed to her 2014 study of newborn babies to show that the human brain forms languages based on an innate set of linguistic rules.
One-day-old neonates were presented with blocks of auditory syllables—either well-formed ones like “blif” or ill-formed ones like “lbif”—while their brain activity was monitored using near-infrared spectrometry.
Berent found that ill-formed syllables elicited higher activation compared to well-formed syllables, suggesting that the brain has to work harder to identify syllables that are worse along the hierarchy. “These results do not tell us precisely why ‘lbif’ is disliked,” she explained, “but the fact that you find it basically at birth suggests that this preference is likely innate, rather than one that is acquired by learning.”
Dyslexia and the brain
Understanding how the brain computes language—how neural activity gives rise to cognitive structure—could have big implications for language-based disorders.
Dyslexia is a prime example. Although it is most often classified as a reading disorder, it is also well known to affect how individuals process spoken language.
In a 2013 study of both skilled and dyslexic readers, Berent and her colleagues were surprised to find that the dyslexic readers’ phonological system was considerably stronger than their phonetic system.
“If language and speech are one and the same, then deficit to speech would surely compromise the language system as well,” she explained on Thursday. “But if the systems are distinct, then it is conceivable that despite the problems to speech processing, the core of the language system is intact.”
She added: “I think these results show how a detailed interdisciplinary approach to the language system can help illuminate the origins of speech and language disorders.”