http://www.sciencemag.org/cgi/content/full/322/5904/1057
Science 14 November 2008:
Vol. 322. no. 5904, pp. 1057 - 1059
DOI: 10.1126/science.1167437
Perspectives
BEHAVIOR:
A Biolinguistic Agenda

Marc D. Hauser1 and Thomas Bever2

When we transform thoughts into speech, we do something that no other animal ever achieves. Children acquire this ability effortlessly and without being taught, as though discovering how to walk. Damage to specific areas of the brain that are critical to language shows the profound selectivity of cerebral organization, underlining the exquisite biological structure of language and its computational features. Recent advances bring new insights into the neurogenetic basis of language, its development, and evolution, but also reveal deep holes in our understanding.

There are about 7000 living languages spoken in the world today, characterized by both exceptional diversity as well as significant similarities. Despite many controversies in the field, many linguistic scholars generally agree on two points (1-8). [1] Language as a system of knowledge is based on genetic mechanisms that create the similarities observed across different languages, culturally specific experience that shapes the particular language acquired, and developmental processes that enable the growth and expression of linguistic knowledge. [2] Also, the neural systems that allow us to acquire and process our knowledge of language are separate from those underlying our ability to communicate.

To fulfill a biolinguistic agenda--study of the computational systems inherent to language--we must address the rules and constraints that underlie a mature speaker's knowledge of language; how these rules and constraints are acquired; and whether they are mediated by language-specific mechanisms. We also need to distinguish which rules and constraints are shared with other animals and how they evolved, and to ask how knowledge of language is used in communicative expressions.

There has been little research linking the formal linguistic principles that describe the mature speaker's knowledge of language to the evolutionary, neurobiological, and developmental factors that lead to their instantiation in the adult mind. These principles include computational devices such as hierarchies and dependencies among syntactic categories (e.g., the relationship between determiners such as "the" and "a" followed by nouns), recursive and combinatorial operations, and movement of parts of speech and phrases (e.g., to create a question, many languages move constructions such as "what" or "where" to the front of the sentence). This gap is slowly narrowing, but the separation remains great. It is thus important to clarify the appropriate targets of analysis. In particular, examination of the evolutionary, neurobiological, and developmental aspects of language often focuses narrowly on speech, or in some cases, on the separate issue of communication. Instead, these aspects should be considered in light of the principles discussed, helping to align formal approaches to linguistics with the biological sciences.

Formal approaches to examine linguistic structure are marked by disagreement about the necessary or sufficient computations required to create the expressed languages of the world. Some linguists argue that linguistic form relies on abstract, generative operations that allow phrases and sentences (syntactic structures) to interface with meanings (the semantic system) to create a categorization (lexical terms) in which single words and groups of words convey a specific meaning. Such lexical terms then interface with speech sounds (phonology) to create expressed words in speech or sign. Language has been suggested as an optimal solution to the syntactic-semantics interface, achieved by a small number of computational operations. By comparison, current evolutionary models suggest that the variation in animal body form can be explained by different activation patterns for a few master genes during development. The corresponding idea in linguistics is that the cross-cultural variation in expressed human languages can be explained by a universal set of mental operations, some specific to language, others shared across domains including music, mathematics, and morality (4, 9).

Comparative evolutionary studies suggest that birds, rodents, and primates compute some components of human grammatical competence, but cannot attach this capacity to their own communication systems (10-12). For example, birds and primates can compute a first-degree finite state grammar, where elements in a string of sounds have specific orders, each predicted by simple statistical associations. This grammar is one of the simplest within a hierarchy of computational operations of increasing complexity and expressive power (10, 13). The biggest puzzle, however, is why nonhuman animals cannot integrate these computational capacities with their capacity to communicate. So, although songbirds can combine different notes into a variety of songs, they don't integrate this combinatorial capacity with conceptual abilities to create sounds with varied meaning. Understanding what neural connections are absent, or poorly developed, may help account for this evolutionary bottleneck, and explain why human infants readily produce an infinite variety of meaningful expressions.

Damage to Broca's area and Wernicke's area in the human brain results in distinct patterns of language loss, suggesting that properties of the neocortex make language unique to humans. Artificial language studies show that these cortical areas execute the computations that obey language universals (the principles accessed by all languages, such as specific word orders), but other brain areas are also activated by these computations (14, 15). In fact, different cortical areas may compute different kinds of grammars, but such localization does not provide insight into linguistic theories aimed at uncovering principles that guide the mature state of language competence and its acquisition during development.

Does language have its own dedicated brain circuits, or is much or all of this circuitry shared across domains (such as music and language)? For example, language and music rely on hierarchical representations, make use of combinatorial and recursive computations, and generate serially represented structures. But does each domain recruit a general-use ensemble of these processes or does each domain have its own set of processes? Further studies of selective brain damage and brain-imaging experiments should be informative.

Genes associated with particular linguistic deficits can help pinpoint the molecular basis for language, and link issues in evolution with those in development. Yet, we are far from understanding how normal genes are associated with linguistic features. When the gene FOXP2 was linked to families with a particular language deficit, it seemed that genomics might account for linguistic structure. But the relationship between FOXP2 and language turns out to be weak. For example, FOXP2 exists in songbirds and echo-locating bats; although songbirds have richly structured sound systems that might be properly characterized by a finite state grammar, such grammars are not hierarchically structured, lack syntactic categories (e.g., nouns and determiners), and do not productively generate meaningful variation. Further, the disorders associated with FOXP2 in humans include articulatory disabilities and are not clearly syntactic, semantic, or computational (16, 17). The weak connection between FOXP2 and these aspects of language should not, however, come as a surprise given that most gene-phenotype relationships involving complex phenotypes (such as language) are weak. Nonetheless, by breaking language down into its component parts and finding potential homologs in other animals (especially those that can be genetically manipulated), we may better understand the evolution, development, and neurobiological breakdown of linguistic function.

Figure 1 CREDIT: PETER HOEY

Current research on hemispheric lateralization (division of the brain into left and right halves) and language acquisition provides one example of how interdisciplinary work relates to specific theories in linguistics. All right- handed people have strong left-hemisphere lateralization of syntactic function. However, classic investigations of aphasia--the inability to produce or comprehend language--reveal that familially "mixed" right-handers (right- handers with left-handed family members) show more right-hemisphere involvement in language than pure right-handers (18, 19). Thus, in familially mixed right- handers, the right hemisphere's involvement in language may be specific to lexical representations (20). Familially mixed right-handers access individual words more readily than global sentence structure, whereas the reverse is true of familially pure right-handers (21). Their critical period for language learning is also earlier than that of familially pure right-handers (22), which suggests that mixed right-handers are more likely to base their language learning on the acquisition of words as opposed to syntactic structure. These findings are supported by brain-imaging research showing that familially pure right-handers have left-hemisphere activation during lexical access, whereas familially mixed right-handers show more bilateral hemisphere activation (23). At the same time, all subjects show left-hemisphere activation for syntactic processes. This confirms the basic hypothesis that mixed right-handers have more distributed representations of lexical knowledge.

What are the implications of such population-level differences in lexical use, access, and representation for linguistic theory? In recent decades, syntacticians have struggled with the role of the lexicon in syntactic architectures. Proposals range from the traditional view that the lexicon is distinct from the computations of syntax, to the view that syntax itself is driven by lexical structures. The observed variability in how the lexicon is accessed and represented suggests that it is indeed a biologically separable component of linguistic knowledge.

Brain imaging, genomics, and new methods for comparative studies have provided the means for better understanding the shared and uniquely human components of language. As some linguists argue, the variation in linguistic form among the world's languages may be as superficial as the variation in animal body forms. The superficiality arises, in each case, because of universal computations that provide the necessary suite of developmental programs to generate the variation. As the biolinguistic agenda advances, however, new generations of linguists will be required to translate their formalisms into testable experiments by biologists and psychologists. For example, language deploys recursive operations and generates hierarchical representations with specific configurations. It is not yet clear how to design experiments to test whether nonlinguistic organisms can acquire these representations, or what factors limit either their acquisition or implementation into communicative expression. Conversely, psychologists and biologists will need to be sensitive to the limitations of their methods and the extent to which they can test linguistic theories. Thus, neuropsychological studies showing deficits in language need to be accompanied by comparable tests in nonlinguistic domains to show that they are language-specific deficits. And studies using brain imaging must acknowledge that localization of function does not provide explanatory power for the linguist attempting to uncover principles underlying the speaker's knowledge of language. These cautions aside, the biolinguistic approach is clearly benefiting from modern technologies to advance our knowledge of what language is, how it is represented, and where it came from.

References

1. M. A. Arbib, Behav. Brain Sci. 28, 105 (2005).
2. E. Bates, Discuss. Neurosci. 10, 136 (1994).
3. T. G. Bever, in Cognition and Language Development, R. Hayes, Ed. (Wiley, New York, 1970), pp. 277-360.
4. N. Chomsky, Linguist. Inq. 36, 1 (2005).
5. T. W. Deacon, The Symbolic Species: The Coevolution of Language and the Brain (Norton, New York, 1997).
6. R. Jackendoff, Foundations of Language (Oxford Univ. Press, New York, 2002).
7. E. H. Lennenberg, Biological Foundations of Language (Wiley, New York, 1967).
8. S. Pinker, Language Learnability and Language Development (Harvard Univ. Press, Cambridge, MA, 1984).
9. C. Boeckx, M. Piatelli-Palmerini, Linguist. Rev. 22, 447 (2005).
10. W. T. Fitch, M. D. Hauser, Science 303, 377 (2004).
11. T. Q. Gentner, K. M. Fenn, D. Margoliash, H. C. Nusbaum, Nature 440, 1204 (2006).
12. R. A. Murphy, E. Mondragon, V. A. Murphy, Science 319, 1849 (2008).
13. N. Chomsky, Syntactic Structures (Mouton, the Hague, 1957).
14. A. Friederici et al.Proc. Nat. Acad. Sci. 103, 2458 (2006).
15. M. Musso et al., Nat. Neurosci. 6, 774 (2003).
16. W. Enard et al., Nature 418, 869 (2002).
17. S. Haesler et al., J. Neurosci. 24, 3164 (2004).
18. A. R. Luria, Traumatic Aphasia (Mouton, the Hague, 1969).
19. J. T. Hutton, N. Arsenina, B. Kotik, A. R. Luria, Cortex 13, 195 (1977).
20. T. G. Bever, C. Carrithers, W. Coward, D. J. Townsend, in From Neurons to Reading, A. Galaburda, Ed. (MIT Press, Cambridge, MA, 1989).
21. D. J. Townsend, C. Carrithers, T. G. Bever, Brain Lang. 78, 308 (2001).
22. D. S. Ross, T. G. Bever, Brain Lang. 89, 115 (2004).
23. S. Chan, thesis, University of Arizona (2007).

10.1126/science.1167437
1Department of Psychology, Human Evolutionary Biology, Harvard University, Cambridge, MA 02138, USA. E-mail: mdh@wjh.harvard.edu

2Departments of Linguistics, Psychology and Cognitive Science Program University of Arizona, Tuscon, AZ 85721, USA.