3. Theories of grammar and language acquisition

3.2. Generative Grammar

In this textbook, the model that we will be learning together belongs to the family of models called Generative Grammar. It was Noam Chomsky that came up with the idea that models of grammar should be generative. He defines generative grammar as “a system of rules that in some explicit and well-defined way assigns structural descriptions to sentences” (Chomsky 1965: 8). In other words, a generative grammar uses rules to generate or “build” the structure of sentences. Santorini and Kroch (2007) define it as “an algorithm for specifying, or generating, all and only the grammatical sentences in a language.”

What’s an algorithm? It’s simply any finite, explicit procedure for accomplishing some task, beginning in some initial state and terminating in a defined end state. Computer programs are algorithms, as well as recipes, knitting patterns, the instructions for assembling an Ikea bookcase, or a list of steps for balancing your checkbook.

An important point to keep in mind is that it is often difficult to construct an algorithm for even trivial tasks. A quick way to gain an appreciation for this is to describe how to tie a bow. Like speaking a language, tying a bow is a skill that most of us master around school age and that we perform more or less unconsciously thereafter. But describing (not demonstrating!) how to do it is not that easy, especially if we’re not familiar with the technical terminology of knot-tying. In an analogous way, constructing a generative grammar of English is a completely different task than speaking the language, and much more difficult (or at least difficult in a different way)! Just like a cooking recipe, a generative grammar needs to specify the ingredients and procedures that are necessary for generating grammatical sentences.

Not all models of grammar use a generative framework. In other kinds of grammar models, language is produced by repeating memorized fragments or by probabilistic modeling, which is more similar to how Large Language Models produce language.

 

Noam Chomsky

An older man with greying hair and glasses in a tan jacket.
Figure 4: Noam Chomsky in 2004. Photo by Duncan Rawlinson. Used under CC BY-NC 2.0 license.

Noam Chomsky (1928- ) is perhaps the most well-known linguist in the world. Beginning with his 1955 dissertation Transformational Analysis and his 1957 book Syntactic Structures, Chomsky revolutionized the ways we think about language and linguistics and founded the modern field of linguistics.

As we already learned in this section, Chomsky was the first to explicitly model language as a rule-governed system, launching the study of Generative Grammar. Another foundational proposal by Chomsky was the idea of Universal Grammar, which is the idea that humans are genetically endowed with the capacity for language. We will learn more about Universal Grammar in the remainder of this chapter.

Grammatical theory has changed a lot since 1955, but Chomsky has been a key player throughout this time. Most of the grammatical models that have been developed since then were either developed within a Chomskyan framework, sometimes by Chomsky himself, or in direct opposition to it.

Chomsky is also well-known for his political writings and activism.

At the time of this writing, Chomsky, although retired, is still publishing papers and giving talks. He is professor emeritus at the Massachusetts Institute of Technology and a laureate professor at the University of Arizona.

 

Some useful distinctions

When we are talking about our model of grammar, there are some useful distinctions we should make.

Competence vs. performance

Sometimes when we produce language, words don’t come out exactly the way we intend. Because of this, we need a distinction between competence and performance. If you have linguistic competence in a language, then you have acquired the grammatical rules necessary to produce the language in question. If you have linguistic performance, the language you produced conforms to the rules of the grammar you are using.

Most of the time, we have both competence and performance. But it is possible to have competence without performance. For example, if you are drunk or sleepy, you are more likely to misspeak. You still have the rules of grammar in your head, so you have competence. But you may have trouble accessing or implementing the grammar rules, so the end product of your language use does not conform to the rules of grammar in your head, so you do not have performance.

You can also have performance without competence. For example, say you memorize a sentence from a language you don’t speak. You can repeat it, perhaps even flawlessly, so you have linguistic performance. However, you do not have the grammatical rules in your mind necessary to construct that sentence from scratch — you are only repeating what you have memorized — so you don’t have competence in that language.

I-language vs. E-language

The next distinction we should make is between I-language and E-language. I-language stands for internal language and refers to the system of grammatical rules that an individual language user has in their mind. Everyone has a slightly different I-language. E-language, on the other hand, stands for external language, and refers to how language is externalized, including how it is used in a community. Since everyone has slightly different I-languages and because of the effects of linguistic performance, the E-language might not be consistent. In Generative Grammar, what we are trying to model is the properties of human I-languages. However, we cannot access I-language directly, since it is a cognitive object. Instead, we infer the properties of I-language from the properties of particular E-languages.

I-languages is also sometimes called Language (with a capital L) while e-language can be called language (with a small l).

Synchronic vs. diachronic

It is sometimes useful to look at how language changes over time, which is called the diachronic study of language. Although historical linguistics can be very interesting indeed, our model of grammar needs to be a model of language at a particular time, which is called the synchronic study of language. Often, this means studying modern languages, but it can also mean studying a historical language at a particular period.

The history of a language is not encoded in its grammar. Most speakers, unless they have specifically studied it, do not know the history of the languages they speak (and if they have studied it, they very likely did so after their language was acquired as young children). Because the history of the language is not part of what most speakers know, we cannot use a historical explanation in our model of grammar. The historical explanation can be useful for explaining why the grammar has one set of rules and not another, but the rules themselves need to work as a system independent of where they came from.

Let’s use riding a bicycle as an analogy. I could know the history of the bicycle, where each piece of metal was mined and smelted, and where the bike was assembled. All of those processes had to happen in order for the bike to exist for you to ride, but that knowledge isn’t necessary to be able to ride the bike. What is necessary is that your bike is properly assembled, with the peddles linked to one of the wheels with a chain. When you peddle your bike, the peddles move the chain, which in turn rotates the wheel and moves the bike forward. This chain reaction between the parts of your bike is kind of like a grammar. The pedals, the chain, and the wheel are the different parts of a system that work together to make your bike work. It doesn’t matter whether the chain on your bike is the original one or has been replaced, it just matters that it is working now.

In the same way, it is not necessary to know the history of your language in order to use the language. When we are trying to explain how our model of Language works, it doesn’t matter if a particular rule was original to the language or borrowed from a different language — it is part of the system now, and we need to explain how it works now.

 

Figure 2: A bicycle. Photo by LUM3N. Used under Pixabay Content License.

 

Check yourself!

References and further resources

Attribution

Portions of this section are adapted from the following CC BY NC source:

↪️ Santorini, Beatrice, and Anthony Kroch. 2007. The syntax of natural language: An online introduction. https://www.ling.upenn.edu/~beatrice/syntax-textbook

For a general audience

Enos, Casey. No date. Noam Chomsky. Internet Encyclopedia of Philosophy. https://iep.utm.edu/chomsky-philosophy

Academic sources

Chomsky, Noam. 1955. Transformational Analysis. Doctoral dissertation, University of Pennsylvania.

Chomsky, Noam. 1957. Syntactic Structures. The Hague/Paris: Mouton.

Chomsky, Noam. 1965. Aspects of the Theory of Syntax. Cambridge, MA: MIT Press.

definition

License

Share This Book