Noam Chomsky restera dans l'histoire de la linguistique comme l'inventeur de la grammaire générative, l'une des théories les plus influentes de ce siècle, mais aussi l'une des plus controversées. Il ne s'agit rien moins, en effet, que de construire une grammaire universelle, une grammaire dont les fondements s'appliqueraient à toutes les langues du monde, mortes, vivantes et à naître, c'est-à-dire en fait au langage humain. Ne resterait alors qu'à mettre à jour les règles de détail qui permettent de rendre compte de l'incroyable variété des langues. Structures syntaxiques, paru en 1957, expose sous une forme compacte les grandes lignes de ce programme ambitieux. Chaque phrase d'une langue peut être considérée comme le résultat de l'application d'un certain nombre de règles de transformation à une base reconnue comme étant sa structure profonde. Par la suite, les générativistes s'interrogeront sur la nature de la structure profonde : est-ce une construction abstraite de linguiste ou correspond-elle à une réalité psychologique ? --Guillaume Segerer
Avram Noam Chomsky is an American professor and public intellectual known for his work in linguistics, political activism, and social criticism. Sometimes called "the father of modern linguistics", Chomsky is also a major figure in analytic philosophy and one of the founders of the field of cognitive science. He is a laureate professor of linguistics at the University of Arizona and an institute professor emeritus at the Massachusetts Institute of Technology (MIT). Among the most cited living authors, Chomsky has written more than 150 books on topics such as linguistics, war, and politics. In addition to his work in linguistics, since the 1960s Chomsky has been an influential voice on the American left as a consistent critic of U.S. foreign policy, contemporary capitalism, and corporate influence on political institutions and the media. Born to Ashkenazi Jewish immigrants (his father was William Chomsky) in Philadelphia, Chomsky developed an early interest in anarchism from alternative bookstores in New York City. He studied at the University of Pennsylvania. During his postgraduate work in the Harvard Society of Fellows, Chomsky developed the theory of transformational grammar for which he earned his doctorate in 1955. That year he began teaching at MIT, and in 1957 emerged as a significant figure in linguistics with his landmark work Syntactic Structures, which played a major role in remodeling the study of language. From 1958 to 1959 Chomsky was a National Science Foundation fellow at the Institute for Advanced Study. He created or co-created the universal grammar theory, the generative grammar theory, the Chomsky hierarchy, and the minimalist program. Chomsky also played a pivotal role in the decline of linguistic behaviorism, and was particularly critical of the work of B.F. Skinner. An outspoken opponent of U.S. involvement in the Vietnam War, which he saw as an act of American imperialism, in 1967 Chomsky rose to national attention for his anti-war essay "The Responsibility of Intellectuals". Becoming associated with the New Left, he was arrested multiple times for his activism and placed on President Richard M. Nixon's list of political opponents. While expanding his work in linguistics over subsequent decades, he also became involved in the linguistics wars. In collaboration with Edward S. Herman, Chomsky later articulated the propaganda model of media criticism in Manufacturing Consent, and worked to expose the Indonesian occupation of East Timor. His defense of unconditional freedom of speech, including that of Holocaust denial, generated significant controversy in the Faurisson affair of the 1980s. Chomsky's commentary on the Cambodian genocide and the Bosnian genocide also generated controversy. Since retiring from active teaching at MIT, he has continued his vocal political activism, including opposing the 2003 invasion of Iraq and supporting the Occupy movement. An anti-Zionist, Chomsky considers Israel's treatment of Palestinians to be worse than South African–style apartheid, and criticizes U.S. support for Israel. Chomsky is widely recognized as having helped to spark the cognitive revolution in the human sciences, contributing to the development of a new cognitivistic framework for the study of language and the mind. Chomsky remains a leading critic of U.S. foreign policy, contemporary capitalism, U.S. involvement and Israel's role in the Israeli–Palestinian conflict, and mass media. Chomsky and his ideas are highly influential in the anti-capitalist and anti-imperialist movements. Since 2017, he has been Agnese Helms Haury Chair in the Agnese Nelms Haury Program in Environment and Social Justice at the University of Arizona.
A difficult book to review... On the one hand I admire Chomsky's ability to think out of the box in a period (the fifties... long time ago already!) where the rest of linguistics was perhaps staying in that same box a bit too much. He looks at the description of language in a fresh new way which offers all scholars of language food for thought.
Nevertheless, as a modern linguist, I can't help but see this book as 'the one that started it all', by which I mean many decades of linguistics based on a certain assumption that has yet to be proven - Universal Grammar. It's a construct, and the abstract way in which it is treated in generative linguistics gives our study a scientific air, as if we were studying types of rocks. The study of grammar (as opposed to meaning, or even actual language use) is promoted to be the main object of study.
Somehow, I think linguistics would have been better off if other areas of study (phonetics, morphology, dialectology, sociolinguistics) had become or remained equally 'hip'. Sadly, they did not, and only in recent times is some of the damage being repaired...
I've never read a work of Chomsky's that was well-thought-out and methodologically sound. From his political writings to his linguistic concepts, he seems to favor grand, unsubstantiated ideas. I have written a fuller account of my disappointing experiences with Chomsky's work here.
Hari ini kita membaca buku ini, tidak ada banyak perkara yang membuat kita teruja kerana sakin banyaknya kajian kajian terkini yang kita telah baca berkenaan Ilmu Linguistics.
Tetapi 60 tahun lalu, penulisan inilah yang telah menghadirkan satu gesaan baru untuk menghubungkan antara sains dan ilmu Bahasa. Bagi menerangkan perihal pengolahan sintaksis dengan pembuktian saintifik.
The book that started it all off - yet it is clear how far we have come. For Chomsky, this is a relatively clear and simple explanation of transformational grammar. As a linguistic student, it was nice to deepen my knowledge in the background of syntax. The structure of the book was good - definitely a refutation of previous beliefs: but nevertheless, as a student, I was unaware of any syntactic knowledge prior to Chomsky so this was enlightening.
Thrilling. I have no formal background in linguistics and so was completely unable to situate _Syntactic Structures_ in its proper historical or argumentative context, but I found it lucid, engaging, and completely convincing.
the linguistic revolution begins... until he changes his mind ... and then it begins again!!! ... until he changes his mind ... an- well, we'll get there eventually. these things take time.
يُصنَّف هذا الكتاب في عدّة قوائم للكتب المائة الأكثر تأثيراً في تاريخ العالم، فهو من حفنةٍ من الكتب التي شكَّلت علم اللغويات الحديث، ويُقَال أن علم اللغة بأكمله تغير منذ نشره في عام 1957 لتحلَّ فيه النظرية التوليدية والتحويلية (لنعوم تشومكسي) مكان النظرية البنائية السائدة سابقاً (لفيرديناند دو ساسور)، ولذا أظن أن قراءته إلزامية لأي شخص يتمنى الاختصاص أو التعمق بعلم اللغة، وهو من أولى المحاولات الجادة -وأنجحها- لوضع منهج علمي في دراسة اللغات وبناها النحوية.
الكتاب قصيرٌ جداً، فهو أزيد بقليلٍ من 100 صفحة، لكن قراءته مجهدةٌ جداً للذهن والعقل بحيث تقضي خمس أو عشر دقائق في الصفحة الواحدة لتتمعّن بما فيها من أدلة ومحاججات منطقية شديدة التعقيد واستشهادات رياضية يعشق المؤلف الإلقاء بها لإعطاء طابع العلوم الطبيعية لا الإنسانية في كتابه. من الصعب جداً قراءة الكتاب لمن ليست لديه خلفية متينة في علم النحو، وخصوصاً في نظرية التكوين (Constituent analysis) وما فيها من دراسة شجرية لمكونات الجملة. ومن الغريب أنه -على صعوبة استيعابه- مصاغٌ بجمل واضحة وتسلسلات منطقية بسيطة جداً، ومُقسَّمٌ إلى فصول قصيرة جداً وموزَع على عناوين فرعية عدة تسهيلاً للقراءة.
لستُ متأكداً، بكل شفافية، إن كنتُ قد تعلمت شيئاً ملموساً من هذا الكتاب ولا شيئاً قد يفيدنا في دراسة اللغة، بل لعلَّ فائدته تنحصرُ (بالنسبة لي) في استيعاب السياق الأكاديمي للنظرية التوليدية والتعمّق بأدبيات اللغة. من ناحية عملية، فإنَّ معظم ما يحتويه الكتاب هو أفكارٌ تأسيسية في علم النحو الذي درستهُ بالجامعة ويُدرَّس -في أيامنا- بسائر الجامعات، ولكن بنظرةٍ أبكر زمناً وأكثر تعقيداً. كما أنه يُوضّح مصدر الكثير من التغيرات الجديدة في علم اللغة، مثل فكرة أن قواعد اللغة هي ظاهرةٌ فطرية يولدُ جميع البشر مع الاستعداد لها، ولو أن الكتاب لا يُقدّم أساساً شاملاً لمثل هذه التغييرات الكبيرة بقدر ما أنه يُقدّم أمثلة سريعة ومُفصَّلة تدعم فكرته ثُمَّ ينتقل لموضوع وموضوع تلو الآخر، بطريقةٍ يصعبُ فيها على قارئٍ غير مطلع أن يفهم هدفه.
Chomsky no se contentó con revolucionar la historia de la lingüística una sola vez, sino que lo hizo tres veces (y, a sus 90 años, todavía está a tiempo de hacerlo una cuarta). La que produjo este libro fue sin embargo la primera revolución chomskyana, y podría argüirse que la más importante de todas. La única forma de cuantificar adecuadamente la importancia de Estructuras Sintácticas es comparándolo con las obras adánicas de otras ciencias: este libro es, para la lingüística, lo que fueron The Sceptical Chymist para la química, De revolutionibus para la astronomía, El origen de las especies para la ciencia evolutiva. Como todos estos trabajos representó, más que el avance en una determinada línea del saber, una manera enteramente nueva de comprender el fenómeno que trataba. Paradójicamente, dado su título, significó el fin (o algo así) del estructuralismo y el comienzo de todo lo demás; tanto de las corrientes propiamente chomskyanas, como de las antichomskyanas (desde 1957, hay que ser una cosa o la otra). Su influencia llegó incluso a sentirse en otros campos, como la computación y las neurociencias. Estructuras sirvió también para catapultar a Chomsky a un estrellato bastante peculiar, y que lamentablemente el autor usó para hacerse oír con respecto a temas muy fuera de su campo de expertise. Bueno, si Newton pasó los últimos dos tercios de su vida dedicándose a la alquimia y a las profecías bíblicas, bien le podemos perdonar a nuestro Chomsky sus incursiones en política internacional.
Practically useless without a primer on the history of linguistics, or a version of an exam or mental discipline that forces you to learn it rather than just read it. I was choosing between this and Shannon’s mathematical theory of communication, and for some reason thought this might be simpler, less technical, and more accessible. Maybe if he’d used Ramu/Shamu instead of John to eat an apple (or the more ominous ‘has a chance to live’ and its blithely terrifying negation transformational analysis). Still, mindblowing to consider that the concept of simplifying grammar by transformational rules atop terminal strings didn’t exist before this, something our post-PC generations would find an obvious programming feature.
Notes
Independence Of Grammar From set of all possible sentences L, study the surely grammatical ones and the surely ungrammatical ones, produce a grammar that is capable of separating sentences from non-sentences. Strong version: linguistic theory can generate different grammars all of which handle clear cases properly. What is grammatical? a) can’t be based on observed, because from finite set we are able to project and understand indefinite new sentences. b) Meaning is irrelevant, we can pick out meaningless grammatically correct sentences “colorless green ideas sleep furiously”. c) Statistical approximation doesn’t work, because while we come across completely novel utterance, we can still pick out grammatical one, and learn it faster (linguists have taken analog logistic curve of probable sentences and made it digital, where rare ones are impossible, and common ones are possible): a schematic because language cannot be ‘completed’.
Elementary Theory Levels of representation: Perform Shannon-entropy on language instead of exhaustive grammar (must generate all sequences of morphemes, without just being a list of all morphemes). So sentence as structure of morphemes, and morphemes as structure of phonemes. Finite state Markov process: Machine with rules that determines step by step from an initial state, producing a finite number of sentences. Human as machine, choosing 1st word, then being constrained in choice of 2nd, etc, until full sentence. Generates sentence left to right. English not a finite state language. If A, then B. If either (if A, then B), or C, then D. Ad infinitum. Finite state grammar then, will not produce all true statements, and also produce many non-sentences, inadequate linguistic theory. Similarly, need more general concept of linguistic level.
Phrase Structure Parsing (constituent analysis) provides linguistic description on syntactic level Finite set of initial strings E, and finite set of instructional formula F (with order to perform them) of the form X->Y ie ‘rewrite X as Y’. Terminal string: Derived string cannot be rewritten further using rules. Set of terminal strings given (E,F) is the terminal language. All finite state languages are terminal, but not all terminal are FSL. Then write instruction formula for creating morphemes out of phonemes (take + past -> tuk) Generates sentence top to bottom.
Limitations Of Phrase Structure Distributive rule: Z+X+W and Z+Y+W can be merged if X,Y are constituents of same type. Eg: the scene, of the movie, was in Chicago; and the scene, of the play, was in Chicago. The scene, of the movie and the play, was in Chicago. X,Y need to have been derived similarly for this to be applicable. In (E,F) type phrase structure, not possible to determine their background, so cannot be sure. Phrase structure produces sentence not left-right but top-bottom, like the mathematical derivation of a proof. Each step depends only on previous step. But powerful rule like distributive rule needs look back at much earlier steps Similarly, rule that can invert a sentence (John admires sincerity -> sincerity admired by John) to test that it is also grammatical, cannot be handled by (E,F). Transformational Grammar: Take terminal string of morphemes from phrase-structure (E,F), apply set of transformations (some obligatory, some optional) to get string of words. Then morphophonemic rules applied to get string of phonemes. If only obligatory transformations, then it is a kernel sentence. Grammar is simplified, because now (E,F) phrase structure rules only need to be given for kernel sentences. All other sentences derivable from these by transformations. Is grammar speaker-centric, ie how to generate sentences (synthesis) but not how hearer reconstructs them (analysis)? Neither. A description of the sentences it generates. Like Chemical theory that generates physically possible compounds, the basis for qualitative analysis and synthesis.
Goal Of Linguistic Theory Grammar as theory of Language: takes finite observations (sentences), establishes relations between constituents, and can therefore predict/generate new sentences. Conditions: Adequacy (native speakers must accept new sentences), and Generality (should use ideas/rules like phrase, morpheme, that are independent of the language). Otherwise multiple grammars possible with internal consistency in any corpus of sentences. General Theory: discovery procedure (find grammar from corpus), decision procedure (is Grammar G1 the best possible one?), evaluation procedure (is G1 better than G2?).
English Transformations Terminal string : John - C - eat + an + apple. Kernel sentenc: John ate an apple (only the obligatory transformation). Tq: did John eat an apple. Tw1: What did John eat. Tw2: Who ate an apple Falling intonations for kernel sentences, and rising intonations for yes/no questions. A transformation that converts one to the other therefore also converts the intonation. Interrogatives take yes/no questions and transform them (did John eat an apple -> What did John eat), so this converts rising intonation back to a falling one, like a declarative. Simplest transformational grammar will lead to consequences that appear inexplicable (‘have you a book on X’, but not ‘read you a book on X?’): ie, arbitrariness is actually higher level regularity. Similarly, T N is Adj, must generate ‘the book seems interesting’, ‘the very interesting book’, but not ‘the child seems sleeping’ and ‘the very sleeping child’, Tadj that distinguishes V from Adj Grammar is more complex is kernels contain both actives and passives. Simpler if keep only active and have powerful transformation capable of converting to passives.
Explanatory Power Of Linguistic Theory Constructional homonymity: phoneme sequence could have two interpretations (an aim / a name). Grammar should be able to resolve ambiguity (morphology and phrase structure such that one clear interpretation emerges). Even higher levels than phrase structure required to resolve phrasal ambiguities Transformational ambiguity: 1) i found the boy, 2) boy is studying in the library. Transform these kernels to either 1) i found studying in the library the boy, or 2) I found the boy studying in the library. When phrases are ambiguous, can be resolved by understanding at level of transformations, what kind, how many etc, like yes/no interrogative (Tq only) vs who/what interrogative (Tq+Tw1/2)
Syntax and Semantics Asking wrong questions (how to construct grammar that takes account of meaning) as opposed to asking how grammar can be used to enhance use of language Phonemic distinctness (phonemes are distinct based on meaning): Fail in case of synonyms, homonyms, accents, multiple/metaphorical meanings. Instead linguist doesn’t care about meaning in order to establish phonemic distinctiveness (medal vs metal), simply experiment, see if someone can consistently distinguish, elaborate/vary till you get some operational criteria
When Syntactic Structures appeared in 1957, published by Mouton & Co., it marked not merely a turning point in linguistic theory but a cognitive revolution across the human sciences. Noam Chomsky, then a young linguist at MIT, produced in under a hundred pages what became the intellectual equivalent of a Copernican shift.
Syntactic Structures did not simply propose a new grammar; it redefined what it meant to study language scientifically. To understand the magnitude of this transformation, one must read the book not only as a technical model but also as a philosophical manifesto—a radical reorientation of linguistics from descriptive empiricism to formal generativism.
Before Chomsky, the dominant paradigm in American linguistics was that of Bloomfieldian structuralism, rooted in the empiricist philosophy of behaviorism. Language was studied as a set of observable habits — sound patterns and distributions — to be analyzed through inductive generalization. The linguist’s task was to catalogue data, not to theorize mental processes.
In psychology, B.F. Skinner’s Verbal Behavior (1957) argued that speech was learned through conditioning — a stimulus–response mechanism reinforced by reward. Chomsky’s Syntactic Structures, appearing the same year, was an intellectual counterattack. He claimed that no amount of behavioral conditioning could account for the infinite creativity of human language — our ability to produce and comprehend sentences never heard before.
Thus, the book was revolutionary not only linguistically but cognitively: it reasserted the mental reality of grammar, introducing what would later be known as the “Cognitive Revolution.”
Syntactic Structures is deceptively concise. Across its ten chapters, Chomsky constructs a formal theory of grammar grounded in mathematical precision. The work’s structure itself mirrors his argument — beginning with simple assumptions and building toward abstract generative principles.
The chapters progress as follows:
Introduction — definition of linguistic theory’s aim: to produce a finite description capable of generating all and only the grammatical sentences of a language.
The Independence of Grammar — grammar as autonomous from meaning and use.
Immediate Constituents — hierarchical phrase structure over linear sequencing.
Phrase Structure Grammar — rules that recursively generate sentence constituents.
Transformations and the Auxiliary — transformational operations that alter structures while preserving meaning.
A Model for Syntactic Description — integration of phrase-structure and transformational rules.
Some Transformations in English — concrete examples (e.g., negation, question formation).
Summary and Conclusion — implications for linguistic theory.
In this progression, Chomsky builds a formal system where syntax is generative — capable of producing an infinite number of grammatical sentences from a finite set of rules.
At the heart of the book lies Chomsky’s concept of transformational-generative (T-G) grammar. He distinguishes between two layers of structure:
Phrase Structure Rules, which generate basic “kernel sentences.”
Transformational Rules, which derive new sentences from these kernels (e.g., active → passive, declarative → interrogative).
For example:
Kernel: The boy is reading the book.
Transformation: Is the boy reading the book?
These operations are not random; they obey strict formal rules. The idea that syntax could be described as a formal generative system—a set of algorithms—was revolutionary. It introduced the notion that the human brain must contain an internal grammar generator, capable of infinite creativity within finite means.
This insight paralleled discoveries in mathematics (Gödel), logic (Carnap), and computer science (Turing), situating Chomsky’s linguistics within the broader intellectual movement toward formal systems.
Though only implicit in Syntactic Structures and made explicit in later works like Aspects of the Theory of Syntax (1965), the distinction between competence (internalized knowledge of language) and performance (actual language use) originates here.
For Chomsky, linguistic theory should describe the ideal speaker-hearer’s competence—the mental system that underlies linguistic intuition—rather than the messy surface data of performance, which are affected by memory, distraction, and context.
This distinction repositioned linguistics from being a social or behavioral science to a branch of cognitive psychology. Language was no longer merely an external behavior but an internal mental faculty—the first serious model of “mind as computation.”
Chomsky’s formalism in Syntactic Structures borrows heavily from symbolic logic. He employs recursive rules, phrase-structure trees, and transformational notations reminiscent of logical calculus.
For instance, he writes:
S → NP + VP NP → Det + N VP → V + NP
These phrase-structure rules recursively define how smaller constituents build larger ones, generating hierarchical structure. Transformational rules then operate on these structures, creating syntactic diversity while maintaining grammaticality.
This mathematical approach gave linguistics a new scientific precision. It allowed grammars to be tested for generative adequacy (their ability to generate all and only grammatical sentences). In effect, Chomsky turned language into a formal object of inquiry—a system governed by rule-based computation rather than descriptive observation.
Beyond its technical apparatus, Syntactic Structures represents a profound philosophical stance: the revival of Cartesian rationalism against the empiricism of the mid-20th century.
Chomsky argued that humans are born with an innate linguistic capacity—a “language faculty” or “universal grammar” that constrains possible human languages. This idea reasserted the notion of inborn mental structures, a view reminiscent of Descartes and Leibniz but radical in a behaviorist age.
By claiming that children acquire language not through imitation but through the internal activation of an innate system, Chomsky shifted the focus of linguistics from the external to the internal—from surface utterances to the deep architecture of mind.
The implications were immense. Linguistics became a window into cognition, and Chomsky’s ideas laid the groundwork for modern cognitive science, artificial intelligence, and neurolinguistics.
Despite its groundbreaking nature, Syntactic Structures has drawn significant criticism.
a) Empirical Thinness: The book provides few examples beyond English and rarely engages with cross-linguistic data. Its formal models, elegant though they are, rest on idealized intuitions rather than large-scale evidence. Later linguists, especially typologists and corpus linguists, have argued that such abstraction risks detaching theory from linguistic reality.
b) Meaning Neglected: Chomsky’s famous assertion that “Colorless green ideas sleep furiously” is grammatically correct though meaningless. His point was to separate syntax from semantics — to show that grammaticality does not depend on meaning. However, critics like George Lakoff and the cognitive linguists later argued that meaning and structure are inseparable, that syntax itself is meaningful.
c) Competence vs. Performance Dichotomy: While theoretically useful, the separation between competence and performance ignores how social and pragmatic contexts shape language. Sociolinguists like Dell Hymes and Labov rejected the “ideal speaker-hearer” as an abstraction divorced from reality.
d) Excessive Formalism: By modeling grammar mathematically, Chomsky inspired both admiration and alienation. Many linguists felt his approach privileged formal elegance over descriptive richness.
Nonetheless, these critiques only underscore the depth of Chomsky’s provocation: he redefined what questions linguists must ask, not merely how they should answer them.
8. Influence and Legacy: The impact of Syntactic Structures cannot be overstated. It transformed linguistics from a subfield of anthropology into a formal cognitive science. Generative grammar became the dominant paradigm for decades, spawning entire subfields — Government and Binding Theory, Minimalism, and more.
Outside linguistics, its influence rippled through:
Psychology: the birth of cognitive psychology (e.g., Miller, Fodor).
Computer Science: the foundations of formal language theory and programming language design.
Philosophy: renewed debates about rationalism, intentionality, and mind-body dualism.
Education: reshaping theories of language acquisition and grammar pedagogy.
Chomsky’s ideas also sparked intellectual opposition—Halliday’s systemic functional linguistics, Lakoff’s cognitive linguistics, and the pragmatic turn of the 1980s all define themselves partly against him. Yet even opposition confirms his foundational status.
To a modern reader, Syntactic Structures may appear terse, even austere. Its notation can feel archaic compared to later elaborations like the Minimalist Program. Yet its enduring power lies in its conceptual clarity. Every page radiates intellectual daring — the conviction that language, the most mysterious of human faculties, can be formally modeled and mentally explained.
Today, computational linguistics and AI models like GPT and BERT are distant descendants of Chomsky’s original insight: that language generation is rule-based, hierarchical, and combinatorial. Even those who reject his specific theories remain indebted to his methodological legacy — the idea that to understand mind, one must model structure.
Syntactic Structures was not just a linguistic treatise; it was a philosophical intervention in the theory of knowledge. It dethroned empiricism, revived mentalism, and provided the first serious scientific basis for studying mind.
Chomsky’s claim that language acquisition is “an example of the poverty of the stimulus” — that children know more than they are taught — became a cornerstone of modern epistemology. It reintroduced the idea of innate structures into scientific discourse.
Critics like Quine and Skinner resisted this nativism; others, like Fodor and Dennett, embraced and extended it. Whether one agrees or not, the fact remains: Syntactic Structures changed the direction of twentieth-century thought.
If Yule’s The Study of Language is a map of linguistics, Chomsky’s Syntactic Structures is its earthquake. It reconfigured the intellectual terrain, forcing all subsequent inquiry to position itself relative to it — whether in alignment or rebellion.
In less than a hundred pages, Chomsky transformed the study of language from a catalog of patterns to a theory of mind. His formal grammar became not just a linguistic tool but a metaphor for human creativity: finite rules generating infinite possibilities.
Syntactic Structures remains a work of profound audacity — terse, technical, yet tectonic in consequence. Its brilliance lies not in the minutiae of its rules but in its intellectual ambition: to show that the secret architecture of human thought is written, quite literally, in syntax.
For linguistics, psychology, and philosophy alike, it remains the book that taught us that grammar is not a manual of correctness — it is a mirror of the mind.
"We can describe circumstances in which a 'quantificational' sentence such as "everyone in the room knows at least two languages" may be true, while the corresponding passive "at least two languages are known by everyone in the room" is false, under the normal interpretation of these sentences -- e.g., if one person in the room knows only French and German, and another only Spanish and Italian. This indicates that not even the weakest semantic relation (factual equivalence) holds in general between active and passive."
Grammar is a device for generating sentences in a language. Those sentences are grammatical which are acceptable to speakers of the language in which they are expressed. Thus we have a theoretical disposition and an empirical component, respectively, for a linguistic theory. Grammar understood as a device will make the purpose of linguistic theory to explore a grammar's particular mechanisms for sentence generation, whereas it's ability to generate sentences which are grammatical to the ear of a speaker will, to the same extent that it generates all and only the grammatical sentences, be a validation of a grammatical device for a specific language.
An example of a simple yet insufficient grammatical device is the Markov process, which uses a matrix of frequencies (derived from some stock of observed sentences) that can be used to generate a phonetically ordinary sentence by random sampling. This is the same device used in information theory to demonstrate the level of redundancy in natural languages like English. The reason for its insufficiency is that even though the sentences it generates are phonetically ordinary, they are often not grammatical. What is required is to further constrain this device with additional rules such that it generates fewer ungrammatical sentences while retaining all of the grammatical ones.
Phrase structure is one such additional constraint. If one denotes "noun phrase" and "verb phrase" by NP and VP respectively, then one can characterize a sentence as NP+VP. Noun and verb phrases will have a noun or verb as their root with possible affix on either end. Hence NP = X+N+Y and VP = X+V+Y where N or V are the roots and X is a prefix and Y a suffix variable. Simple examples of prefixes of a noun phrase are the definite or indefinite articles ("AN apple" or "THE house"). Phrases can also be recursive such that an affix of an NP or VP can also be a NP or VP.
A Markov process which is further constrained to generate only sentences which are decomposable into a recursive phrase structure will eliminate many of the ungrammatical sentences that might have otherwise been generated, but not all of them. For example, the tense used to express a statement will require consistency across parts of the phrase structure in order to be grammatical. The sentence "When I woke this morning I brush my teeth" is not. We require "brush-ed" in the past tense for consistency with "woke". Another example would be the choice of copula used to bind a noun to a predicate depending on the noun's plurality (a choice of "is" or "are" for "he" or "they").
Chomsky proposes that we consider a subset of "kernel" sentences which are grammatical by dint of having only started with a subset of words which will always be consistent in the phrase structure. But that we then allow a set of transformation rules which, when applied, will take the kernels to new sentences in the language in a way that preserves the property of being grammatical. The transformation rules are what principally defines this work, as does the chain of reasoning that leads us here, to wit, the notion of adequacy of a device for generating grammatical sentences.
A large section of this monograph is devoted to providing transformation rules for the English language, merely as an example, and purports to show by this that the method can simplify our understanding of some of English grammar's more unusual locutions. It's a bit of a slog. I think what makes this monograph really worth reading are some of the introductory and summary remarks in each section and around the whole work, which outline quite clearly what the aims of the theory are and why the author believes they are the correct ones to take, at least in the context of the philosophical trends in research science that prevailed at the time it was written. If you are at all familiar with Carnap, Shannon, Turing, Quine or Goodman, you should recognize the influences here. This is linguistics as a hard, computational science, not the branch of humanities that fell out of philology ages ago. I don't know of any other work coming out of the analytic tradition which has had so great an impact on a branch of science as the current work has had on linguistics.
"In 2005, I published a paper in the journal Current Anthropology, arguing that Pirahã – an Amazonian language unrelated to any living language – lacked several kinds of words and grammatical constructions that many researchers would have expected to find in all languages"
"...it is astounding that the point that has so inflamed my academic critics was my claim that the Pirahãs lacked subordinate clauses." Dan Everett
The book where it all began. It's a (relatively) clearly written book, and very useful in reviewing higher concepts about what we do when we work in transformational grammar. The chapter on syntax and semantics is especially lovely---that one gets 5 stars.
Word of warning: the book is very technical, especially from Chapter 5 onwards. The book presents Chomsky's theory of a "transformational grammar", which is very...cool (I gues). But currently I can only understand about 40 per cent of the book. Will need a reread (or lots of rereads).
Almost ten years ago I sat in a linguistics class in undergrad, and I distinctly remembered the "Colorless green ideas sleep furiously" sentence Chomsky uses in this text. I also remembered the talk about theoretical grammars with only 2 letters and very simple rules, like "aaabbb" or "ababab" or "abbbba"...It's strange to me how that was already nearly 10 years ago, because I'm a totally different person. I bet Chomsky feels like a totally different person several times over since he wrote this, given that this book is something like 70 years old now.
More than anything, reading this book teaches me that I probably shouldn't go into linguistics. Sure, there's some overlap with my computer science interests, but I'm not passionate enough about algorithms and grammars to want to pursue this further than as an occasional curiosity. Perhaps the most relevant excuse for reading this text in 2025 is the comparison between what Chomsky describes here and what AI can do. Thankfully, after those 70 or so years, Chomsky has honed his skills as a communicator, and he explains nicely the distinction between natural intelligence (humans can extrapolate from very partial examples and context) and artificial intelligence (computers require an excessively large data set to only get mediocre answers). And to clarify my complaints, Max, I'm sorry I said AI doesn't know anything but 'averages.' I realized I was being too kind. AI isn't even as good as the average; instead, it only understands the lowest common denominator, the Median, which in all cases of language, is incredibly lower than the Mean, which is raised up artificially by the geniuses of past ages.
One of the other Goodreads reviewers complained that Chomsky started linguistics down a fruitless path of hyperfixating on universal grammars, but I don't even know enough about linguistics to know if A) that's even true in the first place, and B) whether that's a bad thing. It seems that one of his main claims is that "no theory of linguistic structure based exclusively on Markov process models and the like, will be able to explain or account for the ability of a speaker of English to produce and understand new utterances, while he rejects other new sequences as not belonging to the language."
He describes these "finite state Markov processes" earlier in the text, drawing a helpful diagram of the options for which words a "machine" (proto-computer?) might choose to complete a sentence. In this sense, I don't think AI has proven him wrong at all, but I could totally be mistaken. I take that diagram more as inspiration for my own thinking about cliched thinking, and how you can treat the average person more as an algorithm than as an original thinker, especially when concerning politics or other overwrought topics.
I do think the other reviewer who complained about Chomsky was on to something, since looking back at my bookmarks, I noticed this same claim recurring: "More generally, linguists must be concerned with the problem of determining the fundamental underlying properties of successful grammars. The ultimate outcome of these investigations should be a theory of linguistic structure in which the descriptive devices utilized in particular grammars are presented and studied abstractly, with no specific reference to particular languages".
In other words, why grammar uber alles? Why not "phonetics, morphology, dialectology, [or] sociolinguistics", to quote the top Goodreads reviewer? I don't know. It seems Chomsky was hellbent on a scientific approach to language, in the most cutting and dryly philosophical way he could think of. As he writes in the introduction: "By pushing a precise but inadequate formulation to an unacceptable conclusion, we can often expose the exact source of this inadequacy and, consequently, gain a deeper understanding of the linguistic data". In other words, he wanted to take the Markov processes to task and refute certain ideas about grammar being finite. As he puts it, "In this respect, a grammar mirrors the behavior of the speaker who, on the basis of a finite and accidental experience with language, can produce or understand an indefinite number of new sentences."
The main thing Chomsky is fascinated with is how elegant the human mind is when learning and using grammar; it's decidedly unlike a computer with every possibility mapped out ahead of time. In that way, Occam's razor gives a close shave: "any attempt to present directly the set of grammatical phoneme sequences would lead to a grammar so complex that it would be practically useless." In other words, the challenge is how to deal with something as complex and theoretically infinite as human language, but to put it into a quasi-mathematical/scientific language. I have no clue how successful he was, but it's something that gave undergrad Mark a lot to think about, and it's still something interesting to chew on, though I'm not sure how to swallow it...
I'm still reading the classics of linguistics. Syntactic Structures is one of Chomsky's earliest works and perhaps his most original; this is the book that established transformational grammar as a major trend in linguistics. It is the fourth volume in the Janua Linguarum series, of which I read the first volume last week. Chomsky is concerned with grammar as a device for distinguishing grammatical from ungrammatical sentences, or generating any grammatical utterance without generating any ungrammatical ones (as defined operationally by the behavior of native speakers.) He begins by considering and rejecting two simpler models for grammar, the idea that grammars can be modeled by finite Markov processes (like a Turing Machine, where each state is determined only by the previous state), which he shows is inadequate with examples like either. . .or, if. . .then, etc., and immediate constituent analysis which is similar to traditional grammar.
He then interrupts the argument for a general consideration of the goals of a linguistic theory, which was very interesting from a philosophical viewpoint. After this, he goes on to propose that there is another level distinct from phrase structure which he calls the transformational level; he argues that there are certain "kernel" sentences which are terminal states of the phrase structural level (all simple affirmative declarative sentences) which are then transformed by an ordered series of transformational rules, obligatory or optional. He shows how this simplifies the explanation of things like the passive, interrogations, and negations, and compound expressions. He then gives detailed examples from English. The book is difficult at the beginning; I almost wished that I could have somehow read it backwards, because the earlier chapters make more sense in the light of the later ones.
The transformational approach is certainly a powerful way to consider grammar. It has since been very much modified, in large part by Chomsky himself, but this was the beginning. It is a book that anyone with a serious interest in language should read. I only wish that he had given examples from other languages with grammars very differently structured than English, because he claims that the transformational approach is a universal which should apply to all language grammars.
For the most part, lucidly written and articulates it's main points rather well. Chomsky sets out the goals of transformational grammar and generative linguistics very modestly, to the point where it clashes with his public persona in the present, showing the humble foundations within linguistics from whence we eventually arrived at Minimalism, Move and Merge as the toolkits of the modern language scientist. However, those seeking a philosophical or more scientifically (in an empirical sense) inclined discussion on the nature of language will have to dig through the scraps of subtext afforded here. Whilst I personally think Chomsky's conception of language leaves a lot to be desired, and that in the end it sets up a house of cards that many a breeze has succeeded in shaking in modernity (though most of not all have factored into the conception of Minimalism) but the foundations of thought laid bare here are still an open target. Though this has little impact on the work in question, it's structure, and presentation, and more with the habitus it established and survives in. In that sense, it won't convince you one way or the other if you already have an informed opinion of generativism. For better or for worse, it also suffers from a poverty of any substantial philosophy. It is the manual it hopes to be.
I was a little surprised by how little time was allocated towards syntax and Chomskyan linguistics in my education, so I wanted to explore a little primer in both. At times, there were lines of reasoning I found difficult to follow, proofs I would have liked to see expanded, and examples that I thought were flawed for one reason or another. All the same, I do think his argument is compelling, and I am interested to engage with more work on transformational syntax. As I am anything but fluent in symbolic logic, I don’t profess to have understood every word, but I see now why Chomsky is so highly esteemed in the field. I am sure his critics have valid points of their own, but I can’t help but admire the intellectual rigor with which he attempts to make his case. His voice struck me as both wise and humble, and his language was about as clear and succinct as a book on this subject could be. At times I was dubious, but given my background I don’t feel qualified to dispute any of his claims. How’s that for humility? Great read, I really enjoyed it, 4.5 stars
I enjoyed several chapters of this book including 2. The Independence of Grammar, 5. Limitations of Phrase Structure Description, 6. On the Goals of Linguistic Theory, 9. Syntax and Semantics. On the other hand, the raw syntactic conversation in 7. Some Transformations in English wasn't for me.
I found his arguments against the theory that "semantic information is required for discovering or selecting a grammar" particularly interesting and convincing.
"Our ultimate aim is to provide an objective, non-intuitive way to evaluate a grammar once presented, and to compare it with other proposed grammars. We are thus interested in describing the form of grammars (equivalently, the nature of linguistic structure) and investigating the empirical consequences of adopting a certain model for linguistic structure, rather than in showing how, in principle, one might have arrived at the grammar of a language."
Fascinating analysis of how to model the syntax of human language which revolutionised the field of linguistics. It is worth reading not just to better understand language, but to see how Chomsky approached the problem. He initially lays out an approach based on Markov chains showing the benefits of this approach, and also the limitations. He then goes on to show an alternative model based on assuming there are a finite set of phrase structures, before finally building toward a "transformational grammar" approach relying on underlying phrase structures (kernel sentences), plus tranformations on these phrase structures to produce a wide range of (ultimately all) grammatical sentences.
Note that this book does not go into the concept of "universal grammar", which Chomsky later developed, hypothesizing that certain elements of grammar in human language are inate to humans and not learned (and could be thought of as genetic, or a priori knowledge)
It really makes me wonder when Chomsky discovered Alan Turing, since it's obvious he has by now but...seems like he hadn't yet. And whether he's ever encountered Chaitin. Between Chaitin and Turing some of this stuff seems...weird but he can't be accountable for things that happened since this book was published (and Turing's work was secret/obscure for much of his life). This is a work of computer science, imho, even if he didn't know that's what he was grasping for.
Grammar is quite obviously divorced from semantics. This was stressed a little more than necessary. The formulaic nature of his syntactic analysis is quite inventive and was no doubt very useful at the time, given that this is regarded as a progenitor for much of the linguistic theory that followed. It is however probably less relevant today and only concerns the English language, meaning it isn’t suitable for those seeking to learn more about Chomsky’s theory of a universal grammar. Three stars.
Top 100 cuốn sách gây ảnh hưởng nhất từng được viết. Nay đã có bản tiếng Việt :3
Sách đọc khó như chó nhưng mà anh bạn bảo là Hạnh phúc nằm ở sự hiểu không phải ở kiến thức nên là cũng cố gắng như một thử thách đầu năm =)) chủ đề ngôn ngữ vẫn hấp dẫn với mình nên không vấn đề :3
Some parts were very clear and understandable, and they gave me some useful insights. However, other chapters were so complex I lost the will to live lol. But I’m not giving up on you, Chomsky, no worries
I didn't understand most of it (This was my first exploration into linguistics), but it was interesting. Will probably come back to it once I am more versed in the subject.