Generative grammar is a linguistic theory that considers grammar to be a system of rules that generate exactly those combinations of words which form grammatical sentences in a given language. The term was originally used in relation to the theoretical linguistics of grammar developed by Noam Chomsky, beginning in the late 1950s. Linguists who follow the generative approach have been called generativists. The generative school has focused on the study of syntax, but has also addressed other aspects of a language's structure, including morphology and phonology.
Early versions of Chomsky's theory were called transformational grammar, and this is still used as a general term that includes his subsequent theories. The most recent is the Minimalist Program, from which Chomsky and other generativists have argued that many of the properties of a generative grammar arise from a universal grammar which is innate to the human brain, rather than being learned from the environment (see the poverty of the stimulus argument).
There are a number of versions of generative grammar currently practiced within linguistics. A contrasting approach is that of constraint-based grammars. Where a generative grammar attempts to list all the rules that result in all well-formed sentences, constraint-based grammars allow anything that is not otherwise constrained. Constraint-based grammars that have been proposed include certain versions of dependency grammar, head-driven phrase structure grammar, lexical functional grammar, categorial grammar, relational grammar, link grammar, and tree-adjoining grammar. In , grammatical correctness is taken as a probabilistic variable, rather than a discrete (yes vs. no) property.