The critical period hypothesis is the subject of a long-standing debate in linguistics and language acquisition over the extent to which the ability to acquire language is biologically linked to age. The hypothesis claims that there is an ideal time window to acquire language in a linguistically rich environment, after which further language acquisition becomes much more difficult and effortful.
The critical period hypothesis states that the first few years of life is the crucial time in which an individual can acquire a first language if presented with adequate stimuli. If language input does not occur until after this time, the individual will never achieve a full command of language—especially grammatical systems.
The evidence for such a period is limited, and support stems largely from theoretical arguments and analogies to other critical periods in biology such as visual development, but nonetheless is widely accepted. The nature of such a critical period, however, has been one of the most fiercely debated issues in psycholinguistics and cognitive science in general for decades. Some writers have suggested a "sensitive" or "optimal" period rather than a critical one; others dispute the causes (physical maturation, cognitive factors). The duration of the period also varies greatly in different accounts.
In second-language acquisition, the strongest empirical evidence for the critical period hypothesis is in the study of accent, where most older learners do not reach a native-like level. However, under certain conditions, native-like accent has been observed, suggesting that accent is affected by multiple factors, such as identity and motivation, rather than a critical period biological constraint.
The critical period hypothesis was first proposed by Montreal neurologist Wilder Penfield and co-author Lamar Roberts in their 1959 book Speech and Brain Mechanisms, and was popularized by Eric Lenneberg in 1967 with Biological Foundations of Language.