Nuclear winter is the severe global-climatic-cooling-effect, hypothesized to occur after the ignition of a number of firestorms. Such fires, which can inject soot into the stratosphere, have historically occurred in a number of cities, with nuclear winter researchers using both Hamburg and the less ferocious Hiroshima firestorms as the principal examples. However, as firestorms are most frequently encountered in much larger area wildfires, these latter fires are the most significant today in assessing both the initial and present-day "nuclear winter" computer models.
"Nuclear winter," and its progenitor, "nuclear twilight," both refer to nuclear events as both were formulated at a time when attempts to quantify the climatic effects of large-scale nuclear war scenarios had hit a stumbling block, namely, the prior popular hypotheses that fireball generated NOx emissions would devastate the ozone layer was becoming increasingly unrealistic. It was within this early 1980s context, that the climate effects of soot from fires, was "chanced upon" and became the new focus of the climatic effects of "nuclear war". In these modelled scenarios, various "soot cloud" events, containing uncertain quantities of soot, were assumed to form over cities, oil refineries, and the more rural missile silos. Once the quantity of soot is decided upon by the researchers, the climate effects of these soot clouds are then modeled. The term "nuclear winter" was specifically coined in 1983 by Richard P. Turco to refer to computer model results where this soot/smoke remained in the air for almost a year, and during this time it caused monumentally severe planet-wide temperature drops ("winters")—apocalyptic models that Turco would later temper and distance himself from.