A global catastrophic risk is a hypothetical future event that has the potential to damage human well-being on a global scale. Some events could cripple or destroy modern civilization. Any event that could cause human extinction is also known as an existential risk.
Potential global catastrophic risks include but are not limited to hostile artificial intelligence, nanotechnology weapons, climate change, nuclear warfare, total war, and pandemics.
Researchers experience difficulty in studying near human extinction directly, since humanity has never been destroyed before. While this does not mean that it will not be in the future, it does make modelling existential risks difficult, due in part to survivorship bias.
Philosopher Nick Bostrom classifies risks according to their scope and intensity. A "global catastrophic risk" is any risk that is at least "global" in scope, and is not subjectively "imperceptible" in intensity. Those that are at least "trans-generational" (affecting all future generations) in scope and "terminal" in intensity are classified as existential risks. While a global catastrophic risk may kill the vast majority of life on earth, humanity could still potentially recover. An existential risk, on the other hand, is one that either destroys humanity (and, presumably, all but the most rudimentary species of non-human lifeforms and/or plant life) entirely or prevents any chance of civilization recovering. Bostrom considers existential risks to be far more significant.
Similarly, in Catastrophe: Risk and Response, Richard Posner singles out and groups together events that bring about "utter overthrow or ruin" on a global, rather than a "local or regional" scale. Posner singles out such events as worthy of special attention on cost-benefit grounds because they could directly or indirectly jeopardize the survival of the human race as a whole. Posner's events include meteor impacts, runaway global warming, grey goo, bioterrorism, and particle accelerator accidents.