*** Welcome to piglix ***

Global annihilation


A global catastrophic risk is a hypothetical future event that has the potential to damage human well-being on a global scale. Some events could cripple or destroy modern civilization. Any event that could cause human extinction or permanently and drastically curtail humanity's potential is known as an existential risk.

Potential global catastrophic risks include anthropogenic risks (technology risks, governance risks) and natural or external risks. Examples of technology risks are hostile artificial intelligence, biotechnology risks, or nanotechnology weapons. Insufficient global governance creates risks in the social and political domain (potentially leading to a global war with or without a nuclear holocaust, bioterrorism using genetically modified organisms, cyberterrorism destroying critical infrastructures like the electrical grid, or the failure to manage a natural pandemic) as well as problems and risks in the domain of earth system governance (with risks resulting from global warming, environmental degradation, including extinction of species, or famine as a result of non-equitable resource distribution, human overpopulation, crop failures and non-sustainable agriculture). Examples for non-anthropogenic risks are an asteroid impact event, a supervolcanic eruption, a lethal gamma-ray burst, a geomagnetic storm destroying all electronic equipment, natural long-term climate change, or extraterrestrial life impacting life on Earth.


...
Wikipedia

...