*** Welcome to piglix ***

Centre for the Study of Existential Risk

Centre for the Study of Existential Risk
Founder Huw Price, Lord Martin Rees, Jaan Tallinn
Purpose Higher Education and Research
Headquarters Cambridge, England
Parent organization
University of Cambridge
Website cser.org

The Centre for the Study of Existential Risk (CSER) is a research centre at the University of Cambridge, intended to study possible extinction-level threats posed by present or future technology. The co-founders of the centre are Huw Price (a philosophy professor at Cambridge), Martin Rees (a cosmologist, astrophysicist, and former President of the Royal Society) and Jaan Tallinn (a computer programmer and co-founder of Skype). CSER's advisors include philosopher Peter Singer, computer scientist Stuart J. Russell, statistician David Spiegelhalter, and cosmologists Stephen Hawking and Max Tegmark. Their "goal is to steer a small fraction of Cambridge’s great intellectual resources, and of the reputation built on its past and present scientific pre-eminence, to the task of ensuring that our own species has a long-term future."

The centre's founding was announced in November, 2012. Its name stems from Oxford philosopher Nick Bostrom's concept of existential risk, or risk "where an adverse outcome would either annihilate Earth-originating intelligent life or permanently and drastically curtail its potential". This includes technologies that might permanently deplete humanity's resources or block further scientific progress, in addition to ones that put the species itself at risk.

Among the global catastrophic risks to be studied by CSER are those stemming from possible future advances in artificial intelligence. The potential dangers of artificial general intelligence have been highlighted in early discussions of CSER, being likened in some press coverage to that of a robot uprising à la The Terminator. Price explained to the AFP news agency, "It seems a reasonable prediction that some time in this or the next century intelligence will escape from the constraints of biology". He added that when this happens "we're no longer the smartest things around," and will risk being at the mercy of "machines that are not malicious, but machines whose interests don't include us". Price has also mentioned synthetic biology as being dangerous because "[as a result of] new innovations, the steps necessary to produce a weaponized virus or other bioterror agent have been dramatically simplified" and that consequently “the number of individuals needed to wipe us all out is declining quite steeply.”


...
Wikipedia

...