References

My views on "doom"

Paul Christiano (2023)

AI Alignment Forum.

URL: https://www.alignmentforum.org/posts/xWMqsvHapP3nwdSW8/my-views-on-doom

Abstract. Christiano's published estimate of catastrophic-risk probabilities, with the reasoning that underlies them. He gives a roughly 22% probability of AI takeover; an 11% probability that AI kills more than a billion people; and a 46% probability that humanity has irreversibly damaged its future within 10 years of building powerful AI. He sketches the scenarios he finds most plausible and the conditions under which each becomes more or less likely. The post is widely cited as one reference point for the alignment community's discussion of risk, though Christiano stresses the figures are subjective and may have shifted since publication.

Tags: safety alignment existential-risk

Cited in:

This site is currently in Beta. Contact: Chris Paton

Textbook of Usability · Textbook of Digital Health

Auckland Maths and Science Tutoring

AI tools used: Claude (research, coding, text), ChatGPT (diagrams, images), Grammarly (editing).