AI development may be driven not only by competition but by an elite impulse toward lifespan extension or quasi‑immortality: powerful actors tolerate very high aggregate risks because the upside to their longevity or survival is personally transformational. If true, this motive helps explain why organizations accept nontrivial extinction probabilities and how messaging about catastrophe can be instrumental rather than merely alarmist.
— If elites seek life‑extension or immortality via advanced AI, that transforms regulatory debates, incentive design, and public trust — it reframes risk as a distributional and moral problem, not only a technical one.
Noah Smith
2026.03.26
100% relevant
Noah Smith cites Altman’s survival prepping and invokes historical searches for immortality (Genghis Khan), using those facts to hypothesize an immortality/immortality‑seeking driver behind tolerance for high existential risk.
← Back to All Ideas