Carl Shulman & Elliott Thornley argue that the goal of longtermists should be to get governments to adopt global catastrophic risk policies based on standard cost-benefit analysis rather than arguments that stress the overwhelming importance of the future.
https://philpapers.org/archive/SHUHMS.pdf
Note: Tables, notes and references in the original article have been omitted.
Holden Karfnosky — Success without dignity: a nearcasting story of avoiding catastrophe by luck
Larks — A Windfall Clause for CEO could worsen AI race dynamics
Otto Barten — Paper summary: The effectiveness of AI existential risk communication to the American and Dutch public
Elika Somani — Advice on communicating in and around the biosecurity policy community
Riley Harris — Summary of 'Are we living at the hinge of history?' by William MacAskill.
Riley Harris — Summary of 'Longtermist institutional reform' by Tyler M. John and William MacAskill
Hayden Wilkinson — Global priorities research: Why, how, and what have we learned?
Piper — What should be kept off-limits in a virology lab?
Ezra Klein — This changes everything
Create your
podcast in
minutes
It is Free
강유원의 책담화冊談話
The Art of Manliness
Conversations With Coleman
Dear Hank & John
The Lila Rose Podcast