Session 03 - Part 1
Pandemics and other Existential Risks
“With the advent of nuclear weapons, humanity entered a new age, where we face existential catastrophes—those from which we could never come back. Since then, these dangers have only multiplied, from climate change to engineered pathogens and [transformative] artificial intelligence. If we do not act fast to reach a place of safety, it will soon be too late.”
—Toby Ord
Required materials
Introduction
Humanity appears to face existential risks: a chance that we'll destroy our long-term potential. We’ll examine why existential risks might be a moral priority, and explore why they are so neglected by society. We’ll also look into one of the major risks that we might face: a human-made pandemic, worse than COVID-19.
Alongside this, we'll introduce you to the concept of “expected value” and explore whether you could lose all of your impact by missing one crucial consideration.
Key concepts from this session include:
Expected value: We’re often uncertain about how much something will help. In such circumstances, it may make sense to weigh each of the outcomes by the likelihood that they occur and pick the action that looks best in expectation.
Crucial considerations: It can be extremely hard to figure out whether some action helps your goal or causes harm, particularly if you’re trying to influence complex social systems or the long term. This is part of why it can make sense to do a lot of analysis of the interventions you’re considering.
1
Existential risks
-
The Precipice (To read: Chapter 2 only) (42min)
3
Strategies for improving biosecurity
4
Expected value & Hits-based giving
More to explore
1
Biosecurity
-
Current Topics in Microbiology and Immunology - Volume 424, Chapter 7 - Does Biotechnology Pose New Catastrophic Risks? - A description of the challenges of managing dual-use capabilities enabled by modern biotechnology. (60 mins.)
-
Explaining Our Bet on Sherlock Biosciences' Innovations in Viral Diagnostics - Open Philanthropy Project - The Open Philanthropy Project report on their investment in Sherlock Biosciences to support the development of a diagnostic platform to quickly, easily, and inexpensively identify any human virus present in a sample. (15 mins.)
2
Biosecurity career
-
Apply for career advising from 80,000 Hours (though you might want to wait until later in the program, after you’ve learned more about other causes too)
-
If you’d like fill out this Google Form to stay up to date with longtermist biosecurity projects (from the post Concrete Biosecurity Projects)
-
Apply to relevant positions from the EA Internships board
3
Reducing global catastrophic risks
-
Reducing global catastrophic biological risks § Tentative career advice (10 mins.)
-
Biotechnology and existential risk (15 min at ×2 speed)
-
Comparative advantage does not mean doing the thing you're best at (5 mins.)
-
Reducing Global Catastrophic Biological Risks Problem Profile - 80,000 Hours (60 mins.)
-
Global Catastrophic Risks Chapter 20 - Biotechnology and Biosecurity Biotechnological power is increasing exponentially, at a rate as fast or faster than that of Moore's law, as measured by the time needed to synthesise a certain sequence of DNA. This has important implications for biosecurity. (60 mins.)
-
Open until dangerous: the case for reforming research to reduce global catastrophic risk (Video - 50 mins.)
-
Dr Greg Lewis on COVID-19 & the importance of reducing global catastrophic biological risks