In this week of the programme, you will grapple with how to think about risks that could permanently destroy humanity’s long term potential. If we want to improve the long term future, should addressing these risks be our top priority?
You will also discuss why emerging technologies like the ability to create genetically engineered pathogens and advanced artificial intelligence could be a major source of existential risk.
Curriculum
Core materials
Recommended reading
More to explore
Specific risks:
General:
Exercise: Reducing extinction risk on current priorities
In this week's exercise, we will be reflecting on whether the case for putting more resources towards reducing extinction risk is only compelling if we value future generations.
If focussing more on risks of extinction seems valuable even if we only consider the interests of the present generation, then it seems extremely valuable if we also take into account the interests of future generations.
Say an asteroid is about to hit the Americas and kill every US citizen. Up to how much should the US government be willing to spend to avert this disaster?
In this case, we’re not asking for your view of how much the US government should put towards avoiding this outcome! Imagine that they simply extended their evaluation of the badness of one US death to every US citizen dying