I think Leopold Aschenbrenner’s argument here is interesting to consider: https://worksinprogress.co/issue/securing-posterity/?ref=forourposterity.com (full paper https://globalprioritiesinstitute.org/wp-content/uploads/Philip-Trammell-and-Leopold-Aschenbrenner-Existential-Risk-and-Growth.pdf) Regarding discount rate, 0% discount is pretty common in EA circles, I think, although I think many recognize it should be at least a bit above 0% to account for epistemic uncertainty about how long humans will continue to exist.
I think Leopold Aschenbrenner’s argument here is interesting to consider: https://worksinprogress.co/issue/securing-posterity/?ref=forourposterity.com
(full paper https://globalprioritiesinstitute.org/wp-content/uploads/Philip-Trammell-and-Leopold-Aschenbrenner-Existential-Risk-and-Growth.pdf)
Regarding discount rate, 0% discount is pretty common in EA circles, I think, although I think many recognize it should be at least a bit above 0% to account for epistemic uncertainty about how long humans will continue to exist.