Most of the current attempts that interact with everyday people are random greenwashing trying to get people to sort recycling or use paper straws. Yes solar panel tech is advancing, but that’s kind of in the background to most peoples day to day lives.
And all this goal is promising is that things won’t get worse via climate change. It isn’t actually a vision of positive change.
A future with ultra cheap energy, electric short range aviation in common use etc.
Building true artificial intelligence (AGI, or artificial general intelligence)
Half the experts are warning that this is a poisoned chalice. Can we not unite towards this goal until/unless we come to a conclusion that the risk of human extinction from AGI takeover is low.
Also, if we do succeed in AGI alignment, the line from AGI to good things is very abstract.
What specific nice thing will the AGI do? (The actual answer is also likely to be a bizarre world full of uploaded minds or something. Real utopian futures aren’t obliged to make sense to the average person within a 5 minute explanation.)
Colonizing Mars
Feels like a useless vanity project. (See the moon landings. Lots of PR, not much practical benefit.)
How about something like curing aging? Even the war on cancer was a reasonable vision of a positive improvement.
Thanks Donald, good feedback. I agree about maximizing good over minimizing bad. Curing aging, or extending healthspan, is a great one. Certainly an easier sell than becoming a multiplanetary species.
Most of the current attempts that interact with everyday people are random greenwashing trying to get people to sort recycling or use paper straws. Yes solar panel tech is advancing, but that’s kind of in the background to most peoples day to day lives.
And all this goal is promising is that things won’t get worse via climate change. It isn’t actually a vision of positive change.
A future with ultra cheap energy, electric short range aviation in common use etc.
Half the experts are warning that this is a poisoned chalice. Can we not unite towards this goal until/unless we come to a conclusion that the risk of human extinction from AGI takeover is low.
Also, if we do succeed in AGI alignment, the line from AGI to good things is very abstract.
What specific nice thing will the AGI do? (The actual answer is also likely to be a bizarre world full of uploaded minds or something. Real utopian futures aren’t obliged to make sense to the average person within a 5 minute explanation.)
Feels like a useless vanity project. (See the moon landings. Lots of PR, not much practical benefit.)
How about something like curing aging? Even the war on cancer was a reasonable vision of a positive improvement.
Thanks Donald, good feedback. I agree about maximizing good over minimizing bad. Curing aging, or extending healthspan, is a great one. Certainly an easier sell than becoming a multiplanetary species.