Sunday, August 21, 2022

Longtermism, a bizarre philosophy with some very influential adherents

You may or many not have come across a philosophical belief called "longtermism", a quasi-religious belief promulgated by the likes of William MacAskill, Nick Bostrom and Carl Shulman, and supported by powerful and influential figures like Elon Musk and Jason Matheny.

In a nutshell, longtermism is the idea that, in long-term future (think millions or billions of years), there will be so may digital people (think 1058 according to one estimate) living in vast computer simulations that we have a moral obligation to ensure that as many of these people come into existence as possible. Part of that belief includes the need to colonize space as soon as possible, and to convert other planets into computer simulations for these unfathomable number of future digital beings. It sees the earth, nature and all the planets, stars, asteroids, etc, as our "cosmic endowment" to be exploited to the limit in the pursuit of this futuristic vision. Some, like Bostrom, take it even further and recommend genetic engineering for super-high IQ beings with "desirable traits".

I confess it all makes no sense to me. If you think that the future is going to be a transhuman dystopia of this magnitude, why would you feel obliged to precipitate it and bring it to reality as soon as possible? But then, who ever expected anything like logic from the likes of Elon Musk?

No comments:

Post a Comment