An AI Company Just Fired Someone for Endorsing Human Extinction

by AndrewKemendoon 8/6/2025, 3:51 AMwith 16 comments

by KingOfCoderson 8/6/2025, 4:45 AM

What people often get wrong (this was xAI), when Musk talks about "humanity" and "Mars" that needs to survive and not go extinct, he means him and his many (dozens? hundreds?) of children, not you and me. Many people seem to misunderstand Mr. Musk there.

by monero-xmron 8/6/2025, 5:41 AM

I’ve met tons of people who are pro-extinction even if they don’t acknowledge it that way. The “global warming means I can’t have kids” people I’ve met numerous times. I guess we just need to all die so a planet full of unfeeling matter can decrease a few degrees (?)

by Animatson 8/6/2025, 5:54 AM

We can see the beginning of the end of humans now. "Peak baby" worldwide was in 2013. Essentially all of the developed world has a birth rate below replacement rate.[1] This could just be a slow phaseout of humans over the next century as AIs take over.

Milton Friedman's position that the sole purpose of the corporation is to maximize shareholder value is now accepted economic wisdom in the US.[1] So, once AIs get better at running businesses than humans, investment decisions will put AIs in charge. It's the free market in action.

Next big milestone to watch for: first big company with an AI in charge and outperforming competitors. This outcome is inherent in capitalism.

[1] https://desapublications.un.org/publications/world-populatio...

[2] https://www.nytimes.com/1970/09/13/archives/a-friedman-doctr...

by aitchnyuon 8/6/2025, 6:20 AM

I thought of A-GI as a motivation-free program that can turn paperclip-maximizer. Should an AGI necessarily "want" something?

by readthenotes1on 8/6/2025, 6:10 AM

So the "worthy successor" of people are basically cucks for AI?