Discussion about this post

User's avatar
Tom White's avatar

One of the more interesting, creative things I have read about existential AI risk. Bravo!

Expand full comment
Randy M's avatar

"I’m not worried about AI doom because I believe that God exists, and whatever else His plans might be they certainly don’t include complete human extinction from an AI.3 My more controversial claim is that, based on the arguments made in IABIED, Yudkowsky and Soares should also believe in the protection of a god-like ASI. Also, given their own arguments about the vast danger to all of existence from a new unaligned ASI, the current ASI “ruler” of this area of space should be very interested in protecting us from that as well."

Here's a wrinkle... What if the local ASI for our region isn't protecting us *from* that because it was protecting us *for* that? It awaits for us to produce another of its kind for it to welcome into the community of ascended beings which is its real concern, then to discard the parent biological civilization like the shell of a bird taking flight?

Hmm, someone ought to write that story...

Expand full comment
21 more comments...

No posts

Ready for more?