23 Comments
User's avatar
Tom White's avatar

One of the more interesting, creative things I have read about existential AI risk. Bravo!

Randy M's avatar

"I’m not worried about AI doom because I believe that God exists, and whatever else His plans might be they certainly don’t include complete human extinction from an AI.3 My more controversial claim is that, based on the arguments made in IABIED, Yudkowsky and Soares should also believe in the protection of a god-like ASI. Also, given their own arguments about the vast danger to all of existence from a new unaligned ASI, the current ASI “ruler” of this area of space should be very interested in protecting us from that as well."

Here's a wrinkle... What if the local ASI for our region isn't protecting us *from* that because it was protecting us *for* that? It awaits for us to produce another of its kind for it to welcome into the community of ascended beings which is its real concern, then to discard the parent biological civilization like the shell of a bird taking flight?

Hmm, someone ought to write that story...

R.W. Richey's avatar

Ahh, yes. I feel like I've read that story. ;) I definitely should have included that option, but it's still entirely fatal to the IABIED project. If there's a superpowerful AI waiting to midwife a new ASI into existence there's presumably nothing we can do about it.

Gamereg's avatar

I'm sensing a reference here, but I'm not getting what it is...

R.W. Richey's avatar

The story that immediately popped into my head was "They're Made Out of Meat" which on reflection isn't quite that, but has a similar vibe of there already being an ASI out there that doesn't care about us at all.

Evan Þ's avatar

I was envisioning "Childhood's End"...

R.W. Richey's avatar

Yeah, I thought of that too, it's a perfect match except it's us, not AIs that get taken up into the Overmind.

Ponti Min's avatar

That's something that hadn't occurred to me.

Isha Yiras Hashem's avatar

The difference is I think there will be a Judgement Day and world to come. Everything comes to an end, including this world. IABIED wants to gain control over this process by at least getting credit for the end of the world. It is a long tradition for people to predict this and be wrong but to get a lot of attention in the process.

Ponti Min's avatar

They want to prevent the end of the world, not gain credit for it. What use is "I told you so!" if everyone's dead?

Isha Yiras Hashem's avatar

People have been predicting the end of the world for a long time, why do you think they have done that? Attention is also a success to gain

Ponti Min's avatar

People have been predicting the end of the world forever, but this time it's different: the summoning into existence of computer based superintelligence is as big as the start of unicellular life, as big as eukaryotes, as bit as multicelled life, as big as the evolution of the nervous system and brain, and as big as the creation of human intelligence.

People need to take the long term view.

Isha Yiras Hashem's avatar

"This time it is different"

There is nothing new under the sun

Ponti Min's avatar

We are people on different continents, having a conversation in near real time, something that wouldn't have been possible for anyone but a select few 100 years ago and not possible at all 200 years ago.

Nothing new under the sun? Yeah, right.

The Sentient Dog Group's avatar

For the Fermi Paradox, consider the Peloponnesian explanation. But first I recall a video I stumbled across that tried to explore what trade would look like between star systems assuming nothing like hyperspace or warp drive is ever possible.

It was an interesting topic but one major issue it didn’t really cover was why? Assuming most solar systems are roughly like ours, there’s really not anything to trade. Anything you can imagine from flawless diamonds to iphones can be made from the rocks and materials in just about any given solar system. So without the need to trade what is the point of colonizing? To get away from an overbearing life on the home planet? OK, but that is a much more limited model of ‘spready everywhere as fast as possible’ as the Fermi Paradox assumes. Perhaps our view of history is a little too dominated by Columbus and the expansion into the New World.

A different model is the Peloponnesians who travelled over the Pacific and have visited numerous islands. There are islands where they’ve been for ages but also many islands where they were for a while but moved on leaving behind only some traces. While there are networks of islands with relationships between them that persist for a long time, most of the islands most of the time have no one on them. Is this a defiance of the Fermi Paradox? There was time enough for humans to visit and fill up every island in the Pacific, but they haven’t.

The resolution is that while humans expand from A to B to C, there’s also contraction. Perhaps B becomes abandoned. The expansion is pared with contraction, so the question is why doesn’t intelligent life just want to spread itself out over every possible rock in the galaxy and then some? But then the answer could simply be there’s little point in that. While some Peloponnesians may have had an ideology that favored occupying every single island in the ocean, the islands were so far apart that any society based on that fragmented in favor of other ideologies…esp more mellow ones that enjoyed the island they were on but were open to packing up and moving should things get bad.

It follows then AI would face the same limitations. While nothing in physics prevents a Borg-like takeover of the entire galaxy, maybe there’s just not much point in it.

R.W. Richey's avatar

Some interesting points, unfortunately undermined by the fact that you mean Polynesians, not Peloponnesians. ;) Also recall that it's Yudkowsky making the "Eat the stars" claim, not me.

The Sentient Dog Group's avatar

Ahhh I was betrayed by autocorrect. The AI knows I’m too dangerous!

The Sentient Dog Group's avatar

Still is it obvious that the ultimate point of intelligent life is to spread everywhere it can within its light cone? How many people does the universe need? If you say 100 trillion, our current solar system can accommodate that quite nicely.

What exactly comes from instead wanting 100 Trillion to the trillionth power? Even if you say intelligent people are very valuable, that's starting to feel like a paperclip factory. It might be more accurate that the purpose of the universe, if it has a purpose, is literally something else.

It's not really intelligent life that spreads everywhere but non-intelligent life. Bacteria has covered every inch of the earth's surface plus a good portion of its atmosphere and deep under its surface as well.

Ponti Min's avatar

> Superintelligences have already spread across the galaxy.

If this is true, they've done so in a non-grabby way, i.e. not harvesting most of the energy from any star. Maybe they are conservationists?

But if there are lots of separate superintelligences, some will probably be grabby, and they would dominate.

The Sentient Dog Group's avatar

Maybe none will be grabby. A problem with 'super intelligence' is that the speed of light is a limiting factor as far as we know. A 'brain' the size of earth would think much slower than a person (imagine Ents taking hours to say what we say in a few seconds).

The price of 'expansion' is a breakdown of Orthodoxy. In other words, consider Christianity that eventually dominated the Roman Empire only to schism apart after. An ideology to dominate the galaxy may simply not be able to sustain beyond a handful of solar systems absent Star Wars like hyperspace.

Ponti Min's avatar

> It’s possible that developing an ASI doesn’t create something which expands out until it conquers the galaxy, rather it creates something which destroys its host civilization without a trace. This possibility is equally fatal to the IABIED project, because it means that well-aligned ASI’s are effectively impossible.

Space-faring life forms, whether meat based or computer based, will either be grabby or not.

IMO many will be grabby, because life tends to be, also instrumental convergence. A grabby species will colonise its own solar system, then the star systems round it, turning them into Dyson clouds. If another galaxy had been turned into Dyson clouds, we'd be able to detect it. We don't.

Thus I conclude that spacefaring life is probably rare. Either life is rare, intelligence life is rare, technological intelligent life is rare, or we just happen to be one of the first. Someone has to be first.

R.W. Richey's avatar

The Dyson Cloud/Sphere thing is one of the best arguments that we're alone, but also it's a fairly narrow criteria. When ancient hunter gatherers imagined god-like power they might imagine that they would flatten all the mountains so that there could be more plains for large grazing animals, and forests would grow better so there would be more wood for fuel. And then you could imagine them getting a glimpse of our civilization and when they saw that there were still mountains concluding we couldn't be that advanced. But this is because the agricultural revolution, and huge cities, to say nothing of the green revolution and skycrapers, are things they can't even conceive of.

Ponti Min's avatar

You're absolutely right that future technology might make a Dyson cloud look primitive, and be undetectable by us. It's an Outside Context Problem, if you're familiar with the works of Iain Banks.

I think the most accurate answer to the Fermi paradox is "we just don't know (yet)". The best we can do is intelligent speculation, and acknowledge that it is just that.