Jump to content

Roko's Basilisk


adw95

Recommended Posts

I read it. I are confused.

 

I read once and am I little confused but the general idea was fascinating. The following part seems to explain it best with regards to Newcomb's paradox, which sort of explains the whole thing...

 

 

 

The rationale for this eludes easy summary, but the simplest argument is that you might be in the computer’s simulation. In order to make its prediction, the computer would have to simulate the universe itself. That includes simulating you. So you, right this moment, might be in the computer’s simulation, and what you do will impact what happens in reality (or other realities). So take Box B and the real you will get a cool million.

 

Essentially the all powerful AI may happen so you may as well believe and don't try to stop it. I think.

Link to comment
Share on other sites

 

I found this to be terrifically interesting.

 

...

Me, too.

Thanks for posting. :thumb:

Edit: Linked Gary Drescher book looks interesting.

 

 

Can't find it anywhere new for less than £25.  :(

Link to comment
Share on other sites

Most of the post is just explaining the idea of timeless decision theory.

 

The important thing to take away is that you should just ignore the blackmail because that means it won't happen.

 

 

 

Believing in Roko’s Basilisk may simply be a “referendum on autism,” as a friend put it.

 

:lol: Heh.

Link to comment
Share on other sites

Most of the post is just explaining the idea of timeless decision theory.

 

The important thing to take away is that you should just ignore the blackmail because that means it won't happen.

 

 

 

Believing in Roko’s Basilisk may simply be a “referendum on autism,” as a friend put it.

 

:lol: Heh.

 

If you ignore the blackmail and it doesn't happen, then you may pop out of existence if you happen to be a part of the simulation. I think. 

 

It's an interesting concept, fun to think about. Ultimately though we solved that part hundreds of years ago (I say we, I didn't really contribute) - I think therefore I am. Simulation or not, I am. I exist. This is a fact. 

Link to comment
Share on other sites

I read it. I are confused.

 

It's a mishmash of Asimov, Terminator, a bit of Alan Dean Foster, some supposition based on what they know of IBM's DARPA funded AI research, some guessing at what the Chinese are up to, with a magic sprinkle of lunacy and paranoia.

 

Though they have a point, knowing the risks the military take to stay ahead.

Link to comment
Share on other sites

It's an expansion on Pascal's wager, which suggests you might as well live as though god exists and serve him as you will be rewarded if he does but it won't matter if he doesn't because you'll be dead.

 

Now I know about the Basilisk I have the choice to help it or risk eternal torture in the form of my future simulation.  Now I live with the inner turmoil, how do I help the Basilisk?  Arrrrgh!

  • Like 1
Link to comment
Share on other sites

×
×
  • Create New...
Â