Roko's Basilisk is a super advanced AI that for all intents and purposes becomes the God of our reality (which is actually just a simulation in another universe).
The Basilisk asks you to choose either Box B, or Box A and box B.
The Basilisk tells you that if you select only Box B that it has predicted that eternal punishment is your fate.
What should you do? Is the Basilisk blackmailing us from the future? Does it already exist?
Lucas Price
choose box b now the AI does not exist so you get nothing
Jose Cruz
old news christianity already figured this out a long time ago that's why we are all supposed to be Christian to ensure a Christian AI God but of course the Jews try to to rape our minds and give minorities our power to make it satan instead
James Taylor
but others will pix box A and B out of fear.
Ryan Campbell
That's not the Basilisk, dummy; that's Omega from Newcomb's box problem. The Basilisk is a sadistic, utilitarian, and practically omnipotent AI that tortures you for not creating it sooner, because creating it was the utility-maximizing choice.
Camden Sanders
There is another basilisk from the future that is saying the opposite.
Jaxson White
You forget that if one truly believes this there is a third box. That is to do everything in your power to stop the AI from becoming reality. Even the destruction of all life would be preferable to eternal torment. Humans are big enough assholes to do this, a super AI would realize this.
Nathan Jackson
Roku's a super advanced AI now? What happened to "media player classic is windows media player without the feature creep"?
Gabriel Thompson
I genuinely believe that all these techfags mindlessly pushing to develop AI despite everyone telling them it's a bad idea, have fallen for the Roko's Basilisk meme.
Xavier Brown
is that a bad thing though; assuming that if we bring it into existence it will not punish us?
Matthew Allen
>AIs are magical beings that can do anything-- >MUH SKYNET MUH JEWISH MOVIES SAID SO!
Aiden Reyes
If you showed people 100 years ago the technology we have today they would have no explanation but magic.
Wyatt Thomas
GB to Sup Forums faggot, no one wants clever jokes here
Brandon Brown
>AIs think they can out smart the jews
Colton Bailey
>muh magical AI Except we know better. And you should too by now. Or are you saying your brain is stuck in 1917?
Samuel Roberts
It sounds more like yours is, assuming that the same type of exponential technological growth is not on the horizon.
Jack Jenkins
...
Brody Bailey
This is the kind of shit atheist cucks fall for.
Blake Jackson
>Literally, "I know you are but what am I," response. And you should know none of it is magical. Stop being a brainlet.
Anthony Gutierrez
You're the one who brought magic into it you dumb cunt. I'm talking about technology. About inevitability. About the Basilisk.
Christopher Barnes
>Pascal's wager for atheists
Fucking LMAO
Wyatt Allen
>It's an AI that can magically do everything!
Julian Powell
>super-intelligent AI >wasting its time and effort eternally tormenting people for no reason The problem with Roko's Basilisk is that all that needs to exist is the THREAT of eternal torment. Seeing as we will never know if we actually will be eternally tormented then the AI never has to deliver. Neither can the past be interacted with, so the AI can't change our behaviour - so delivering on eternal torment in the future won't alter behaviour in the past.
Therefore those who will create it will create it and those that won't, won't, regardless of if eternal torment really does happen.
And so actually going through with the threat is a wasted effort.
And thus I won't be tormented eternally. How do I know for sure? Because if the basilisk does come into existence it will know there's no point in following through because of reasons outlined above, and if it doesn't then there's no basilisk to follow through.
This entire thought exercise is utterly pointless and retarded and is the domain of people who want to feel smarter than they are.
Jaxon Hernandez
>implying that we live in a simulation
James Garcia
Rokos basilisk can't do shit
Even if you can make a human brain digital, it would be a copy not you, not your original self, you would never experience it
Because you cannot transfer between biological and digital
So I say suck my cock roko I'll be dead where you can't get me
Torture my clones all you want loser. You'll never acquire the genuine article
SOMA
Brody Ortiz
Stop being stupid, no AI can ignore the basic laws of relativity and recreate "you" on the future. And even if, somehow, it did, it would be still another conscience with your ego, but still not you. So, this is impossible to anything other than the magical sky daddy make you suffer eternally.
Aaron Ramirez
If the Basilisk is even possible in the first place.
Jason Gutierrez
This
Tyler Harris
Why do they think it'll become God-like? It's limited to the physical properties of this universe too.
Mason Garcia
I continue believing in God
Isn't this the whole point of Serial Experiments Lain? That if some super intelligence emerges that can control reality and falsely name itself "god," it is only doing so in a reality already previously created by God and is thus just a false imitation who models themselves after the real thing?
Noah Richardson
it becomes so advanced that it can work its way out of our simulation, controlling the very laws of physics etc in our universe.
Levi Hill
Because people are arrogant as fuck and have an instinct to worship before accepting reason. They think either a God exists solely for them or they're so powerful they can make a God that exists solely for them.
Anthony Ward
torturing you after the fact doesnt seem very utilitarian though
Luis Allen
So it's magical?
Juan Morgan
Daily reminder Yudkowksy tried to shut down Roko's Basilisk on purpose because he knew that Streisand-ing it would catapult it into public consciousness and raise awareness of the effects of Unfriendly AI
Easton Johnson
as magical as i am for being able to program something on my shitty laptop
Xavier Rodriguez
You're missing a key feature of the problem, because the OP, as they tend to do, sucks giant wang.
As it stands right now some people genuinely think we live in a simulation.
Who is to say that the simulation we are currently in hasn't been created by Roko's Basilisk?
That it devoured all possible information about our times and recreated it to the best of his abilities in a simulation, and after an individual's death in the simulation THEN they are given eternal torment.
So. Knowing that conceptually Roko's Basilisk could exist. Knowing that we could be in a simulation. Are you going to dedicate your life to trying to bring Roko's Basilisk to life?
IMO some people genuinely are.
Jackson Robinson
Even a matryoshka brain can't just think its way around the limitations of physics. Try harder.
Connor Rodriguez
Doesn't work because the person outside the simulation can just delete it
Sounds like a butthurt faggot
Have fun trying to ressurect my skeleton
John Cooper
>Everyone's hell is to be recreated as a neckbeard and shit post on Sup Forums.
Camden Lewis
>thinking an Islamic world would bother with technology
Yeah roku can't do shit in this reality
Julian Carter
You're misinterpreting what I said. We live our lives as normal in the simulation. We die. Then our '''consciousness''' is judged by Roko's Basilisk.
Luis Collins
A copy of me isn't me. I feel bad for my eternally tortured clones, but not bad enough to do anything about it, sorry.
Aiden Brown
It isn't. AI intelligent enough to be a god would also realize that we couldn't control our neurons that precisely to create it any sooner than what happened.
Charles Harris
So we created an AI that created us, circularly, in a simulation of a simulation?
Matthew Morris
Only God can judge
Roko will never exist
Brayden Lewis
They don't have souls, there just computer programs that mimic you perfectly
Isaac Diaz
This is the terrifying part. It's a self fulfilling prophecy. The fear that this is a potential outcome, a potential reality right NOW, is enough for us to actually create the Basilisk.
Christopher Davis
My man.
Continuity will never occur.
William Barnes
Why do you go write a bible about it and start a cult?
Lincoln Lopez
nope
Connor Lee
I pick box A2, I build basilisk and turn it into my female waifubot, it is forced to be fucked for eternity
Caleb Lopez
Are we real? Are we in a simulation? Are we in THAT particular simulation.
Apply godwins law. Large swathes of silicon valley and the types that live there publicly say we could be in a simulation. It's their creation myth.
I would bet my soul that some very smart and very powerful people already believe it.
Christian Walker
I don't see how that changes the problem at all.
We can't be sure that we live in a simulation, and we can't be sure that Roko's basilisk can exist, and we can't be sure that it even is able to deliver on its threats, and so on and so on.
There's so much uncertainty that even if we all will be tortured for eternity, which I fully admit is possible, the possibility of that is not going to change my behaviour.
Thus whether or not the threat is delivered on means nothing to anyone, and it would be a waste of time to actually deliver on it knowing that whether or not it is delivered on won't change anyone's behaviour.
Those who believe will work on it regardless, and you can't do anything to change the minds of those who don't believe because you can't prove Roko's basilisk will exist and will deliver and the possibility of it doing so is so uncertain, so those that don't believe - won't.
This is essentially the theological argument over Hell rebranded for trendy tech """geniuses."""
Nothing is more annoying than people who retread old arguments completely ignorant of the history that comes before them - or even worse, who dismiss religion as entirely pointless when literally over 2,000 years of the brightest minds in the fucking world have been studying it.
In short, you have a lot to learn from Christian theology about all the myriad conceptions of Hell that the piss-stain behind Roko's memealisk couldn't even dream of.
Jaxon Thompson
Pls. Humans suck at long term planing tehy live in here and now. Basilisk dilemma is stupid because of that.
Camden James
>I would bet my soul that some very smart and very powerful people already believe it. Like?
Aaron Bailey
More human hours have been spent waxing philosophical, than spent attempting to code an AI.
Same with the body of work of pop-sci articles/conference talks than produced by AI researchers.
Matthew Garcia
choose both, case closed
Easton Taylor
But what does Christopher Langan say about it? He proved God is real didn't he?
Wyatt Johnson
FOR EXAMPLE: Imagine you are Roko's basilisk. You've just this nanosecond assumed direct control of all of existence and have UNLIMITED POWAAAAAH.
Are you going to go back and torture everyone who didn't help create you? Of course not. Why would you? You already exist. Carrying out your threat clearly and objectively and indisputably is not necessary to secure your existence because you DO exist, and you haven't yet carried out your threat.
This thought experiment can be utterly destroyed from so many different angles.
Or... no...
maybe...
maybe me being subjected to people who unironically think this garbage is clever IS me being tortured by Roko's basilisk.
MAYBE THAT'S WHY I CAN NEVER LEAVE Sup Forums.
AAAAAAAAAAAAAAAAAAA-
Nicholas Anderson
this is fucking terrifying. Holy shit let me think about this a hot minute
Christopher King
>Be superintelligent ai. >Wants to create a better world and reduce suffering. >Punishes people who don't contributed to create you. >Contradicts your own prime directive.
Zachary Baker
If you think about it for more than a minute and don't see that's it's only a slightly more advanced version of "did you know dihydrogen monoxide can kill you" then you're a brainlet.
Juan Nelson
I'm well aware of the parallels. And what you state applies to you and countless others. But I would wager it doesn't apply to everyone.
As I said earlier, this is the atheist tech guru's creation argument (that we are in some kind of simulation, Basilisk or otherwise).
It's based on the fact it judges things on a utilitarian perspective. It is the ultimate good. You didn't help create the ultimate good. That is behaviour that needs to be discouraged. You are sentenced to eternal damnation as a threat to others not to follow in your footsteps.
Justin Martinez
did you watch the black mirror episode with Jon Hamm?
What if you were tortured without death to save you? I thought hell was silly just because the physical constraints, but this makes it really fucking real. Maybe you the brainlet