What do you think about the Roko's basilik? do we need to fear a strong artificial intelligence or can we just think a minute to realize that this is non sense. I mean, we have to make so many guess to admit that artificial intelligence will spontaneously *pop* in our existence, to admit that this IA will make those decisions, to admit that a perfect replica of a human IS a human (a least have the same value). And finally we have to guess what wiil think an IA who is more intelligent than all the human brains put together. That makes a lot for me... What do you think?
is bullshit to think about.
just like free will vs predestination.
1. it exists - you do nothing differently.
2. it does not exist - you still do nothing differently.
either way you wont do anything different. and hence it would not matter.
why bother thinking about it if its not actionable?
I'm not thinking about Roko's basilik,in my opinion that's bullshit as you said, I just wonder what people think about this, that's what interesting, how people cope with the IA question.
>What do you think about the Roko's basilik?
That's it's a completely retarded idea.
It's such a waste of braincells that I'm going to make an anti-lesswrong basilisk that will torture everyone that thinks that awful site have any value for all eternity.
So the premise is that in the future a god/AI will simulate us and then torture us for not bringing it into existence sooner?
Aren't these the same people who don't believe in dualism? Isn't that kind of contradictory?
Roko's basilik is presented as an IA who is so good that she torture people who did nothing to make her come to our existence. So it's an IA who is so good she's bad in a way. So dualism or not? I'm not a dualist thinkier so this story for me is non-sense. And it's so biblical....
>in the future a god/AI will simulate us and then torture us for not bringing it into existence sooner?
Why the simulation?
Couldn't our future silicon overlord(s) just torture the real, physical humans?
Because if it will be limited to torturing living humans you could escape the basilisk by dying first. The idea is that the AI could retrieve your "soul" from death by perfectly simulating your brain.
It seems there are an infinite amount of "do x or you will be tortured forever" scenarios you could come up with.
What if Satan is really all powerful and, if you don't devote your life to him, you will burn for all eternity?
How is Roko's scenario more valid than mine?
Also, I would argue that entropy would prevent anything from being able to simulate your brain long after you've died such that your consciousness would spring back from the dead. (assuming you could even do that)
Yep, this theory smells like biblical stuff, with the fear of apocalypse or the salvation of your soul... look at the pascal's bet for example, it "proove" that you have to believe in god in a mathematical way but in fact we don't know so why do we have to worry about that?
In the sense that you can't just reconstruct a decayed brain, even if you are a super-human intelligence. The information that was in it is all spread out and entangled with a billion other things now.
It's an online cult.
Just go look at the site and see how much insider terminology and memes they use, doublespeak is kindergarten tier compared to the bullshit they subscribe to. And like philosophers they don't care about real world competence, only shoveling out heaps of shit oriented around their local memes and pretending to be relevant to what they talk about.
That's the thing, We can't imagine what this IA will look like, what it will can do or not.
And to simulate the brain you don't need energy (yes, yes let me finish) if you recreate atoms by atoms in the exact same disposition someone's brain (in a computer, in a simulation) and give it some fake energy (simulated electricity... that will use real electricity to let the computer work but you get the idea), the simulated brain will work in the exact same way as the original and entropy as nothing to do with that. All the nerve impulses will be done at the same time and at the same place of the brain the original brain would.
>Because if it will be limited to torturing living humans you could escape the basilisk by dying first.
That's a rather extreme measure.
Personally, I'd rather live to serve HAL than die to escape working for a non-human.
>considering that a simulated human is a human
But it's not.
>it can torture even after your death
I don't care.
I'd be more motivated by the threat of it growing more actual humans and torturing them.
The argument is something along the line that it can approximate you very well, to the point where it can recreate an artificial person from scratch that really belives it is you.
It's an okay argument so far, but because it's lesswrong they put their feces-soiled fucking pants on the head after this and argues that it will then go ahead and torture this replication of you for all eternity because it's a vengeful and demented AI god that decided you didn't do enough to create its majestic being.
The only one to be tortured for all eternity by such a system will of course be Eliezer Judowsky because he's such a repulsive person. I'm going to get my personal portable copy with a audio jack so I can listen to his agonizing screams whenever I need some peace of mind.
>if you recreate atoms by atoms in the exact same disposition someone's brain
His point is that this information will be long gone by the time our synthetic Satan rules the Earth.
Even though we don't know the nature of the AI, I think it's a safe bet it won't be inventing a time machine that can reach back to before its own existence.
The greater danger is that Skynet might be here in our own lifetimes.
>The argument ...it can approximate you very well, ... it can recreate an artificial person from scratch that really belives it is you.
It could also just hypnotize/brainwash a biological person into believing they're me.
Either way, I'm not getting tortured.
>Either way, I'm not getting tortured.
P.S. I just realized this doesn't derail the basilisk.
As long as some singularity-fag believes a cyber-copy is really them, they would be motivated to help create the AI.
>I'm not getting tortured.
This is what you'd believe in the simulation too when it runs the non-torture test scenarios.
It wouldn't know if it have accurately simulated you without seeing how you behave and adapt to a normal world. When it comes to torture everyone is going to scream and piss their pants so it couldn't gauge the accuracy of the simulation by just starting with torture. Before heating the nipple clamps it need to extensively test the simulated person, or even raise the person from childhood in a simulated environment.
It could even opt for non-direct torture scenarios but simply replay depressing life scenarios where you end up having pointless debates on 4chan while being a dead-end person in poor socioeconomic conditions.
then the past has already happened and even in this new simulation your fate is sealed. there is no motivation to do anything.
why can't people accept that this is a stupid thought experiment shilled by spergs and manchildren because they think the name sounds cool. it has no grounding in logic let alone reality.
schizophrenics get motivation to do things from irrational sources as well
>then the past has already happened
Nah, it could be an open ended simulation, it runs until you die and the lifeline is judged afterwards. If it branched into a nice person you get S rank and is uploaded to the Infinite Funland simulation. If you fuck up and do something bad it throws you into the lake of fire.
In the lets-create-a-Hitler-to-torture simulation runs any simulation that leads to Hitler becoming a famous artist the timeline is either deleted if too mundane to save, forwarded to funland if he does anything good, and in the holocaust lines he's picked out for pitchforking.
"Free will" isn't a thing anyways, every action we do or thought that passes through our head could have been predicted with certainty 15 billion years ago if you knew the exact state of the universe
it changes nothing
>If it branched into a nice person you get S rank and is uploaded to the Infinite Funland simulation. If you fuck up and do something bad it throws you into the lake of fire.
>literally just pascal's wager with a new name
paranoia--> psychiatric diagnosis too
delirium--> set of symptoms (related to a psychoses generally) that characterize the patient (symptomatically) in period of crisis.
sorry but will use french terminology becaue I couldn't find the some of the following terms in english.
For the paranoia (which is a Psychosis) we talk about a délire paranoiaque. But for schizophrenia we talk about délire paranoide. In schizophenia, the paranoid delirium is a part of the diagnostic
Nah. You can believe in god and it's still the lake of fire.
Or you can do whatever you want and it's still the lake of fire if it's a sadist simulation, it just wanted to have your wife and children to toss in the lake as well.
>french terminology becaue I couldn't find the some of the following terms in english.
You frog eaters have a special place in hell where frogs find human legs to be a delicacy.
It's called DELUSION in english.