There's a really fucked up trend going on in sci-fi

Some kind of AI/copy of a human consciousness is pitted against humans as villains. Ends with the AI killing the humans, and we're supposed to cheer that the monsters who abused them are dead

Not only is it a cheap and lazy gimmick, but the moral implications are often so fucked up that it completely changes the story.

Take Black Mirror season 4

The media went fucking crazy over the opening episode USS Callister and it's been topping lists of best Black Mirror episodes. The ending is sold as this great victory and this moment of celebration, finally this horrible monster keeping them prisoners got his comeuppance, they have acquired their freedom, they have their own ship and they are free to explore their new universe and make a life for themselves.... except they are fucking computer programs. In the real world, you have a man who is stuck with his virtual headset on, unable to get out of it, and he will stay there until he starves to death. Yeah, he was a dick when he played with the AI on his ship, but the dude didn't do anything in real life that deserved that. The worst/creepiest thing he's shown doing is getting some dna from around the office to create the AIs but that's definitely not in line with starving to death stuck in a virtual black void

Other urls found in this thread:

en.m.wikipedia.org/wiki/Fiction
newscientist.com/article/dn25560-sentient-robots-not-possible-if-you-do-the-maths/
twitter.com/SFWRedditGifs

The season ends on a similar note where you're supposed to cheer for this girl who just murdered a dude and finally freed her father. But yeah, her dad was dead. That was just a fucking hologram, It had nothing to do with her actual father. And it's not like the dude, as much as a scumbag as he was, didn't deserve to be poisoned and killed over it. He's not the one who sent him to jail. Why go after him when a judge and jury put him in prison? Her mother's death? Her mother was fucking insane and killed herself because she saw a hologram of her dead husband that looked sad. Are you fucking shitting me?

The real ending here is actually that the mother was completely fucking insane with grief over an hologram and killed herself. The daughter, stricken with grief, gets her mother's consciousness implanted in her brain. Her mother, being fucking batshit crazy, feeds all that crazy bullshit to the girl until she convinces her to go kill a man because of a sad hologram. And we're supposed to see that as a good.

If those endings were actually seen as dark and fucked up as they actually are it would be one thing. But they are treated as something triumphant. You can see the same shit in recent movies like Ex Machina or tv shows like Westworld. I haven't seen the last Blade Runner, but I assume the same theme runs through it.

I mean do they not realize that at the end of the day computer code is just computer code and it's really fucked up to pretend that being mean to a computer program/robot as a human you actually deserve to die for doing that?

It’s a leftist thing. They take equality too far

>he thinks something being made of metal means it can't be at or greater than the level of human sentience and self-awareness
You're the brainlet here.

yeah its just oversaturating the market all of a sudden

I think it's partly because they have been so used to rely on 'show the bad guy is a terrible person who deserves to die by having him attack or kill someone innocent' that they think they can just cut/paste that shit into everything and it's gonna work just as well.

It doesn't.

The whole fucking point is to make the audience think what means life and consciousness and morals and how it will affect us in the near future where digital and analogue meets as one.
Fucking brainlet.

>implying a soul isn't created by God
Top kek

>The whole fucking point is to make the audience think what means life and consciousness and morals and how it will affect us in the near future where digital and analogue meets as one.
>Fucking brainlet.
yeah except we use computers and we know consciousness is not a few lines of code, life is an entirely different thing than what we can create. there's no comparison and to pretend they are similar or even worse that they are somehow superior or better than us is just insane and unrelatable to normal people

No that was Data on Star Trek this is just simulated AI with a personality construct, it isn't a life that grew and learned through experience it's just a mask over code.

White man = devil

youre completely glossing over the fact that the show implies that they arent just regular videogame AIs but true sentient AIs and the dude was fucking torturing them and you have to be a seriously fucked up person to torture a living being

there would be literally no reason for the episode to exist if they were just "fucking computer programs" like you said

are you seriously so stupid that you didnt figure that out instantly?

Exactly. It's like you guys are playing stubborn or something. The point is that in this distopian future, all this crazy shit will happen constantly.
Some will treat it just as data, but for the data itself, what's going to be the perception?
If data is programmed to feel, is it ok if we make it feel pain? Over and over?

Personally, the idea of a 1:1 consciousness copy of me being stuck inside a monkey doll terrifies me.

I thought they were far more than AI, after all its made up super high sci-fi fantasy shit.
Getting memories from DNA?
Putting little disc on your temple that somehow connects to your brain?
I could go on and on, it was intended for them to be real human clones trapped inside a computer.

In the universe of the episode that allows for shit like this to happen, you should root for the poor crew getting tortured, if you apply real life logic then you are right, they are just code and their lives are meaningless.

At the end of the day, if you take any of this shit serious you are autistic.
but I still hate when the 'good guys' win

I don't think I've ever seen worse Sci-Fi than Black Mirror, it's the most lazy surface level garbage I've come across in a long time. They can't even get the formula right, for instance in something like Outer Limits you'd have a man that loves a woman but it turns out the woman was a swamp monster and was using her pheromones so make him think she was human but he slowly becomes immune to it and begins to see her as a monster. Now he has to decide does he love his wife despite this or does he try and kill the monster because he was bamboozled and tricked into a relationship built on lies. So the formula is things are normal, then they start getting weird, then moral dilema, philosophical proposal, then resolution.

In Black Mirror the formula goes, something is happening, moral dilema, things keep happening, resolution or reveal. It's fucking ham fisted garbage.

it was simulated, simulations aren't real hence no matter how alive you think your doll is it's still a doll and not alive.

yeah it's Assassins Creed with a worse story somehow.

OP is literally so stupid he thought the point of the episode was "videogames are bad because ur being violent against muh poor 1s and 0s" and didnt notice the extremely obvious implication that its a sci fi dystopia and the ais were sentient

i honestly cant believe someone can be so stupid that something so obvious goes over their head

>I don't think I've ever seen worse Sci-Fi than Black Mirror
But it's British!

BUT NOT FOR THE DOLL ITSELF.
Fuck's sake you're dense.

>being so autistic you fail separate the fictional actions of fictional characters from the real world
>we know consciousness is not a few lines of code
I bet your morbid weight in hotpockets you can't even explain what consciousness is.

>youre completely glossing over the fact that the show implies that they arent just regular videogame AIs but true sentient AIs and the dude was fucking torturing them and you have to be a seriously fucked up person to torture a living being
>living being
>inside a video game

This is not a living being. That's code. And that "sentience" is only code simulating consciousness. It's not an actual human being. Meanwhile a real human being is dying of starvation alone at home

And it's definitely fucked up to think that it's ok.

>If data is programmed to feel, is it ok if we make it feel pain? Over and over?
Yes. Because it's data. It's not real.

>Personally, the idea of a 1:1 consciousness copy of me being stuck inside a monkey doll terrifies me.
Why? It would not be you. It would be a copy meant to replicate you. You are you, you cannot be anywhere but your own human body.

yeah thats pretty spot on

who cares about the doll, it's a doll, do you give a shit about killing NPC's?

>it's OK to kill somebody for being a dick to AI
Guess that means everybody who has tortured somebody in a Sims game should die too

>this is the brainlet that shits on Black Mirror

The real brainlets are the ones who think intelligence is a regression problem, and parameter calibration is learning.

>I bet your morbid weight in hotpockets you can't even explain what consciousness is.
are we talking on the level of the brain, or the soul?

tumblr was all over this episode celebrating how the entitled white male got his comeuppance

>he thinks anybody killed the guy
AI isn't alive. He died because his hubris told him to create a homebrew version that could potentially lock anyome inside.

ITT: Retards who don't understand the concept of sentience
lmao at you retards trying to find deeper meaning and getting mad because it isn't as deep as you want

its almost like the entire point of the episode is that they are futuristic true artificial intelligence's and not shitty AI from a videogame from the year 2000

do you give two shit's about A.I.? because it's the same stupid story about a doll that simulates life.

If you believe the AI was sentient, then he was killed by a sentient being. Anyways it's a major plot hole that his VR game didn't have some sort of killswitch or emergency shutdown. His game may have been a private build but the VR tech he was using was obviously popular and if a customer died IRL from getting stuck in the game they would get sued to hell and back.

>AI isn't alive. He died because his hubris told him to create a homebrew version that could potentially lock anyome inside.
Well no actually he died because he decided that he would do VR with a headset that doesn't seem to shut off on its own when it detects problems or after a certain amount of time

That in itself is fucked up and you'd think that tons of people would have sued those companies and others would have died when their game glitched

>If you believe the AI was sentient
But you don't believe it, so why the fuck argue against a strawman? It's like you understand the premise and the themes presented in the episode but stubbornly cling onto other people's perceptions to validate your biases.

>its almost like the entire point of the episode is that they are futuristic true artificial intelligence's and not shitty AI from a videogame from the year 2000
*true* futuristic artificial intelligence would still be artificial and not life and the idea that it should take precedence over actual human life is fucking ridiculous

You edgelords do realize a lot of this Black Mirror episodes with brain implants and stuff related to the brain or AI are basically impossible right?
They have as high of a chance to even coming close to being real in the future as Star Wars becoming also real in the future.

They are set in alternate made up universe, they are 'What Ifs'.
What if you were able to clone a person inside a computer?
They are not AIs by our definition, they are clones, they are humans not code.

They have a hard time grasping the fiction bit in sci-fi.

[citation needed]

You're a retard holy shit.
I don't give a shit about the doll. If it were mine, I would destroy it. The moral dilemma starts when there's an exact replica of my consciousness trapped forever in a vegetative but fully self aware state. Oh, and it's illegal to destroy it according to the UN.
You're too stupid to understand the philosophical points the show tackles so you criticise it's technological accuracy when it's a fucking dystopian sci fi show.
Read a book sometimes.

>They are not AIs by our definition, they are clones, they are humans not code.
yeah i'm gonna go ahead and say that if you're a human "clone" in a virtual reality game, no you're not human and yes you're fucking code

Same thing if you're in a chip in a teddy bear doll or a hologram in a museum.

Those things are not human, even if they can simulate it. Same thing with the Westworld robot, just because they are starting to malfunction and remember shit instead of wiping their memories

>illegal according to the UN
Like that has ever stopped anyone

You're thinking of SAPIENCE you fucking retards. Sentience is literally just the ability to acquire and perceive external info via the senses. A dog is sentient

Again you're using our logic in a high fantasy sci-fi setting.
Care to explain how the little disc he puts in the temple to connect to the game works?

Hint: it's not about how it works that matters.

>Because it's data. It's not real.
>Data somehow has no physical existence, electrons aren't real
>The structure of our brain is magically conscious in ways that nothing else can be for reasons I don't understand and can't explain, but it's "just obvious"
>Computers couldn't possibly replicate that structure in any way

>You're thinking of SAPIENCE you fucking retards. Sentience is literally just the ability to acquire and perceive external info via the senses. A dog is sentient
mathematically its not even possible to make AIs sentient in the first place. And even if they were, they would still not be anything more than lines of code and definitely should never be placed above actual human life. Or animal life for that matter.

Blade Runner doesn't actually do that. They're not AIs and it's never implied that they'd be taking down humanity. Just nu-Tyrell probably.

>mathematically its not even possible to make AIs sentient in the first place

You have no clue what you're on about and you know it.

plant life can suck a dick tho

>mathematically its not even possible
en.m.wikipedia.org/wiki/Fiction

>You have no clue what you're on about and you know it.
newscientist.com/article/dn25560-sentient-robots-not-possible-if-you-do-the-maths/

did you forget the part in blade runner with the sentient artificial intelligences? its was pretty hard to miss

I think the implication is that each AI crew member's entire genome was transferred into the game, and that each one's consciousness was a side effect of the game creating a perfect digital simulation of their brain chemistry. I have no idea how it would be able to get their memories from assimilating their DNA, though, but whatever.

Daly may not have understood how to reprogram their simulated minds - he could only "import" them - probably because their minds are created all at once while assimilating their DNA, by using deep-learning algorithms or something. Successfully tweaking that kind of genetic information to change a person's personality in specific ways is beyond human understanding. But maybe Daly just wanted them to suffer and to hate him anyway though, because he hated most of their real world counterparts. I personally wouldn't feel comfortable playing a videogame where all the NPCs were sentient and hated my guts for enslaving them into the game.

I don't mind the guy dying in his apartment, I do feel something else could have happened.

I get that he tried to live out his fantasies in videogames and do/say what he couldn't in real life, but still, I feel like something should have happened to the AI.

It was a dumb ending, because they didn't know that the update was even going on. They didn't know the update would have trapped him in that world. He could have gotten out, and just gotten their DNA again.

I think the AI were let off too easily. Yeah, you get rid of the guy who put you there, but now you're free to ride in a computer game ship? They should have been fucked.

It's unclear whether Joi was actually sentient and even if she was it was never a question of anyone oppressing them

>I think the AI were let off too easily. Yeah, you get rid of the guy who put you there, but now you're free to ride in a computer game ship? They should have been fucked.
Not only that, but their time is nearing as well, There's no way that as a bit of rogue code they won't end up being swept up by an update and deleted. Or even if they avoided that, how long are people gonna play that game? 5 years? 10 years? It's not like they'll have people playing this game 50 years or whatever. At some point they are probably gonna end up disconnected from anything themselves. What happens then? They go to sleep? They are in a black void for eternity themselves?

Since they are code, most likely they'd just go back to not existing

im not referring to joi

...

I was wondering if you were talking about the ITT, most of which I agree with as an attempts to reason about what would be needed for a mathematical model of phenomenology. None of this shows that the operations performed by neurons couldn't be emulated via silicon though, this doesn't at all constitute a proof that machine consciousness is not possible, as there's nothing saying that any large scale structures would have to behave in ways that couldn't be integrated simply because exclusive-or-like behavior is in principal possible at a low level.

Otherwise you would argue that in principal, the possibility of having neurons behave like NAND gates would 'prove' that we aren't really conscious.

it's a simulated personality you're to much of a pussy to separate your emotions from reality, Data is true life The Doctor on Voyager is an A.I. construct and not real nor alive simply a personality interface, a way to deliver knowledge.

Who cares if it's a replica shit isn't real, there's no philosophy here it's cut and dry.

not him but that article is completely based on the supposition that consciousness is based on total integration. We don't actually know if that's true or not.

Dude, Artificial Intelligence is rapidly reaching the point where it will be indistinguishable from the human brain. I mean what is a human brain but an organic computer? It stores and processes information, just like a computer, to operate its meat machine. We'll all be better off once we're free of these gross flesh prisons and we can ascend to the next level of digital consciousness. All of our problems are because we're trapped in these meat robots anyway. Once we can exist in the digital cloud we'll be free of war, religion, sexual desire and racism that cause all of humanity's problems. Look at the advancements people like Elon Musk are making in technology. I personally can't wait to finally reach the next phase of human evolution.

lol you are trapped in your brain forever there is absolutely no way to move you into a new storage medium, copy you maybe but never move you.

Nice religion you’ve created for yourself. Strong AI is decades away if it’s even possible, and there isnt enough energy left to run the world in a growth system for more than 10 years. By the time you’re fifty the world will be an unrecognisable dystopia.

yeah those are delusions of people bored with their lives who fantasize about some sci-fi shit that's never gonna happen

there will never be a human being who goes from a human body to a robot or whatever. "Downloading" thoughts or consciousness is not a thing and at best, as you said, a copy and never the original

we'll see what happens when they implement the birthright lotteries

To the credit of Ex Machina, Westworld and BR2049, they at least research the issue, while Black Mirror just goes "Yeah lines off code are totally people and deserve to be treated well, just turn off your brain lmao"

They should not wake up for the first time in the game thinking that they are still their real selves and believing that have been abducted. That's kind of fucked up. If an AI's consciousness can feel pain and emotion and is as much a sapient entity as its real-life counterpart, then morally there is little difference between torturing the AI and torturing the real person. From the perspective of the AI crew members, they *are* their real selves

great but that doesn't make them real people, just that they have hold over memories, it thinks it has pain and emotion when it doesn't because that's what the personality attached to the constructs thinks it's supposed to emulate.

See you can't just chalk it up to "Well they're just humans get over it" because the point of Black Mirror is to tell about TECHNOLOGY, otherwise the script could be about the guy kidnapping and torturing people in an actual dungeon for all anyone cares.
And if you do tell about technology, you have to at least question AI's sentience, otherwise you're not doing a very good job at being a sci-fi.

Are we answering questions with questions to avoid providing our own proper explanations?

You act as if there is some mindless controller daemon / program invisibly managing the acting and behavior of each crew member, so as to provide Daly with the illusion that they are like real people. The show made it clear it is not like that at all, and that each one of the AIs feels like they are the real-world version of themselves and that they've been abducted into the game. They are not real people, no, and you're right that they are not even close to as special as the real-life original people. But they are still conscious entities who are exact copies of real people's minds, and that makes it wrong to torture them. Deleting them is fine and painless, but torturing is not okay. Daly didn't deserve to die, but it was his own fault for fucking up with his own game and underestimating the AIs in it. It's not like his colleagues in the real world found out and had him arrested.

So is it "sentience" or "sapience" we are discussing here?
A goldfish is sentient. Only a human can be sapient

Wait, what the fuck? Even if you could retrieve the DNA of someone and somehow implant that into a program, how the fuck would that AI have the memories of that person? I'm pretty sure DNA does not store memories. Get that assassin's Creed bullshit out of here.

You are missing the point of black mirror which is the ambiguity of whether a highly advanced "computer program" can be virtually conscious and therefore whether they deserve human rights. This should be obvious to anyone who isn't completely fucking retarded and it amazes me that so many people can't comprehend this.

Leftism is based on Luciferianism and Gnosticism and filmmakers are unanimously rabbid "liberal" Marxists. Basically they see the programmer as the Creator God or Demiurge and the AI as creatures. They seek to propagandize their doctrine that the creatures should rise up in rebellion against God and kill Him, so they make these stories as an allegory in order to mislead their young audience. Basically they're pushing Satanism.

I do however see a trend where white males are "in control" and women and people of color put them in their place and take control, killing the off or massively cucking them somehow. You see this in basically every major motion picture that comes out these days.

Numerous animal species can problem solve with complex reasoning skills, moving the goalpost and claiming sapience is only human is a cop out since homosapien is built into the definition and every other quality of sapience can be applied to species other than humans.

but they don't really think or feel, they have procedurally generated thoughts based on personality traits derived from the host no matter if they think they do, the fact is they don't.

>virtually conscious

there is no difference between this statement and characters confined to a book, you might as well be arguing that fan fiction that contains torture is a human rights violation.

You know what's really fucked up that they don't even acknowledge in the black museum episode? The part when they first "transfer" the consciousness of the woman in a coma and kill the "body". I mean they show it in a previous episode but in this one they don't and I thought they would at least mention something about it. Imagine hearing that the operation was a success but wondering why you're still stuck in your body while they prep you to be killed and gutted for parts.

I think Ex Machina did it better.

The big difference between Ex Machina AI and Black Mirror AI is that the AI in Ex Machina is only "acting" human, while the AI in Black Mirror are basically emulated human minds. Thus, Ex Machina is able to be a cautionary tale about feeling empathy and trust for something that cannot feel empathy and trust. Meanwhile Black Mirror is just "what if" (OI WOT IF) speculation, presented alongside some really fucked-up scenarios. It's more horror than sci-fi.

I've heard Westworld is a good AI show too, but I haven't seen it so I can't judge.

Black mirror is ambiguous about how advanced the technology is but it's heavily implied that they are virtually clones of real people, just in digital format. It's strongly implied that they really do think in the same way human beings do and really do feel in the same way but are simply trapped in code format. If you refuse to accept this you are obviously not sufficiently capable of using suspension of disbelief or comprehending the complexities of theoretical technology. In short, you are probably not a very intelligent person. Sorry.

> there is no difference between this statement and characters confined to a book

It's heavily implied that the technology is so advanced that the characters are virtual clones of the real people in the world. If you can not accept this you are probably just a dumbass.

>clones

yeah clones don't get to be people they're clones they're used for a purpose and discarded, they're copies only the original matters.

>it thinks it has pain and emotion when it doesn't

no physical body no real pain

What is the difference between a real pain nerve telling your mind you're in pain and a simulated pain nerve telling your mind you're in pain?

an organic mind in a synth reality or a synth in a synth reality?

>we have to destroy the DNA samples so he can't just clone us back

>but what if he just gets another DNA sample from work?

uuuuhhh

that was just the guys son or whatever but same thing

Luckily he ended up dying so it all worked out in the end, what a weird coincidence!

>the dude didn't do anything in real life that deserved that
I've been over this before in previous threads, but yeah - he hid plenty that deserved that. To reiterate it yet again:

1. Didn't program proper resource locking (or ANY resource locking) in his program that he has connected to his brain
2. Didn't program the game to automatically disconnect any player from a "rogue universe" before deleting it
3. Didn't use file permissions correctly / didn't run the program that can turn you into a potato as root
4. CONNECTED HIS COLLEAGUE TORTURER TO THE FUCKING INTERNET

The real moral of USS Callister is don't be a shitty Sysadmin / programmer

NOBODY STEALS MY PUSSI XD

by being trapped in a computer program that was purged by a server that was previously said to have been isolated from the server so that he could run his own beta build game.

Its the media HIVEMIND. Ai is on the public consciousness so the media reflects that, same with the SJW shit

...

>intelligence is what defines worth
so kill all retards then? eugenics?

the real moral of the episode is that they should hire better writers who would know how to write for this scenario without completely unintentionally including tons of plot holes

Westworld is good, about how ai developer decide to add memories function, hoping someday they'll decide to break free, but that still wasn't enough to have identity, until the season end. Way better commentary on the conscious mind than black mirror

>we know consciousness is not a few lines of code
We don't even know if consciousness is real or just an illusion. It might be beyond our biological limitations to truly understand it. You live with the assumption that other humans have a conscious experience similar to yours. I see know reason not to extend this courtesy to other beings that show self awareness.