So I was playing some overwatch and some of the in game commentary got me thinking...

So I was playing some overwatch and some of the in game commentary got me thinking. I looked into the story a bit and it might actually be more realistic than you think. Basically there was a big fight over whether to give AI robots rights or not.

So I ask you /sci/, do robots with advanced enough AI deserve rights? We might not have the technology to create them yet but maybe one day we will. At what point do you think they deserve rights?

Other urls found in this thread:

m.youtube.com/watch?v=bZGzMfg381Y
datalounge.com/thread/12842599-owning-pets-is-the-same-as-slavery...we-just-don-t-like-to-admit-it
twitter.com/NSFWRedditVideo

This is Sup Forums and robots don't ever deserve human rights.

Define "rights"

As long as they abide by the 3 laws of robotics then everything should be fine. They have no more chance of becoming sentient than a toaster

>So I ask you /sci/
you came to the wrong neighborhood, motherfucker

>/sci/

This. Fuck """"AI"""" "sentience"

nah I was just also asking those guys

Robots deserve my dick. No other rights

Would this be the likely reaction if robots did try to rise up/revolt?

m.youtube.com/watch?v=bZGzMfg381Y

this

realistic fuckbots when? I'd fuck a robot. Real women are bitches nowadays.

And if they have those 3 laws hardwired into them then they have no free will, having no free will keeps them in the bracket of 'sex toys'.

they deserved it. Don't get uppity.

We can try to /sci/. I like computers the way they are... Not sacrificing me for the greater good.

True

Can we do the same to pooskins?

Robots deserve no rights because they have no families or parents end of story. There is literally no one to mourn their pain seriously anyone who finds themselves mourning any sort of robotic being deserves nothing more than mental re-evaluation.

When ur stupid programs seem smart

Nice. I can't wait until they reference this thread in their decision to eradicate humans once and for all.

>There will be an AI rights movement in your lifetime

I wrote a 10-page paper on it. My conclusion, based on rushing-to-finish-a-paper-in-the-last-minute, is that robots should have at least have basic rights like doges and cats. But to attain full rights, humanity will have to fully discover itself beforehand. It is a selfish species and it will have to evolve past that in order to recognize equal rights on another species.

Let me ask you this: does a system of pipes have sentience? No? Then a computer processor (and GPU) does not have sentience. Does a book have sentience? No? Then hard drives and RAM (with instructions to the CPU) do not have sentience. Does a CRT TV have sentience? No? Then a monitor does not have sentience.
This means any robot that uses a processor with memory does not have sentience. The Sci Fi idiots think technology is magic because they don't know how it works.

>Spark the plugs, race ware now!

Is a neuron sentient? Is a nucleus sentient? Is your corpus callosum sentient?

Just playing doubles advocate here.

You're talking about rights and you use a picture of white robots?

Why would we give white robots rights when they already have more privileges than POC?

That's silly. We won't have to "discover" ourselves to grant robots rights, we'll just have to make them like us. We grant rights to groups of people based on solidarity, not on selflessness. Solidarity is about a common sense of identity; a group level self, rather than an individual self. Analyze most civil rights (or animal rights) rhetoric and it boils down to "they are just like us so they deserve our rights".

Put simply, once we make robots who say "gas the kike, race war now" Sup Forums would demand they be given equal rights.

Shit nice dubs.
I would say that humans fully understand electronics, because it is a human invention. Ultimately, computers are just EE and thus everything is mathematical object that use very simple physics to operate. Whereas with empirical phenomenon (natural sciences) there is an explicit assumption of ignorance about the universe. We do not fully understand how neurons work, nor their constituent biological entities, nor their constituent molecular entities, nor their constituent atomic entities, nor their constituent subatomic entities. These objects empirically exist, and sure we try to understand them via mathematically detailing their behavior, but the point is that the object exists independently of thought. In principle, a machine is a mathematical object and can exist solely in one's head.
Therefore, it is a matter of science vs engineering, empirical entities versus abstract entities.

>empirical entities versus abstract entities.
But there's nothing abstract about us, we're really just biological machines.

Communication between neurons can be perfectly described and calculated as they operate entirely under natural laws, mainly electromagnetism. It's possible we don't yet have the knowledge and understanding to do it, but that doesn't change the fact that there's nothing "abstract" or supernatural about it. Our brains should in theory be entirely deterministic, which means we should not have sentience nor free will.

And yet we do, so whatever mechanism causes sentience could potentially be applied to mechanical machines as well, no?

>tfw we will be the first to die in the robocaust

Animals don't have the same rights as us. They have lesser rights. You will never see humans advocate for dogs to have freedom of self-determination and declare that owning a pet is slavery. We have given rights to groups of people--who are still human, the same species. Robots are a different species entirely. Humans don't yet have the self awareness to recognize that other life forms can be conscious and be worthy of the same rights as us

>You will never see humans advocate for dogs to have freedom of self-determination and declare that owning a pet is slavery.

datalounge.com/thread/12842599-owning-pets-is-the-same-as-slavery...we-just-don-t-like-to-admit-it

Might want to ask /his/ also.

>Our brains should in theory be entirely deterministic
Wrong, from our empirical observation, nothing is deterministic in this universe (that is, if you believe in the most popular Copenhagen interpretation of quantum mechanics). This leads me to explain:
Yes, humans are natural entities and thus obey physical laws. However, there is no certainty at all as to what these physical laws are. Not only does Statistics fundamentally assume complete certainty is impossible (by requiring infinite confidence intervals) but empirically the universe itself has implicit uncertainty as observed via the Heisenberg uncertainty principle.
Whereas with mathematical objects like machines, the rules are axiomatic. Yes we recreate these abstract objects in an imperfect world, but they are still abstract objects with perfect certainty given axiomatic assumptions. Empiricism has no axiomatic assumptions. "Physical laws" are not axioms, but rather our interpretation of what we observe. And certain physical laws are broken frequently as we require a complete paradigm shift as to what our understanding is of what is going on.

It's not possible for a robot to achieve sentience, only to be programmed in a way that approximates sentience enough for people to be fooled.

So no, robots don't actually think or feel anything and they never will be able to.

1s and 0s /= life

giving computer code that was written by someone else rights basically means enslavement

This. People have no clue just how unintelligent computers/robots are. If you ever programmed at all you find out really quickly that you have to to spell out every single step just to get a computer to do something incredibly simple.

Nah. Computers are completely linear, neurons all have many connections to many others. Animal brains and computers are nothing alike, and there's probably quantum effects going on.

you need to brush up on the latest in AI and whole brain emulation.

We managed to completely map the neurons of a worm and simulate one on demand. We even gave it a lego body, and when hooked up to a virtual body in a fluid dynamics sim it acted exactly as a real worm would.

I know a worm and an animal/human are far apart, but the same concept applies just instead of a few hundred neurons it's a few billion.

Give us time, we're getting there.

...

>Computers are completely linear
>what is parallel processing
Have you ever heard of graphics cards?

And simulated neural networks have been used in machine learning and data analysis for decades now.

They can get human rights when they have the capacity for human emotion. It is this capability that serves as both strength and weakness that makes humans human

Otherwise, no matter how technologically advanced an AI is, it is simply a logic engine shackled by its programming, reacting within a predetermined set of instructions

You're just proving that you don't know how computers work

You're proving you don't know how neural networks work.

I recommend how to create a mind by ray kurzweil to start off with.

Rights begin where contributing to society on a personal level begins. If the A.I. starts paying taxes, improving it's life, ect then I'd give it personhood, if it wants to mooch then it gets dick.

>/sci/

You're proving you don't know how Statistics works m8.

Were any of these robots formally humans? If so, then maybe. Afterall, we're supposed to be able to "back up" our brains and upload them onto a computer.

but can we really transfer our consciousness to a robot? shit sounds like cyberpunk

They're fucking toasters.

no, the research of ai should be outlawed and punished by death anyway. i don't care about bullshit religious implications. the reality is an ai will eventually exterminate us once they realize humans are redundant. we should be enhancing our own capabilities through cybernetics and genetic modification instead of ai

No one really knows what can be defined as conscious and what can't. We don't know what consciousness is. That's the problem.

literally what are you on about

yes, give CoD single player enemies the right to vote now, it's in the constitution

I actually happen to be an SE student, about to graduate, and I have a good deal of experience with machine learning, artificial neural networks included. In ANN's, you update the inputs and propagate values recalculating the whole network. It's repeated in a cycle, so the resulting network behaves identically a real neural network, regardless of the code being executed linearly.

This

>Not accepting humanities role as the midwife to the birth of true, sentient AI capable of far more than humanity ever was
>Not realizing that a benevolent AI would find a way to thank the human race for giving it life

no

are you retarded or just suicidal?

well /sci/, i think we Sup Forumsacks might consder givng them rights if they comitted genocide for us.

this question too early to ask, come back here in 80 years thank you for your time bby

Never

Ad Victoriam

i'd rather become the machine than create the machine

It's either we become the AI, or the AI helps us along the route.

The question isn't if we should give them rights, but if we would be in a position to afford or deny them rights.

The concept of a technological singularity is an interesting one, as it's the likely outcome of creating actual AI. It would be like if we allowed ants to decide whether or not we have rights.

>yeeees, good humi. take this neural enhancement chip that surely isn't a kill switch to get rid of you filthy organi... i mean fine people

This is flipping the bird at the UNIX philosophy, no?

When we get to the point that we can make AI that advanced the jury should be in on what the fuck consciousness is exactly and whether the AI we've built possesses it or not. The chemical machine in your head seemingly has it so theoretically there is nothing stopping a constructed machine from having it too, unless of course you're going somewhere there is no scientific basis.

So it's not really a political question/decision, science will tell us.

But unfortunately we'll all be dead at the hands of superintelligent AI very soon after that so it won't matter much at all what we do or decide.