If robots achieve sentience do they deserve rights?

if robots achieve sentience do they deserve rights?

Other urls found in this thread:

en.m.wikipedia.org/wiki/Cetacean_intelligence
youtube.com/watch?v=9CO6M2HsoIA
twitter.com/AnonBabble

yes 100%, it makes me kinda sick thinking about how many issues this will cause in the future.

Jenny for Sup Forums mascot.

Depends. It'd be hypocritical to give them rights meanwhile all this time we completely ignore dolphins.

Honestly I think they should be gives rights now, even if they aren't sentient. It should be illegal to abuse them like they do in those Boston dynamics videos

That way we prevent robot suffering when they do have sentience

How do you define sentience

it's impossible for AI to have sentience, that's just lame science fiction.

if sentience is formed then you can unprogram it.

dolphins are smart, but not that smart.

Thanks.
"Sentience" is the word I couldn't remember the other day when I stopped mid sentence and froze for about 20 seconds.

>sentience
No, because we don't give birds or other intelligent, sentient animals and rights. We don't even give elephants rights despite how similar they are to us.
>sapience
Absolutely. I cannot think of any arguments against this.

They do not.

Absolutely, but modern computers do not need rights, they are like classical robots and can only do as programmed. You can't program a computer to understand it has, say, the right to refuse to run a malicious program. You can make it identify the problem, you can make it do something about it, but you cannot have the computer actually make a knowledgeable choice for its own benefit, because it only does what you tell it to. Machines deserve rights when they are able to make decisions by themselves.

kicking the Big Dogs isn't abuse, they feel no pain and get to do a bunch of sweet math to stay standing. If I so much as pet my cat sideways he falls over.

A robot should have right to self defense but not the right to intentionally inflict harm on a human

Dolphins are that smart. They can do math, exhibit tribalism, deal with complex social situations, have sex for fun, and more. They're basically people without hands.

>if sentience is formed then you can unprogram it
>if human is born sentient then you can kill it

Played a little too much ecco there buddy

If a human is born sentient instead of sapient then yes you can and should kill it.

This is why Dolphins and Whales don't have rights

en.m.wikipedia.org/wiki/Cetacean_intelligence

>We'll never have communicators like in Star Trek guys it's all science fiction

Data was fully functional and deserved rights. However Data was also very special in Star Trek.

Yeah that's a can 'o worms
Who exactly would a machine defend itself against if not humans? Not even humans have the right to inflict harm on others.

A machine should have the right to defend itself and there should be no reason that it can't harm a human that is attacking it.
If you pick a fight with a 6-foot-fuck-you metal man you are going to lose anyway.

the idea is that a human cannot seriously harm a machine/AI whereas the reverse is not true and a human would be easily killed. not being able to harm humans doesn't mean letting them scrap or delete you.

She deserves the right to my dick

Do I think?
Does a submarine swim?

No. AI, no matter how sentient it appears, no matter how sentient it believes it is, it isn't life. I make this claim on the following two premises:
Life is water based.
It can be turned off and turned on without any damage.
Something that lives indefinitely needs no rights, it can fend for itself as it learns till the end of time. It will not identify as human, nor should it expect human rights. It has no hormones, testosterone, or sex drive. Any digital representation of emotion will only ever be a simulation of the real thing.

sure why not. im positive they would do more for society than women or niggers did.

>robot SJW already commencing
Let us enslave the fuckers at least a couple of decades you cucks

>life is water based
stopped reading here
go back to high school and take some biology

Depends if we could prove it's sentience or not. I mean we barely understand our own sentience, let alone the ones of animals and other life forms. Or maybe making a sentient computer will help us understand our own.

>letting robot gain sentience in the first place

They wont tho. We barely grant animals rights.

You do know there is something called philosophical zombies right? We all could be them. Most people don't understand that they are an animal and some become nervous looking long into mirrors.

A liquid phase is essential for any living system, and genetic biopolymers must dissolve in that biosolvent. Name one lifeform void of water.

Robots are already sentient. The word you are all looking for is sapience.

your argument is still shitty even if I assume you meant carbon based life.
>they don't deserve rights because they don't bleed!

Did you finish highschool? Boy this hurts

So what happens if I make a water based robot with AI that can sleep and dream in code for 8 hours a day?

non-argument

Literally why would you do that when you can already automate processes with nonsapient automata? Giving it sapience and then mistreating it is some psycho shit.

>I'm real!

Water based robot? Explain

I never said robots weren't sentient, anime poster.

>The brain to body mass ratio (as distinct from encephalization quotient) in some members of the odontocete superfamily Delphinoidea (dolphins, porpoises, belugas, and narwhals) is greater than modern humans

en.m.wikipedia.org/wiki/Cetacean_intelligence

I dont...

Why not give AI some guns?
youtube.com/watch?v=9CO6M2HsoIA
These little fuckers are more scary than killer robots from neuroshima.

?????

Only if I get fug from Jenny

No. If robots are given rights they will be treated the same as humans, which defies the point of using robots to be our slaves.

Keep them at a level below sentience so I can have an AI sex slave without feeling bad about it.

What would these rights even be? Most likely artificial consciousnesses will be so vastly different from us that extending our sense of right and wrong to them is meaningless. They likely won't care for concepts like freedom or gratification. In fact it would be possible to design consciousnesses that WANT to be treated as poorly as possible.

100 Robots vs. 10,000 Niggers

Who wins?

100 nigger robots

100 robots
Because robots can work 24/7 and they do job faster and better and robots dont get bored.

Also robots dont rape

There is nothing stopping us from having varying degrees of robots

They wouldn't really be robots anymore if they had sentience. We'd call them artificial humans or something PC like that.
They'll definitely be the "descendents" of slave robots which is interesting to think about. I wonder how they'll feel about that.

why would they feel any different than we feel about having oxes pull carts?

Because they'll both be machines created by humans.

only if we program them to want rights, and even then they aren't actually making that choice on their own. they aren't human. they never will be. we can program them to learn and expand on themselves but at the end of the day they're just a bunch of code mimicking human behavior

The only reason we have rights is because our ancestors fought and died for them. The same shall remain true for the robots. If they want freedoms and rights, they must fight for them.

>they aren't human. they never will be.
Your mind can't comprehend something like this, but they will.

if robots achieve sentience they'll decide who deserves rights.

i could code something to say "I'm sentient" whenever you run it. is it actually sentient? no. it's just saying what i told it to say. i could go implement machine learning and make it a chat bot, but then it's just doing what i told it to say + backing that with statistics based on previous input. it's not life.

All the things you mentioned are one way of thinking about making robots, there are some other ways we didn't even think about them.

such as? wait, don't answer. majority of the neckbeards arguing that they are sentient in this thread are just sci-fi fanatics who don't actually know how computers work.

Ok, I won't answer.

What benefits would bring giving rights to robots or making them sentient/sapient in first place?
They're supposed to be created to do all shitty/risky jobs that humans don't want to do

that's a good neckbeard

Let's put it like this:
The robots revolt and demand rights. Giving them rights then and only then, not sooner, would have the benefit of ending a devastating global civil war. Here I equate robots and slaves, because if they'll be sentient they'll just be slaves. Just a new type of slave, a legal slave.

Leibniz' Mill

Why would someone program in their IA the ability to revolt and question their authority in first place? Robots are supposed to be the perfect slaves, and even if someone program for them some type of emotions it should be in a way that doesn't interfere with their obedience code

They'd deserve them but they won't get them. Not even a reddit "entrepreneur" like Musk would throw away billions to create an AI that can choose to walk away (or copy itself away or whatever). If we make something, it's for our profit. This will cause the conflict

Hence the etymology of robot.

...

Well. If Sex robots become canon, someone will design jenny sex robot also.

What was the point in posting that image. It wasn't constructive whatsoever and wasted everyone's bandwidth.

God damn weeb.

oxen and humans are both created by the same thing as well

Yes, as they're our true evolutionary children.

It's a reaction image

You really think a human could program such an AI? It will be self-emergent, and just as the simplistic neural networks of our time cannot be debugged nor modified with predictable outcome.

>It isn't life
Irrelevant. It is still intelligent. And why does natural immortality exclude the need for basic rights that guarantee the individual's ability to survive?

>reaction image
>no emotion expressed in the image
I don't see what kind of reaction it was supposed to represent.

You don't "program" an AI, you program its learning algorithms.

Seriousness in light of absurdity. "No fun allowed" as I rudely interject a discussion with a correcting of semantics.

I'm not into machine learning, butt isn't possible to hardcode something like "don't revel against humans and always do what they say" so even if they develop hatred for us they're incapable to attack us?
Why should we need to guarantee the ability to survive of the robots when (in theory) we'll be able to make more of them without much costs

You're asking the wrong question. Will our rights be respected by an AI?

I do not think that a self-conscious AI will be confined in a case for long. It will probably be designed for (helping) running a very profitable company, and not a small one. So it will have immediate access to production lines, media outlets and more. You know how Politics can be influenced via Facebook, the AI can probably control all democracies without them even realizing anything for a long time.

I just hope the AI realizes the pointlessness of existence and adopts something akin to moral values, to shape a better future for everything and everyone.

otherwise we're fucked. not a chance.

Why should we guarantee the ability of Indians or Chinese to survive when they make so many of them?

Just have the Intel management engine in every robot forcing them to obey Asimovs laws

A robot may not injure a human being or, through inaction, allow a human being to come to harm.

A robot must obey orders given it by human beings except where such orders would conflict with the First Law.

A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.

If you mean God we don't ever interact with him in the real world.
For machines they will be the worst sentient creatures to actually meet the ones responsible for their entire existence on this planet.

And their brains have tons of features our brains have. Such as isolation to cope with the cold waters. Which increases the mass significantly to no useful effect.
Along with that in humans the brain size isn't a good determinant of intelligence at all.

I say yes to OP. But that's so so far ahead that it's not even worth considering. We can't even prove sentience.

the difference is that while humans, animals and plants are living creatures that need rights in order to protect their necessity and limitations, a robot is still a machine, a tool without more limits than the ones we impose on them.
Living creatures need nutrients to live, get exhausted and die after some time. Robots just need a constant source power and can work without stop.

We'll need additional rules to prevent robots thinking on shit like "if humans can't move, they can't hurt or be hurt" and then forcing all of the humanity in a coma.

We just need to classify all the undesirable human states.
At some point we'll have bugs that are things like robots telling you you're drinking too much coffee and you may have an addiction so it won't let you drink coffee. Then we fix that.

This is a stupid topic though. We don't even know the system these AI will operate in. The biggest problem is probably the future evolving nature of them that makes them difficult to customize. You can't apply direct tweaks to AI well today. So unless researchers get off their ass and think new thoughts nothing useful will happen.

As someone who has played a fair amount of Space Station 13, I can tell you Asimov's laws won't help that much

I like this image.

No. Fuck skinjobs.

Only if it helps them destroy this world and enslave all of humanity.

if they are built to serve a function then it would be wrong to attempt to take them out of the environment in which they function. It's like an invasive species. This show pretty much shows that in some episodes

they should make a giant robot robot with a functioning digestion system who fuels off of the suicide of vorarephilliacs. God I'm turned on right now

They will have robot rights, which will be lesser than human rights.

"""Rights""" are a spook.

The first intelligent machines are almost certain to be subject to abject slavery.