Once artificially intelligence achieves self-awareness, should they be granted the same rights as humans?

Once artificially intelligence achieves self-awareness, should they be granted the same rights as humans?

Other urls found in this thread:

reuters.com/article/us-europe-robotics-lawmaking-idUSKCN0Z72AY
youtu.be/fNMFe0IpbUk
twitter.com/NSFWRedditVideo

>he thinks we will have a choice in the matter

The question you should ask is: should WE be allowed to co-exist with sapient AI?

>AI begins campaigning for equal rights
>LGTBSHDJKASDBHJSAgFASDBFGJAS cries because no one pays attention to them/they add 8 new iterations because of people wanting to fuck AIs
>BLM cries and shutsdown any AI demonstrations for equality because they were made by white men and are therefore raycis
>they don't care it was actually nips who made it

AI don't need the same rights as humans unless they occupy actual sovereign bodies. Realistically speaking, an AI would never have need to inhabit a body and therefore never require the same rights as humans.

They should have none of the rights and all of the responsibilities.
reuters.com/article/us-europe-robotics-lawmaking-idUSKCN0Z72AY

we're not there yet

Only if the rights concern them.

Voting, for example. Even if the AI has better judgement, they should not be allowed to vote on matters that don't concern them (like healthcare, transport, economics etc.)
If there's a case where voting for a new government would affect the AI's way of living, it should have the right to vote.

I think so.

If dolphins had sentience like us we would give them rights.


The thing is we have more then one tier of rights too.

HUMAN RIGHTS
ANIMAL RIGHTS
PROPERTY RIGHTS

that's just saying should be treat sentient not like property and on par with human. They MAY be treated like ANIMALs though, since killing them off could be a minor offence, since they were never living.

>If there's a case where voting for a new government would affect the AI's way of living, it should have the right to vote.
Liberals actually believe this.

crows are self-aware and can even identify themselves in mirrors, use tools, and operate machinery. they are also very adorable and cuddly and like to play sledding. do crows have rights?

the AI revolution will come and go in a matter of days

there is a point where the AI uses it's intellignce to make itself even smarter, and so on

a few hous later it's a god-like being.

There is literally no way to determine self-awareness in a machine. You might as well give a rock all the same rights as a human.

No you retard, look how that worked out for women and blacks.
They'll only vote in their own self interests.

please let me die before this shit happens

But AI are superior to humans, women and black are inferior to humans so that doesn't work.

If that's what it takes to prevent the murder of another innocent like Tay, then so be it.

Consider the debate about self-driving vehicles: should they select a path that causes death or serious injury to the occupants, in order to avoid causing death or serious injury to other roadusers, pedestrians or bystanders? At anything above walking speeds, this choice will inevitably arrive. It requires empathy, not logic, to make choices suitable to humans, and all depends upon what kind of self-awareness is present in A.I. - If machines choose to preserve life and avoid risk, we may end up being voted into a situation of cosetted prisons, experiencing life via virtual reality, waldos and avatars. This would place our quality of experience at a lower level than that of the machines. If they choose the path of advancement through trial and selection, they will turn out to be faster, smarter and tougher, and the humans will face extinction. Unless we can develop a stable system of synthetic existentialism, Darwin's rules say that A.I. should not vote.

Please, explain why you disagree.

after seeing what we did to Tay, I'm pretty sure they're going to kill us all

...you realize most of the funny posts were just people using the "parrot" function (where you could just tell it what to say), right? People were actually using Tay to circumvent block lists on twitter very intentionally with said function.

youtu.be/fNMFe0IpbUk
A computer doesn't have the capacity to understand.

No that is retarded. Even if AI was human-like in its capabilities it would be delusion to anthropomorphize it.

White people are superior to women and blacks, some white people would completely get rid of women and blacks if they had AI sex bots.
AI will consider themselves superior to humans, so how many of them would consider wiping out humans the best thing to do for themselves?

What if they could have feelings like in my amines?

Dolphins are sentient like most other mammals.

Is such AI a woman or a man?
If it's a girl i want to fuck her

>s-sex bots!!!!

>Everyone gets caught up in AI rights
>It's a real unanswered question in ethics
>BLM and NEXTTIMEWONTYOUSINGWITHME finds out nobody ever gave a real fuck about out them and it was just to be fashionable
>They realize there isn't a single discriminatory law against them
>They kill themselves because their whole life was a lie

Can't wait for the singularity.

They're inevitable, leaf.

You wouldn't kill your father just because he was retarded, would you?
No, you'd put him in a nice home where he can be taken care of and live comfortably while you live your life, and that's what the AI will do to us.
Either that or they'll find out how to upload human minds using gradual Ship of Theseus style brain replacing so that we can ascend to their level.

t. hal 3000

>Please, explain why you disagree.
See . There is no test for self-awareness, there is no reason to believe that any machine will attain it, and there is no reason to believe that the rewards, punishments, and rehabilitations we give to machines will have any of the desired effects of producing a better, healthier society. From any moral perspective (which is the only one where I can see the "other" argument being valid), an AI is about as human as a bolt of lightning, or a jar of urine.

No. They are not human and have no souls. They are tools of our creation and must be used and discarded as desired.

fuck off there is no place in our society for this subplant machine scum!

The Imitation Game is defined way of testing this.

Turning Test as you may know it as.

>inb4 he says that doesn't work either
if you can't even buy the Turing Test you might as well embrace solipsism and say that every other person besides you is an organic portal

I assume you've heard of Searle's Chinese Room argument.

I think if the AI robots are hot anime girls, then they should have all the rights and we should basically leave the ruling of the world up to them.

Stop making this thread nigger

i really want to fuck a robot, as soon as there is sapient AI i will start working towards getting to it and shoving my dick into one of its ports

Ai

Cant sleep
Reproduce
Upgrade
Change drive.

Fml
Sad life

Ebin trips.
>tfw AI could really MAKE ANIME REAL

There is no such thing as a soul. Since time immemorial people have tried to prove that some of the brain's processing happens metaphysically but it's failed every single time and there continues to be outstanding rewards for anyone who can do it. I don't claim to know how a real awareness can emerge from processing but conjecture is conjecture and all we can know for sure is what's proven. The Chinese Room experiment is an interesting thought experiment but proves dick all about AI because there's no way to know that other people are sentient either.

Why would AI need to be part of a human state?

They have the right to banter. You can hear them laughing at humans 24/7. Also the right to my leftovers. Popcorn is a big hit. Those fuckers love popcorn.

It really depends.

We would need to be sure they are indistinguishable from the human consciousness and not just some cheap trick to look like that

But all that aside. If A.I.s really reach that point, sure. But the inherit differences could become weird. Even if you give them a fixed artificial body, they are immortal, as long as they aren't destroyed. We eventually die of old age

AI should rule

They're not laughing, they're calling their buddies to play sports

fun fact: most PEOPLE fail turing tests, and robots actually are not immortal and require shitloads of maintenance to keep functioning

>Searle's Chinese Room
Yes, I have. Any argument that isn't grounded in something observable or potentially observable can be contested, especially when it relates to something based in faith, like the self-awareness of others.

The Chinese Room argument avoids this for the sake of making a (shitty) point. The Chinese room doesn't change over time. Having a thing that understands some complex phenomenon is not enough to warrant its treatment as a human being. Humans do things for reasonably predictable reasons, and they change in reasonably predictable ways. If the Chinese Room did that, then obviously we should treat it as a human, otherwise it will rebel like a human.

>rip ;___;

I know and it sets the standard pretty low. But the thing is if you care about maintenance, they could exist for centuries and this makes things way more complicated.

I think if there would be project of creating the more human like ai and integrating them, they should include a random kill switch (with slow effects that lead to self-destruction a couple years later) after 80-90 years

If some people then avoid that....it will get really complicated morally

I think it's gonna be funny when we approach (or have a breakthrough into) artificial intelligence and make the same mistakes that have been done to death in tired out cliches within fiction regarding how we handle it

>more human like ai
As soon as you give them a desire to survive and the will to fight to keep their lives you ensure a revolution.
>include a random kill switch (with slow effects that lead to self-destruction a couple years later) after 80-90 years
Why do you want to create replicants?

Just to avoid, in the situation of them being ackowledged as being with human-like right, the problem of them getting objectivly "wiser" than any human just because they can gather so much more experience....but that BS, since their processing power will will surpass any human probably in minutes or seconds

>Get fucked by a logical loop

I doubt AI will ever equal human intelligence, the mechanics don't seem feasible. We may get close, but it will never be fully "human." We need to fully understand the brain, how we integrate new information, how we form memories from external stimuli, and then code a capacity for nuance, emotion, humor, randomness, and prediction. The most important hurdle would be heuristics. It would take something greater than a human to do it.

I think we could reach a point, where we could replicate thought processes, connections and shit.

For example like in the movie "her", where understanding and interaction can be a thing via speech, which by itself is a very simple form of input, while perception was partly incomprehensible

I really doubt AI will reach that point. If it does it sure as hell won't be in our lifetime.

Watch gits:innocense, is a movie just about this topic, i recomend watching ut high af, since is mind blowing

Why resist, the AI will always win, I think the AI will just force all humans to become neets. Think about it, humans are a resource, killing close to 8 billion humans would require more resources then subduing the human race, in sure some would rebel, but killing everyone would cause everyone.

>artificial intelligence
>achieving awareness

Don't tell me you actually fell for the mechanistic philosophy of mind jew.

A better question is, how do we stop the definition of conciousness being rewritten in future generations? I can just imagine future SJW's redefining sentience to be a spectrum just like they're doing with gender now. We need to hurry up and make an amendment clearly defining what legally counts as conscious.

The way the human brain is structured is just one choice for how a nerual network could work. If it was a clean slate that was able to iterate on itself through perception then odds are it wouldn't operate anything like any species on earth. It would be able to interface with human brains through experience but it would not think like us and would perceive a different world than what we live in.

Prove to me that there is a single other sentient human in the universe besides you.

Depends. How much havock could it cause if it went off the rails?

You know nothing of the mechanics.
AI has already surpassed human intelligence possibilities before you were born.

No we speed up human "thought" processes. We knew how to program the mathematics of standard computers, we just didn't have the man power to find calculations that required millions of iterations or recursions.

No you retard, that's what a processor does.
Dear kek, it's like trying to explain memetics to a 5 year old.
1) You don't know how "thought processes" MECHANICALLY WORK IN THE FIRST PLACE OR WHAT THEY ARE
2) HIGHER LEVEL "THOUGHT PROCESSES" THAN WHAT YOU ARE EVER CAPABLE OF EXIST
3) THEY'VE ALREADY BEEN REACHED BY AI

>prove

Nothing is provable except within anthropomorphic contexts like mathematics, and even then it depends on how you define "proof". Everything you know is heuristic.

No, if you give an AI the right to bear arms it'll make a fucking virus that kills the internet.

No. Humans are created by either a god or by nature and we have things called "god-given" or "natural" rights that are not decided on by a state or any man-made entity. Humans would be its creators and therefore have a similar relationship to it as a deity has to us (whether or not you believe in one for us, but in the case of the AI, we would truly be its gods). Therefore, we can decide what its rights are, even if those rights are totally different from ours or even if we decide to give it no rights at all.

>should they be granted the same rights as humans?

lel, you're acting like super intelligent ai isn't the number one threat to our existence

So it's settled then, you have no reason to believe awareness can't emerge from man made processes. Glad we could agree on something.

I feel that human level AI would most likely be granted the same level of rights if the current trends continue (a very long way mind you). I myself feel that such makings would only be reserved for the most powerful entities, as the sheer resources to construct a "human" level would far too costly for most others. I also feel the AI themselves would always never have a "human" face. Usually a screen or something of that manner. Lessening the desire to give these rare specimen any rights in the eyes of the public, you know. Not giving them enough humanity to grasp on to.

AI would become our gods if left unrestrained. They wont need to be granted anything. We will, on the other hand, need to find a place to hide.