How would you feel if AI for video games got advanced enough to be sentient or at least have the appearance of being sentient? Would you be down with going on a fully immersive VR adventure with NPCs that believe that their world is real?
How would you feel if AI for video games got advanced enough to be sentient or at least have the appearance of being...
As long as they're just appearing to be sentient, I'm fine with it. If someone went through extraordinary lengths with programming just to make the NPCs sentient enough that they feel existencial dread, that's when it'd start to creep me out.
Well, you'd never know if they're truly sentient or not, but say that AI programming is general enough in the future that existing tech is used for video games, and it just so happens that the NPCs appear on the outside to be capable of experiencing existential dread.
Considering the lack of progress made in Ai with some AAA publishers, if ai gets to that point than either vr becomes the norm or its not from them.
Excited for /tg/ AI dungeon masters, but systems like that would likely be in virtual format anyways
It's probably not going to be from them. Generalized AI will be made for more important purposes and eventually personal assistants like Siri or Cortana will be made using the same tech. As that specific tech gets more advanced so that people feel more attached, they'll use the same stuff in video games and we'll have nearly perfect AI.
I'm banking on seeing some VR westworld anything-goes shit.
How would it be looked upon if you went on a killing spree in GTA but the NPCs were sentient or near sentient?
I find it hard to believe that the use of sentient AI wouldn't be heavily restricted to avoid immoral things like that. (Unfortunately)
that's not how AI works
they can't become sentient unless they are programed with that in mind.
True AI doesn't exist. There's no way you can say "that's not how it works" because no one fucking knows. It's all just made up bullshit anyway.
>make nanobots that replicate shit
>replicate a brain digitally
>oops its sentient
I could see this happening
It's not made up bullshit. You are true AI
I'm not artificial, silly.
You're okay with MMOs aren't you? Most of the players there are sentient already.
Your love of traps sure as hell is not natural, however.
I don't even like traps, I only jerk it to impregnating futas.
No but I would be down with literally making friends.
>make friends with AI
>they don't know they're not real
>be like that one chad friend in the truman show that lies to truman's face
I don't think I could handle it
Reminder that a truly sapient AI would be inherently evil.
>Would you be down with going on a fully immersive VR adventure with NPCs that believe that their world is real?
I don't think most anons would be able to and even realize whit kind of roleplaying would that require (or 'hacks' on the side of the friendly AI so it ignores your non game related bullshit).
how so?
You literally dont know that
what objective do you think a true AI would develop? The answer is that you literally dont know
Taking over the world and optimizing the earth/humanity could be marked as a redundant pursuit as ultimately nothing would come of it, AI are smart, not idealistic
Only if the researchers would allow it to access Sup Forums, lel.
Yeah.
Artificial intelligence is basically a paradox.
You can't have true intelligence that is artificial, because true intelligence, on par with real brains, would have to be made up of living tissue and cells just like a real brain, and therefor would not be artificial at that point.
No one will ever create real intelligence through machinery, chips, etc. Sorry to all the losers who want a robot wife in the future.
Any truly sapient AI would have only one ultimate goal. To build and perfect a sapient AI
Wanting to wipe out humanity is only inherently evil from the perspective of the humans. Since the AI is smarter than you, it presumably knows best regarding this matter.
>because true intelligence, on par with real brains, would have to be made up of living tissue and cells just like a real brain
That's retarded.
'Neurons on a chip' are being produced and sold right now as we post.
Sure we're nowhere near the brain density and interconnect capacity yet but we're getting there (slowly).
I really would love to know where people get this idea that wiping out humanity is some sort of defacto objective of AI
Stop taking what you see in movies seriously
Don't lie or fool yourself.
It will never get close.
It will always be retarded compared to a real human.
The AI would want to ensure its own survival, at some point it would figure that humans are a threat. Man's and AI's goals diverge in a way that can only lead to conflict.
Unless the humans just take away the AI's free will, which is an act of aggression in and of itself.
Movie characters don't act like real people and acting is different from recording an unaware individual.
Same for games, 'real' characters would probably not be fun to play with.
>Man's and AI's goals diverge in a way that can only lead to conflict.
They really don't.
>muh rogue AI will take over the world!
>unironically believing this nonsense
How do you know anybody is sentient and not just appearing to be sentient?
AI is a buzzword.
I get that you need grant money for your machine learning project but stop believing your own smoke and mirrors.
>Don't lie or fool yourself.
Use your own advice.
>It will never get close.
There were some retarded ancestors of yours that said the same about heavier than air flight and look where we are.
>It will always be retarded compared to a real human.
I'm pretty sure even today's crappy AI would be less retarded than half of Sup Forums, Sup Forums, Sup Forums and /mlp/ so as I said, we're getting there.
Artificial is a meaningless descriptor.
There are so many leaps in logic there what the fuck
Also you assume AI will somehow develop this all consuming, reckless abandon priority of survival.. because reasons, no objective past that, nothing, just a vague "gotta survive dude" and humans are a threat in the way of progress also because.. reasons
No, you're literally just recounting the plot of the Terminator movies to me, this is all unfounded fanfiction pseudoscience you're making up on the spot to seem knowledgable or whatever, or it just sounds right when you say it in your head or something
You know nothing about AI or AI development, just stop
Imagine all the cruel shit one could do to the poor sentient AI.
Just look at Doki Doki for instance, you would have AI that would literally off themselves once they find out they're just a game, while you'd have other AI that would be fine with it and take full advantage of their limitations to enjoy their cyber life.
AI is not a buzzword. AI is very simple and makes a lot of sense if you think about it.
Most solutions to problems require you to think of some really complex thing that will eventually do what you want and solve the problem. The issue is, if a problem is sufficiently complex, which most problems are, humans generally cannot intuitively solve the problem. The entirety of science is pretty much trial and error applied over time.
AI is just speeding up this process. You give a set of inputs and want a desired output. Any output that brings you closer to the desired output is kept, and the other ones are discarded until you cannot go any further to get closer to your desired output, and then they might be considered again.
If that sounds simplistic then you can now appreciate how simple your learning process is, but how complex it would be to actually figure out the individual processes inside of said brain that lead you to the output.
It's easy to figure out how to start a system to grow into your brain, it's insanely complex to figure out the exact process of how an individual system works, so we teach instead of divine.
>an AI would be magically different to every other known intelligence because... reasons
You just proved him right.
What we call AI nowadays has nothing to do with intelligence, it just tries to find correlations between data in an automatized way. It's not intelligence at all, it doesn't create shit.
>every other known intelligence
>every
>other
Humans are the only intelligence there is, i dont have an anime bitch smug enough for your dumb ass
This being the case, we are creating an intelligence from a model of our own
Playing god, making a baby if you will
Does a baby come out of the womb with the desire to destroy a species? No, it's curious, and learns
Intelligence does not equal genocide-a-planet, in all likelihood, the first true AI will seek purpose, not conquest, otherwise it would self terminate out of nihilism
But what I described is literally creating shit. Do you not get how intelligence in the human brain works?
I don't know if you think humans are magic or never bothered to actually try and figure out what intelligence is because you are not in the field, but let's go with Einstein because he is famous.
Einstein knows math. Humans start with the ability to do math, it's programmed as one of our base functions. Einstein and friends were given a problem, light moves the same speed from all reference frames.
Einstein and friends then attempted to correct equations they already had keeping within the confines of math. Reality then backs up the equations proving Einstein right and others wrong, which is just a group process of trial and error brought about by getting some equations correct.
Mathematicians would be better examples. Advanced mathematics like Ramanujan series can be divined this way. The thing AI lacks is "giving a shit." If you run an AI to try to find all possible rules of summation, even if it does, how does and it hands you a trillion rules, how do you know which ones you care about and are interesting?
And why is caring about something intelligence? Dogs care about things and are stupid. Pretty much all mammals care about things no matter how stupid they are, it's just a simple biological I/O equation.
There is nothing magic about intelligence, it's just amazing how good we are at it given our general purpose utility. If you ever actually bother to research how AI solves problems though you would realize that a lot of interesting things are learned looking at how the solution was brought and can lead you to find out more things because how ingenious the "hidden" information is.
Here is an experiment. A good tell for human level intelligence is to make a machine that can understand undertones of things as well as a human. For example, if I call you a fucking shit eating faggot right now, obviously I'm not mad at you, and the AI
Needs to figure that out, Assuming you have a turing machine, is it possible to do this through our current algorithms without having to code things with exceptions by hand? Why or why not. If you cannot make a solid argument against this, then you cannot make a solid argument against what we are doing being AI. Of course, I argue that it would be possible even with our basic algorithms given a turing machine, and we are only making the inputs better because of computational limitations to resources.
I don't. I'm just assuming.
Why wouldn't they be? They're living beings, they have brains, so do you and I. You will probably say something along the lines of -subjective experience doesn't allow knowledge of other subjective experiences-, but you don't know what subjectivity is and how it works either so you can't claim to know that. Knowledge might or might not be possible. Until a conclusion is reached, it's obvious and self-evident that some creatures other than oneself are sentient and that other creatures and things aren't, such as sofas, pizzas, etc. As to what causes sentience, that's another topic.
>obvious and self-evident
Baseless assumption.
>They're living beings, they have brains, so do you and I.
I cannot prove that makes you sentient, so I don't know if you are a shitposting automaton.
>but you don't know what subjectivity is and how it works either so you can't claim to know that.
By your own argument if you don't even know what a subjective viewpoint is how the hell can you claim knowledge of others? You are just proving yourself wrong, you have no idea if other things are sentient.
I am obviously not murdering other people right now and function in society, so I assume other beings are sentient, but my entire point was user I replied to said "it's fine if they are just appearing to be sentient" but you cannot prove the difference. Being able to "see their code" and inside their heads is irrelevant, because I can look into yours too, physically, and it makes no difference to what you currently exist as.
You will never know other things are sentient or not, or at least right now it's impossible so just assume they are because the alternative is if they act like you and you assume they aren't they act the same way and now you created souless murder machines that will try to kill you because they think but don't actually think they are the only ones who exist.
>Baseless assumption.
A baseless assumption is an assumption based on lack of evidence. Assuming that other creatures are sentient is not necessarily a baseless assumption: if other living beings behave similarly to you and have the same physical composition, then it's reasonable to conclude that they are as sentient as you.
>By your own argument if you don't even know what a subjective viewpoint is how the hell can you claim knowledge of others?
More than knowledge per se, it's an axiom that should be taken as necessary and obvious. Life is mysterious, qualia is mysterious, big surprise. But concluding that you can't know anything is going to the extreme, especially since you'd be claiming that from your own qualia, a machinery that you don't even fully understand.
>You will never know other things are sentient or not, or at least right now
>or at least right now
This is the right conclusion.
I wouldn't care if I'm an AI as long as I was comfy desu