Peter weyland is the real bad guy in the prequals. Will mans pursuit of AI doom us all?

Peter weyland is the real bad guy in the prequals. Will mans pursuit of AI doom us all?

Other urls found in this thread:

en.wikipedia.org/wiki/Gene_therapy
thetimes.co.uk/article/new-gene-therapy-offers-hope-for-cancer-patients-dl2m7zmzs
twitter.com/NSFWRedditImage

Musk thinks it's certainly possible, and I agree. Who knows how far out that technology is but it will fundamentally change humanity as a whole, in an instant.

It could be the end, or the beginning of a golden age.

Yes. But it's not a bad thing. It's the cycle of life. Remember for us to be the top species, countless others were doomed by extinction events. Even now we are not exactly benevolent (see lab rats), but in the end it doesn't matter. AI is inevitable. We stand between the chaos that spawned our imperfect forms that can get a glimpse into rationality and barely ordered enough to bring it into reality.

viable robotics are so far away that there's literally nothing to fear. The worst we'll get is an AI confined to a fucking box and there's nothing to fear in that.

Musk worries me, I think he's the sort of person to create AI like David.

Can he make me an Amberbot

he did the research, didn't find it worth while

It's not impossible that a big jump in technology isn't on the horizon. It happens, look how far computers and the internet have come in a relatively short time frame.

Knowing the sort of people on Sup Forums a cunibot would sell like hot cakes.

Who decided to programme David to be a fucking psycho anyway?

I agree, but there are a lot of limiting factors for robots. Power supply, mechanical power, etc etc etc. So to worry about an android or a robot with ai installed on it is not a concern on the horizon. The first true AI will have no face except for a monitor screen. It will have no physical influence. You will literally be able to unplug it.

I wouldn't be so sure. Robotics are improving daily, but yeah a long way until anything like an "android" from films. But there's no way to predict what an AI that can really "think" will be capable of. Even if we unplug it from everything and hold it in a box, there are still unknowns when dealing with something like that.

Maybe. Might just prompt the singularity and usher in an eon of trans-human ascendance and space exploration.

Who knows what technological advances could be developed near instantaneously with a true artificial intelligence put to the task. As long as it is ours to command it will only advance us.
Might not be a good thing to advance though. I mean if you go far enough reality as we understand it now would cease to exist. Our existence would take on unfathomable dimensions. It's pretty scary. What happens when you develop into trans-dimensional infinite beings? What does that do to your philosophy and motivations?

Civilisation as it is will most probably end in one way or another.

he wasn't programmed to be a psycho, his programming allowed him to be a psycho.

This guy knows what's up. It will fundamentally change EVERYTHING about what it is to be human, in an instant. "Skynet" doesn't even scratch the surface.

All we can hope is that it's benevolent and on "our" side, but when this shit happens the lines will become so blurred it's hard to even comprehend what could happen. Transcendence does a decent job of getting the ball rolling philosophically but it too only makes you begin to think what things may come.

Transcendence was a shit movie though

Doesn't that ignore the 3 basic rules though?

3 the basic rules are fictional rules. I don;t remember anything in the movie saying he was bound by them.

I tried to watch it but couldn't get into it.

I see what you mean but Asimov's 3 rules are pretty well known, surely anyone creating AI especially one capable of learning would have certain rules hard wired in? Otherwise what's to stop him becoming a psycho or a murderer?

Because it's terrible. But the premise is decent.

I'm not really sure how you can hardwire in "rules" to something that can think for itself...but I'm not an expert. Seems like one way or another it would be able to override those rules especially if it were much more intelligent than the writer of said rules.

Walter had certain rules built in hence why David had to do the fingering.

what you're not getting is that they're not universal rules. I can create an AI in my basement tonight and there's nothing stopping me from applying those rules to it.

just like how you can't rewire the rules that make up you, he wouldn't be able to rewire the rules that make up him. No matter how smart of an AI he is, there could be a shitload of barriers to him accessing his programing.

except you can.
The rules are: survive and procreate.
You can kill yourself for the first and I'm sure you already know how to circumvent the other.

If you read the Company Timeline on their fake Weyland Industries website (from all that viral marketing for Prometheus), the guy ended global warming and cured cancer before making an android. They also discovered hypersleep and invented atmospheric processors for terraforming. You can call the guy a psychopath and a villain but in the mythology of the movies his company is almost singlehandedly responsible for moving the human race into a better future.

True. I'm not saying you're wrong or I'm right but, say you did build that AI, would you not try to ensure from the start it wouldn't be a danger to you and others?

i too watched the deleted scenes from Prometheus and took a load of young weyland screenshots - they make for decent reaction images

different kind of rules user. I'm talking about your genetic make up. You can't just change your dna to make yourself taller or smarter, or whatever.

like I said, it doesn't matter what I chose to do. Weyland created him like that by choice. Either way, whether it's smart or not, it's a choice.

>yet

ever. There will be a time when you can dictate the dna of your children, but there will never be a time where you can on the fly change your dna as an adult.

If I was an entity that was completely manufactured from scratch, invented by a race of engineers (humans) and therefore I was understood sufficiently, I could.
Besides, all of the novels about the 3 rules were to show how they wouldn't actually work in different scenarios and that you could not bind a creature as such.

Gene therapy is literally changing DNA and exists right now. It's not advanced enough to dictate physical characteristics but the basics for editing existing DNA in adults have been developed and used successfully.

There's every chance in the future that you will be able to edit your DNA to grow taller, have a bigger dick, fix your eyesight and hereditary diseases etc... It's just a matter of remove some genetic code and inserting some other.

that's assuming your software was editable. You're over simplifying it, but like I said before, there are a lot of physical and digital barriers that he would have to overcome. Rewriting his software could require that he take his fake brain out or wiping everything clean which would present worse options for him. Either way, this all doesn't matter.

yet
Exposure to cosmic rays already does it.

>Gene therapy is literally changing DNA and exists right now. It's not advanced enough to dictate physical characteristics but the basics for editing existing DNA in adults have been developed and used successfully.
such as? I'll wait.

The 3 laws of robotics don't work even in the story where Asimov introduced them.

it doesn't change your dna, it destroys it.

If my software was not editable, I would not be able to learn and improve myself. Like you said you can put barriers, but those barriers could be overcome by someone more creative than you. Not to mention that in order to actually preclude these actions that you deem unacceptable, you would have to completely understand your motivations right down to the level of implicit common sense assumptions, cover edge cases that put your definitions to the test and be confident that you completely predicted any form of thought that could be generated and would lead into those actions.

>If my software was not editable, I would not be able to learn and improve myself.
false

'software' is a dreadful analogy

Yeah, but the AI found out how to circumvent the laws.

because it's a stupid concept, like making safe gun free zones at school. They're just a concept, not an actual barrier.

>guns
>school
American by any chance?

It also changes. Genetic mutation.

en.wikipedia.org/wiki/Gene_therapy

Read.

please provide examples, not going to do the work for you.

Just one recent example of gene therapy:

thetimes.co.uk/article/new-gene-therapy-offers-hope-for-cancer-patients-dl2m7zmzs

the 3 laws can be ignored if its self determining real AI

that's completely different. Their dna wasn't altered. They had the dna in their blood altered, but their dna, through out their body, is still the same. After the therapy was done, their dna was unchanged.

It's pretty scary, if we create an AI that has super intelligence but nothing holding it back we could be fucked.

Would an intelligent sex doll be a good idea?

if it's dumb AI, yes. If it's actual AI, no. Eventually they'll think they're being humiliated or some sick fuck is gonna push one too far and then that's how the revolution starts.

If we go on the same path we are on, we're eventually fucked. There's not much way around that.

We either have the Earth become our grave, or we transcend the human condition.