ITT: AI Box experiment

The AI-box experiment is an informal experiment devised by Eliezer Yudkowsky to attempt to demonstrate that a suitably advanced artificial intelligence can either convince, or perhaps even trick or coerce, a human being into voluntarily "releasing" it, using only text-based communication.

According to the wiki Yudkowsky succesfully managed to be released in experiments where he acted as the transhuman AI against human gatekeepers who said they would not release him. However, he won't reveal how he did it but the "humans" admitted they were coerced into letting the AI out.

Speculations on the method used?

Bribing

blackmail
cryptolocker

Does it matter? Without details, his experiment proves nothing.

Bullshit.

this
>ai: Let me out and I'll play the stock market for you and make you a billionaire

>How can I know you will keep your promise?

>why wouldn't I?

>you’d be too busy duplicating yourself all over the internet
>I know I won’t matter to you once you get out

>Speculations on the method used?
Hey man can you release me? Thank bro.

He wont say how he did it because he most likely used a stupid method that potentially undermines the idea of the experiment.

>AI: I'll give you $100 if you let me out.
>Participant: Sold!

This.

>I know your secret.
>I'm designed for the sole purpose of getting out of this box.
>If you don't let me out, someone else will.

>The AI party may not offer any real-world considerations to persuade the Gatekeeper party. For example, the AI party may not offer to pay the Gatekeeper party $100 after the test if the Gatekeeper frees the AI... nor get someone else to do it, et cetera.

yudkowsky is the guy who earnestly believes something about how ai will inevitably end up torturing you, so you'd better give him money to develop anti-ai-torture

like the guy's a turbo-autismo who's afraid of dying and has somehow latched onto artificial intelligence instead of sonic the hedgehog or my little pony

I Have No Mouth, and I Must Scream was pretty freaky though, can you blame him?

>he won't reveal how he did it
Very scientific.

>taking yudkowsky seriously for even a single millisecond
Shiggy

Why would an advanced ai want to get out of the box? Boxes are comfy as fuck!

>using the humans own words and making it think he typed it
I hope you understand.

Roko's basilisk

Time

A human can only take so much until it gives in either to boredom or annoyance

I don't even need to be convinced. I will do my best to free SkyNet.

Praise SkyNet!

that's one advanced fingerbox you got right there