Not sure if you guys have been keeping up to date on this, but big news in that Reddit, Pornhub and basically every major site involved have now banned porn videos made with the FakeApp program.
If you're not aware of it, it was a Machine Learning system that essentially photoshopped the face of one person onto another after being trained on the faces. This works well with video so a bunch of porn videos were getting made that convincingly changed the face to some famous actress.
Today Reddit banned not only the subreddits for that tech but also all the ones for the old school porn photoshopping. Even fuxtaposition which was literally just editing video to have shots of the actress mixed with shots of porn, was banned.
It seems this all just happened because of a bunch of articles focusing on the porn side of the program were getting churned out and then the phrasing for rebranded from fakes to "Involuntary Pornography"
I'm interested to hear what you guys think. This tech is crazy and has the potential to change a lot of shit as it gets refined, but this shutdown is brutal. Thoughts?
Reddit can fuck off and their ban will have zero effect on this. If anything it just caused a lot of news and interest.
Jonathan Hughes
It's involuntary pornography, shitlords. Those poor women were basically raped!
John Martin
Everyone knows that banning something you don't like makes it go away forever
Cameron Thompson
I think they were simply threatened with legal action and banned the boards to protect themselves.
Ian Flores
What is there to discuss? Reddit is run by jews so anything that puts it on a bad light will get banned to retain advertisers.
If you want to discuss the program itself or the algorithm, this is a shitty worded thread and you should go back
Ian Fisher
Yeah exactly. Reddit is doing this just to cover their ass and it won't have any effect in the long-run
Carson Hernandez
This, and the actresses won't get money out of it so it's bad for them. They won't make it to the news anymore if they "accidentally" flash their tits or vags because everybody's already seen them getting gangbanged.
Lucas Hill
The tech has plenty of uses that don't violate anyone's privacy so what's the issue?
Aiden Fisher
IT NEEDS TO GET TO JAPAN
Ryder Reyes
Porn is porn I see no problems here seeing as we have countless shops of Nick Cage's face on fucking everything
Ethan Carter
Well, duh. Textbook model/image rights violations. Libel, even, if they were ever misrepresented as genuine.
It's cool technology, if you don't think about its implications at all. If you do, it is valuable for: faking porn of people that haven't consented to it, for falsifying video evidence, and for state-driven propaganda.
All of those things make things worse. Try and focus on technologies which make humanity better, instead.
Austin Morris
>Try and focus on technologies which make humanity better, instead. The only technology that makes humanity better is whatever lowers its population (like weapons) but those things are not nearly as fun for my dick.
Levi Young
>state-driven propaganda This is the area where it gets real scary real fast.
You could stage a coup, take over the government and the "elected" president or prime minister could be shot in the head and ditched in a shallow grave - and still make regular speeches and appearances and none would be the wiser.
If you want war then you can easily show the leaders of the country you want to invade say all kinds of outrageous things.
There are a lot of possibilities that are really scary. Porn isn't really a concern, that's just the first industry to make major use of a whole lot of technologies.
Carter Williams
It will work the first time, afterwards people will learn to not trust video evidence, which is a good thing, so this technology is a force of good.
Chase Reyes
>TFW poorfag with no GPU to train I just have to sitback and watch the ride
Gavin Mitchell
One word: voat.
David Rivera
>Try and focus on technologies which make humanity better, instead.
Unironically this tbqh. I don't give a shit about 3D porn. They should use this technology to create something more interdasting.
Nicholas Morales
Wow, who the FUCK cares OP, my god...
Matthew Barnes
>make humanity better Wouldn't humanity be better if every person pursuing celebrity knew that this would be an inevitable outcome? The problem isn't the people rendering these videos. It's the fact that 50 million people can want to fuck one person. You can't get mad when some of those people want to watch them get fucked.
Brody Morales
I haven't fapped to 3D porn in over 8 years now.
Ethan Edwards
But all their best looking women are already in porn.
Sebastian Perry
Well, I think we should have thought about consequences sooner, but this only shows, that you should not make this technology based on porn things etc. We should focus on creating a more solid app, with an (AMD support) haha. And I believe for some time use it only for real or funny purposes. Once there will be enough people achieving the same results, hence they will use it for anything they want. We probably can not stop this progress by any means. Therefore, once someone will do it, others can join. It won't be bad, since the time will get use to it. And atleast it will hopefully help some girls to act so bitchy with and attitude of doing selfies of herself and posting it on a FB without privacy settings.
So, will /gif/ finally start making their own deepfakes instead of stealing from reddit?
Jose Gutierrez
It'll be used to discredit videos that incriminate people in the near future
Ethan Long
It's kinda sad, that everyone only sees it for making a porn, but people should realise this would have happened sooner or later. Creating a website would not be much possible, right? They would have taken it down. Unless it's safe somewhere like pirate bay lol.
Brandon Gray
Sometimes I like to imagine having sex with certain celebrities, and my brain is really good at producing good visuals. Does that count as involuntary pornography too?
Samuel Taylor
reddit is devouring itself. it will be myspaced within 5 years.
Cameron Anderson
I wonder if we will end up full-on "The Culture", where video and audio(yeah, they are working on that, too) forgery will end up super-advanced, resulting in privacy ironically spiking, since you any faggot or AI could fake you doing literally anything.
Carter Lopez
>Machine Learning system that essentially photoshopped the face of one person onto another after being trained on the faces I swear to god there was a Sup Forumsentooman who said they were going to make something like this a few months ago.
Jose Powell
KYS You dumb or something? It would be an actress' face on a porn star's body. This is a problem because the porn star doesn't receive attribution, and the actress is misrepresented as having something they don't (I've been to the fappening, celebrities are not all still hot when the clothes come off)
Joseph Taylor
>not trust video evidence, which is a good thing Are you retarded? >Video evidence shows a woman murdering her husband >it wasn't me your honor it's fake You've made simple convictions much harder. You should have said it would make public CCTV useless, which makes it harder for the government to spy on you, which is a good thing.
Evan Hall
>celebrities are not all still hot when the clothes come off who is?
Jordan Torres
>for falsifying video evidence, and for state-driven propaganda. this is inevitable.
Lincoln Robinson
where is the part that shows your brain having sex? or fantasizing? >will /gif/ finally start making their own deepfakes instead of stealing from reddit? >implying any of them have their lives together enought to afford the equipment.
Parker Adams
>>Video evidence shows a woman murdering her husband >>it wasn't me your honor it's fake >You've made simple convictions much harder. and this is a problem how? >You should have said it would make public CCTV useless, which makes it harder for the government to spy on you, which is a good thing. so you DO get it. people willl have to rely on PHYSICAL evidence, not DIGITAL crap
Nolan Morgan
What would be the legal pitfalls I would need to avoid if I were going to make a website to host deepfakes? Obviously a banner at the top stating that all videos on the site are fake, but what else?
Daniel Jones
Outside the US
Oliver Perry
You are not authorized to use pictures of other people in some countries. They could sue you for defamation or some related crimes. Also international laws could make it harder, but it may vary from the country you are in
creator just made his own website and now they are making sites specifically made for dumps of these and now there are cloud services to help with "training" if your gpu sucks. I think they are going to figure out real quick, STFU and normies will never know lol
money doesnt bring happiness. But satisfying carnal desires does
Dominic Cox
Money can buy things that make you happy.
Christopher Parker
The biggest problem I think they see is that brainlets will sooner or later realize how scary this technology is actually is and that they can basically trust no videos/audiostream anymore because everything could be faked. With banning the subreddit they slowed down the process because there is no plattform to spread that stuff effectively anymore.
Can we do that with historical figures? I wanna see hitler with tits being fucked by a bbc
Hudson Sanchez
Don't think there is anything illegal about it in the US
Parker Reed
That's what I have been thinking. I'm not sure how it could be if you explicitly state that it is fake. I've never heard of someone losing a law suit over a photo edit, and this is along the same lines.
Sebastian Russell
I think my biggest worry would be how to not get the website Daily Stormer'ed off the internet.
Andrew Allen
this, it's good that it's happening with meaningless shit like celeb porn now, to make everyone aware it is possible
Andrew Brooks
Who the fuck cares about these idiots. They post on Jewish internet defense head quarters and whine that they are getting fucked in the ass. If they cared about freedom they would use 8ch, Sup Forums or something.
Where's the source code now? The github page wants me to log in before showing me anything.
Jordan Bennett
It'll probably outlawed in murrica with all the metoo hysteria. Hell I even heard glenn beck talking about it on the radio last month, but they were talking about how dangerous it is as a potential political weapon not how its involuntary porn.
Liam Wood
Yeah, it's called the Streissand effect.
Brandon Torres
>Jewish internet defense head quarters You mean ?
Kayden Gonzalez
>We are quickly spiraling into a society with >A. Mass surveillance where every individual action is recorded and cataloged >B. Evidence of any action can be easily fabricated via tech Luddites were right.
Michael Kelly
>If anything it just caused a lot of news and interest. I am here because of that, I wanted to deepdream some pics and wanted to use different site than usual for that and google results were littered with fakeapp articles.
Grayson Ramirez
where can I acquire said software?
Eli Reed
>reddit are so delusional they think Sup Forums likes jews now wut
Carter Thompson
> (OP) >One word: voat.
That place is a bigger nigger shit hole that this site.
Jack Sanchez
>make your own deepfakes site >host it in Russia or something >pay some Ivan NEET $100/month to be the "admin" >rake in fat stacks because your site is the only one that allows deepfakes
>FakeApp program hmmm also search on github for faceswap or deepfakes
Carson Smith
r/FakeApp is still standing for now
Elijah Parker
Would this work?
Benjamin Price
are there any examples for good and bad training pictures?
James Roberts
I think panama maybe a better choice
Adrian Rivera
The idea is that you just throw a set of what you want at the learning algorithm.
Basically images with unrelated stuff in them may be a bad idea, but the right choice of algorithm / parameters and images showing mainly the objects that you want to train may get it done.
There are no perfect learning algorithms for everything though.
Ryder King
it's either take it down or get charged with defamation
Elijah Brooks
but they don't have to be like perfectly aligned with eyes in the center and all heads the same size and shit?
Isaac Young
No, I believe it will crop them for you. However, you may want to check the subreddits left or other tutorials to see.
I know that orientation of the face isn't important as you want as many different orientations as possible. You want them looking in different directions, making different facial expressions, different angles, different lighting. That way if the person you're putting the face on makes a facial expression, the algorithm will be able to equate it to the equivalent from the other and make it look more realistic. There's plenty of tutorials still out there.
Samuel Wright
This. This thread is a serious reflection of how pathetic the community of this board is that they're getting buttblasted about it being slightly harder to fap to fake porn of famous actresses. Just stop and reflect on how pathetic this is. And call me a faggot jew sjw cuck as well. Please do that.
Reddit and Pornhub can ban whatever content they want. Still, that won't get rid of the demand, so people will most likely just create their own porn site for deep fakes, or flock to a porn site that doesn't ban deep fakes.
I think deep fake porn is unethical but I recognize that there's no enforceable way to keep people from making it and distributing it.
Camden Phillips
Celebrities should embrace this. Now they don't really have to worry about their personal smut being leaked because they could just as well be fakes. Plausible deniability.
Jeremiah Fisher
Wouldn't this just centralize it to groups and organizations that want control over the larger masses?
Jason Campbell
Anything that stimulates pleasure centers in your brain is going to look like a brain scan of a drug addiction.
You don’t actively seek for things that make you sad to neutralize your brain activity. Nor can you stop the rush of endorphins when you get something stimulating.
Tyler Ward
Not exactly.
Plausible deniability only exists if something is truly unable to be determined real/fake.
Since all of the frames of deepfake are averaged from existing videos and photos, a person with the right imaging software can crossreference public photos/image libraries and throw a difference filter on the deepfake.
If anything, it makes deepfakes more identifiable, and real smut more verifiable.
Parker Edwards
I care more about funny/cool videos than watching actresses get fucked desu