So lemme get this straight...

So lemme get this straight....Facebook (a place where there shouldn't be any nudes at all) is asking for your nudes to prevent your nudes from being posted on Facebook (which again...shouldn't have ANY nudes)

so if i dont send facebook my nudes then they can be posted....uhh

I get it....but like why not just train a neural net to identify naked humans...and remove it.

Why should nudes be allowed on Facebook at all? This seems pretty retarded but maybe its just me. Idk. What yall think?

Other urls found in this thread:

geek.com/tech/send-facebook-your-nudes-so-no-one-else-can-see-them-1722059/
twitter.com/NSFWRedditVideo

You knew his plan from the very start.

look at this face...should've seen this coming

Since when does Facebook ask for nudes

since this week
geek.com/tech/send-facebook-your-nudes-so-no-one-else-can-see-them-1722059/

this article is so biased it makes me sick

Jesus fucking Christ

I seriously don't know how this is a viable answer. Can somebody explain.

So if they're going to store the image "encrypted", do they mean making a hash of the image? Couldn't someone just put a dot or tint it or some shit if they really wanted to post the nudes anyway?

daily reminder to Sup Forums that hashing is not encryption

Facebook will just use it for extortion and leverage in the future.

>People just submitted it.
>I don't know why.
>They "trust me"
>Dumb fucks

On a technical note: they don't actually mean encrypted, and they don't mean hashed in any conventional way: PhotoDNA isn't a cryptographic hash, it's a perceptual (fuzzy) hash.

Like many other fuzzy hashes, it performs a lossy transform on the data before reducing to a numerical form - in this case some transform involving desaturation to grayscale, resampling into a low-resolution pixel grid, and then some kind of quantisation/analysis of the gradients between the edges of adjacent pixels.

I'm being fuzzy about the description because so is everyone else: nobody is publishing exactly how it works, largely for fear that if they did, or presented an oracle to easily test it (without fairly serious consequences), it would be trivially exploitable. They don't want this: this is because it is also being used to recognise images of child porn.

So, the Microsoft research paper on it hasn't been published as far as I know, nor is there any binary or source available.

It's easy to see if you knew exactly how it worked, or you had an oracle to test, their fears would indeed be founded, even if there's a convolutional neural net inside the hash part. Fuzzy hashes are always, by definition, imperfect tools - but they can indeed be useful as low-specificity, high-sensitivity first passes to flag possible, but not certain, things for human review.

The problem comes when there's no human review, but Facebook does actually have human reviewers whereas, for example, YouTube (with ContentID) do not.

They're certainly not getting me to send them my nudes. (They could search the internet for them if they're that desperate.)

The Xibit meme has become funny again.

The article explicitly states that the photo will be encrypted and stored on Facebook's servers.

It also talks about signatures and since it's an article for normies it might not actually mean encryption. It's just a way to help normies believe that their pictures won't be seen by Facebook employees.

Kind of. I read a more detailed article somewhere else (don't remember where but I think it was linked in ) that explained they are running their own histographical analysis on the image and then creating a hash from the results. They don't necissarily have to keep the image like the geek article kind of alludes to, because they can just run the same analysis on newly uploaded images and compare hashes.

It's going to be a little more complex than changing a couple of pixels or the tint but the concept is still theoretically possible. I think their main goal here is just to discourage it enough so people will be more inclined to post revenge porn elsewhere, effectively washing their hands of the issue.

Since you seem to have read into it more than I have, here's my question: would basic geometric transformations throw a wrench into it? Say flipping and inverting or skewing the image or just rotating it 45°, for example.

It will be the motherlode when some hacker penetrates that system.

is it really that hard to identify a naked human? we dont need to report every nude that comes on facebook or every snuff film...we just have a trained AI checking each file as it is uploaded.

But the zuk is more concerned with instagram face filters and not being human huh

>And let’s be real, the NSA is already looking at your nudes all the time anyway as part of the infinite digital trash pile they constantly sort through.
they aren't even trying anymore

.. and they have save the noods for scientific purpose. It's not enough to save the hash only ..

hmmmm... uploading incriminating photos which are explicitly linked to your personal details. Facebook then sells this information to the highest bidder. brb running for public position and opponent pays FB enough to gain access to all incriminating information which I have myself tied to my name.

top kek indeed

I drew the line with using FB after they no longer allowed you to share videos with friends which contained copyrighted music.

Mark is trying to find the perfect physique in time for his 2020 Presidential campaign.

I thought it was some hacker posing as facebook to get nudes