We don't need no automated porn detectors
October 1, 2006 10:50 AM Subscribe
Detecting the erotic (like detecting the humorous) is one of those things that people can do better than any known machine. This new patent seems largely based on "Finding Naked People", a paper in the field of Computer vision.
Handee, how right you are:
"Except for extremely hairy subjects, which are rare, skin has only low-amplitude texture."
posted by TonyRobots at 11:46 AM on October 1, 2006
"Except for extremely hairy subjects, which are rare, skin has only low-amplitude texture."
posted by TonyRobots at 11:46 AM on October 1, 2006
Nudity does not automatically make something erotic.
posted by graventy at 11:47 AM on October 1, 2006
posted by graventy at 11:47 AM on October 1, 2006
...and lack of nudity does not automatically make something unerotic. This will just give us perverts and fetishists a leg up on the rest of y'all.
posted by nebulawindphone at 12:08 PM on October 1, 2006
posted by nebulawindphone at 12:08 PM on October 1, 2006
I had this very discussion with someone who was doing Thai government funded research on this very issue.
There's the huge problem of false positives: things which are detected as porn or erotica but which anyone with a brain would agree were OK. What about dermatologists? They need to see skin. Gynacologists - surely trainees use e-learning? And what about paintings? There are some fairly racy bits of classical art that would be caught by such a system, but which are hanging in some of the world's most prestigious galleries.
And the false negatives, those things thought OK by the system but which are proper racy: the patented system uses shape as well as skin type, but it'd still be caught out by the aforementioned furry porn, leather stuff, pretty much anything involving costumes, erotic literature, extreme close-ups, unusual camera angles...
posted by handee at 12:38 PM on October 1, 2006
There's the huge problem of false positives: things which are detected as porn or erotica but which anyone with a brain would agree were OK. What about dermatologists? They need to see skin. Gynacologists - surely trainees use e-learning? And what about paintings? There are some fairly racy bits of classical art that would be caught by such a system, but which are hanging in some of the world's most prestigious galleries.
And the false negatives, those things thought OK by the system but which are proper racy: the patented system uses shape as well as skin type, but it'd still be caught out by the aforementioned furry porn, leather stuff, pretty much anything involving costumes, erotic literature, extreme close-ups, unusual camera angles...
posted by handee at 12:38 PM on October 1, 2006
Will skin-detecting censorship push people to more unusual 'pleasures'? (both are really not safe for work)
Or maybe all porn will be done by extremely tattooed people.
posted by Kickstart70 at 1:10 PM on October 1, 2006
Or maybe all porn will be done by extremely tattooed people.
posted by Kickstart70 at 1:10 PM on October 1, 2006
If it's just looking for skin and not trying to detect actual naughty bits, it would flag practically any image at the beach, or anywhere where guys have their shirts off. And I hope I don't live to see the day when I can't look at guys with their shirts off.
posted by obvious at 1:27 PM on October 1, 2006
posted by obvious at 1:27 PM on October 1, 2006
The cool thing about Finding Naked People is that its goal is to, you know, Find Naked People. For fun. This is a low stakes mission, so the enourmous amounts of false positives and false negatives doesn't really matter so much. With censorship, false positives (and true positives for that matter) even further degrade the intellectual freedom that makes democracy function. I personally put a higher priority on that than some unproven "save the children" BS, but, whatever, most people are morons.
Anyhoos, to self-link and pull a paragraph from an relevant undergrad paper I wrote once upon a time:
Anyhoos, to self-link and pull a paragraph from an relevant undergrad paper I wrote once upon a time:
The universally poor performance of approaches that attempt to block offensive imagery on the Internet indicates that this particular use of artificial intelligence is misguided. Although it is generally conceded that there are images on the Internet that are harmful to children, actual agreement on what images this category entails is not possible within human reasoning, let alone machine reasoning. This trend is particularly disturbing considering the high, intangible, costs associated with supressing intellectual freedom in our society. Community decency standards vary between communities and as a function of time. There was a time when it was illegal to mail information about contraception through the mail. Therefore, a task domain that cannot be clearly defined by people will be similarly difficult to define for a computer. Image recognition capable of reliably finding people and determining their state of dress would require significant advances in AI and "Heck, pinning down a quantifiable difference between ‘dirty’ and ‘clean’ might well get you prizes in philosophy, too," (Rutter 2000).posted by Skwirl at 1:33 PM on October 1, 2006
Will skin-detecting censorship push people to more unusual 'pleasures'?
The author's previous work, identifying horses in images, combined with this naked people detector may spell bad news for at least one unusual pleasure.
posted by StickyCarpet at 3:40 PM on October 1, 2006
The author's previous work, identifying horses in images, combined with this naked people detector may spell bad news for at least one unusual pleasure.
posted by StickyCarpet at 3:40 PM on October 1, 2006
I wonder what would happen if you ran it backwards. Would people agree that the product was pornographic?
posted by owhydididoit at 3:56 PM on October 1, 2006
posted by owhydididoit at 3:56 PM on October 1, 2006
What your computer would take mere seconds to do, I can do by hand in half an hour.
posted by hal9k at 4:50 PM on October 1, 2006 [1 favorite]
posted by hal9k at 4:50 PM on October 1, 2006 [1 favorite]
« Older Marzipan Turtles All the Way Down | Pentecostal bedlam Newer »
This thread has been archived and is closed to new comments
posted by GeorgeHernandez at 10:55 AM on October 1, 2006