NYT: The Internet Is Overrun With Images of Child Sexual Abuse (CW)
September 29, 2019 1:35 PM Subscribe
In its reporting the New York Times "reviewed over 10,000 pages of police and court documents; conducted software tests to assess the availability of the imagery through search engines; accompanied detectives on raids; and spoke with investigators, lawmakers, tech executives and government officials. The reporting included conversations with an admitted pedophile who concealed his identity using encryption software and who runs a site that has hosted as many as 17,000 such images." Their shorter "What You Need To Know" article summarizes: reports are increasing, victims are younger and the abuse is worse, the Justice Department has neglected its duties for years, the police and non-profits are overwhelmed, and tech companies are often slow to react if they do at all. Almost 66% of reports were from Facebook Messenger.
(CW) The full article is fascinating and very in-depth but it includes graphic descriptions of abuse.
(CW) The full article is fascinating and very in-depth but it includes graphic descriptions of abuse.
I don't know if there's a "win", but there's certainly things that could be done. More resources could be allocated to the problem.
We spend nearly uncountable amounts of money policing nonviolent drug offenders; perhaps this would be a better use of that time and money.
The State of Maryland recently found the time and resources to put a teenager on the sex offender registry for consensually sending photos of herself to another teen; perhaps there are better ways those law enforcement resources could be allocated, emphasizing predatorial behavior.
At around the same time, the Commonwealth of Virginia pursued a woman for over a year for an alleged self-induced abortion; perhaps that effort in going after a phantom crime, could have been spent hunting for the very real pedophiles that doubtless exist in the same jurisdiction.
These crimes exist because people are evil, but they are allowed to happen in such volume because we as a society decide not to prioritize going after the perpetrators.
posted by Kadin2048 at 2:40 PM on September 29, 2019 [114 favorites]
We spend nearly uncountable amounts of money policing nonviolent drug offenders; perhaps this would be a better use of that time and money.
The State of Maryland recently found the time and resources to put a teenager on the sex offender registry for consensually sending photos of herself to another teen; perhaps there are better ways those law enforcement resources could be allocated, emphasizing predatorial behavior.
At around the same time, the Commonwealth of Virginia pursued a woman for over a year for an alleged self-induced abortion; perhaps that effort in going after a phantom crime, could have been spent hunting for the very real pedophiles that doubtless exist in the same jurisdiction.
These crimes exist because people are evil, but they are allowed to happen in such volume because we as a society decide not to prioritize going after the perpetrators.
posted by Kadin2048 at 2:40 PM on September 29, 2019 [114 favorites]
Kadin2048 is right, resources are sorely lacking, local jurisdictions don’t know or care about the harm done. My local missing and exploited children’s task force has a donation page because they can’t get enough funding from the state government.
I’ve been working in and around child exploitation prevention and related law enforcement for years and It’s hard not to come to the conclusion that people love to talk about protecting children but very few people actually care enough to spend time or money.
posted by LastAtlanticWalrus at 3:08 PM on September 29, 2019 [25 favorites]
I’ve been working in and around child exploitation prevention and related law enforcement for years and It’s hard not to come to the conclusion that people love to talk about protecting children but very few people actually care enough to spend time or money.
posted by LastAtlanticWalrus at 3:08 PM on September 29, 2019 [25 favorites]
So as not to abuse the edit window, if I have to hear one more judge or prosecutor call sharing images of child sexual exploitation a “victimless crime” my head is going to explode.
posted by LastAtlanticWalrus at 3:09 PM on September 29, 2019 [29 favorites]
posted by LastAtlanticWalrus at 3:09 PM on September 29, 2019 [29 favorites]
To be fair, it’s much harder to use child abuse to precisely beat minority communities while leaving white communities alone. Priorities, indeed.
posted by GenjiandProust at 3:15 PM on September 29, 2019 [20 favorites]
posted by GenjiandProust at 3:15 PM on September 29, 2019 [20 favorites]
Adjudicating these cases is also quite resource-intensive. When I was clerking, the judge wouldn't even ask the clerk on the case to look at the images (in these ominous sealed-up manila envelopes); the judge alone did that. It frays everyone's nerves in a way that is almost unique among crimes, particularly since even other sex offenses usually don't by their very nature generate traumatizing physical evidence, as child porn obviously does.
posted by praemunire at 3:48 PM on September 29, 2019 [16 favorites]
posted by praemunire at 3:48 PM on September 29, 2019 [16 favorites]
"If AI is looking for a big win, automating that process so that humans no longer have to do it would be one."
Systems for automatically detecting known child porn have been in place for years... (standard joke: this is why the sentient ai will kill us all.)
posted by kaibutsu at 4:06 PM on September 29, 2019 [5 favorites]
Systems for automatically detecting known child porn have been in place for years... (standard joke: this is why the sentient ai will kill us all.)
In 2018, NCMEC received more than 18 million reports of exploitative imagery.I'm less familiar with the follow through on the law enforcement side, though. Presumably, most of these 18 million reports aren't being acted upon.
Strikingly, 99 percent of those reports come directly from the tech platforms. Through the use of artificial intelligence, hashed images, and partnerships between companies, we’re now arguably much better informed about the scope and spread of these images — and are better equipped to catch abusers.
posted by kaibutsu at 4:06 PM on September 29, 2019 [5 favorites]
Lemme also throw out that the people I've met working on these systems in the tech companies are goddamn heroes, who have chosen to take on a hard, traumatizing job with incredible dedication.
posted by kaibutsu at 4:09 PM on September 29, 2019 [16 favorites]
posted by kaibutsu at 4:09 PM on September 29, 2019 [16 favorites]
Getting funding for preventing child abuse or working with families that were extremely high risk for abuse was incredibly hard. It was also very cost efficient. It cost about $5K a family I think in a year which worked out to about $700-$1000 per child and this was for a child who was already experiencing some kind of known abuse and extremely likely to wind up in a tragic situation.
Getting funding for a post-abuse shelter (which needed them too sure but -) the difference with which they got their funding was brutal. Post trafficking shelters were so much more expensive to run, like 10x more (well, correctly run when you actually had proper staffing) and took years of aftercare in comparison.
I am talking to an NGO about doing some work again but I don't think I can ever work in child abuse cases directly again. It's just too fucking infuriating to see how people will happily talk and talk and spend money afterwards and not do a damn thing before when it's possible to spare children the trauma - WE KNOW WHAT TO DO. We have the knowledge, we have the resources. It's not fucking rocket science. It's not even that expensive. We just don't care enough because they're not our kids and it's not headlines.
posted by dorothyisunderwood at 4:29 PM on September 29, 2019 [38 favorites]
Getting funding for a post-abuse shelter (which needed them too sure but -) the difference with which they got their funding was brutal. Post trafficking shelters were so much more expensive to run, like 10x more (well, correctly run when you actually had proper staffing) and took years of aftercare in comparison.
I am talking to an NGO about doing some work again but I don't think I can ever work in child abuse cases directly again. It's just too fucking infuriating to see how people will happily talk and talk and spend money afterwards and not do a damn thing before when it's possible to spare children the trauma - WE KNOW WHAT TO DO. We have the knowledge, we have the resources. It's not fucking rocket science. It's not even that expensive. We just don't care enough because they're not our kids and it's not headlines.
posted by dorothyisunderwood at 4:29 PM on September 29, 2019 [38 favorites]
And now I want to vomit for an hour.
posted by dorothyisunderwood at 4:33 PM on September 29, 2019 [6 favorites]
posted by dorothyisunderwood at 4:33 PM on September 29, 2019 [6 favorites]
I think we have a hard time admitting that this is really happening and so devoting funding to it -- it's maybe a facet of the just universe hypothesis. And not unrelated to why we can't manage to fund school systems properly, so that schools don't have to start lunch shifts at 10 a.m.
I'm trying to figure out how some people are so broken that images of child torture are the thing they are willing to risk everything for. So many people, apparently.
But then I think about what our current U.S. President does for fun.
Oh, humanity.
posted by allthinky at 5:22 PM on September 29, 2019 [6 favorites]
I'm trying to figure out how some people are so broken that images of child torture are the thing they are willing to risk everything for. So many people, apparently.
But then I think about what our current U.S. President does for fun.
Oh, humanity.
posted by allthinky at 5:22 PM on September 29, 2019 [6 favorites]
I'm trying to figure out how some people are so broken that images of child torture are the thing they are willing to risk everything for. So many people, apparently.
If I understand the article correctly, these sites are bartering these images, not trading them for money. They're demanding newly made material to verify that. a new poster is not an undercover agent. (Makes a good justification for deep fakes in this particular case.)
posted by ocschwar at 5:32 PM on September 29, 2019 [1 favorite]
If I understand the article correctly, these sites are bartering these images, not trading them for money. They're demanding newly made material to verify that. a new poster is not an undercover agent. (Makes a good justification for deep fakes in this particular case.)
posted by ocschwar at 5:32 PM on September 29, 2019 [1 favorite]
Presumably, most of these 18 million reports aren't being acted upon.
In the article it is pointed out that it can be difficult to work with what resources they have due to non-responsiveness from tech companies. So this is not just a law enforcement issue.
posted by Anonymous at 6:06 PM on September 29, 2019
In the article it is pointed out that it can be difficult to work with what resources they have due to non-responsiveness from tech companies. So this is not just a law enforcement issue.
posted by Anonymous at 6:06 PM on September 29, 2019
Yeah, the article's precise wording is /some/ tech companies, with a focus on Tumblr. My guess is that what we're seeing here is greatly improved detection on the part of motivated groups within /other/ tech companies, paired with insufficient enforcement.
But I am trying to puzzle out why the article is painting tech companies with such a broad brush... There's a certainly a well-known desire of law enforcement to get backdoors on encryption and storage, which many tech companies don't want to grant. So perhaps some enforcement requests are over-broad, leading to foot dragging, leading to enforcement complaints about uncooperative companies to the reporters?
After all, why would a company dedicate whole teams to produce millions of leads and then be uncooperative? "Because they're evil" isn't an explanation (why produce the leads in the first place, then?), and the article doesn't really seem to probe the question at all.
posted by kaibutsu at 6:47 PM on September 29, 2019 [1 favorite]
But I am trying to puzzle out why the article is painting tech companies with such a broad brush... There's a certainly a well-known desire of law enforcement to get backdoors on encryption and storage, which many tech companies don't want to grant. So perhaps some enforcement requests are over-broad, leading to foot dragging, leading to enforcement complaints about uncooperative companies to the reporters?
After all, why would a company dedicate whole teams to produce millions of leads and then be uncooperative? "Because they're evil" isn't an explanation (why produce the leads in the first place, then?), and the article doesn't really seem to probe the question at all.
posted by kaibutsu at 6:47 PM on September 29, 2019 [1 favorite]
Don’t forget that any crackdown also requires having regular, caring people look at these horrible pictures day after day. Look at the rates of depression, burn out and PTSD in the teams at Facebook and Twitter, etc that review these image reports. Even more difficult is that these are relatively low wage jobs and there is nothing for the employees after they inevitably burn out and quit.
posted by interogative mood at 7:39 PM on September 29, 2019 [12 favorites]
posted by interogative mood at 7:39 PM on September 29, 2019 [12 favorites]
Look at the rates of depression, burn out and PTSD in the teams at Facebook and Twitter, etc that review these image reports.
My takeaway from previous FPPs on this topic was that a major factor in the depression, burnout, and PTSD among people doing "community standards" reviews at Facebook was not simply the horrendous nature of the material they were exposed to, but the fact that nothing seemed to be done about it. One former employee interviewed in one of the articles I read described seeing the same video of an animal being abused over and over again, with no apparent action from Facebook to take it down or prevent it from being shared and reposted. I think a certain kind of person might tolerate being exposed to the very worst in humanity, even child sexual abuse, if they knew and could see how their reports led to the material being taken down and the perpetrators brought to justice. But Facebook and the other social media giants seem more interested in a sort of Potemkin moderation style, with heinous content brushed aside and its source never addressed, forcing their contractors to encounter the same filth over and over again, helpless to stop it. Small wonder 66% of the reports came from Facebook Messenger.
posted by biogeo at 9:49 PM on September 29, 2019 [29 favorites]
My takeaway from previous FPPs on this topic was that a major factor in the depression, burnout, and PTSD among people doing "community standards" reviews at Facebook was not simply the horrendous nature of the material they were exposed to, but the fact that nothing seemed to be done about it. One former employee interviewed in one of the articles I read described seeing the same video of an animal being abused over and over again, with no apparent action from Facebook to take it down or prevent it from being shared and reposted. I think a certain kind of person might tolerate being exposed to the very worst in humanity, even child sexual abuse, if they knew and could see how their reports led to the material being taken down and the perpetrators brought to justice. But Facebook and the other social media giants seem more interested in a sort of Potemkin moderation style, with heinous content brushed aside and its source never addressed, forcing their contractors to encounter the same filth over and over again, helpless to stop it. Small wonder 66% of the reports came from Facebook Messenger.
posted by biogeo at 9:49 PM on September 29, 2019 [29 favorites]
cops are just "requesting" information from these companies. There are a LOT of legal tools that law enforcement can use in order to demand evidence. But they're just, I dunno, politely requesting it for some reason? It's a bit boggling that they're pretending like they are not powerful or capable here
The SCA turns out to be a real pain in the ass in this context.
posted by praemunire at 11:24 PM on September 29, 2019
The SCA turns out to be a real pain in the ass in this context.
posted by praemunire at 11:24 PM on September 29, 2019
In the nineties my dad worked in the US government in a department full of lawyers where his lawyery job was to be the prosecutor representing the department against the employees who had screwed up or were objecting to their treatment in some way. Sometimes my dad got to do good things, like he was very meticulous about transgender employees who had complained about not being able to use the lockerrooms or restrooms for their gender and ended up helping to write up new policy alongside their defense afterwards because his work as prosecutor was so difficult for even the bigots to find fault with despite him "losing" the cases.
Other times though, I remember him coming home drawn, exhausted, wanting nothing more than to eat dinner and collapse into bed - he could never talk about his cases but he would usually be chatty at dinner or try to work on his music. I remember him coming home one night and just hugging my mom, who is not a hugger and she didn't know really what to do with her arms there in the kitchen. He gained weight and aged a bunch and evidently almost went back to smoking, which would have been a huge failure for him.
A couple years later when I was in college and going through some shit, my dad told me what had happened. An employee of the federal government, working in DC in the office on a government issued and monitored computer network, had been fired for saving and distributing child porn - he'd had 2TB of it saved on his work computers in the nineties. And my dad had to LOOK AT IT for EVIDENCE. It was an awful case for everyone. They apparently kept finding more? And everything had to be done exactly by the book, but the book hadn't been quite cemented for digital crimes. This was the case that my dad says is the worst thing he ever had to deal with in his entire career, but I think that the worst thing of it all is that it was just a standout in volume, not subject matter. He called it "not uncommon".
Just knowing what it took out of my dad to deal with this one incident, I'm not at all surprised that people are unwilling to tackle it as a larger issue. Any sort of action we can take to stop child abuse before it starts not only saves the children from trauma, it also saves the time, money, and psyches of everybody who has to deal with the resulting fallout. I don't know what we can do to hammer this point home.
posted by Mizu at 1:26 AM on September 30, 2019 [36 favorites]
Other times though, I remember him coming home drawn, exhausted, wanting nothing more than to eat dinner and collapse into bed - he could never talk about his cases but he would usually be chatty at dinner or try to work on his music. I remember him coming home one night and just hugging my mom, who is not a hugger and she didn't know really what to do with her arms there in the kitchen. He gained weight and aged a bunch and evidently almost went back to smoking, which would have been a huge failure for him.
A couple years later when I was in college and going through some shit, my dad told me what had happened. An employee of the federal government, working in DC in the office on a government issued and monitored computer network, had been fired for saving and distributing child porn - he'd had 2TB of it saved on his work computers in the nineties. And my dad had to LOOK AT IT for EVIDENCE. It was an awful case for everyone. They apparently kept finding more? And everything had to be done exactly by the book, but the book hadn't been quite cemented for digital crimes. This was the case that my dad says is the worst thing he ever had to deal with in his entire career, but I think that the worst thing of it all is that it was just a standout in volume, not subject matter. He called it "not uncommon".
Just knowing what it took out of my dad to deal with this one incident, I'm not at all surprised that people are unwilling to tackle it as a larger issue. Any sort of action we can take to stop child abuse before it starts not only saves the children from trauma, it also saves the time, money, and psyches of everybody who has to deal with the resulting fallout. I don't know what we can do to hammer this point home.
posted by Mizu at 1:26 AM on September 30, 2019 [36 favorites]
The NYT is looking at the high-level systems, the Miami Herald is going after the criminals. The Army officer in charge of all the President's ccommunication at Mar-a-Lago apparently uploaded child porn while he was working there.
posted by rednikki at 1:42 AM on September 30, 2019 [8 favorites]
posted by rednikki at 1:42 AM on September 30, 2019 [8 favorites]
How sure are people that the tech exists to identify child porn. My impression is that visual recognition wasn't nearly that good.
posted by Nancy Lebovitz at 2:00 AM on September 30, 2019
posted by Nancy Lebovitz at 2:00 AM on September 30, 2019
PhotoDNA, the tool described above, doesn't operate on content recognition (trust me, I regularly see how poor the state of the public art is on non-illegal nudity detection), but on effectively coming up with a hash/fingerprint of known photos that's stable if the image is recompressed, resized, mangled, etc.
Think YouTube's ContentID system. Not good at new photos, but it can catch duplicates.
This was the heart of that tool Facebook proposed for dealing with non-consentual photo distribution. "Hey, upload those photos to us so we can block them" is something few sites have the trust to call for though, and Facebook definitely isn't one of them
posted by CrystalDave at 2:12 AM on September 30, 2019 [7 favorites]
Think YouTube's ContentID system. Not good at new photos, but it can catch duplicates.
This was the heart of that tool Facebook proposed for dealing with non-consentual photo distribution. "Hey, upload those photos to us so we can block them" is something few sites have the trust to call for though, and Facebook definitely isn't one of them
posted by CrystalDave at 2:12 AM on September 30, 2019 [7 favorites]
Perhaps it's time to eliminate the ability to share images/videos privately, at both the legal and infrastructural level.
posted by acb at 2:36 AM on September 30, 2019
posted by acb at 2:36 AM on September 30, 2019
Project Arachnid is a Canadian organisation but collaborates with international agencies. It crawls the web and sends notifications to ISPs when CSAM is found. They've got a new program called Shield which service providers can use pro-actively as well.
posted by harriet vane at 2:46 AM on September 30, 2019
posted by harriet vane at 2:46 AM on September 30, 2019
This Project Arachnid approach sounds exactly right, but they're going to have to be careful, because the moment someone untrustworthy gets access to their tool, abusers are presumably going to be able to check their material against the tool and modify it until it passes.
posted by value of information at 3:15 AM on September 30, 2019
posted by value of information at 3:15 AM on September 30, 2019
The term “child sexual abuse material” is increasingly being used to replace the term “child pornography”.[165] This switch of terminology is based on the argument that sexualised material that depicts or otherwise represents children is indeed a representation, and a form, of child sexual abuse, and should not be described as “pornography”.[166] Pornography is a term primarily used for adults engaging in consensual sexual acts distributed (often legally) [167] to the general public for their sexual pleasure. Criticism of this term in relation to children comes from the fact that “pornography” is increasingly normalised and may (inadvertently or not) contribute to diminishing the gravity of, trivialising, or even legitimising what is actually sexual abuse and/or sexual exploitation of children. [168] Furthermore, as with the terms discussed above, “child prostitution” and “child prostitute”, the term “child pornography” risks insinuating that the acts are carried out with the consent of the child, [169] and represent legitimate sexual material.
Note, this is partly in a legal context: the term “child sexual abuse images” has also sometimes been used in this context. However, it is important to note that, by limiting the terminology to “images”, the risk exists of excluding other forms of material representing child sexual abuse and exploitation, such as audio files...
-- luxembourgguidelines.org. Via Appropriate terminology on the Interpol website.
posted by sourcejedi at 3:39 AM on September 30, 2019 [11 favorites]
Note, this is partly in a legal context: the term “child sexual abuse images” has also sometimes been used in this context. However, it is important to note that, by limiting the terminology to “images”, the risk exists of excluding other forms of material representing child sexual abuse and exploitation, such as audio files...
-- luxembourgguidelines.org. Via Appropriate terminology on the Interpol website.
posted by sourcejedi at 3:39 AM on September 30, 2019 [11 favorites]
"overrun" is a bit strong. That would imply child porn being all over the place, constituting a majority of digitally shared images.
posted by doctornemo at 4:51 AM on September 30, 2019 [4 favorites]
posted by doctornemo at 4:51 AM on September 30, 2019 [4 favorites]
I came in to say what sourcejedi said. The Canadian Special Report "Every Image, Every Child" calls this Internet-facilitated child abuse in Canada. Source.
posted by warriorqueen at 5:03 AM on September 30, 2019
posted by warriorqueen at 5:03 AM on September 30, 2019
Perhaps it's time to eliminate the ability to share images/videos privately, at both the legal and infrastructural level.
It's not.
posted by Mitheral at 5:37 AM on September 30, 2019 [21 favorites]
It's not.
posted by Mitheral at 5:37 AM on September 30, 2019 [21 favorites]
Shocking how everyone in Congress is all about protecting kids right up to the point where money needs to be put into actually doing so. And my surprised face at hearing that money to stop this abuse was pulled to fund border security.
Wonderful that over the past decade Google has donated 4 million or so to the National Center for Missing and Exploited Children... what’s that, an infinitesimal fraction of their earning? Filthy rich assholes like Zuckerberg could donate more than that annually out of their pocket change and not even miss it.
If tech is helping create the problem, the people running the tech companies should have the integrity to support a solution, either by actually taking proactive action or by funding the groups tasked with prevention. If Congress actually gave two shits about children maybe things would be better, but the moral outrage is reserved for drumming up the vote rather than actually improving anyone’s lives.
posted by caution live frogs at 5:38 AM on September 30, 2019 [4 favorites]
Wonderful that over the past decade Google has donated 4 million or so to the National Center for Missing and Exploited Children... what’s that, an infinitesimal fraction of their earning? Filthy rich assholes like Zuckerberg could donate more than that annually out of their pocket change and not even miss it.
If tech is helping create the problem, the people running the tech companies should have the integrity to support a solution, either by actually taking proactive action or by funding the groups tasked with prevention. If Congress actually gave two shits about children maybe things would be better, but the moral outrage is reserved for drumming up the vote rather than actually improving anyone’s lives.
posted by caution live frogs at 5:38 AM on September 30, 2019 [4 favorites]
Perhaps it's time to eliminate the ability to share images/videos privately, at both the legal and infrastructural level.
It's not.
Yeah talk about a reductio ad absurdum.
posted by aspersioncast at 6:01 AM on September 30, 2019
It's not.
Yeah talk about a reductio ad absurdum.
posted by aspersioncast at 6:01 AM on September 30, 2019
CrystalDave, thanks. So people would still have to look at the images, but at least they wouldn't have to keep seeing the same images.
value of information, even if they don't have access to the software, they'll be experimenting to see what they can get past it.
posted by Nancy Lebovitz at 6:34 AM on September 30, 2019
value of information, even if they don't have access to the software, they'll be experimenting to see what they can get past it.
posted by Nancy Lebovitz at 6:34 AM on September 30, 2019
Shocking how everyone in Congress is all about protecting kids right up to the point where money needs to be put into actually doing so.
"Think of the children" is such a handy excuse to attack all manner of freedoms, that actually doing something to protect kids could take the wind out of that particular sail.
posted by Thorzdad at 6:43 AM on September 30, 2019 [9 favorites]
"Think of the children" is such a handy excuse to attack all manner of freedoms, that actually doing something to protect kids could take the wind out of that particular sail.
posted by Thorzdad at 6:43 AM on September 30, 2019 [9 favorites]
From a tech standpoint it's trivial to automate finding previously identified images of child porn, and as noted the tech companies have been doing this for literally decades.
Basically you take what's called a hash of the file, a string of gibberish that's mathematically derived from the actual file. The hash isn't the image and can be distributed to everyone (google's library of hashes is more or less available to everyone who wants it). Then you simply compare all the images against the library of hashes and see what matches. It's fairly quick and doesn't rely on anyone actually needing to manually look through images.
Problem of course is that it only finds previously identified images of child porn, checking hashes does nothing at all to find newly produced images. That would require serious AI and so far training AI to even just distinguish non-porn from regular porn has proven to be really difficult.
Cryptography adds a whole new level to the problem.
And breaking cryptography is worse than just invading privacy. Any cryptographic method with a built in backdoor is a cryptographic method that's inherently unsafe because everyone will instantly start looking for that backdoor. And that means things like online banking and purchasing become vulnerable. It's not just privacy that's at risk if we ban real cryptography it's our entire system of online money handling.
Plus, of course, you can't really ban real cryptography. It's not exactly trivial to hack together a serious crypto program, but it's not difficult or impossible and there's plenty of geeks who would do so just for the privacy reason even if they know their program would also benefit pedophiles. Anyone talking about banning cryptography is just blowing hot air, it can't be done anymore than prohibition could ban alcohol and for basically the same reason: you can make your own with easily obtained materials.
It's also worth noting that part of the problem is due to the *chan boards and their ilk creating a culture both of actual pedophiles coupled with nominally non-pedophile users who post child porn images for the shock value. Result is that child porn is being spread far and wide by the same troll army that spreads Trump propaganda, often by the exact same individuals.
I'm beginning to wonder if we will eventually see the end of online pseudoanonymity and a mandate that people have their online personae linked to their real ID at least with law enforcement (which means it will inevitably be hacked and people will be doxxed and stalked and/or killed by right wing terrorists).
Of course then you run into the dark net and the difficulty and/or impossibility of ending it.
Freenet exists, ostensibly, for the use of people in repressive polities to communicate without government interference. It actually has a fairly decent method of assuring true anonymity. It's not quite perfect, but it's prohibitively expensive for any law enforcement to break through that anonymity unless the user does something stupid. Meaning that it contains a lot of child pornography and it's basically impossible to stop unless you ban all virtual private networking which isn't really possible because VPN's have a huge number of necessary legitimate purposes.
Putting more resources into finding the people producing the images might be the only thing that's remotely productive. That would necessitate taking police resources away from oppressing minorities so it's really a win/win if we took that approach.
posted by sotonohito at 6:45 AM on September 30, 2019 [6 favorites]
Basically you take what's called a hash of the file, a string of gibberish that's mathematically derived from the actual file. The hash isn't the image and can be distributed to everyone (google's library of hashes is more or less available to everyone who wants it). Then you simply compare all the images against the library of hashes and see what matches. It's fairly quick and doesn't rely on anyone actually needing to manually look through images.
Problem of course is that it only finds previously identified images of child porn, checking hashes does nothing at all to find newly produced images. That would require serious AI and so far training AI to even just distinguish non-porn from regular porn has proven to be really difficult.
Cryptography adds a whole new level to the problem.
And breaking cryptography is worse than just invading privacy. Any cryptographic method with a built in backdoor is a cryptographic method that's inherently unsafe because everyone will instantly start looking for that backdoor. And that means things like online banking and purchasing become vulnerable. It's not just privacy that's at risk if we ban real cryptography it's our entire system of online money handling.
Plus, of course, you can't really ban real cryptography. It's not exactly trivial to hack together a serious crypto program, but it's not difficult or impossible and there's plenty of geeks who would do so just for the privacy reason even if they know their program would also benefit pedophiles. Anyone talking about banning cryptography is just blowing hot air, it can't be done anymore than prohibition could ban alcohol and for basically the same reason: you can make your own with easily obtained materials.
It's also worth noting that part of the problem is due to the *chan boards and their ilk creating a culture both of actual pedophiles coupled with nominally non-pedophile users who post child porn images for the shock value. Result is that child porn is being spread far and wide by the same troll army that spreads Trump propaganda, often by the exact same individuals.
I'm beginning to wonder if we will eventually see the end of online pseudoanonymity and a mandate that people have their online personae linked to their real ID at least with law enforcement (which means it will inevitably be hacked and people will be doxxed and stalked and/or killed by right wing terrorists).
Of course then you run into the dark net and the difficulty and/or impossibility of ending it.
Freenet exists, ostensibly, for the use of people in repressive polities to communicate without government interference. It actually has a fairly decent method of assuring true anonymity. It's not quite perfect, but it's prohibitively expensive for any law enforcement to break through that anonymity unless the user does something stupid. Meaning that it contains a lot of child pornography and it's basically impossible to stop unless you ban all virtual private networking which isn't really possible because VPN's have a huge number of necessary legitimate purposes.
Putting more resources into finding the people producing the images might be the only thing that's remotely productive. That would necessitate taking police resources away from oppressing minorities so it's really a win/win if we took that approach.
posted by sotonohito at 6:45 AM on September 30, 2019 [6 favorites]
Hash-based content id certainly isn't a panacea, but it does seem like a powerful and underutilized tool if paired with actual law enforcement. Yes, it can't beat encryption, but people are rarely perfect about using encryption. Comprehensive or even stochastic sampling scans to identify child sexual abuse material when it's transmitted or stored in the clear will capture some fraction of it, and if shared with and subsequently acted on by law enforcement, this should be sufficient to identify and prosecute a reasonable fraction of the individuals producing and propagating this material. Because of the Pareto-distribution nature of this kind of behavior in general (that is, a small number of individuals tend to be responsible for the majority of the material), even this will be enough to dramatically impact the network of child sexual abuse material distributors.
And at the very least it seems like it should be an important tool for preventing the people who have to review these images from being retraumatized by the same images over and over again.
posted by biogeo at 8:28 AM on September 30, 2019 [1 favorite]
And at the very least it seems like it should be an important tool for preventing the people who have to review these images from being retraumatized by the same images over and over again.
posted by biogeo at 8:28 AM on September 30, 2019 [1 favorite]
Don’t forget that any crackdown also requires having regular, caring people look at these horrible pictures day after day.
I currently work with a former leader of our state police's Crimes Against Children Task Force, and another former member worked here for a while, too\ -- and they're both still very troubled by having performed that job.
They claimed there was a pail in the operations room for politicians who visited and had to throw up. Whether that's true or not I can't say -- but I feel badly that their assigned duty was several hours daily of looking at CP. No one should be having to scan the stuff hour after hour, to say nothing of the ghouls who trade in it by choice.
Ugh, people are awful.
posted by wenestvedt at 8:31 AM on September 30, 2019 [1 favorite]
I currently work with a former leader of our state police's Crimes Against Children Task Force, and another former member worked here for a while, too\ -- and they're both still very troubled by having performed that job.
They claimed there was a pail in the operations room for politicians who visited and had to throw up. Whether that's true or not I can't say -- but I feel badly that their assigned duty was several hours daily of looking at CP. No one should be having to scan the stuff hour after hour, to say nothing of the ghouls who trade in it by choice.
Ugh, people are awful.
posted by wenestvedt at 8:31 AM on September 30, 2019 [1 favorite]
There’s a lot of tension that comes from just searching someone’s material when there’s a reasonable suspicion you may encounter CSAM. You sit there, for hours at a time usually, going through the most banal detritus of someone’s life knowing that even if you don’t find it the report that puts you there searching their shit means that this person is probably horrible.
Yeah they’re not convicted, yet, but you heard about what got uploaded from the computer you’re dissecting, so here’s a slideshow: graduations, galas, vacations, weddings. Meanwhile the hammer could drop at any time, which means you will both see something awful and have a lot more work to do.
The last case I was involved in, the perpetrator uploaded the materials to a public blog clearly connected to his real name. He’s no anomaly, I’ve seen a couple others and heard of scores more like him. Normal functioning guy otherwise at that, advanced degree and fairly well-liked by his peers... thought he could post that shit in the open.
I think we need to get really good at dealing with guys like that, technologically and procedurally – because I think there are lots, they may even be the majority – before it makes sense to argue about encryption, VPNs, etc.
posted by Matt Oneiros at 9:00 AM on September 30, 2019 [6 favorites]
Yeah they’re not convicted, yet, but you heard about what got uploaded from the computer you’re dissecting, so here’s a slideshow: graduations, galas, vacations, weddings. Meanwhile the hammer could drop at any time, which means you will both see something awful and have a lot more work to do.
The last case I was involved in, the perpetrator uploaded the materials to a public blog clearly connected to his real name. He’s no anomaly, I’ve seen a couple others and heard of scores more like him. Normal functioning guy otherwise at that, advanced degree and fairly well-liked by his peers... thought he could post that shit in the open.
I think we need to get really good at dealing with guys like that, technologically and procedurally – because I think there are lots, they may even be the majority – before it makes sense to argue about encryption, VPNs, etc.
posted by Matt Oneiros at 9:00 AM on September 30, 2019 [6 favorites]
Finding this is nowhere near, thank God, my job, but we've had cases of even faculty caught with CSAM on University owned computers. Why would they do that - it's the fastest way to get caught, isn't it? The best theory I have heard is that they wanted it away from the computers their family used so that they didn't get caught by them. I don't know, maybe they are that ignorant, and persuaded themselves that no one would find it in all the traffic and data of our network.
posted by thelonius at 9:28 AM on September 30, 2019 [2 favorites]
posted by thelonius at 9:28 AM on September 30, 2019 [2 favorites]
Perhaps it's time to eliminate the ability to share images/videos privately, at both the legal and infrastructural level.
It's not.
what's frightening to me is that this suggestion is probably not new at all in the lawmaking realm, and from what I know of the hi-tech world, there's always going to be some sales guy who, if the contract is put out, will basically lie and say, "yeah, my team can do that" ... and then millions of dollars (and immense amounts of effort) later, nothing tangible will have been accomplished. The problem will still exist. The images will still be out there, hanging with the genii they're attached to ...
posted by philip-random at 10:00 AM on September 30, 2019
Cryptography adds a whole new level to the problem.
It's not hard to make a an encrypted system that still allows for identifying known images, though: just log the hash (ie, content fingerprint) of the image before encrypting. You can then compare to the database easily, and get leads on people trading the images. This seems to me like a pretty good compromise; it generates leads for investigation, without putting in a backdoor.
posted by kaibutsu at 10:52 AM on September 30, 2019
It's not hard to make a an encrypted system that still allows for identifying known images, though: just log the hash (ie, content fingerprint) of the image before encrypting. You can then compare to the database easily, and get leads on people trading the images. This seems to me like a pretty good compromise; it generates leads for investigation, without putting in a backdoor.
posted by kaibutsu at 10:52 AM on September 30, 2019
thelonius, I had a similar experience in a previous job some years ago--not academia but public-service-adjacent, and none of us could wrap our heads around the fact that the dude in question had chosen to do/save what he was doing on computers which were audited regularly by a government agency. The going theory was basically what you said above--he preferred that risk to the possibility of his wife or kids (YUP) finding it. Thinking about it, and having shared office space with him, makes me want to retch.
The officials who finally caught him had had to secretly observe/track him for months to finally incriminate him sufficiently to make the arrest. I cannot imagine having to witness what he was doing in real time. I am grateful he was caught obviously, but the toll that sort of work must take on investigators and officials is terrible to think about.
posted by peakes at 11:57 AM on September 30, 2019 [3 favorites]
The officials who finally caught him had had to secretly observe/track him for months to finally incriminate him sufficiently to make the arrest. I cannot imagine having to witness what he was doing in real time. I am grateful he was caught obviously, but the toll that sort of work must take on investigators and officials is terrible to think about.
posted by peakes at 11:57 AM on September 30, 2019 [3 favorites]
The people quoted in the article say that "those people have always been out there", which is obviously true, but I find myself wondering about the increased availability of images/communities on the internet socializing people into actually perpetrating abuse - people who thought they'd be caught and didn't start until it was normalized, people who saw their horrible urges normalized, etc. When there's a cohesive social world with its own norms, "famous" participants, and durability, it provides a setting for social practices that people might not have done on their own, and it provides ideas for people who go further than they would on their own.
The theme of the present is the realization of the networks, structures and practices which are invisible in everyday bourgeois life but which hold together whole worlds of vile cruelty and suffering under the surface, like evil rhizomes. Whether it's Nazis or pedophiles or the corrupt rich, they're not solitary actors; they have networks to support and enable each other, they have practices that they follow, they have norms. There's the bright daylight world, and if you live there and only there, you don't see this stuff, it might as well not exist - you're safe most of the time, you don't abuse or suffer abuse, when you work you get paid, you have a safe place to live, you don't rip people off or try to hurt them. And then these powerful other worlds. And you know, when you think about it, that all around you must be people who are part of those other worlds, as victims or perpetrators, people who have this bad knowledge. It's like discovering that your house is built on a mass grave.
posted by Frowner at 12:17 PM on September 30, 2019 [9 favorites]
The theme of the present is the realization of the networks, structures and practices which are invisible in everyday bourgeois life but which hold together whole worlds of vile cruelty and suffering under the surface, like evil rhizomes. Whether it's Nazis or pedophiles or the corrupt rich, they're not solitary actors; they have networks to support and enable each other, they have practices that they follow, they have norms. There's the bright daylight world, and if you live there and only there, you don't see this stuff, it might as well not exist - you're safe most of the time, you don't abuse or suffer abuse, when you work you get paid, you have a safe place to live, you don't rip people off or try to hurt them. And then these powerful other worlds. And you know, when you think about it, that all around you must be people who are part of those other worlds, as victims or perpetrators, people who have this bad knowledge. It's like discovering that your house is built on a mass grave.
posted by Frowner at 12:17 PM on September 30, 2019 [9 favorites]
It's not hard to make a an encrypted system that still allows for identifying known images, though: just log the hash (ie, content fingerprint) of the image before encrypting.
I don't quite understand this. Is the child pornographer supposed to run a hash on all his images before encrypting them and then make those hashes available to law enforcement?
posted by Mitheral at 12:27 PM on September 30, 2019
I don't quite understand this. Is the child pornographer supposed to run a hash on all his images before encrypting them and then make those hashes available to law enforcement?
posted by Mitheral at 12:27 PM on September 30, 2019
No, but facebook messenger/whatsapp/whatever certainly can. The 44 million images mentioned in the nytimes article were identified by tech platforms. What I'm saying is that it's feasible to have end-to-end encryption in the app with a side channel that allows identifying CSAI. Since only the hash is logged, enforcement doesn't know the content of the communications unless it's already in the database.
posted by kaibutsu at 12:48 PM on September 30, 2019 [2 favorites]
posted by kaibutsu at 12:48 PM on September 30, 2019 [2 favorites]
So the database of known images grows only by hashing of confiscated images when an arrest is made, or a server is seized?
posted by thelonius at 12:53 PM on September 30, 2019 [1 favorite]
posted by thelonius at 12:53 PM on September 30, 2019 [1 favorite]
Exactly. Which I believe is largely how it works today, FWIW.
posted by kaibutsu at 12:55 PM on September 30, 2019 [3 favorites]
posted by kaibutsu at 12:55 PM on September 30, 2019 [3 favorites]
the Justice Department has neglected its duties for years, the police are inherently corrupt, unapologetically self-protective, and doggedly committed to systemic racial oppression based on a prohibition-era view of drug enforcement, and non-profits are overwhelmed
FTFY
posted by allkindsoftime at 2:24 PM on September 30, 2019 [1 favorite]
FTFY
posted by allkindsoftime at 2:24 PM on September 30, 2019 [1 favorite]
Ah, gotcha, I thought you were talking about using a hash to detect content that has been encrypted on local storage.
posted by Mitheral at 7:10 PM on September 30, 2019
posted by Mitheral at 7:10 PM on September 30, 2019
« Older “I am virtually never alone in newer video games.” | Recipes Endure, across Oceans and Centuries Newer »
This thread has been archived and is closed to new comments
If AI is looking for a big win, automating that process so that humans no longer have to do it would be one. But that requires training with the kind of images it needs to be able to identify, which humans would need to select.
posted by tommasz at 1:55 PM on September 29, 2019 [2 favorites]