the EARN IT Act
February 11, 2022 10:46 AM Subscribe
The Washington Post on the EARN IT Act:
"Under the Earn It Act, tech companies would lose some long-standing protections they enjoy under a legal shield called Section 230, opening them up to more lawsuits over posts of child sexual abuse material on their platforms. The bill, which was first introduced in 2020, would also create a national commission of law enforcement, abuse survivors and industry experts to develop best practices to address child abuse online." But the EFF is not a fan:
The EFF: The goal is to get states to pass laws that will punish companies when they deploy end-to-end encryption, or offer other encrypted services. This includes messaging services like WhatsApp, Signal, and iMessage, as well as web hosts like Amazon Web Services. We know that EARN IT aims to spread the use of tools to scan against law enforcement databases because the bill’s sponsors have said so.
Slate
Sex Workers Project (from 2020)
The Verge
The EFF: The goal is to get states to pass laws that will punish companies when they deploy end-to-end encryption, or offer other encrypted services. This includes messaging services like WhatsApp, Signal, and iMessage, as well as web hosts like Amazon Web Services. We know that EARN IT aims to spread the use of tools to scan against law enforcement databases because the bill’s sponsors have said so.
Slate
Sex Workers Project (from 2020)
The Verge
A couple days ago I wrote to one of my senators, Patty Murray, and received this in response:
posted by xedrik at 11:01 AM on February 11, 2022 [3 favorites]
Thank you for contacting me regarding Section 230 of the Communications Decency Act (“Section 230”) and online speech. I appreciate hearing from you on this matter.I would have preferred a firmer, "Yeah, we need to crack down on sex trafficking, but this legislation as written is just awful, so no." This feels like a wishy-washy response to me.
As you know, Section 230, created in 1996, states that “no provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” In practice, Section 230 has played a critical role in ensuring that the internet remains open and accessible to all, ensuring freedom of expression, speech, and innovation on the internet. However, the legal shield for online platforms has also meant that individuals using the internet can at times be victim to exploitation, hate speech, and misinformation that undermines public health, our democracy, and promotes violence.
Recently, Section 230 was modified by a legislative package known as FOSTA-SESTA, which created an exception to give publishers responsibility over third parties found to be posting ads for prostitution on their platforms. It created a pathway for victims of online sex trafficking to hold online platforms accountable for knowingly facilitating trafficking. While the bill was not perfect, it took an important step to curb online trafficking and bring accountability to internet platforms that facilitate criminal activity. Some in Congress are pursuing further reforms. I appreciate how important Section 230 is to the internet and believe any changes warrant scrutiny and robust debate. I’m closely assessing how any proposed legislation could impact free speech online as well as the websites and services Americans use and rely upon every day.
As the Senate considers the EARN IT Act and other Section 230 reforms, please know that I will keep your thoughts in mind. I will continue to work with my colleagues in the Senate to ensure that any changes to Section 230 are used to crack down on illegal and harmful activity and not on content and services protected by the First Amendment. If you would like to know more about my work in the Senate, please feel free to sign up for my updates through the subscribe button below. Again, thank you for taking the time to share your thoughts with me.
Sincerely,
Patty Murray
United States Senator
posted by xedrik at 11:01 AM on February 11, 2022 [3 favorites]
I emailed both my senators earlier this week but have not received a response yet. This bill is horrible. Between eliminating Sec 230 protections to killing encryption it has the very real potential to kill the Internet as we know it.
A question. If Metafilter, Inc. is potentially liable for everything that gets posted here, how long will it take Cortex to shut down commenting on the site after the news laws go into effect?
posted by COD at 11:05 AM on February 11, 2022
A question. If Metafilter, Inc. is potentially liable for everything that gets posted here, how long will it take Cortex to shut down commenting on the site after the news laws go into effect?
posted by COD at 11:05 AM on February 11, 2022
I'd take the "the government wants to scan your messages" argument more seriously if there was discussion about the tech industry's private anti-CSAM scanning system which has little oversight and is designed in part as an end run around Constitutional protections.
posted by NoxAeternum at 11:06 AM on February 11, 2022
posted by NoxAeternum at 11:06 AM on February 11, 2022
The destruction of encryption, and the broad availability of these records to US law enforcement, seems like it would violate GDPR, and force a whole bunch of US companies out of the EU entirely, as it would not be possible to comply both with this law and with GDPR.
posted by Eyebrows McGee at 11:24 AM on February 11, 2022 [11 favorites]
posted by Eyebrows McGee at 11:24 AM on February 11, 2022 [11 favorites]
A question. If Metafilter, Inc. is potentially liable for everything that gets posted here, how long will it take Cortex to shut down commenting on the site after the news laws go into effect?
It's nearly time, it seems, for MetafilTOR
posted by BigHeartedGuy at 11:25 AM on February 11, 2022 [6 favorites]
It's nearly time, it seems, for MetafilTOR
posted by BigHeartedGuy at 11:25 AM on February 11, 2022 [6 favorites]
Metafilter will not go away, just because Facebook has to be reined in.
posted by They sucked his brains out! at 11:27 AM on February 11, 2022
posted by They sucked his brains out! at 11:27 AM on February 11, 2022
Why does facebook have to be reigned in in this specific way? Because they are already planning to use the "unintended" consequences against vulnerable groups. Count on it.
I'm less concerned about MeFi the website than I am about pregnant people in Texas right now. They want to put them on a list, did you know that? And they will want to track what they're talking about next. They will.
posted by Horkus at 11:41 AM on February 11, 2022 [5 favorites]
I'm less concerned about MeFi the website than I am about pregnant people in Texas right now. They want to put them on a list, did you know that? And they will want to track what they're talking about next. They will.
posted by Horkus at 11:41 AM on February 11, 2022 [5 favorites]
I'd take the "the government wants to scan your messages" argument more seriously if there was discussion about the tech industry's private anti-CSAM scanning system which has little oversight and is designed in part as an end run around Constitutional protections.Sure, there are bad actors,* but this legislation would effectively make it illegal for companies *not* to snoop.
* I’m not convinced that the tech giants who use anti-CSAM systems actually qualify as bad actors. The systems have been in place for quite some time, and appear to have done far more good than harm.
posted by schmod at 11:44 AM on February 11, 2022 [1 favorite]
FOSTA-SESTA is great for crushing sex workers. Not much else.
posted by j_curiouser at 11:55 AM on February 11, 2022 [2 favorites]
posted by j_curiouser at 11:55 AM on February 11, 2022 [2 favorites]
I'd take the "the government wants to scan your messages" argument more seriously if there was discussion about the tech industry's private anti-CSAM scanning system which has little oversight and is designed in part as an end run around Constitutional protections.
I’m a little confused by this take. One of the major concerns about this bill is that it could effectively make it mandatory for companies to implement this kind of scanning to avoid legal liability. If it’s already a privacy issue that many of them do it doesn’t make bill a better idea - it just lowers the stakes somewhat.
posted by atoxyl at 12:01 PM on February 11, 2022 [3 favorites]
I’m a little confused by this take. One of the major concerns about this bill is that it could effectively make it mandatory for companies to implement this kind of scanning to avoid legal liability. If it’s already a privacy issue that many of them do it doesn’t make bill a better idea - it just lowers the stakes somewhat.
posted by atoxyl at 12:01 PM on February 11, 2022 [3 favorites]
One of the major concerns about this bill is that it could effectively make it mandatory for companies to implement this kind of scanning to avoid legal liability.
Given that most of the major players are doing this already, it feels like a difference without distinction. Not to mention that the current system was designed in part to get around the Fourth Amendment (and this is used as an argument in opposition, as it's pointed out that the requirement would make that aspect explicit to the point that the courts would no longer be able to turn a blind eye.)
posted by NoxAeternum at 12:13 PM on February 11, 2022
Given that most of the major players are doing this already, it feels like a difference without distinction. Not to mention that the current system was designed in part to get around the Fourth Amendment (and this is used as an argument in opposition, as it's pointed out that the requirement would make that aspect explicit to the point that the courts would no longer be able to turn a blind eye.)
posted by NoxAeternum at 12:13 PM on February 11, 2022
Sort of a meta-comment (which the mods can feel free to remove if it's better handled in Talk): I appreciate anti-paywall links as a way to ensure everyone can productively participate in the discussion, but should the anti-paywall link (12ft.io) really be the main one? As a paying WaPo subscriber, I don't mind them getting my click, which presumably drives some of their decisions around what content readers are interested in, etc.
Maybe we could have something like:
posted by Kadin2048 at 12:36 PM on February 11, 2022 [9 favorites]
Maybe we could have something like:
The Washington Post (no-paywall link) on the EARN IT Act...Just spitballing. Off to read the article and the bill.
posted by Kadin2048 at 12:36 PM on February 11, 2022 [9 favorites]
they are already planning to use the "unintended" consequences against vulnerable groups
2024:
...opening them up to more lawsuits over posts ofchild sexual abuse abortion material on their platforms.
2026:
...opening them up to more lawsuits over posts ofchild sexual abuse social justice material on their platforms.
2028:
...opening them up to more lawsuits over posts ofchild sexual abuse democracy material on their platforms.
posted by CynicalKnight at 12:46 PM on February 11, 2022 [13 favorites]
2024:
...opening them up to more lawsuits over posts of
2026:
...opening them up to more lawsuits over posts of
2028:
...opening them up to more lawsuits over posts of
posted by CynicalKnight at 12:46 PM on February 11, 2022 [13 favorites]
but should the anti-paywall link really be the main one?
Yes. For any given content, the majority of us are not going to be subscribers to that site. The "main link" should be the one that is most accessible to all, with a "click here if you want the full WaPo experience" being the additional link.
posted by explosion at 1:51 PM on February 11, 2022 [1 favorite]
Yes. For any given content, the majority of us are not going to be subscribers to that site. The "main link" should be the one that is most accessible to all, with a "click here if you want the full WaPo experience" being the additional link.
posted by explosion at 1:51 PM on February 11, 2022 [1 favorite]
Some good info here from Techdirt - The Top Ten Mistakes Senators Made During Today's EARN IT Markup
posted by COD at 2:06 PM on February 11, 2022 [2 favorites]
posted by COD at 2:06 PM on February 11, 2022 [2 favorites]
Okay, so having read the WaPo articles and the text of the bill itself, here's my hot take:
S.3538 - Section 1: Title. Can we just stop with these terrible backronyms, already? Sometimes I think the Revolution was a mistake, just because we have to deal with these fucking awful names, while the British at least get sensibly-named (if not sensibly-drafted) stuff like the "Online Safety Bill 2022" or whatever.
Section 2: Definitions. Boooring.
Section 3: National Commission on Online Child Sexual Exploitation Prevention. This is the stated purpose of the bill, the creation of a 19-member (why 19? who knows, maybe we just like prime numbers) commission, consisting of the US AG, the Secretary of Homeland Security (nope, not Orwellian at all), the Chairman of the Federal Trade Commission (kinda makes sense given the regulatory history of the Internet, but weren't we moving away from this towards multilateral Internet governance?), and 4 each nominated by the Senate Majority Leader, Sen Minority Leader, Speaker of the House, and minority leader of the House. This strikes me as a rather generous power-sharing arrangement towards the minority party. Then there's a bunch of stuff about terms (5 years), and pay (no pay but you get Fed per diem, so that's nice I guess).
Section 4: Duties of the Commission. So once the committee gets populated by our best-and-brightest, they have 18 months to develop and deliver a "best practices" report, with a bunch of requirements that basically read like a grading rubric for a term paper. At least 14 of the 19 members have to concur with the final version, which sounds like a real nightmare of a group project, although it's not clear what happens if 18 months passes and they can't reach agreement. Maybe they all get an incomplete and have to take summer school?
Section 5: Protecting Victims of Online Child Sexual Abuse. This is where we get into the meat of the issue.
First, under "NO EFFECT ON CHILD SEXUAL EXPLOITATION LAW", it creates a new civil liability on "interactive computer service[s]" for violations of 18 USC 2252, which covers visual materials of "a minor engaging in sexually explicit conduct" in basically any form, or 18 USC 2252A, which covers "child pornography" (which is later changed to "child sexual abuse material") more generally—including "obscene visual depiction[s] of a minor engaging in sexually explicit conduct" regardless of whether an actual minor was involved in its creation.
(When I took Con Law in undergrad, it was seemingly assumed that the laws related to "obscene" materials that don't originate from actual, prohibited conduct—i.e. pornography not created by actually photographing a minor in a sexual situation—were probably unconstitutional and ripe for being overturned. I do not think that is a safe assumption anymore, given the current Court and its probable direction. So "obscene" content which doesn't necessarily involve a criminal act in its creation, e.g. hentai, might be covered under 2252A, although I don't know quite how you'd show damages. N.B. this is also a real issue for people in the sex industry who may have a youthful appearance, despite being legally of age.)
Then, under "ENCRYPTION TECHNOLOGIES", it says: "none of the following actions or circumstances shall serve as an independent basis for liability of a provider of an interactive computer service[…]: The provider utilizes full end-to-end encrypted messaging services, device encryption, or other encryption services[,] The provider does not possess the information necessary to decrypt a communication[, or] The provider fails to take an action that would otherwise undermine the ability of the provider to offer full end-to-end encrypted messaging services, device encryption, or other encryption services."
This last bit appears to be a sop to the objections raised by privacy advocates and the tech industry. Basically—my reading, anyway—is that it doesn't let you go after an information service solely because it's end-to-end encrypted, but courts are free (and this is explicitly set out in the following sentence) to consider the usage of encryption in determining the liability of an information service.
Then the rest of the bill basically does a find/replace on "child pornography" with "child sexual abuse material", I guess because as long as they have the hood open, so to speak, they might as well update it, and makes some changes to how long the "Cyber Tipline" can retain data. It's unclear what real-world effect these parts have.
Overall: it's unclear—perhaps by design—what effect S.3538 would have on the average Internet user. It seems likely to make major Internet platforms more cautious in their approach to anything that might be classified after the fact as CP/CSAM, but how they will actually react probably depends on how the first few test cases actually go, when/if it gets signed into law. Thinking through a few likely hypotheticals, I'm not sure what sort of new liability would be created on, say, Apple Inc., if Jill (a minor) takes a nude of themselves and sends it to Jack (also a minor), who then sends it onward to a bunch of people (involuntary pornography) via iMessage (which is E2EE). It seems like Jill now has the ability to go after Apple for damages in addition to Jack, but in doing so Jill herself would seem to be admitting to a violation of 18 USC 2252, so I'm not sure how that would work.
It doesn't seem like it would necessarily ban hard crypto or shut down a site like Metafilter overnight, but it certainly seems like it could have a chilling effect on what content platforms will allow.
posted by Kadin2048 at 2:07 PM on February 11, 2022 [8 favorites]
S.3538 - Section 1: Title. Can we just stop with these terrible backronyms, already? Sometimes I think the Revolution was a mistake, just because we have to deal with these fucking awful names, while the British at least get sensibly-named (if not sensibly-drafted) stuff like the "Online Safety Bill 2022" or whatever.
Section 2: Definitions. Boooring.
Section 3: National Commission on Online Child Sexual Exploitation Prevention. This is the stated purpose of the bill, the creation of a 19-member (why 19? who knows, maybe we just like prime numbers) commission, consisting of the US AG, the Secretary of Homeland Security (nope, not Orwellian at all), the Chairman of the Federal Trade Commission (kinda makes sense given the regulatory history of the Internet, but weren't we moving away from this towards multilateral Internet governance?), and 4 each nominated by the Senate Majority Leader, Sen Minority Leader, Speaker of the House, and minority leader of the House. This strikes me as a rather generous power-sharing arrangement towards the minority party. Then there's a bunch of stuff about terms (5 years), and pay (no pay but you get Fed per diem, so that's nice I guess).
Section 4: Duties of the Commission. So once the committee gets populated by our best-and-brightest, they have 18 months to develop and deliver a "best practices" report, with a bunch of requirements that basically read like a grading rubric for a term paper. At least 14 of the 19 members have to concur with the final version, which sounds like a real nightmare of a group project, although it's not clear what happens if 18 months passes and they can't reach agreement. Maybe they all get an incomplete and have to take summer school?
Section 5: Protecting Victims of Online Child Sexual Abuse. This is where we get into the meat of the issue.
First, under "NO EFFECT ON CHILD SEXUAL EXPLOITATION LAW", it creates a new civil liability on "interactive computer service[s]" for violations of 18 USC 2252, which covers visual materials of "a minor engaging in sexually explicit conduct" in basically any form, or 18 USC 2252A, which covers "child pornography" (which is later changed to "child sexual abuse material") more generally—including "obscene visual depiction[s] of a minor engaging in sexually explicit conduct" regardless of whether an actual minor was involved in its creation.
(When I took Con Law in undergrad, it was seemingly assumed that the laws related to "obscene" materials that don't originate from actual, prohibited conduct—i.e. pornography not created by actually photographing a minor in a sexual situation—were probably unconstitutional and ripe for being overturned. I do not think that is a safe assumption anymore, given the current Court and its probable direction. So "obscene" content which doesn't necessarily involve a criminal act in its creation, e.g. hentai, might be covered under 2252A, although I don't know quite how you'd show damages. N.B. this is also a real issue for people in the sex industry who may have a youthful appearance, despite being legally of age.)
Then, under "ENCRYPTION TECHNOLOGIES", it says: "none of the following actions or circumstances shall serve as an independent basis for liability of a provider of an interactive computer service[…]: The provider utilizes full end-to-end encrypted messaging services, device encryption, or other encryption services[,] The provider does not possess the information necessary to decrypt a communication[, or] The provider fails to take an action that would otherwise undermine the ability of the provider to offer full end-to-end encrypted messaging services, device encryption, or other encryption services."
This last bit appears to be a sop to the objections raised by privacy advocates and the tech industry. Basically—my reading, anyway—is that it doesn't let you go after an information service solely because it's end-to-end encrypted, but courts are free (and this is explicitly set out in the following sentence) to consider the usage of encryption in determining the liability of an information service.
Then the rest of the bill basically does a find/replace on "child pornography" with "child sexual abuse material", I guess because as long as they have the hood open, so to speak, they might as well update it, and makes some changes to how long the "Cyber Tipline" can retain data. It's unclear what real-world effect these parts have.
Overall: it's unclear—perhaps by design—what effect S.3538 would have on the average Internet user. It seems likely to make major Internet platforms more cautious in their approach to anything that might be classified after the fact as CP/CSAM, but how they will actually react probably depends on how the first few test cases actually go, when/if it gets signed into law. Thinking through a few likely hypotheticals, I'm not sure what sort of new liability would be created on, say, Apple Inc., if Jill (a minor) takes a nude of themselves and sends it to Jack (also a minor), who then sends it onward to a bunch of people (involuntary pornography) via iMessage (which is E2EE). It seems like Jill now has the ability to go after Apple for damages in addition to Jack, but in doing so Jill herself would seem to be admitting to a violation of 18 USC 2252, so I'm not sure how that would work.
It doesn't seem like it would necessarily ban hard crypto or shut down a site like Metafilter overnight, but it certainly seems like it could have a chilling effect on what content platforms will allow.
posted by Kadin2048 at 2:07 PM on February 11, 2022 [8 favorites]
I haven't had the chance to think about this much, but I wonder at what point this results in Fourth Amendment violations.
If FedEx (a private company) opens your package on its own initiative, there's no Fourth Amendment violation because there's no state action involved. But if the govt legally compels FedEx to open your package, that's a search under the Fourth Amendment, and the legal protections that flow from that (such as they are) now come into play.
(Oh I see now the Slate article raises this point.)
It may be that the politicians behind this are well aware of the issue but think SCOTUS will have their back on this.
posted by mikeand1 at 5:57 PM on February 11, 2022
If FedEx (a private company) opens your package on its own initiative, there's no Fourth Amendment violation because there's no state action involved. But if the govt legally compels FedEx to open your package, that's a search under the Fourth Amendment, and the legal protections that flow from that (such as they are) now come into play.
(Oh I see now the Slate article raises this point.)
It may be that the politicians behind this are well aware of the issue but think SCOTUS will have their back on this.
posted by mikeand1 at 5:57 PM on February 11, 2022
Re the Slate piece: Anyone who thinks Fourth Amendment violations result in a "get of jail free card" for defendants has never actually litigated a motion to suppress. Evidence is very rarely thrown out, even in blatant Fourth Amendment violations. It's only on TV that judges routinely throw out incriminating evidence.
posted by mikeand1 at 7:15 PM on February 11, 2022 [1 favorite]
posted by mikeand1 at 7:15 PM on February 11, 2022 [1 favorite]
Thinking about the fanfiction and fanart wars around "aging up" the characters, in other words writing your own story loosely based on the original material in which everyone is clearly of age and having sex with each other, and how the brigade of online harassers who use "age-up" as an excuse for their harassment would interpret this law. This is a group that already calls stories where, for instance, the two characters who were originally 14 year old shounen manga characters are now 30 year old widowers who meet at a support group for grieving spouses "CSAM" because words don't mean anything apparently.
It seems like this law that doesn't care about who is actually harmed by the supposed obscene content, would open the door to grudge-reporting all kinds of materials that are technically illegal but largely ignored by the community for being harmless. I mean think about how much sexy stuff about teens there is online, or like all the lolita stuff. This would seem to create a law where pretty much all content providers will at any time be guilty of SOMeTHING. So now there is a legal route to go after whoever you like and arbitrarily get your enemies' websites removed. This is exactly what happens right now in the spaces I'm talking about.
posted by subdee at 5:47 AM on February 12, 2022 [3 favorites]
It seems like this law that doesn't care about who is actually harmed by the supposed obscene content, would open the door to grudge-reporting all kinds of materials that are technically illegal but largely ignored by the community for being harmless. I mean think about how much sexy stuff about teens there is online, or like all the lolita stuff. This would seem to create a law where pretty much all content providers will at any time be guilty of SOMeTHING. So now there is a legal route to go after whoever you like and arbitrarily get your enemies' websites removed. This is exactly what happens right now in the spaces I'm talking about.
posted by subdee at 5:47 AM on February 12, 2022 [3 favorites]
Also 100% trouble for all the websites run on volunteer labor and donations, that don't have the big pockets to shrug off the lawsuits, or the army of coders to automate some stupid automatic filtering system based on keywords. Like how would ao3, for instance, a site run on donations and volunteer labor, even begin to comply with this law? Probably they'd just have to close down the entire site.
posted by subdee at 6:00 AM on February 12, 2022 [1 favorite]
posted by subdee at 6:00 AM on February 12, 2022 [1 favorite]
I don't know, in general I think most websites SHOULD have more moderation and responsibility for what is posted on their site. I just don't know if I have enough trust that this law would be written effectively and especially with the long timeframe for the committee to make their recommendations, that it wouldn't be taken over by a conservative agenda. And especially with the "CSAM" angle because from where I'm standing online, CSAM almost never actually means CSAM.
posted by subdee at 6:06 AM on February 12, 2022 [1 favorite]
posted by subdee at 6:06 AM on February 12, 2022 [1 favorite]
« Older Jesus Built My Hotrod, Indeed | Two binturongs make a binturight. Newer »
This thread has been archived and is closed to new comments
"EARN IT Act was introduced [February 1st]! And it’s already scheduled to get marked up, which is the first step post-introduction. Most bills never go to markup, so this means they are putting pressure to move this through. IF YOU LIVE IN THESE STATES (IL, VT, CA, RI, MN, DE, CT, HI, NJ, and GA), CONTACT YOUR SENATOR AND HOUSE MEMBERS NOW. THIS IS URGENT. This is who gets first crack, and folks in all have Senators who are on the Judiciary committee.
From this anime tumblr, which also has details on who you should contact, and how, to block this:
https://fullhalalalchemist.tumblr.com/post/675056231663190016/urgent-earn-it-act-is-back-in-the-senate
posted by subdee at 11:00 AM on February 11, 2022 [3 favorites]