“I’d rather that everyone… could just stay obscure”
October 2, 2015 10:39 AM   Subscribe

[T]here are immediate practical benefits to trolling. The way we’ve designed the Internet has made the old cliché “There’s no such thing as bad publicity” actually come true. It’s now possible to monetize any kind of attention, good or bad—and if your gift happens to be generating the bad kind of attention, then it’s well within reach to make trolling into a full-time career.
Arthur Chu writes about “The Big Business of Internet Bigotry” for The Daily Beast.
posted by Going To Maine (81 comments total) 18 users marked this as a favorite
 
This isn’t one-to-one related, but there’s a hint of a connection here: at the same time that this piece went up, Chu was drawing fire for suggesting that section 230 of the Communications Decency Act should be repealed, thus making websites liable for their commenters’ postings. (The EFF considers the Section 230 of the CDA “the most important law protecting Internet speech”.)
posted by Going To Maine at 10:45 AM on October 2, 2015 [12 favorites]


I have to admit, I often wonder just how easy it might be to pull-in a reasonable side-income simply playing the most far-right-wing blogger I can imagine online.
posted by Thorzdad at 10:55 AM on October 2, 2015 [19 favorites]


Stephen Colbert (character) is rolling in his grave.
posted by FJT at 10:57 AM on October 2, 2015 [1 favorite]


Techdirt had a few comments recently on Chu.
posted by COD at 11:04 AM on October 2, 2015 [3 favorites]


A few years ago I would have said that argument was rock solid. Now, I just don't know. Frankly if you're intentionally providing a platform for hate under the guise of being a free speech advocate I have no time for you.
posted by Artw at 11:12 AM on October 2, 2015 [1 favorite]


So many websites believe it has to be an all-or-nothing proposition when it comes to being a platform for speech. Just look at Reddit, 4chan, etc., where the only limits are the ones that'll get the law breathing down their neck to see how well that works out. It's incumbent upon website operators to create the community they want to see through moderation---human moderation. We don't need to overturn Section 230 to have that. We just need website operators to care.
posted by SansPoint at 11:20 AM on October 2, 2015 [1 favorite]


We don't need to overturn Section 230 to have that. We just need website operators to care.

But Section 230 says they don't have to care. Which is the whole problem.
posted by NoxAeternum at 11:22 AM on October 2, 2015 [9 favorites]


It's incumbent upon website operators to create the community they want to see through moderation---human moderation. We don't need to overturn Section 230 to have that. We just need website operators to care.

I imagine most site owners want the maximum traffic for the minimum cost, so when you're looking at adding a comments section with the hope of it being an active draw to your site, you don't want it to cost you (much) money.

Good moderators cost good money, so most sites don't really establish communities as they do open up cesspools for people to dump their toxic thoughts.
posted by filthy light thief at 11:22 AM on October 2, 2015 [3 favorites]


A few years ago I would have said that argument was rock solid. Now, I just don't know. Frankly if you're intentionally providing a platform for hate under the guise of being a free speech advocate I have no time for you.

To whom do we want to give the power to distinguish between "platform for hate" and "platform for free speech", and to ban one but not the other? I would not want to serve as the arbiter of free speech for other people, nor would I want to give that power to anyone else.

The First Amendment is absolute, and while companies and organizations certainly can, and should, moderate as they please on sites owned by them, there's a place for "no-holds-barred free speech zones" as well. Subject to the usual laws, of course (i.e. no illegal content, no personal threats of death or harm, etc).

The internet is a big place - there's room for everyone.
posted by theorique at 11:24 AM on October 2, 2015 [5 favorites]


I too would like to believe that bigotry has its roots in soil as superficial and as recently laid down as the love of money.

But it just isn't true; bigotry is a sovereign evil that goes back as far as we can see, and explains greed far better than greed explains it.
posted by jamjam at 11:24 AM on October 2, 2015 [3 favorites]


One more paragraph from the article, just to emphasize its ambivalence:

And while I despise the [Joe The Plumber’s] politics I can’t blame him, exactly. I know how it feels to suddenly go viral. I know what he means when he says that John McCain turning him into a folk hero “really screwed [his] life up.” After becoming a folk hero, it gets hard to go back to work at a regular 9-to-5 job, especially one at an auto plant where you’re surrounded by blue-collar union Democrats who all know who you are.
posted by Going To Maine at 11:26 AM on October 2, 2015 [1 favorite]


For Reddit it's the money, for the Chans... Who knows what the fuck that is about.
posted by Artw at 11:28 AM on October 2, 2015


The internet is a big place - there's room for everyone.

Except for the people who feel unsafe there because of the harassment and attacks they face for being who they are.

But I guess they are the collateral damage of free speech.
posted by NoxAeternum at 11:30 AM on October 2, 2015 [29 favorites]


Interesting point:
It’s gotten so bad that everyone assumes a troll motive as soon as a story breaks. Professional trolls like Sarah Palin, Bill Maher, and Richard Dawkins—who all spend a lot of time on social media stirring the pot to get mentions and clicks—pounced on 14-year-old Ahmed Mohamed as someone whom they think antagonized authority figures for nothing but attention. In other words, they claimed him as one of their own.

Of course, the idea that any 14-year-old black Muslim kid intentionally got himself in huge trouble on the off chance the story might go viral is Alex Jones-level delusional...
I hadn't really connected the "He's only doing it for attention" claim to the fact that the people making it live their own lives that way.
posted by jaguar at 11:32 AM on October 2, 2015 [30 favorites]


I imagine most site owners want the maximum traffic for the minimum cost, so when you're looking at adding a comments section with the hope of it being an active draw to your site, you don't want it to cost you (much) money.

Yes and no. Reddit appears to be honestly trying to make money, but I'm not sure how well their business model is doing. Chan sites seem to be a way to give yourself a big hosting bill, and potential legal problems, while making minimal money relative to the headache. They seem to exist for a reason other than making someone rich.

Except for the people who feel unsafe there because of the harassment and attacks they face for being who they are.

But I guess they are the collateral damage of free speech.


Well, no - we should help them out as well. In those cases, there are pre-existing laws that prohibit personal threats. Just as you aren't allowed to walk into someone's workplace and threaten to kill them, you aren't allowed to do that online. The challenge is identifying who issued the threats, while still maintaining the right to privacy and anonymity.
posted by theorique at 11:38 AM on October 2, 2015 [2 favorites]


"Except for the people who feel unsafe there because of the harassment and attacks they face for being who they are.

But I guess they are the collateral damage of free speech.
"

Now they'll feel extra unsafe because their safe spaces online will now be legally liable for anything that happens on there. They'll be at the mercy of the strictest laws in the country. If a state makes giving out abortion information a crime then they'll end up being legally liable, for example.
posted by I-baLL at 11:39 AM on October 2, 2015 [7 favorites]


The First Amendment is absolute

Boy howdy, no it isn't.
posted by maxsparber at 11:40 AM on October 2, 2015 [32 favorites]


The challenge is identifying who issued the threats, while still maintaining the right to privacy and anonymity.

"You have every right to pursue people who abuse you, but we will make exercising that right functionally impossible. "

And people wonder why there's so much questioning of online anonymity.
posted by NoxAeternum at 11:46 AM on October 2, 2015 [12 favorites]


The most important thing about modernity is that it is created by ideology. From the material basis of it which is its visible mode (created by the ideologies of mass production, of productivity improvement, the joint-stock corporation) to the intellectual creations it demands and requires and makes possible and makes impossible.

Among the ideologies that were chosen long ago is that the freedom of speech is paramount. The strength of this ideology has grown as modernity has grown in tandem: a hundred years ago there were systematic censorship boards in the US enacting prior restraint, but this is unimaginable today. And the censorship that goes on today, in the books in school and so on, will probably be unimaginable tomorrow.

Mass production has claimed its grand share of blood and its mountains of skulls. Productivity improvement has shriven its share of souls. Non-principal management has materially harmed a thousand million lives, enshrouded them in a life with materially less hope and joy, and the joint-stock corporation more. But we paid a lot and gained a lot. With mass production, we have the possibility of all dying, in industrial war which culminates in holocaust: without mass production, we have merely the certainty of all starving to death.

Do you think that the cessation of the freedom of speech will actually stop hate speech on the Internet? I think it more likely to lead to the certainty of it, backed up by cryptographical methods and mean brutishness and the sheer easiness of anonymous creation. Has there been attempts at state repression of speech which actually succeeded? How about societal attempts at repression of hate speech? If people who had the power of arbitrary execution - people who could say to someone, "You will now die." without another soul in the world countermanding them, and have other people kill those people - failed utterly, how is it that this will succeed? If people who had all the power of love and hope and respect and the spirit of all humankind failed utterly, how is it that this will succeed?

It is not often remembered that the quotation, "I disapprove of what you say, but I will defend to the death your right to say it", was not from Voltaire, but a biography of Voltaire, written by one Evelyn B. Hall.
posted by curuinor at 11:47 AM on October 2, 2015 [5 favorites]


Freedom of speech for who on the Internet?

Because I doubt many of these people who are abused and harassed online feel like they have free speech.
posted by NoxAeternum at 11:49 AM on October 2, 2015 [5 favorites]


"Except for the people who feel unsafe there because of the harassment and attacks they face for being who they are.

Just how much liability would metafilter face if sites were liable for comments? How much would it increase the mods' work (and their anxieties)?

You know that this kind of law is almost always used most and hardest against marginalized people. It will be used maliciously by the right and/or miscellaneous trolls against feminist sites, POC-focused sites, etc. Any time any trans person mentions the mere idea that some people out there someone get hormones extralegally, etc etc.

And who will have the time and money to fight these cases? Right wing sites with big money backing them, that's who. The right will be able to get away with a lot of bad behavior while the left gets shut down - that's how it always goes, because the courts, the legislature and the mainstream media are basically right wing.

The "is this free speech" thing isn't relevant; it's "who has the power to use this act and get it enforced". Consider how anti-terrorism laws are used against people of color and activists but generally not against white power/right wing types. Consider how "stand your ground" has been interpreted when the person standing their ground is a black woman. The more there's blanket liability for something, the more that will be used against marginalized people and the places where they congregate.
posted by Frowner at 11:52 AM on October 2, 2015 [46 favorites]


Now they'll feel extra unsafe because their safe spaces online will now be legally liable for anything that happens on there. They'll be at the mercy of the strictest laws in the country. If a state makes giving out abortion information a crime then they'll end up being legally liable, for example.

Except that giving out abortion information is protected speech (and there's a lot of case law on that.)

But the root of the problem is this - Section 230 is treated as a blanket exemption.
posted by NoxAeternum at 11:53 AM on October 2, 2015 [2 favorites]


Freedom of speech for who on the Internet?

Because I doubt many of these people who are abused and harassed online feel like they have free speech.


A lot of the harrasment is expressly for the purpose of silencing people, in fact.
posted by Artw at 11:53 AM on October 2, 2015 [4 favorites]


But I guess they are the collateral damage of free speech.

Yes. They are the collateral damage. No one said the cost of free speech was cheap, only that it's worth it. If all speech were non offensive, we wouldn't need the first amendment. That means you aren't going to feel comfortable visiting some sites, but I'm curious? Is anyone comfortable visiting every site on the net? I know I'm not, yet I'd rather not visit sites which disgust me than to have those sites regulated into non existence.

Note, I'm not talking about instances where specific individuals are targeted for harassment. Writing you think someone is an idiot is not the same thing as posting their home address or work address. That's a distinctly different situation.
posted by Beholder at 12:01 PM on October 2, 2015 [2 favorites]


Is anyone on the SJ end (as opposed to the corporate interests/moral panic end, I guess?) actually taking the Repeal 230 thing seriously? As in, are people organizing? Even Chu admits that TC published that as outrage clickbait and I find it really, really hard to believe that there's an actual movement of people wanting to take sledgehammers to one of the primary buttresses of what we consider to be the web.
posted by griphus at 12:02 PM on October 2, 2015 [3 favorites]


A lot of the harassment is expressly for the purpose of silencing people, in fact.

God yes. For all their complaints that Anita Sarkeesian is somehow trying to censor the internet or video games, 90 percent of her respondents pretty much are just saying "Shut up," many of them literally.
posted by maxsparber at 12:02 PM on October 2, 2015 [13 favorites]


Yes. They are the collateral damage. No one said the cost of free speech was cheap, only that it's worth it. If all speech were non offensive, we wouldn't need the first amendment. That means you aren't going to feel comfortable visiting some sites, but I'm curious? Is anyone comfortable visiting every site on the net? I know I'm not, yet I'd rather not visit sites which disgust me than to have those sites regulated into non existence.

Because the issue is that people are discomfitted.

Not that they are having nude pictures of themselves placed online nonconsentually.

Not that they are being harassed in the modern day equivalent of the public square constantly.

Not that they are being targeted by coordinated efforts to harass and abuse them in the physical world.

Let's actually be honest about what is going on.
posted by NoxAeternum at 12:06 PM on October 2, 2015 [14 favorites]


Note, I'm not talking about instances where specific individuals are targeted for harassment. Writing you think someone is an idiot is not the same thing as posting their home address or work address. That's a distinctly different situation.

And yet to Section 230,they are the same thing.
posted by NoxAeternum at 12:08 PM on October 2, 2015 [6 favorites]


Is anyone on the SJ end (as opposed to the corporate interests/moral panic end, I guess?) actually taking the Repeal 230 thing seriously?

I do, though I don't want to repeal it completely. I just want it to stop being treated as a blanket exemption.
posted by NoxAeternum at 12:10 PM on October 2, 2015


Let's actually be honest about what is going on.

I added a second paragraph explaining that harassing specific individuals, by exposing personal information about them, should not be considered free speech. Nude photos of ex lovers would be an example of that. Posting phone numbers or home address would be another.

A way to sidestep much (not all) of this problem is to implement a universal ignore list for posters, topics, and websites. Now that would improve the internet, culture, society, and politics far more than speech codes would. This is something I wish progressives would really get behind or anyone who doesn't like their buttons being pushed for page clicks.
posted by Beholder at 12:16 PM on October 2, 2015


"Except that giving out abortion information is protected speech (and there's a lot of case law on that.) "

That was just a random example but the fact that there's case law is my point. It's been to trial before. So every time a stupid, unconstitutional law passes sites will have to crack down on speech or risk fines and/or imprisonment.
posted by I-baLL at 12:18 PM on October 2, 2015


"I do, though I don't want to repeal it completely. I just want it to stop being treated as a blanket exemption."

It should be a blanket exception. Go after the harassers and abusers, not after ISPs and hosting companies.
posted by I-baLL at 12:19 PM on October 2, 2015 [8 favorites]


A way to sidestep much (not all) of this problem is to implement a universal ignore list for posters, topics, and websites. Now that would improve the internet, culture, society, and politics far more than speech codes would. This is something I wish progressives would really get behind or anyone who doesn't like their buttons being pushed for page clicks.

It is literally technologically impossible without completely abolishing modern networking and redesigning the internet (from ISPs to servers) as a wholly-owned subsidiary of the government.
posted by Pope Guilty at 12:20 PM on October 2, 2015 [8 favorites]


A way to sidestep much (not all) of this problem is to implement a universal ignore list for posters, topics, and websites. Now that would improve the internet, culture, society, and politics far more than speech codes would. This is something I wish progressives would really get behind or anyone who doesn't like their buttons being pushed for page clicks.

This regularly gets brought up in discussions about Reddit. It's basically saying that as long as you don't see the toxic waste, there's nothing bad happening.

And again, from the standpoint of Section 230, all of those are the same.
posted by NoxAeternum at 12:22 PM on October 2, 2015


I do, though I don't want to repeal it completely. I just want it to stop being treated as a blanket exemption.

Can you discuss in a little more detail how you would like it to function?

Now “Don’t go on the Internet” is as ridiculous advice as “Don’t use the telephone” would’ve been in 1996, or “Don’t use the mail” would’ve been in 1916 — to sever oneself from Internet services would mean severing oneself from where most social interaction and economic activity takes place.

It's interesting that Chu uses these examples, because my understanding is that Section 230 insures that web platforms will be treated they way telephone companies and delivery services are treated. If someone harasses you by phone, I don't think you can typically sue the phone company, and if someone sends you a death threat by FedEx, I don't think you'd get very far with a suit against FedEx.
posted by layceepee at 12:24 PM on October 2, 2015 [11 favorites]


And besides, you don't have to see somebody calling the cops on you for a SWAT team to bust down your door, or have to see somebody posting your nudes for naked pictures of you to be all over the net.

Or perhaps you're thinking that the gamergaters are going to put Anita Sarkeesian on ignore and then they won't have a reason to hassle her? Because if so then you radically misunderstand online misogyny.
posted by Pope Guilty at 12:24 PM on October 2, 2015 [5 favorites]


It should be a blanket exception. Go after the harassers and abusers, not after ISPs and hosting companies.

And what happens when the ISPs and hosting companies make it impossible to do so intentionally?

Or what about when they exert ex ante editorial control? (And before you say they would be held liable, let me point out that was what happened in Batzel - and the court ruled that Section 230 granted indemnity.)

There's a lot of bad behavior that the blanket exemption protects.
posted by NoxAeternum at 12:29 PM on October 2, 2015


No one said the cost of free speech was cheap, only that it's worth it.

It's worth it to the wealthy straight white males that the net was made for, who have the power and privilege to not be harassed and threatened.

Your argument is basically "Shut up and take it, because it works fine for me and mine."
posted by happyroach at 12:32 PM on October 2, 2015 [14 favorites]


Freedom of speech for who on the Internet?

Because I doubt many of these people who are abused and harassed online feel like they have free speech.


"Freedom of Speech" is not a guarantee that everyone will get to speak. It is a belief that giving governments power to regulate speech is usually a disaster.

If Chu's proposals were implemented, do you really think those powers would be used to protect the current targets of abuse more than they would be used by powerful people and corporations to shut down criticism?
posted by straight at 12:34 PM on October 2, 2015 [13 favorites]


Can you discuss in a little more detail how you would like it to function?

Any editorial control by the host ex ante to the posting of content should preclude Section 230 indemnity.

Intentional obfuscation of user information to preclude allowing a complainant to be able to pursue a tort against an end user should also preclude Section 230 indemnity.

That's two examples of where the blanket should be pulled back, in my opinion.
posted by NoxAeternum at 12:34 PM on October 2, 2015


In retrospect I shouldn’t have made my first comment because it’s derailed this thread to hell. Discussion about forced monetization of having gone viral isn’t the same as discussion of section 230, but here we are.
posted by Going To Maine at 12:41 PM on October 2, 2015 [2 favorites]


It is literally technologically impossible without completely abolishing modern networking and redesigning the internet (from ISPs to servers) as a wholly-owned subsidiary of the government.

How about a filter that blocks certain people or subjects. The sites that didn't allow the browsing filter would sink in popularity and the sites that did allow the filter would rise in popularity. No one would be forced to change their site, but with so much competition for readership, many would and the profit incentive for a click hungry troll would go down as the number of people putting them on ignore goes up.

This regularly gets brought up in discussions about Reddit. It's basically saying that as long as you don't see the toxic waste, there's nothing bad happening.

Toxic waste would still be there, but not as much of it would be produced, because ignore list would reduce the cash incentive to troll.
posted by Beholder at 12:44 PM on October 2, 2015


If Chu's proposals were implemented, do you really think those powers would be used to protect the current targets of abuse more than they would be used by powerful people and corporations to shut down criticism?

I know this is a popular argument, and there are many examples of government ovverreach on the question of free speech, but, honestly, that's not an argument against attempting to regulate abusive speech. It has always been illegal to shout fire in a crowded theater, and I have yet to see the government round up lefties under this law. In the meanwhile, the freedom to assemble, to have your own political views, and to speak them did nothing to protect communists during HUAC.

Right now we have a circumstance where the web is being used for speech that, in any other environment, is criminal, and yet it is impossible to prosecute, because of section 230. We need to make a better law, and not be afraid to do so because of the possibility it will be enforced in a way we don't like.
posted by maxsparber at 12:51 PM on October 2, 2015 [3 favorites]


It's not impossible to prosecute hate speech online.

It is just difficult, partially because of site operator apathy, partially because of law enforcement apathy, and partially because of technical obstacles.
posted by SansPoint at 12:54 PM on October 2, 2015 [2 favorites]


Any editorial control by the host ex ante to the posting of content should preclude Section 230 indemnity.

Would this mean that if you moderate at all you're liable and if you don't you're not? Or if you run a site where anyone can post anything you're not liable, but if you curate content you're liable? How would this impact, e.g., metafilter? What happens when one of us goes off the rails and a post or comment that looked innocuous if weird/unpleasant at the time now seems like it should have been a tip-off that something really bad was going to happen?
posted by Frowner at 12:59 PM on October 2, 2015 [2 favorites]


It's not impossible to prosecute hate speech online.

It is just difficult


Well, that's splitting hairs. Tell me one person who has been prosecuted for threatening murder against Anita Sarkeesian on Twitter. Tell me one person who has been prsexuted for doxxing Lindy West. The only people ever to suffer from posting revenge porn are Hunter Moore, Kamala Harris, and Casey Meyering, all prosecuted for crimes that are incidental to posting stolen nudes.

It is difficult to the point where it might as well be impossible.
posted by maxsparber at 12:59 PM on October 2, 2015 [3 favorites]


It has always been illegal to shout fire in a crowded theater, and I have yet to see the government round up lefties under this law...Right now we have a circumstance where the web is being used for speech that, in any other environment, is criminal, and yet it is impossible to prosecute, because of section 230. We need to make a better law, and not be afraid to do so because of the possibility it will be enforced in a way we don't like.

At least in the United States, you're confusing criminal and civil law. Section 230 prevents lawsuits, not arrests. And when it comes to civil cases, we have very good reason to believe that rich people would mostly win and poor people would mostly lose.

How many lawsuits does MetaFilter have the funds to defend itself against?
posted by straight at 12:59 PM on October 2, 2015 [3 favorites]


Any editorial control by the host ex ante to the posting of content should preclude Section 230 indemnity.

Wouldn't this mean that any effort by a host to filter the most egregiously harmful comments would preclude recourse to Section 230, while making no effort whatsoever to remove offensive posts would provide protection from legal action?

Intentional obfuscation of user information to preclude allowing a complainant to be able to pursue a tort against an end user should also preclude Section 230 indemnity.

Does this mean that any website or blog who was contacted by someone who complained about a post or comment would have to share information about the identity of the author or lose the protection provided by Section 230?

It has always been illegal to shout fire in a crowded theater, and I have yet to see the government round up lefties under this law.

Per my understanding, the example of not allowing someone to shout fire in a crowded theater was coined by Oliver Wendall Holmes in a decision holding that a speech in opposition to the draft during World War I was not entitled to protection under the First Amendment. So if you haven't seen the government using it to round up lefties, it's because you weren't looking.
posted by layceepee at 1:00 PM on October 2, 2015 [7 favorites]


Any editorial control by the host ex ante to the posting of content should preclude Section 230 indemnity.

Would this mean that if you moderate at all you're liable and if you don't you're not? Or if you run a site where anyone can post anything you're not liable, but if you curate content you're liable? How would this impact, e.g., metafilter? What happens when one of us goes off the rails and a post or comment that looked innocuous if weird/unpleasant at the time now seems like it should have been a tip-off that something really bad was going to happen?


ex ante - preceeding.

So no, none of that would fall under that.
posted by NoxAeternum at 1:01 PM on October 2, 2015


Intentional obfuscation of user information to preclude allowing a complainant to be able to pursue a tort against an end user should also preclude Section 230 indemnity.

I am not a lawyer but you need at the very least a name and an address to serve process, right? Considering the unceasing parade of hacks/leaks from companies with gold-plated IT and security departments, I'm not sure that trusting literally every dinky communications platform on the web to securely store this information for every user is in anyone's interest.

The other thing is where would the liability for verifying the information provided sit? If it isn't verified, then this is no more a useful solution than having a database full of Deez Nuts and Seymore Butts. If the answer is "in the hands of the platform owner," something like this would put the personal information of howevermany people at risk of doxxing from poor security. And while maybe you can prosecute the platform owner for the breach, it's closing the barn door after the horse has bolted.
posted by griphus at 1:01 PM on October 2, 2015 [3 favorites]


So any editorial control before the posting of content - by that you would mean if I wrote a post and the mods edited it, they would not be indemnified, but if they suggested ways in which I could edit it, they would be? And if someone posted a crappy comment which was allowed to sit, the site would be indemnified, but if the comment was partially edited the site would not be?
posted by Frowner at 1:07 PM on October 2, 2015 [1 favorite]


There could be some sort of safe harbor included. Like you have to have a written code of conduct/acceptable posting standards that users must agree to, have a staff member monitoring the comments 24/7 (or at all times that comments are open to submission), track ip addresses of commenters behind the scenes and maintain records of this for X days in Y manner, have a set procedure for responding to claims of harassment/abuse (adopted and actually followed), etc.

We already do this for DMCA violations, it wouldn't be that hard to say "if you want to avoid liability for third party comments on a site you host, you need to follow these procedures." It's not perfect by any means (and under DMCA is definitely used way more by powerful corporations than marginalized individuals), but it at least provides some avenue for legal recourse while still providing some level of protection for the service providers.
posted by melissasaurus at 1:11 PM on October 2, 2015 [3 favorites]


Frowner: Brianna Wu has made it clear that there is an active investigation by the FBI with regards to the threats against her. I think Zoe Quinn has made similar statements as well. Yes, no prosecutions yet, but these things take time, particularly when the people perpetuating them are hard to pin down due the nature of the platforms.

By the way, when even Zoe Quinn is upset at the suggestions being floated around to stop cyberviolence, you're doing it wrong.
posted by SansPoint at 1:14 PM on October 2, 2015 [6 favorites]


Per my understanding, the example of not allowing someone to shout fire in a crowded theater was coined by Oliver Wendall Holmes in a decision holding that a speech in opposition to the draft during World War I was not entitled to protection under the First Amendment.

Not exactly. Holmes held that passing out pamphlets specifically encouraging a crime -- refusing to participate in the draft -- were not protected by the first amendment. There were literally thousands of cases at this time that were the result of people opposing the draft, and Holmes opposed almost all of them, arguing that people have a right to a difference of opinion. But Charles Schenck and Elizabeth Baer's case was different, he argued, because it actively encouraged the violation of a law.

Holmes repeatedly stated that he believed the law should only be applied to speech that encouraged criminal harm.
posted by maxsparber at 1:15 PM on October 2, 2015


Would this mean that if you moderate at all you're liable and if you don't you're not? Or if you run a site where anyone can post anything you're not liable, but if you curate content you're liable?

That was a concern, and part of the motivation behind section 230. With some platforms, there's just too much user content for perfect moderation - Twitter? YouTube? - especially if every single comment had to be investigated to make sure it isn't slanderous, individual mods have to make judgments about what is parody and what isn't, what might be considered obscene in certain jurisdictions, etc. The hope was that by allowing providers to make reasonable efforts at moderation without subjecting themselves to liability for every oversight that slipped through, there could be richer conversation.

I'm certainly not defending online harassment and I acknowledge that truly terrible things have happened on these platforms, and the problem seems to be getting worse over time. But without section 230, the alternative may be that no operator risks having an open forum or platform anymore because the costs of legal liability are simply unbounded.

I agree that some kind of action needs to be taken, but I believe more aggressive moderation, stronger abuse policies, more responsive takedowns, and so on are a better alternative than repealing section 230 entirely and forcing anyone who wants to speak to run their own hosting platform, their own blog or video site, perfectly moderate any discussion that happens on their own site, etc. as an alternative. There's still a lot of good that had come from allowing the less privileged and less tech-savvy an opportunity to speak and find community on the platforms that exist now because of section 230 protections.
posted by cobra_high_tigers at 1:15 PM on October 2, 2015 [5 favorites]


ex ante - preceeding.

So no, none of that would fall under that.


I thought you were proposing that once a site has exercised editorial control, it's lost Section 230 protection for any future content posted.

Are you just saying that if a site edits a post, they don't have Section 230 protection with respect to that particular post? It doesn't seem to me like that would have much positive effect. How many harassment complaints are based on comments that were edited prior to posting?
posted by layceepee at 1:18 PM on October 2, 2015


I'm not even sure what the incentive to host a communications platform would be without 230. At the end of the day, someone has to make money somehow to keep the lights on and if that money is under permanent threat by nuisance lawsuits, why would anyone even want to host a platform when there's other ways of making money that don't get you sued quite so easily?
posted by griphus at 1:20 PM on October 2, 2015 [9 favorites]


Are you just saying that if a site edits a post, they don't have Section 230 protection with respect to that particular post? It doesn't seem to me like that would have much positive effect. How many harassment complaints are based on comments that were edited prior to posting?

Well, Batzel was, for one.
posted by NoxAeternum at 1:22 PM on October 2, 2015


If we're literally talking about "if you edited it, then you're liable, but if you don't, you're not", I'm not sure how much help that would be, but it does seem like a clear-cut enough deal (as long as "edit" doesn't mean something weird like "fixed a broken link") that it seems like giving it a try wouldn't be bad.
posted by Frowner at 1:27 PM on October 2, 2015


It's worth it to the wealthy straight white males that the net was made for, who have the power and privilege to not be harassed and threatened.

Less red meat, please. Freedom of speech should not protect online stalkers, period. If that's what we're talking about, I'm all for sending that bullshit to the corn field and people who engage in it to prison. Freedom of speech does (and should) protect the most vile and rancid subjects, no matter how offensive, as long as they don't target individuals for harassment.

Controlling internet speech sounds peachy keen as long as you agree with the controllers, but what if you don't agree with them? What if, in fact, the controllers despise you? Same arguments to control speech, same arguments not to. Meet the new boss, same as the old boss. I can't believe we're still having this discussion sixty years after book and records were burned. Not to mention the annual assault on libraries.

Resistance to censorship used to be a bedrock progressive issue and still should be, because we know what the right wing thinks about it.
posted by Beholder at 1:31 PM on October 2, 2015 [10 favorites]


Resistance to censorship used to be a bedrock progressive issue and still should be, because we know what the right wing thinks about it.

Threatening rape or violence has always been a criminal act, and the left wing has never supported this under the rubric of free speech.
posted by maxsparber at 1:36 PM on October 2, 2015 [2 favorites]


We don't need to overturn Section 230 to have that. We just need website operators to care.

But Section 230 says they don't have to care. Which is the whole problem.


I see cobra_high_tigers has commented on what I was going to observe, having been a part of all this back before the days of CDA 230. At that point there was concern that you either did all or nothing - that taking any sort of editorial effort potentially meant that you were left on the hook for what you didn't excise.

I think that's pretty obviously a problem since not only do you create an incentive for people to have wild west spaces with no limits at all but you also open up people trying to do the right thing to persecution by folks with deeper pockets. The DMCA mention is a good connection there - the takedown provision allows folks to demand something get dropped on sworn request, period, no matter how shit the request is.

In theory the people who put out fraudulent requests should suffer repercussions but the reality is they never once have (unless something has changed in the last six months since I did some research, but I would be shocked if I didn't hear about it). So folks looking to suppress things have an easy out because of it. I'm personally aware of a situation where it was explicitly used in a case with questionable ownership (and almost certain fair use exemption) to suppress an embarrassing audio recording of a very racist rant on a hot mike.

Personally I'm cool with some tuning of 230, but I am unsure there's much to do to it that doesn't end up with a situation that's no better because we don't exert effort at going after the underlying crimes. Amanda Hess wrote a piece not too long ago that went into a lot of detail about the issues she had with getting legal attention from the authorities. Not about leaked pictures or slander but outright violent threats. From identifiable people. She identified shittyness from Twitter in this article but there's nothing Twitter - or any other hosting website - could have done that was going to overcome our failure to properly address threatening women.
posted by phearlez at 1:37 PM on October 2, 2015 [3 favorites]


Well, Batzel was, for one.

Was it the case in Batzel that Ton Cremers edited Robert Smith's potentially defamatory email before posting it to the Museum Security List, or are you saying that because Cremers didn't publish all email he received to the list, posting the Smith email is an example of ex ante editorial action? I'm still confused.

Not exactly. Holmes held that passing out pamphlets specifically encouraging a crime -- refusing to participate in the draft -- were not protected by the first amendment.

Well, I'm not sure we need to accept Holmes' framing of the case, but even if we do, isn't it still an example of the government using the law to round up lefties?
posted by layceepee at 1:38 PM on October 2, 2015 [1 favorite]


Well, I'm not sure we need to accept Holmes' framing of the case, but even if we do, isn't it still an example of the government using the law to round up lefties?

Well, I guess it may be. They also rounded up extreme right wing bund members who supported Germany. So it wasn't specifically targeting lefties.
posted by maxsparber at 1:41 PM on October 2, 2015


The internet is a big place - there's room for everyone.
Like Chris Harper Mercer, right?

I, for one, believe the Holy First Amendment is costing us as much in lives and quality of lives as the Holy Second Amendment, but profiting off of Free Speech is something that has become much more important to the vilest of bigots and deranged assholes because THAT'S WHERE THE MONEY IS.
posted by oneswellfoop at 1:41 PM on October 2, 2015 [1 favorite]


Are there any proposals to amend 230 in a way that would allow for prosecution of harassers but not create thousands (millions?) of databases of doxxable info?
posted by griphus at 1:42 PM on October 2, 2015 [2 favorites]


Was it the case in Batzel that Ton Cremers edited Robert Smith's potentially defamatory email before posting it to the Museum Security List, or are you saying that because Cremers didn't publish all email he received to the list, posting the Smith email is an example of ex ante editorial action? I'm still confused.

The latter. Let's be honest - had that been all done with paper instead of digitally, Cremers would have at the very least been forced to defend his conduct at trial. Which is why I assert that Batzel was a poorly ruled and tech-ignorant ruling.
posted by NoxAeternum at 1:50 PM on October 2, 2015


How about a filter that blocks certain people or subjects.

Blocking subjects is something the Chinese have been trying for a decade and it turns out that while you can repress subjects, people just come up with ever-evolving codes because people want to talk about shit. Blocking people, meanwhile, runs into the problem I mentioned upthread- it's impossible. Without literally binding a real-person government-verified identity to every packet (something which, again, would require completely reinventing networking and banning most existing networking technologies), you have no way of telling who's actually behind which account. For all you know, this entire thread is just you and me, with every other account being controlled by me. (Except griphus. I'm, uh, just not up to that level.)



The sites that didn't allow the browsing filter would sink in popularity and the sites that did allow the filter would rise in popularity.

What gives you that impression? This runs into the same problem that plagues the argument that the market will solve racism by punishing racist stores- it only works if the value being punished by the market is one society disagrees with. In the world as it actually exists, there was during segregation a huge demand for racist stores, just as there's a huge demand for asshole spaces, as evidenced by the fact that Reddit and 4chan are much, much bigger than Metafilter. We've already tried what you're proposing, and it's baldly insufficient.


No one would be forced to change their site, but with so much competition for readership, many would and the profit incentive for a click hungry troll would go down as the number of people putting them on ignore goes up.

Again, this simply doesn't match observable reality.
posted by Pope Guilty at 2:36 PM on October 2, 2015 [11 favorites]


I have to admit, I often wonder just how easy it might be to pull-in a reasonable side-income simply playing the most far-right-wing blogger I can imagine online.

The only thing that stops me from doing this is fear of sinking so far into it I can't get back out, Mother Night style.
posted by Ghostride The Whip at 3:12 PM on October 2, 2015 [2 favorites]


I have to admit, I often wonder just how easy it might be to pull-in a reasonable side-income simply playing the most far-right-wing blogger I can imagine online.

It's been a financially tumultuous few years in the [icnh] household, and I can not tell you the number of times my spouse and I have discussed just this. And debated whether we would be doing net good by separating the bigots of the world from their cash, or whether we'd be contributing more hate. Is hate finite, so if we scammed haters, we'd be depriving other haters from their income? Or is it infinite and ever little bit makes the world a worse place?

Whatever the answer, it certainly seems like a comfortable living.
posted by [insert clever name here] at 4:06 PM on October 2, 2015


Whatever the answer, it certainly seems like a comfortable living.

Contributing to a world in which people suspect that they are being gaslit is not good.
posted by Going To Maine at 4:14 PM on October 2, 2015


Be the lack of Joshua Goldbergs you want to see in the world.
posted by Artw at 4:28 PM on October 2, 2015 [6 favorites]


I think I'm contractually obliged to point the (derailed) thread to my response at EFF to Arthur's suggestion to tear down CDA 230. I'm not sure it adds much to the debate here, though, where others have raised the same points I make.

Maybe more interesting for everybody here is this extract from Sarah Jeong's excellent book, which goes into more detail about the challenges of using CDA 230 and the DMCA to combat harassment (you need to page down a bit to get to the CDA 230 bit).
posted by ntk at 5:27 PM on October 2, 2015 [8 favorites]


I have to admit, I often wonder just how easy it might be to pull-in a reasonable side-income simply playing the most far-right-wing blogger I can imagine online.

The only thing that stops me from doing this is fear of sinking so far into it I can't get back out, Mother Night style.
Battle not with monsters, lest ye become a monster, and if you gaze into the abyss, the abyss gazes also into you.
(Friedrich Nietzsche)

Like Chris Harper Mercer, right?

I, for one, believe the Holy First Amendment is costing us as much in lives and quality of lives as the Holy Second Amendment, but profiting off of Free Speech is something that has become much more important to the vilest of bigots and deranged assholes because THAT'S WHERE THE MONEY IS.


Who is actually making money from this? The chan sites where Mercer is alleged to have posted are seedy, weird backwaters that run on shoestring budgets compared to big properties like Twitter or Facebook. They are popular niches, and they consume a lot of bandwidth and hosting resources and peoples' time, but they don't have a very successful business model. Few companies want to place their ads beside typical *chan content.
posted by theorique at 5:59 PM on October 2, 2015 [4 favorites]


“Stop giving these terrible people attention”

My new mantra.
posted by irisclara at 6:24 PM on October 2, 2015


Fucking Capital!
posted by clavdivs at 8:15 PM on October 2, 2015 [2 favorites]


I like how Dan Savage calls this sort of thing "sweet, sweet bigot money" (see Kim Davis thread).

As usual, Chu does a damn good job of discussing things.
posted by jenfullmoon at 10:31 PM on October 2, 2015 [2 favorites]




Yes, I read that. The problem is that not all intermediaries are neutral. There are many, yes, who are not looking to be a source of abuse, and do deal with it as it comes to their attention. But there are intermediaries who set up websites with the specific intent to enable harassment and abuse (e.g. revenge porn), as well as intermediaries who knowingly turn a blind eye to abuse because it works in their favor.

That's the issue that the EFF dodges - that Section 230 indemnifies Hunter Moore just as much as Metafilter.
posted by NoxAeternum at 8:43 AM on October 6, 2015


I'm not sure it's so much a dodge as it is reflective of their philosophy. The EFF is pretty clearly of the "better that 10 guilty men go free" sort of thinking. Their linked article in the first paragraph - which states yes-this-is-a-problem - talks about the sorts of approaches they believe in. You can disagree with their priorities or where they're comfortable drawing a line but I think "dodge" is not a reasonable descriptor.
posted by phearlez at 10:33 AM on October 6, 2015


See also: Weev
posted by Artw at 10:39 AM on October 6, 2015 [1 favorite]


« Older What keeps us apart, what brings us together   |   웃 i am not here and this is not really happening. Newer »


This thread has been archived and is closed to new comments