‘I can never go back to who I was before.’
September 14, 2024 10:03 AM   Subscribe

How a former content moderator thinks about the job. (SLWaPo gift) Content warning for animal abuse.
posted by box (24 comments total) 6 users marked this as a favorite
 
Requires account creation.
posted by manageyourexpectations at 10:07 AM on September 14 [4 favorites]


I wish I hadn't read that. It's about as inexplicit as it could be but I wish I hadn't. Going to try to forget now.
posted by Frowner at 10:20 AM on September 14 [5 favorites]


Try this link. And yes, moderation is tough going, even when there's nothing explicit to look at.
posted by jessamyn at 10:21 AM on September 14 [6 favorites]


a population of 200,000,000 is going to have more than a few sick, sick mofos; 1% of 1% of 1% is 200 people
posted by torokunai at 10:22 AM on September 14


I'd suggest this needs a trigger warning for animal abuse (and perhaps more, I noped out at that point).
posted by carrienation at 10:33 AM on September 14 [2 favorites]


Welp, there's the thing that made me cry today.
posted by limeonaire at 10:52 AM on September 14


This fits well with the "Stop Drinking from the Toilet" article posted by chavnet. That metaphor is really apt, and was the last nail in the coffin for my prior stance that information is neutral. But at what cost to have clean drinking water?
posted by brambleboy at 11:24 AM on September 14


I just read this horrifying piece on animal abuse and moderation. Content notice: explicit descriptions of animal torture. Totally baffling that Google and Meta want to be hosting that sort of content. Like, how could you sell ads against it?
posted by novalis_dt at 11:29 AM on September 14 [1 favorite]


I like the use of the comic medium to tell this story. Particularly showing us what it's like to encounter horrible abuse imagery in the middle of something innocuous. (Don't worry, the offensive areas are all blacked out in the comic.)

Some friends of mine have created The Trust & Safety Professional Association (TSPA) in response to problems like this author describes. It's a trade group to represent the needs of content moderators, to provide a professional place where companies and workers can set standards and grapple with the challenges of the difficult job.

There's a place for more forceful protection for content moderators too. Various groups have tried unionizing but I think maybe not in any large cross-industry way? It's complicated because so much trust & safety work is offshored to different countries.
posted by Nelson at 11:48 AM on September 14 [9 favorites]


I was reading through the piece that novalis_dt linked to until I got to this bit regarding Kiwifarms: "I’ll be eternally grateful [to them]. I don’t have an opinion on anything else they have done as I was unaware of the existence of the site previously."

And that's when I closed the browser. Fuck that and fuck them. That kind of willful moral ignorance in an article about the cruelty that humans inflict on other beings makes the author immediately suspect as a source in my opinion.
posted by hackwolf at 12:17 PM on September 14 [10 favorites]


Reading that was a gut punch, for sure. I wish the comic had continued with some reflection on just how a person is supposed to grapple with that but maybe the short answer is you can’t.
I did my share of content annotations in the late ‘00s - back then if you wanted to train a model you usually had to put in the time to gather data and do your own labeling. Most of it was adult content which became generically clownish, but a few pieces of content slipped in that still haunt me.
posted by simra at 12:18 PM on September 14


One place I'm familiar with had a lot of flaggable images, and the content moderation team had that job as well as front line customer support, usually dealing with upset and confused people.

Unlike engineers at the same place who were almost all white, male, well paid, older, and with flexible hours, they were almost all women in their 20s, more often black and brown, with only scheduled timed breaks during the day, fewer options for time off, and much lower pay.

It sucked, and I suspect they all got PTSD from work.
posted by zippy at 12:59 PM on September 14 [2 favorites]


Reading that piece from novalis_dt makes me think the Kiwifarms bit is a case where one group of abusers found another and targets them because it is socially acceptable to do so. I’m not sure if that’s a win, but I guess if it channels the desire to target individuals to people who support animal abuse instead of trans people that’s maybe marginally better? But it’s not unproblematic.

The problem with having AI do moderation, as mentioned in that newsletter, is that it still takes humans to train and check the work of AI. It’s good that this lessens the raw number of times a person sees the worst things, but it doesn’t remove the human involvement and psychological costs entirely. It’s also good that companies are adding features to their moderation systems that abstract or change some kinds of horrifying content so that moderators aren’t looking at the stuff without something to help screen some of it from their nightmares. I hope further advancements are made in that area so that moderators don’t have the same amount of psychological damage from the work as they get currently.

I think it would be good policy to disallow social media companies to outsource moderation. People who do this work should get pay and benefits commensurate to the salaried employees of these companies. Most importantly, moderators shouldn’t be shoved off in hidden corners or subcontracting sites to suffer the effects of companies’ moderation philosophies and policies without the executives of those companies having to see the impact of those policies on their employees and co-workers every day. (Though companies should also be required to have moderators in every country in which they operate because speaking the language and understanding the social context of the content posted is essential to responsible moderation.) Good moderation might be impossible to do at scale, but social media companies should be doing way better than they are now.
posted by fontgoddess at 1:07 PM on September 14 [4 favorites]


It is so dark that they've set up a system in which we pay people in poorer countries to filter our content so we don't have to watch the disturbing stuff.
posted by ssg at 1:10 PM on September 14 [10 favorites]


Outsourcing consequences is the point of much of capitalism and corporate structure. The fact that the worst jobs are low prestige with workers viewed as disposable is not new — look at how janitors and sanitation workers get treated. Look at the abuses of outsourced manufacturing, or the historical abuses of industrial manufacturing from its very inception. Social media isn’t doing something new in its callousness towards moderator wellbeing, but it’s sure dark that people in rich countries have figured out how to outsource so many different kinds of suffering.
posted by fontgoddess at 1:23 PM on September 14 [5 favorites]


a case where one group of abusers found another and targets them because it is socially acceptable to do so.

I agree with this. The enemy of my enemy is not always my friend.

I link to that article not to lionize kiwifarms but to describe a phenomenon (major sites not censoring horrific animal abuse) that I, until I read that article a few days ago, was unaware of.

And you don't have to believe the author -- you can follow the link to a still-up YouTube video and see for yourself.
posted by novalis_dt at 1:37 PM on September 14 [1 favorite]


Mod note: One comment removed. Please avoid posting scenarios or hopes people murdering one another, particularly in this thread.
posted by Brandon Blatcher (staff) at 1:51 PM on September 14


novalis_dt, it’s a solid and relevant report on the existence of this animal abuse content, even though the author should have tried harder to understand Kiwifarms’s place in the ecosystem of online abusers. I didn’t think you necessarily agreed with that particular piece of what the person wrote. The overall point about the wide variety and proliferation of incredibly disturbing content and the industrialization of producing it is something good to know when thinking about the impact on moderators forced to look at it professionally.
posted by fontgoddess at 1:52 PM on September 14


Thank you so much for sharing this. Over the past few years I've become connected with the community that moderates spaces like Twitch streams, where it is unfortunately a social norm that most moderators are unpaid volunteers, often teenagers, who also sometimes do the work of moderating very large (10K plus member) community discord servers.

The kids see becoming a mod as a fun perk at first, but inevitably they are worn down and sometimes traumatized, being exposed to all kinds of things they were not expecting to have to deal with, from "simple" nudity and porn bots to more complex issues like threats of self harm or accusations of abuse within the community.

If you're moderating for a small discord server or for your friend's 10 viewer stream, that's one thing - but the fact that so many big streamers who make tens of thousand dollars a year don't pay their mods at all frustrates me endlessly. A few do, but most don't. And these teens pay the price over time, and often don't realize what they're getting into until the damage has already been done.
posted by anastasiav at 4:32 PM on September 14 [3 favorites]


novalis_dt‘s link ist the most disturbing thing I have read on the internet in a long time. Words fail me.
posted by uncle harold at 3:15 AM on September 15


A big shout-out to the mods here; I sincerely hope you don’t have to deal with content that disturbing.
posted by TedW at 6:20 AM on September 15 [3 favorites]


I read one sentence too much of this article earlier today and now have something horrible in my mind that will be there for the rest of my life. I can’t imagine having to see this stuff all day. This is the one use case for AI that I think is really good.
posted by Winnie the Proust at 12:20 PM on September 15 [1 favorite]


This is not strictly a site-moderation anecdote, but I think it's related so here goes:
Someone very close to me works for the local Police Department as an office admin. She loves the work and is very proud to be able to actually help people who are in grim places navigate difficult and arcane paths w/r/t Court and Police bureaucracy.
But a big part of her job involves reviewing official police reports and removing certain categories of information (sensitive personal information, or names of Minors or other protected individuals, etc.) so they can be made public.
It's like she has to read real-life horror stories of truly awful things that really happened, every day at work. In many cases, these things happened at places we pass by often. In a few, they happened to people we know.
And she is sharply restricted in what information she can share with me, so she can't even really talk about these stories at home. Some days she just comes home with the burden of some horrific thing that happened and has to process it alone. They have resources at her office to help with this burden, but still I see the impact it has on her.
And we know her burden is small compared to that of the responders who actually saw and lived those stories, and had to write them down.
Some days it's hard to reconcile our comfortable, stable, safe lives with the knowledge of how many of our neighbors experience life. It's hard to live knowing such horrible things are happening all around us.
posted by BigLankyBastard at 7:32 AM on September 17


It's hard to live knowing such horrible things are happening all around us.

My sister works for a State Police department in a very similar role and it's interesting, I do think that there are some personality types who have an easier time managing this type of thing than others. I am one of those "I will never be able to get those things out of my head" people, especially about gruesome sadism-type stuff. She has an easier time of it and I'm never sure why.

I sincerely hope you don’t have to deal with content that disturbing.

We're mostly lucky that we're a text-based site, though sometimes mods check out linked videos that have awful stuff in them, or people post AskMes of really difficult situations that are themselves disturbing. However by far the worst things mods have to deal with here are either community members who are spiraling in various ways which can sometimes involve them lashing out in ways that are abusive to mods, or community members in crisis situations where what is happening in their lives needs to remain private but has also manifested in some public way on the site and mods need to manage that while also handling the public aspects of it.

People interested in this topic might really like the novel We Had to Remove This Post which is an interesting mood piece about content moderation. Marked as horror but mostly not horrifying like those links were, more like talks about the things which get moderated without telling you specifics.
posted by jessamyn at 9:23 AM on September 17 [1 favorite]


« Older Just how much 2024 GOP platform is based on lies?   |   "That's a wonderful approach" Newer »


You are not currently logged in. Log in or create a new account to post comments.