[HN Gopher] Open Letter from Facebook Content Moderators ___________________________________________________________________ Open Letter from Facebook Content Moderators Author : ynac Score : 91 points Date : 2020-11-18 21:27 UTC (1 hours ago) (HTM) web link (www.foxglove.org.uk) (TXT) w3m dump (www.foxglove.org.uk) | supernova87a wrote: | Alternate version, points #1-5: _We 're so tired, so very very | tired of doing this._ | anoncow wrote: | The best corporations are heartless machines (in this case this | applies to the contractors primarily, not sure if FB would mind | if the work is getting done). | | We should be thankful for the jobs that we have today. | Corporations of the future will not need as many warm bodies. | (Sarcasm) | AnimalMuppet wrote: | I'm glad that you don't have a time machine. So you can't back | and persuade those who worked 16-hour days in unsafe conditions | that they should be thankful for the jobs they had then. | anoncow wrote: | Was being sarcastic. | gtrhtrhtrhtr wrote: | What's interesting is that Facebook, like Google, and other | similar companies, are absolutely the best places to work at | with the way they treat their employees, the benefits, etc. So | now imagine how most companies treat their employees... | exogeny wrote: | They're not employees, they're often contractors. I think | that's the point. | metachor wrote: | What the heck kind of take is that? We should be fighting tooth | and nail today to not let that kind of future come into | existence. | SteveGerencser wrote: | Some of us have tried for decades. But people like cheap/free | stuff and don't care what happens behind the curtain for the | most part. | anoncow wrote: | Agreed. I was being sarcastic. We should be striving for | better working conditions. | cambalache wrote: | We should, but we are not doing that. We have been cleverly | manipulated to care more about what a soon-to-be-ex president | said or didnt say or how many genders actually exist. You can | protest against everything except those things that are | against the interests of the rich and powerful. The protests | are channel to serve their interests, not the other way | around. Dont believe me? Just a sample: https://trends.google | .com/trends/explore?date=today%205-y&ge... Dont believe me? | Just a sample: | PragmaticPulp wrote: | The TL;DR is that around 200 content moderators, location | unspecified, are protesting decisions to move them back into the | office. | | It's not immediately clear from the letter that this was | Facebook's decision. In fact, it appears that Facebook | subcontracts content moderation to Accenture and CPL, both of | which are named in this letter. | | The letter also takes shots at Mark Zuckerberg's growing wealth | and attacks Facebook's attempts to augment the human content | moderators with AI pre-filters to remove the worst content. The | latter claim is strange, as the content moderators are the ones | most likely to benefit from AI pre-filtering. They try to claim | that Facebook "failed" to create a workable AI, but I seriously | doubt that Facebook has given up on the project. More likely that | they're still iterating. I suspect these people feel they need to | attack the AI angle to protect their jobs. | | The letter goes on to demand 50% higher hazard pay, demands that | the subcontracted content moderators be hired directly into | Facebook as full employees, out of their contracting jobs under | consulting firms. | [deleted] | gtrhtrhtrhtr wrote: | I'm wondering if there is also an element of: you should be | surveilled while you moderate content, as otherwise you could | steal it. It's a hard problem as moderators have access to user | data (I imagine). | occamrazor wrote: | I imagine that a locked down laptop running just the | moderation UI provides reasonable security. If not, the | camera can ensure that the moderator doesn't use the "analog | hole" to copy video or image content. Forced use of | headphones instead of speakers finally almost closes the gap | for audio. | Karunamon wrote: | Even company-mandated spyware is superior to hauling everyone | into an office in the middle of a pandemic. | dbt00 wrote: | > They try to claim that Facebook "failed" to create a workable | AI, but I seriously doubt that Facebook has given up on the | project. | | On the one hand, they probably haven't. On the other hand, they | provably aren't as close as they thought they were, and may be | very far away. | satya71 wrote: | Facebook should bear the brunt. In a more reasonable | arrangement (in which, the generator of risk has to bear the | liability), these people would be direct employees of Facebook. | jefftk wrote: | _> in which, the generator of risk has to bear the liability_ | | Do you think this in general, or just for content moderation? | For example, is it reasonable that I can buy car insurance | that covers me if I cause an accident? | arcticbull wrote: | > For example, is it reasonable that I can buy car | insurance that covers me if I cause an accident? | | Reasonable? I'd describe it as mandatory, and not even a | question in no-fault jurisdictions. | leetcrew wrote: | > For example, is it reasonable that I can buy car | insurance that covers me if I cause an accident? | | did you mean to phrase this differently? isn't that the | primary purpose of having car insurance? | zentiggr wrote: | If part of someone's work responsibilities involves screening | for child abuse and other direct violence, and every other | depravity a user might try to post, that worker should be given | the best possible mental health support, and be paid | substantially more than any minimum wage. | | Like anyone serving in the military and possibly subject to all | the PTSD and injury that combat can entail, these moderators | are standing as a thin line against the worst things humanity | can do to each other. | | They should be given respect and every possible consideration | for their sacrifice. | | Their demands are merely a simple first step closer to the | recognition they actually deserve. I would go much further. | toyg wrote: | _> It 's clear that they want Facebook to bear the brunt of the | bad PR_ | | That's the right thing to do, considering FB is the only entity | with some power over their employers. Avoiding this kind of | thing is exactly why brands outsource their dirty work. | monkin wrote: | > 5. Offer real healthcare and psychiatric care. Facebook | employees enjoy various benefits, including private health | insurance and visits to psychiatrists. Content moderators, who | bear the brunt of the mental health trauma associated with | Facebook's toxic content, are offered 45 minutes a week with a | 'wellness coach'. These 'coaches' are generally not psychologists | or psychiatrists and are contractually forbidden from diagnosis | or treatment. And they generally cannot build a relationship of | trust with moderators, since workers know that Facebook | management (and Accenture/CPL management) ask 'coaches' to reveal | confidential details of counselling sessions. Moderators deserve | at least as much mental and physical health support as full | Facebook staff. | | Low pay and no psychiatric help with that kind of content?! This | is just outrageous. Mark should moderate this stuff by himself to | see how huge burden it is. And, this is not Rotten or Goatse type | of shock it's entirely different level. | zentiggr wrote: | I'd love to see everyone in Facebook's chain of command above | the moderators being required to do a week worth of the actual | moderating. | | Then let them say in public that they're doing their best for | their employees. | grecy wrote: | I agree with you wholeheartedly, but capitalism has allowed | this kind of thing for generations. | | Mining magnates should be forced to work at the coalface. The | Waltons should be forced to stock shelves, Bezos should be in | the distribution centres. | | The list is endless, and unfortunately this is nothing new. | The best we can do is minimum wage laws and unions to try and | "negotiate" with billionaires to give us a crumb. | yeetman21 wrote: | I don't mean to be mean and I support the plight of the | moderators but this is just simple supply and demand. What are | the skill required to be a moderator: eyesight and motor | functions. Who has those skills: almost everyone. What will the | companies therefore pay: almost nothing. I think the best thing | that can happen to those jobs are they they are automated out | of existence with AI, or heavily reduced, | helmholtz wrote: | You don't mean to be mean, but then follow it up with such a | tasteless and repugnant comment. God help those in your life | if this really is your attitude towards living, breathing | humans. | Judgmentality wrote: | > I think the best thing that can happen to those jobs are | they they are automated out of existence with AI | | Yes, because having your account/post/whatever flagged by | Google/Facebook/whoever and being unable to talk to a human | about why or get any type of response is "the best thing that | can happen" | | It's a shitty job but somebody has to do it (or we have to | wildly rethink what it means to have free speech), and | wouldn't it be nice if the people doing it weren't abused? | | Maybe scale fundamentally makes some experiences worse for | the consumer and the cost burden should be shouldered by the | corporations profiting from it? | gmadsen wrote: | not everything is supply and demand. we have a "mostly" 40 | hour work week, not because that is what is needed for | demand, but it is enforced by labor laws. | | I see no reason why there can't be a mandate to provide | mental health benefits to these workers. It is absolutely a | safety hazard. | justiceforsaas wrote: | Is there a particular logical reason why Facebook required these | people to get back to office? Is it related to productivity? | Would be curious to hear Facebook's take on this. | alkonaut wrote: | Wtf, people who work at computers, having to work in offices, | _now_? It's the very worst it's ever been! What are they | thinking? | | I'd expect almost everyone who works at a computer screen to be | working from home since March. Even if it requires expensive | technical solutions to ensure it. I can't imagine moderators need | more than an internet connection and computer. | delroth wrote: | My understanding is that there are certain legal requirements | associated with content moderators who may have to review | illegal content, especially around child abuse / sexual | imagery. This is what I've been told in the past by friends in | other companies working on content moderation -- I don't have | any specific evidence for or against it, but it passes the | smell test. | | The whole thing in the article about Facebook suddenly trying | to automate this part of the content moderation job seems to | also point in that direction. | bredren wrote: | I also presume that this is related to the type of content | that has to be reviewed and the need to have premises | recording of the work as it is being done for audit. | | Sounds like someone in the automod product failed to deliver | or failed to deliver in a timely enough manner. ___________________________________________________________________ (page generated 2020-11-18 23:00 UTC)