[HN Gopher] Supreme Court allows Reddit mods to anonymously defe... ___________________________________________________________________ Supreme Court allows Reddit mods to anonymously defend Section 230 Author : taubek Score : 126 points Date : 2023-01-22 15:12 UTC (7 hours ago) (HTM) web link (arstechnica.com) (TXT) w3m dump (arstechnica.com) | Bystander22 wrote: | There were some interesting observations on r/law, particularly | this one: | https://www.reddit.com/r/law/comments/10h9vju/supreme_court_... | | From there: | | >The issue has been muddied by sites like reddit who have been | keen to play up outlandish possibilities of individual users or | volunteer moderators becoming liable, which has never been a | likely outcome of this case. The bigger and more realistic threat | to a site like reddit is the possibility that actions taken by | tools like automoderators, slur filters, or recommendation | algorithms (e.g., sorting by "hot") might become legally | analogous to editorial decisions. | | >Because those tools are sometimes set up by moderators and | influenced by user actions (e.g. voting/reporting), there are | sort of fringe or edge-case scenarios where the lines could | potential blur between algorithmic policies and user/moderator | actions. But we as users don't really need to worry too much | about every conceivable edge-case legal theory, because a site | like reddit would presumably be incentivized to remove or disable | any tools that could create such a liability, to protect Reddit's | own self-interest. | | >The algorithms that keep people clicking/viewing/refreshing the | site are critical to the business interests of sites like Reddit | and Youtube. It's really important for reddit's bottom line to | have broad latitude to gamify user engagement by showing more of | what will keep people on reddit longer and more-frequently. | That's a less flattering PR angle than playing up the possibility | that reddit users or mods could get in legal trouble. | | >It's not so much that there is no possible way that any | ramification of this case could ever put a user or a mod of a | site like reddit in any jeopardy in any conceivable | scenario...It's more like, sites like Reddit have a lot to lose | if their algorithmic recommendations should become legally | analogous to editorial decisions. | photochemsyn wrote: | [flagged] | orra wrote: | > noting that sped-up clinical trials for vaccines might be | missing some issues related to vaccine efficacy and side | effects, there's a problem. | | Clinical trials weren't sped up: there's your problem. | edgyquant wrote: | stop cherry picking, the crust of that persons comment is | about censorship | orra wrote: | We're talking about private (not government) censorship of | _inaccurate_ information. | photochemsyn wrote: | Not to go to far off topic, but: | | https://www.gao.gov/products/gao-21-319 | | > "Operation Warp Speed was a federal effort that supported | multiple COVID-19 vaccine candidates to speed up development. | We analyzed the program's vaccine candidates and found that | their development followed traditional practices, with some | adaptations. For example, some clinical trial phases | overlapped with each other and with animal studies to | accelerate development." | | Notice this probably played a role in the fact that the | clinical trials didn't reveal that the vaccines didn't have | much effect in preventing transmission (although severity of | symptoms was clearly reduced). | | This discussion would have led to a perma-ban on the main | Covid subreddits, I believe. | orra wrote: | > For example, some clinical trial phases overlapped with | each other and with animal studies to accelerate | development. | | Right, but GP was misrepresenting this. Having phases 1 and | 2 overlap when phase 1 is clearly going well was at worst a | risk to the phase 2 participants. It wasn't skimping on the | length of phase 2 or 3, so there was never an increased | risk of dangerous vaccines for the public. | | > Notice this probably played a role in the fact that the | clinical trials didn't reveal that the vaccines didn't have | much effect in preventing transmission (although severity | of symptoms was clearly reduced). | | I don't think measuring reduction in transmission is a | primary concern of vaccine trials? It also seems quite hard | to do, without a significant proportion of the population | being vaccinated. | edgyquant wrote: | >Having phases 1 and 2 overlap when phase 1 is clearly | going well was at worst a risk to the phase 2 | participants. It wasn't skimping on the length of phase 2 | or 3, so there was never an increased risk of dangerous | vaccines for the public. | | Irrelevant "actually"ing after being objectively wrong. | Don't cherry pick to shutdown a conversation: and if you | do don't be wrong in your attack. | orra wrote: | > and if you do don't be wrong in your attack. | | OP was clearly said the trials were sped up and that | compromised safety. That's nonsense. | AlbertCory wrote: | Clinical trials are phased for very good reasons, AFAIK: | | phase 1 is for safety | | phase 2 is for efficacy | | phase 3 is for dosage | | They're not the same. Shortening Phase 1 is automatically | a compromise with safety. | orra wrote: | Phase 1 wasn't shortened. And all stages assess safety. | swimfar wrote: | Were the duration of the phases shortened? The quote | makes it sound like the duration was kept the same, just | that the following phase started before the end of the | previous phase. | AlbertCory wrote: | I'm not sure which "quote" you're talking about. I | carefully did not say Phase 1 was shortened. | | The reason for sequencing, in the abstract, would be that | if Phase 2 looks like "hey, this thing really works!" | then the pressure to approve it would become | irresistible. Whereas if Phase 1 finds unacceptable side | effects, then Phase 2 would never start. | | Note again that I'm not saying that's what happened. | fzeroracer wrote: | > A far better option is to rely entirely on a transparent | algorithmic model, in which automated tools like lists of | trigger keywords and contextual analysis that cause posts to be | flagged for further review are clearly visible to the user | audience. | | This is the worst idea I've read on HN and shows you've never | actually dealt with users at scale. | | The moment your algorithm is visible, users will know how to | beat it. Your list of 'trigger keywords' becomes a weapon both | to harass normal users (by potentially tricking them into | writing the trigger weapons) as well as by trolls because they | know exactly how to modify the word to get around the filter. | | Social media is no different from a game, and when people | figure out how a game works, they learn how to break it. | xbar wrote: | I fear a ruling where the interpretation of Section 230 puts dang | at risk of liability for all of his necessary and appropriate | historic moderation. | shadowgovt wrote: | This Court case largely isn't about that; the protections for | moderation Section 230 grants are largely not in question here. | | The question is whether an automated algorithm is protected by | 230 in the same sense that manual moderation is. To the extent | this _might_ impact HN, it 'd be more along the lines of "HN | weights stories too heavily by (upvotes, time of post, some | other metric) and as a result harm has occurred." | dragonwriter wrote: | Weakening Section 230 doesn't just put dang at risk, it puts | every HN user that takes an action which alters the visibility | of content, like flagging. | [deleted] | hgsgm wrote: | That's not true. Section 230 is based on a principle that | _someone_ is responsible for the content a post, and creators | and publishers don 't get to both deflect responsibility to | the other. | | Next, YC would be the relevant entity, not dang personally. | (But that's a minor point because dang is part of YC. ) | | The difference between you and YC is that YC actually | collects posts and re- publishes them. | | Users simply tell YC if they like a post or not. They don't | transmit the post content to anyone. | | YC decides whether to grey a post or remove it, or keep it. | Showing a post higher or lower on a page doesn't mean | anything related to whether the post violates some law and | someone needs to be held responsible. | | Reddit mods are closer, since they have specific power to ban | a post or poster. | dragonwriter wrote: | > Section 230 is based on a principle that someone is | responsible for the content a post | | No, its not. Law predating section 230 is based on the | principal that _lots_ of people can be responsible for | published content. | | Section 230 is based on the conclusion that certain of | those rules making people liable are inappropriate in the | online context; particularly those that would give any | active moderators of content liability as publishers, which | does not depend on actual knowledge of the illegality of | any content. These rules _were_ being applied to both sites | _and_ users other than the creator when section 230 was | adopted, which is why it explicilty protects both operators | and users. | | Section 230 doesn't he impact the liability of creators at | all | 2OEH8eoCRo0 wrote: | Perhaps Section 230 shouldn't extend to sites with | anonymity. If somebody is harmed then somebody should be | liable. | | An issue is you have anons causing harm to users who cannot | be sued and the platform also cannot be sued. No good. | dragonwriter wrote: | > Perhaps Section 230 shouldn't extend to sites with | anonymity. If somebody is harmed then somebody should be | liable. | | The creator is liable even if they are anonymous. | | There is a difference between someone being liable and it | being easy to identify who they are. (And, even if the | site owner isn't liable, a John Doe suit against the | anonymous user can be a framework within which to | subpoena the site owner for records which help to | identify the liable user.) | 2OEH8eoCRo0 wrote: | I don't like that. If the user cannot be held liable for | whatever reason it needs to fall on the site. I don't | like that people can be harmed without recourse. There is | little incentive for the site to run communities that | aren't toxic. | luckylion wrote: | There's little incentive for good moderation, and there's | little cost for any moderation, which makes sites | business models work. | | I agree with you that it has gaps and ugly side effects, | but it also has the effect that a lot of things are | working because you're not by default responsible for | them because they've been commented on your server. | bobmaxup wrote: | From the article, what Reddit is arguing: | | > "Section 230 of the Communications Decency Act famously | protects Internet platforms from liability, yet what's | missing from the discussion is that it crucially protects | Internet users--everyday people--when they participate in | moderation like removing unwanted content from their | communities, or users upvoting and downvoting posts," a | Reddit spokesperson told Ars. | DanAtC wrote: | Reddit is afraid they'll be held responsible for the | actions of moderators they have no control over. Their | business model is at risk so they're spinning it as | something that threatens their users. | [deleted] | mrkstu wrote: | Reddit has exactly as much control over moderators as | their own policies dictate, which they can change at any | time. | aobdev wrote: | Honestly that sounds like fear-mongering, Reddit wants to | protect its interests by turning the public against | section 230 reform. | vxNsr wrote: | It specifically doesn't put any user at any more risk than | we're already at. s230 protects hn from liability, it doesn't | protect you, the user from liability. | bioemerl wrote: | Yeah, to my understanding you can currently be held liable | for what you post online, but the platform can't be in | trouble for distributing it. | PragmaticPulp wrote: | It is interesting to read HN comments demanding more laws and | regulations and restrictions on how social media operates. | Usually the commenters are unaware that HN is a social media. | | HN has algorithmic ranking, it has invisible moderation, it has | shadowbanning (sort of), and it has YC sponsored ads injected | into the "feed" that get special treatment relative to user | submissions. | | Many regulation proposals seem to have carve outs for sites and | networks below a certain size, but if one past without such | exceptions then a lot of the community sites we know and love | would have no choice but to shut down. | | I suspect a lot of the proponents of these regulations aren't | really interested in seeing the sites they like subjected to | these regulations. It has almost become a talking point about | punishing social media companies people don't like _others_ | using. | ip26 wrote: | Good moderation & ranking is nearly invisible, like good | email spam filtering. We forget (or never even knew) just how | much we depend on it. | MisterBastahrd wrote: | It isn't to the people who get actions taken against their | commentary that they deem restrictive, though. And while | the average person will simply take the moderation action | with a grain of salt, I've been in a situation before where | I've had to recommend en-masse banning of people who | originally had constructive comments that I largely agreed | with. | bioemerl wrote: | If you held YC liable for its user content it would be | broadly alright because YC has very very good moderation and | for the most part bad stuff gets taken care of very quickly. | treis wrote: | But none of these things are what Google is accused of doing. | Google is accused of recommending pro-ISIS videos to someone | who ended up killing a bunch of people. Which is not really | what 230 was meant to protect against. | | 230 is there to enable the existence of online platforms. | It's not there to let Google wring every dollar out of | YouTube they can, damn the societal consequences. | MBCook wrote: | The worry is that the decision will be far more expensive. | More like using this case as an excuse to do what they | wanted to. | | Much like Dobbs or a number of other recent cases. | archgoon wrote: | > it has shadowbanning (sort of) | | In what sense does HN have shadow banning (sort of)? | krapp wrote: | In the sense that HN has shadowbanning, but it's publicly | reversible, so it's only sort of shadowbanning. | prettyStandard wrote: | I'm pretty new here. Can you elaborate? Edit: On the "necessary | and appropriate historic moderation" part. | Aaron2222 wrote: | dang is the moderator here. | DoneWithAllThat wrote: | The claim by some is that 230 protections shouldn't apply if | the site at all influences what is shown to other users - | essentially, moderation. There's all sorts of made up | distinctions between publisher and web site (most of it | disingenuous) but it generally boils down to various | political factions upset that the "wrong" sort of content | isn't moderated away, or the "right" sort of content is. | Which is right or wrong depends on how you lean politically. | Retric wrote: | The desire for protection isn't the same as saying 230 | actually applies. The case made it to the Supreme Court | because it isn't clear where exactly the law does and does | not apply. | | User content and the promotion of user content are | different things. If Facebook picks a specific message out | of the billions posted they can find basically any message | ever said. The choice of a handful of messages to post on a | TV commercial moves the message from user content to | Facebook's message. | | Legally 230 could be limited to direct content and it's | moderation (removal) but not cover manual curation. | Similarly purely algorithmic feeds may be yet another | meaningful distinction. | | It's a surprisingly complicated topic and I doubt the | Supreme Court will make a broad ruling covering every case. | kmeisthax wrote: | Funnily enough DMCA 512 already works this way. If you | manually curate a content feed you lose your copyright | safe harbor. So you're actually incentivized to remain | willfully blind to certain aspects of how your site is | being used. The Copyright Office has been complaining | about this and arguing that we should pull _all_ | recommendation systems outside of the copyright safe | harbor. | | I kind of disagree with this. It would make both safe | harbors kind of nonsensical, because we're incentivizing | platforms to keep their systems broken. We understand | that free speech on the Internet requires a minimal | amount of censorship: i.e. we have to delete spam in | order for anyone else to have a say. But one of the ways | you can deal with spam _is to create a curated feed of | known-good content and users_. | | Keep in mind too that "purely algorithmic feeds" is not a | useful legal standard. Every algorithm has a bias. Even | chronological timelines: they boost new posts and punish | old news. And social media companies change the algorithm | to get the result they want. YouTube went from watch time | to engagement metrics and now uses neural networks that | literally nobody understands beyond "it gives better | numbers". And how exactly do you deal with an | "algorithmic" feed with easter eggs like "boost any post | liked by this group of people"? | | The alternative would be to do what the Copyright Office | wants, and take recommendation systems out of the | defamation and copyright safe harbors entirely. However, | if we did this, these laws would _only_ protect bare web | hosts. If you had a bad experience with a company and you | made a blog post that trended on Facebook or Twitter, | then the company could sue Facebook or Twitter for | defamation. And they would absolutely fold and ban your | post. Even Google Search would be legally risky to | operate fairly. Under current law, the bad-faith actor in | question at least have to make a plausible through-line | between copyright law and your post to get a DMCA 512 | notice to stick. | Retric wrote: | By purely algorithmic systems I mean something like the a | hypothetical Twitter timeline showing the top 4 tweets of | everyone you've followed in purely chronological order. | Or a Reddit feed purely based on submission time and | upvotes. | | A curated feed being something like the current HN front | page where websites from specific manually chosen domains | are penalized. | | I am not saying there is anything inherently wrong with | curation, it may simply to reflect what users want. | However, as soon as you start making editorial decisions | it's no longer purely user generated content. Which was | the distinction I was going for, it's still an algorithm | just not a blind one. | | > Every algorithm has a bias. | | Using upvotes, deduplicating, or penalizing websites | based on the number of times they have been on the front | page in the last week definitely has bias, but it isn't a | post specific bias targeted by the website owner. I agree | the lines aren't completely clear, when you start talking | AI the story specific bias can easily be in how the AI | was trained, but I suspect something that flags child | porn would be viewed differently than something that | promotes discrimination against a specific ethnic group. | dragonwriter wrote: | > if the site at all influences what is shown to other | users - essentially, moderation | | Which is bizarre, given the legislative history of Section | 230, whose entire point was to protect and encourage | private censorship by sites and users. | intrasight wrote: | Section 230s entire point is to encourage online | communities | triceratops wrote: | You can't have a healthy online community without | moderation. It gets overrun by spammers, trolls, off- | topic conversations, and flamewars. Moderation is the | reason that all of us are here instead of Usenet or | 4chan. | intrasight wrote: | Exactly. While in some ways moderation and censorship are | synonymous, the intent is different. Moderation is | necessary for healthy communities - online and offline. | luckylion wrote: | They didn't say anything about 'healthy' though. Reddit, | Twitter and Facebook have lots of trolls, off-topic | conversations and flamewars (and spammers aren't that | rare either), and yet they thrive. | hgsgm wrote: | What parent meant to mean is that "the claim by some is | that section 230 is bad/unconstitutional and should be | removed". | | > A key protection shielding social media companies from | liability for hosting third-party content--Section 230 of | the Communications Decency Act--is set to face its first | US Supreme Court challenge. | andsoitis wrote: | Rules will probably limited to communities over a certain size, | larger than that of HN I would bet. | ecf wrote: | > Unlike other companies that hire content moderators, the | content that Reddit displays is "primarily driven by humans--not | by centralized algorithms." | | Reddit's entire business model is Astroturfing-as-a-service, so I | don't believe this for a second. | throwayyy479087 wrote: | Ding ding ding. Being forced to reveal this information is why | they'll never IPO. | DannyBee wrote: | FWIW - The supreme court generally will allow just about any | amicus brief. | | So this is not particularly special in that respect. | Sunspark wrote: | There are some great Reddit mods, but there are also some truly | awful ones and the corporation does not have a proper arbitration | process for managing things. | | Example fictional scenario, but entirely plausible as variations | of this do play out, on a computing sub you could write "I like | Windows" and the Apple Mac-loving Reddit mod could immediately | ban you. No rules were broken, you just expressed a contrary | opinion to theirs and that is it. So if you wish to continue | participating in that sub, you would need to generate and use an | alt account. | | Reddit threatens that having more than 1 account is against | policy. Ignore this. You are the product on a free platform that | generates corporate revenue through ads and selling digital | awards, etc. If you do not engage with the platform by putting up | posts and writing comments, there is less incentive for others to | come visit as well, and ad impressions will diminish, revenue | will diminish, etc. They will not actively seek out preventing | access to the site if you are not breaking any laws or upsetting | users. Do not get invested in accounts, were you to die tomorrow, | nobody at all would remember you or care about anything you wrote | there. | | It is negligent on the part of Reddit to not have a proper | arbitration process to grieve improper content moderation on the | part of mods. | | So yes, Reddit absolutely does take an active and direct hand in | promoting the visibility of anything on the platform and should | not be exempt from section 230. I am active in a sub that every | day sees a lot of posts that I find interesting deleted by the | mods. They are just curating. Sometimes things are deleted for | the dumbest of reasons. Corporate interests come into play too. I | remember the other month when Kanye said that Kim Kardashian and | CP3 got together while both were married, the NBA Reddit mods for | over 24 hours were ACTIVELY deleting every single post mentioning | or linking to that. It was certainly basketball news, it was | certainly salacious. Why was this happening? Good question! Was | the suppression due to receiving an order from the NBA? Was it | under orders from Reddit Corporate? Was it just simply a group of | Reddit mods working overnight in a coffee shop deleting posts? It | was far too targeted and for too long a time period to not be an | active attempt at speech suppression until it was already out on | too many other news sites, at which time an "approved" site like | TMZ would be allowed through where they could presumably get ad- | click impressions from diminished traffic to their story about | it. They should tell the Supreme Court who gave the orders to | suppress the Kanye story on a sub with 6 million+ subscribed | accounts. You see this news-story preferences on other subs too, | where some sites seemingly often have their links given | preferential treatment, and others do not get to come through. | Why? Is there a kickback? I don't know, but stories coming | through are worth money as traffic is directed. | | I love old Reddit, but I despise how it was set up to have little | anonymous dictators for life seemingly entrenched forever in | their little fiefdoms. No elections, no votes, no recourse other | than having more than 1 account. | | If Reddit is serious about wanting exemption from section 230, | then if they want to be a social commons with community | moderation they need to implement an arbitration process OR allow | users to hold elections on which mods they want to represent them | for fixed terms. | | Dictators-for-life from anonymous mods (who also have alt | accounts and are probably Reddit employees on the largest subs) | is not it. | | The anonymous mods giving Supreme Court testimony should state | whether or not they are now, or have ever been an employee of | Reddit or its investors or associated companies. | theknocker wrote: | [dead] | LinuxBender wrote: | RFC 2119 agrees [1] as does US law [2] _with exception of | Illinois apparently._ Seems to be ill defined and not preferred | in the UK, sometimes deemed _inappropriate_ [3]. Perhaps the UK | will interpret as per RFC-6919 [4] instead. | | _1. MUST This word, or the terms "REQUIRED" or "SHALL", mean | that the definition is an absolute requirement of the | specification._ | | [1] - https://www.rfc-editor.org/rfc/rfc2119 | | [2] - https://www.law.cornell.edu/wex/shall | | [3] - https://www.law-office.co.uk/art_shall-1.htm | | [4] - https://www.rfc-editor.org/rfc/rfc6919 | honkler wrote: | [flagged] | [deleted] | montron wrote: | [flagged] | paulpauper wrote: | The notion of Reddit mods being selfless unpaid volunteers is | misleading. They wield considerable power, and it's a desired | position. Also, mods have been known to engage in payola and | other deception for personal gain/profit. And too many arbitrary | rules, too many shadow bans/deletions, etc. | notatoad wrote: | it's crazy to me that reddit doesn't have staff doing the | moderation job for the bigger subreddits. most of the content | on the front page is from 5-10 subreddits which are all | moderated by the same super-moderators, who effectively set the | policy for what content is allowed and not allowed. and they do | it with complete autonomy and independence. | | either that, or the few anonymous people who run reddit | actually are reddit staff, and reddit prefers to keep the | appearance of subreddits being "community-run" because | unpopular mod actions can be swept away by retiring a | moderator's profile. | water-your-self wrote: | They generally prefer to retire whole subreddits | jeoqn wrote: | They don't have to pay anybody... there's no shortage of | people who are more than happy to wield the power of being a | mod and share the political and corporate values of the | Reddit staff. | | Not to mention Reddit doesn't have as many legal | responsibilities over them if they don't pay them I guess. | Bystander22 wrote: | And Reddit is paying some moderators. It's called the Community | Builders Program; they are mostly paying people to moderate UK- | and India-specific subs. US $20/hour; most volunteer mods know | nothing about this but it's in the open. | | Article: https://reddit.zendesk.com/hc/en- | us/articles/4418715794324-C... | | Announcement for India mods: | https://www.reddit.com/r/IndianMods/comments/w4k4y4/launchin... | | Discussion about UK mods: | https://www.reddit.com/r/RoyalsGossip/comments/xfn4t2/adminr... | | This is an initiative of the new VP of Community: | https://communityvalidated.co/community-lessons/reddit-a-gli... | beej71 wrote: | I don't think this changes the point, which is, "If I moderate | a forum--even for free--what is my exposure to liability from | things people post to that forum?" | [deleted] | medellin wrote: | I believe reddit has been a part although its hard to say how | big of ones it's played in further diving people. It's in my | experience the worst social media at putting you inside a | completely one sided bubble with twitter as a very close | second. | | Part of that being downvote is just used as i disagree with you | even when you are adding a valid but different view to the | conversation. I don't use reddit anymore outside of trying to | find recommendations for products but even that is being gamed | now. | enslavedrobot wrote: | Not to mention the obvious political bias in many subs that | state they are neutral and objective. | robswc wrote: | That is one of my biggest issues with Reddit. One big | gaslighting operation. Even if you agree with 9/10 opinions, | If you disagree with one, you are made out to be an "other" | and an adversary. | | Just look at how dis-functional it can be, even when everyone | is on board with the same basic principals (r/antiwork) | bitlax wrote: | Ghislaine Maxwell was a power mod. | srj wrote: | If you haven't already I recommend reading the text of section | 230. It's very short and takes only a minute or two: | https://www.law.cornell.edu/uscode/text/47/230 | badrabbit wrote: | I don't think it should be repealed but its protections should | only apply to users and content providers acting in good faith | and best effort to proactively prevent harmful content. Revenge | porn for a moderate example: not verifying the provenance if | the content and validity of the submitter by the porn site or | users who knowningly upvote or positively comment should not be | protected. There needs to be an incentive beyond the goodwill | of site owners and users. Look at twitter with elon changing | policy with allowing harmful content but reducing its reach. He | is able to do that due to this law. | wolverine876 wrote: | I'm concerned about the US Supreme Court's ability to handle a | subtle, technical issue tied to new media. They need to take into | account all the stakeholders, rights, law, and the dynamics of | social media, and come up with an innovative solution that meets | all needs fairly and clearly. | | I wonder how many there even use social media, and I'm especially | concerned that the Court is now oriented toward, and many members | selected for, partisanship. They are there to find partisan | advantage in rulings, not to be legal geniuses with deep | commitment and knowledge of justice and fairness, with deep | judicial temperment - there are not there as Solomons. That puts | them at a loss for complex issues, expecially unfamiliar ones, | though I'm sure they will find a partisan angle. | | Whatever your politics: The reactionary conservative movement, | with its campaign to politicize everything (now working on the | FBI and Department of Justice, for example), has permanently | degraded the country; we won't have these institutions back for | generations. People don't want to face the loss, but it's already | happened and continues to worsen before our eyes. | | EDIT: People who support politicization (or corruption or | disinformation or other damaging behavior) argue to normalize it | - it's always that way, everyone does it, it's unavoidable, it's | 'human nature'. I have warmongers now telling me that it's | inevitable human nature. But that's not the case; we can have | meaningfully less or more partisanship (especially in courts), | corruption, disinformation, and warfare. I can see it with my own | eyes now; I was here before 2016, and I know about other places | and times and people. It's a bunch of nonsense and everybody | knows it. | | We control our fate, through knowledge and reason, through a | collective commitment to good. Our predecessors did it, without | the institutions and mechanisms and knowledge they bequeathed to | us. With our inheritance couldn't have it easier; what are we | bequeathing to the next generation? Despair? Corruption and war? | What a shame that would be, with all we were given. | xyzzyz wrote: | What do you mean that it is _now_ oriented towards | partisanship? This has been true for at least a century. The | recent Dobbs decision is as much the result of partisan efforts | to institute policy opposed by the majority of the country as | the original Roe decision was. | | The court has handled complex technical issues many times | before. This, along with partisanship, is nothing new. | wolverine876 wrote: | > The recent Dobbs decision is as much the result of partisan | efforts to institute policy opposed by the majority of the | country as the original Roe decision was. | | It's not just one decision, but would you provide support for | that claim? Dobbs was decided by conservatives put on the | court specifically to make that decision, which they executed | promptly, along with other conservative priorities. Roe was | decided by conservatives also, and they weren't put on the | court to rule on abortion. | xyzzyz wrote: | > Roe was decided by conservatives | | Roe was decided by the Burger court, which, according to | Wikipedia, "is generally considered to be the last liberal | court to date". It was heavily based on a Griswold v. | Connecticut precedent by the Warren court, generally agreed | to be the most liberal Supreme Court in US history. Both of | these verdicts were and continue to be widely criticized by | conservatives as being based on extremely dubious | reasoning. I don't know what made you think that Roe was | "decided by conservatives". | | There is a lot historical revisionism involved around these | issues, with many people making blatantly false claims, | either lying, or being themselves mistaken. The result is | that people who have not lived through it, or who have not | studied the history diligently, are very much misled as to | the facts, because the media, which is very good and active | at correcting lies and falsehoods spread by conservatives, | takes approximately zero efforts to correct falsehoods | spread by liberals (often it in fact acts with clear intent | of spreading misapprehensions, by selective reporting and | careful omission of facts). | gcanyon wrote: | "last liberal court" -- it remains the case that 6 of the | 9 justices on the court that decided Roe were put there | by Republican Presidents. | | That's not a lock that they were in fact "conservative," | but four of them were put on the court by Nixon, and | regarding Blackmun, "The Justice Department including | future Chief Justice William Rehnquist investigated | Fortas at the behest of President Richard Nixon who saw | the idea of removing Fortas as a chance to move the Court | in a more conservative direction, and Attorney General | John N. Mitchell pressured Fortas into resigning." | https://en.wikipedia.org/wiki/Abe_Fortas | | So four were appointed by Nixon, who specifically had in | mind moving the court to a more conservative stance. The | fact that he failed miserably with Blackmun | notwithstanding, the only thing that can honestly be said | of the court at the time is that it was less conservative | than courts that followed, not that it was liberal. | mindslight wrote: | I know it's much less convenient, but can we stop referring | to the Republican party as "conservatives" ? In this case | they reversed a precedent that had been in place for two | generations, with a justification firmly rooted in | collectivism. | yladiz wrote: | How would you prefer the Republican Party be referred to | as? | mindslight wrote: | "Republicans" or "Republican Party" works - we don't need | a synonym. If we really want to talk about views | independent of the party, then let's characterize each | view individually rather than as a group. | | What doesn't make sense is taking a group of positions | that were conservative in the 70's, carrying them into | the current day after society has changed significantly, | and then talking about them as if they still represent a | slowing of change rather than a radical departure. | jimbob45 wrote: | Didn't Dobbs return the decision to the states? Isn't having | the decision decided by majority vote the least partisan | thing you can do by definition? | | If they had truly taken a partisan stance, they would have | unilaterally decided to ban abortion based on specious | reasoning not unlike Roe. | epakai wrote: | The decision was (more or less) up to individuals before. A | state decision is inherently more partisan. | xyzzyz wrote: | I don't think that most people would agree that | anarcholibertarianism (the system which allocates least | decisions to state) is the least "partisan" option. | That's not what people understand by partisanship. | mensetmanusman wrote: | After Dobbs the US is more like the EU. | wolverine876 wrote: | I think we all know very well the partisanship involved, | despite the theoretical questions (which might be | interesting in another context). | | Majority rule is the most partisan thing. | einpoklum wrote: | > Isn't having the decision decided by majority vote the | least partisan thing you can do by definition? | | If one party believes something to be an individual right | and another party believes it to be a matter for | collective/state decision, then no. | | (Not that the Democratic party fully sees abortion as an | individual right of the mother - after all, the Roe v. Wade | decision did not really consider it as such, nor did it | legitimize abortion throughout the pregnancy term; and the | Democratic party generally supports Roe v. Wade. It has | also not tried to put the matter into federal legislation | for the 40-odd years between Roe and Dobbs.) | jlawson wrote: | >The reactionary conservative movement, with its campaign to | politicize everything | | I'm sorry but which movement came up with the slogan 'the | personal is political'? | | Which one has entire academic departments dedicated to | 'problematizing' everything from the skin color of LOTR orcs to | dog walking? | golemotron wrote: | > I'm concerned about the US Supreme Court's ability to handle | a subtle, technical issue tied to new media. They need to take | into account all the stakeholders, rights, law, and the | dynamics of social media, and come up with an innovative | solution that meets all needs fairly and clearly. | | Nope. That's what legislators do. | | The Supreme Court's job is to decide whether safe harbor in | Section 230 applies to companies when they are exercising | editorial control. | tptacek wrote: | Our court system routinely handles far more technical issues | than online publishing. Like every modern country, we regulate | everything from water reclamation to aviation. | wolverine876 wrote: | I'm talking about SCOTUS. Can you give examples there? Trial | (district) and circuit appealate courts are different - | though they also have been politicized to degrees. | | Regulation is handled in the executive branch. Just because | we regulate it, or it's tried in court, doesn't mean it's | done well. That's the issue. | yladiz wrote: | Isn't regulation (mostly) handled by the legislative branch | and enforcement handled by the respective department | (sometimes executive)? | oneoff786 wrote: | Those departments are largely executive agencies. The | executive branch is supposed to enforce but over time it | began to do both. The court is pulling this back now. | blindriver wrote: | A more important point about Reddit specifically is that when | Reddit finally goes IPO, all the mods get ZERO. The investors and | the engineers and all the workers at Reddit will become rich, but | the people who do the most important work, as outlined by the | brief, are the anonymous moderators that create the culture of | every subreddit. They get NOTHING. | | I find it amazing that this hasn't been brought up by the | moderators themselves, and they're okay getting all the outcome | and profits from their hard work literally picked away by Reddit. | They don't even share in the profits of the advertising revenue | from their subreddits! It's really incredible to me. | dmix wrote: | > but the people who do the most important work, | | Strange, I thought that was the people contributing and | curating the content, ie the posts and comments. They also do | it for free. Should they be paid too? | blindriver wrote: | And to reply in a single comment: | | Reddit is nothing without its moderators. The moderators create | engagement through their hard work. You can take a great topic, | and it will die in the hands of bad mods. Reddit owes its | entire existence to its mods. | | The equivalent is Twitter or TikTok's algorithm. Reddit has | tricked thousands upon thousands of people to do the hard work | for free. | | They SHOULD get paid by reddit. Or, maybe reddit should adopt | and non-profit structure and never IPO. But the idea that all | these reddit employees will eventually become millionaires on | the backs of free labor is disgusting. | | It's funny how HN loves to shit on Lyft/Doordash/Uber for | exploiting drivers, meanwhile if reddit mods get paid it's | somehow dishonorable. For the record, Uber gave shares to some | of its best drivers upon IPO. | ncr100 wrote: | Shows the need for digital signatures of any creative output | of any individual. | | People matter too. | secondcoming wrote: | You'll either see subreddits add lots of mods, or culling them. | For example, /r/ukpolitics has 22 probably-human mods [0] | | [0] https://www.reddit.com/r/ukpolitics/about/moderators | mongodooby wrote: | What these moderators seem to crave the most is power. The | majority of them are unemployed loners with lots of time to | kill, who are unappreciated in real life and don't have much | power over their own lives. So they enjoy the small sliver of | power that Reddit gives them. | | There are also the agenda-pushers, who obtain moderator | positions in order to control the narrative on Reddit. Just | look at how certain ideological views are essentially | uncriticizable on that site. | | If you can get access to the Discords (or the leaks thereof) | where these moderators think they are discussing things in | private, it's a fascinating insight into their culture. Some of | them occasionally acknowledge that they're doing all this work | for Reddit for free, though it's a somewhat taboo topic too. | AlbertCory wrote: | It's apparent from my limited experience there that you are | right: some subreddits are dominated by a certain demographic | group, and they Report anything they don't like as | "offensive." Then the mods dutifully remove it, since they're | only interested in pleasing their group to keep their power, | such as it is. | | something, something, "power" which Lincoln apparently didn't | say: | | [1] https://www.snopes.com/fact-check/lincoln-character- | power/ | SanjayMehta wrote: | > The majority of them are unemployed loners with lots of | time to kill, who are unappreciated in real life and don't | have much power over their own lives. So they enjoy the small | sliver of power that Reddit gives them. | | Nailed it. Reddit went completely off the rails during the | Pao reign. Never figured out what they get out of it apart | from a power trip. | Aloha wrote: | There is more to life than just money. | | Reddit is the vehicle - a mere tool - people use to help build | community around an interest - before Reddit, it was yahoo | groups, or you had to go host your own forum somewhere. | | So while yes, Reddit benefits from their work and content, | keeping reddit sustainably funded also keeps alive all of those | communities. | jmyeet wrote: | This cuts both ways. If there's more to life than money, why | not spread it around to those who create the value of the | thing you're selling? | | "There's more to life than money" is used by capital owners | to justify the exploitation of labor. | montagg wrote: | I imagine you're going to get a very different type of | person moderating if they have a monetary incentive to do | so. I don't disagree with your argument in the broad sense | --the rich get richer at the expense of everyone else, and | they shouldn't--but adding money to a relationship that | doesn't have it _always_ fundamentally changes the | relationship, and the incentive structure, and it doesn 't | guarantee better outcomes. | AlbertCory wrote: | I have an application to the LinkedIn group of ex-Oracle | employees. It was ignored for months and months. | | Finally, I wrote to one of the admins. He apologized and | said he had a lot of groups he was admin for, and asked, | now which group was I talking about, again? | | Does LinkedIn pay their admins? Don't know. | Aloha wrote: | I moderate communities on telegram, I already have people | who treat me and my mod team like we are being paid, and | this is our full-time job, and we should be more | responsive to their concerns. I wouldnt want to be | actually paid and have an expectation of quality of | service. | KennyBlanken wrote: | This is a really charitable view of things. | | Most of reddit is controlled by "powermods" who mod anywhere | from dozens to hundreds of subreddits. They collect them for | clout and don't give the slightest damn about their | communities, don't participate in them, don't have time to | even spend time reading them and getting a feel for them, | etc. They're not trained or educated in community management | for the betterment of said community. | | Nobody has the sort of free time to spend moderating an | active community, especially unpaid - and therefore they must | be getting paid by someone other than reddit. I'm convinced | that a large number of subreddits are moderated by accounts | that are controlled, directly or indirectly, by advertising, | PR, and reputation management firms - and government | agencies, ranging from intelligence to "PR." Either directly, | or via payoffs to promote or suppress certain subjects, | topics, and types of posts. | | I think there's a reason Ghislaine Maxwell - whose father was | an intelligence agent - was a reddit powermod. | honkler wrote: | [flagged] | Aloha wrote: | Most volunteers for any organization are neither trained or | educated in the thing they're volunteering to do, thats why | they're volunteers and not paid labor. As soon as you start | putting requirements for compliance training or whathaveyou | on volunteers, they start needing to be paid. | wnevets wrote: | > A more important point about Reddit specifically is that when | Reddit finally goes IPO, all the mods get ZERO. | | Some mods have definitely been making money controlling the | content that appear on the popular subs, e.g. [1] | | [1] | https://www.reddit.com/r/trees/comments/oh5o4/rtrees_nonprof... | googlryas wrote: | They get to moderate the discussion of the message board, which | is what they want to do. It isn't a job - the admins actually | get paid. | | Why would the moderators get anything for creating a message | board using a free message board service? | | If phpBB IPOed 15 years ago, would everyone who downloaded and | hosted an instance deserve a cut of the IPO? | [deleted] | robswc wrote: | Reddit mods, for the most part, don't need money. I imagine | there's tons that would even pay money to "remain" as a mod if | they could. | | Many reddit mods are motivated by nothing more than some | semblance of power over other people. Just look at the way one | of the big subreddits operates: | | https://www.gamerevolution.com/news/931803-reddit-art-subred... | | I've personally been banned from a handful of subs all at once | (due to the mod being a mod on all of them) because I argued | with a moderator that humidity is worse than dry heat. | | No, they don't get "zero" - they get their purpose. Otherwise, | they would get a job. Coincidentally, you almost have to be | job-less or have next to no other obligations to be a mod on | reddit. I personally created and "mod" a community that's not | very big at all but still requires cleaning up... but I could | never see myself justifying more than 15 minutes a day to that | "job." | this_user wrote: | It hasn't been brought up, because it is ridiculous. Reddit is | a platform to create communities, and people operating subs are | reddit's customers. With the same logic you could argue that | everyone posting on Twitter should have participated in their | IPO, the same goes for Facebook, TikTok and whatever platform | you can imagine. Why isn't Discord paying the server admins? HN | would be nothing without people posting links. Should OP be | paid by YCombinator for this thread? | blindriver wrote: | The mods are NOT the commenters. There's a difference. | Twitter moderates its own content via algorithms. Reddit | still has comments in the same way Twitter does. But | moderators literally moderate the content to make sure it's | good and engaging and THAT'S what drives reddit. | pjot wrote: | The mods are also moderating voluntarily. | [deleted] | washadjeffmad wrote: | Hah. Community management, including moderation, is a job. We | know this because of the AOL Community Leader Project: | | https://en.m.wikipedia.org/wiki/AOL_Community_Leader_Program | | I'm torn between disgust and embarrassment for the people | providing free, unofficial support for trillion dollar tech | companies via Reddit Requests. They literally get nothing, no | job placement, no recognition, no pay or benefits, and no way | to be made whole for providing services that have a real, | demonstrable impact on company perception and operation. | whatshisface wrote: | It's funny, but although there's not even the slightest | thread of a relationship as the legal system would see it, we | are in a sense "the authors of Twitter." | World177 wrote: | Discord does have a revenue share. [1] Elon Musk also | recently stated that he planned to add creator monetization | to Twitter. [2] | | From Discord | | > Good news: It's a 90/10 split! This means you, the creator, | get to keep 90% of each monthly Server Subscription you sell, | minus some small processing fees for legal's sake. | | From Elon Musk | | > Followed by creator monetization for all forms of content | | [1] https://discord.com/creators/server-subs-101-earning- | money-o... | | [2] https://twitter.com/elonmusk/status/1589010272341340160 | cactusplant7374 wrote: | They also get to make arbitrary decisions with no | accountability. Essentially, the site is their personal | fiefdom. Why would they waste their time unless they are | benefiting in some way? | raverbashing wrote: | Like one proeminent example some years ago (gallowb...) and | I'm sure there are more modern ones as well | 2OEH8eoCRo0 wrote: | Why is there always, "but they're getting rich!" | | Who cares? I'm sure the mods are aware. They do it anyway | because it's beneficial having a healthy community that they | are also users of. | fnordpiglet wrote: | Probably the same reason people waste their time playing | video games or reading science fiction. Scoundrels!! | cactusplant7374 wrote: | In those cases you aren't exercising control over someone | else's creative expression. | fnordpiglet wrote: | You're ascribing a negative motive by establishing one | could exist. Surely there are petty tyrants on Reddit. | But the much simpler, most generous, and most likely | explanation is they enjoy a hobby and enjoy fostering a | healthy community around their hobby, and that this | itself is a hobby of theirs. The word moderator wasn't | invented for social media, and the role is crucial in any | healthy intellectual community. Because your creative | expression may very well be disruptive to everyone | else's, and while you're free to be creative in your | expressions in general, others are free to exclude you | from their community. The moderator gets the unpleasant | job of being the executor of that will. | | I think the most important thing is most moderators I | know _hate_ the function of moderator. But the community | is important enough they do it anyways. | | So, I guess I take it back. It's more like producing a | video game or editing science fiction. It's work we do to | be sure everyone can enjoy a hobby we love. | breck wrote: | Exactly. As soon as you get subs over 100K you get anon mods | out there accepting bribes, kickbacks, and/or on weird power | trips. | | Bravo to the mods on the tiny subs, but in my experience | (both as a mod of a top sub and as a contributor) the anon | mod behavior becomes 20% toxic when a sub gets popular. | no_wizard wrote: | Reddit has a complex ownership history. At one point Conde Nast | actually owned Reddit as a subsidiary (majority ownership) and | now their parent company does. I am sure it may IPO eventually | but I'm not entirely sure the employees will make out like | bandits | ghufran_syed wrote: | The same is true of parents: they "do the most important work" | of creating and raising their children, but after they become | adults, the parents have NO legal rights to any income from | their children. "They get NOTHING" | | Is that really true in the case of parents? If not, might the | same reasons (e.g. intrinsic motivation, non-financial rewards | such as inter-personal bonds) possibly apply here too? | | I think it _would_ be interesting though to think about what | sort of monetization structure might encourage the development | of more healthy communities | | [Warning: all analogies are "wrong", but some are useful...] | h2odragon wrote: | > parents have NO legal rights to any income from their | children | | not all parents believe that. | | Some people raise their kids to be their slaves. The results | get ugly. | willnonya wrote: | While I support upholding section 230 I find irony in reddit | proclaiming their support for users and everyday citizens. The | majority of the site is overrun by zealots and trolls who | relentlessly punish any non-conformance woth their chosen | orthodoxy. Their moderation rarely maintains communities but | rather restricts discussion, dissent and freedom of expression. | | This is like the Stazi proclaiming support for privacy laws. | llanowarelves wrote: | A while back, somebody made a moderator graph, that showed what | dozens and dozens of subreddits individual moderators | moderated. You could see their "web". | | It not only confirmed the leanings and behavior but undeniably | so. | | It is why "redditor" exists as an insulting term, but | fb/twitter/etc don't have one. Not necessarily just the users, | but the mods too. | paulpauper wrote: | This was way worse during Covid , I know, even though before | Covid it was already very bad. People were banned just for | asking questions, merely for dissenting from the narrative, | which was constantly changing, like about masks. | breck wrote: | [flagged] | [deleted] | noirbot wrote: | I was under the impression we often allowed anonymous testimony | in courts for criminal trials, often to protect the identity of | the witness and avoid the opportunity for them to be coerced or | intimidated. | | I imagine often the situation is that your identity is | verified, but just not entered into the public record? | | If anything, the point of it is because "we should not listen | to cowards" is the justification for the worst kind of | heckler's veto. It's the exact reason it was hard to get | testimony against organized crime. If there's a good chance | you'll be physically attacked, kidnapped, or financially ruined | for trying to bring justice, you're simply encouraging more | violence and threats in order to protect those already in | power. | breck wrote: | > I was under the impression we often allowed anonymous | testimony in courts for criminal trials | | We specifically do not. The 6th Amendment: "...to be | confronted with the witnesses against him" | | https://www.law.cornell.edu/wex/right_to_confront_witness | ameister14 wrote: | I don't think it's 'often' and it's only in cases where there | is a real and credible threat of harm to the witness. | | While that might make it harder to go after organized crime, | the US believes it is important that, when you are accused of | doing something, you get to confront your accuser. | | There's a really interesting law review article about this | from 2020 which gets into the idea that, through misconduct, | a defendant can waive their right to confront the witness at | trial. Here's a link: | https://digital.sandiego.edu/sdlr/vol39/iss4/3/ | dragonwriter wrote: | > Freedom of Speech does not give you freedom of Anon Speech. | | The Supreme Court largely disagrees. | | https://www.mtsu.edu/first-amendment/article/32/anonymous-sp... | breck wrote: | There is nothing in that article about right to anon speech | in a court room. I'm assuming that was your best argument, | proving my point. | [deleted] | codingdave wrote: | To be clear, from the article: "Reddit received special | permission from the Supreme Court to include anonymous comments | from Reddit mods in its brief." | | This is not saying the court decided anything - it is just saying | it is allowing commentary from mods to be submitted to the court, | without knowing their identity. | lucb1e wrote: | I'm apparently out of the loop enough to understand all the words | in the article but still have no idea what it's talking about. | From context, I'm getting that it's something to do with | volunteering and needing anonymity and immunity for when you then | accidentally allow terrorists recruit followers? And there | already exists law number 230 for this but the lawsuit tries to | get it declared invalid? Can someone share maybe a short comment | on what actually is going on here? | jcranmer wrote: | Ostensibly, the issue presented is this: | | > Whether Section 230(c)(1) of the Communications Decency Act | immunizes interactive computer services when they make targeted | recommendations of information provided by another information | content provider, or only limits the liability of interactive | computer services when they engage in traditional editorial | functions (such as deciding whether to display or withdraw) | with regard to such information. | | The background is that the families of a victim of a terrorist | attack are suing Google for hosting an ISIS recruitment video | on Youtube, for which Google has clear and undisputed immunity | because of SS230, and the plaintiffs are trying to find any | argument they can stand on to make the suit stick (they've lost | at all lower levels of the court). | | What I quoted above was the explicit question presented to | SCOTUS, but from reading some of the actual briefs, there's | almost no discussion of this actual question, with everyone | instead wanting to discuss SS230 as a whole and not its narrow | application to recommendation content. | | Of the briefs I did skim, I liked the US solicitor general's | position the best: recommendations _are not_ protected by | SS230, but it 's not enough to say that recommendation engine | produced objectionable content, since the content itself is | protected. Essentially, the recommendation would have to be in | some way unreasonable, and the burden of that unreasonability | is presumably on the plaintiff's part, and these plaintiffs are | clearly unable or unwilling to properly make those allegations. | i_hate_pigeons wrote: | here | https://www.reddit.com/r/reddit/comments/10h2fz7/reddits_def... ___________________________________________________________________ (page generated 2023-01-22 23:00 UTC)