[HN Gopher] Facebook executives shut down efforts to make the si...
       ___________________________________________________________________
        
       Facebook executives shut down efforts to make the site less
       divisive
        
       Author : longdefeat
       Score  : 771 points
       Date   : 2020-05-26 16:17 UTC (6 hours ago)
        
 (HTM) web link (www.wsj.com)
 (TXT) w3m dump (www.wsj.com)
        
       | daenz wrote:
       | Facebook discovers profitable strategy that news organizations
       | have been using for decades.
        
         | forgingahead wrote:
         | This is the most relevant comment to this discussion. News orgs
         | have been profiting off the same negative elements of human
         | society for decades.
        
           | tzs wrote:
           | News organizations present a limited, curated view from fact
           | checked, verified sources. The information flow is mostly one
           | way, from the news organization to me.
           | 
           | A social media news feed might present the same underlying
           | story to me, but via some opinion blog that has not fact
           | checked it or verified sources. It might also come with
           | assorted speculation by the posted, ranging from wild ass to
           | outright insane conspiracy theories.
           | 
           | And social media is designed to get me to offer my opinion on
           | it, and to see other people's opinion, and for all of us who
           | read it to discuss it in a semi-pseudonymous free for all.
           | 
           | The news organization approach is much more effective if the
           | goal is to actually inform people about the negative event.
        
         | 082349872349872 wrote:
         | They even have the same business model, in which users are not
         | the customers. If you are Sylvester McMonkey McBean, you do not
         | want to place ad impressions in groups of star- and plain-
         | bellied sneetches who share an interest in underwater basket
         | weaving. You will happily spend to place impressions for star-
         | on machines among groups of plain-bellied sneetches, and star-
         | off among star-bellied.
        
         | Wohlf wrote:
         | Centuries actually:
         | https://en.wikipedia.org/wiki/Yellow_journalism
        
         | cynusx wrote:
         | That sounds right. Fear and conflict drives higher engagement.
         | Although it makes business sense to chase higher engagement, I
         | wonder how much of people's distrust with Facebook the brand is
         | just a reflection of how people feel when engaging the product.
        
       | alkibiades wrote:
       | the media really just wants fb to prevent division by only
       | allowing a center-left world view.
        
       | majky538 wrote:
       | I remember content from friends, then no related content when
       | Facebook was testing feed changes and now, it's mostly based on
       | meme pictures and group posts. I forgot that there are any
       | "friends".
        
       | codermobile wrote:
       | You
        
       | astrophysician wrote:
       | As a total outsider following this from a distance, I sort of
       | feel for Facebook and other social media platforms facing this
       | problem -- they've run up against a fundamental issue for which
       | there doesn't seem to be any satisfying solution. Misinformation
       | and propaganda are rampant on their platforms definitely, and
       | echo chambers that reinforce divisive worldviews have probably
       | deepened real societal divisions, but how do you actually
       | implement a policy to stop this? What is "propaganda"? What is
       | "misinformation"? The entire core of Facebook's existence is
       | advertising, which means user engagement and reach is the only
       | thing that drives your bottom line; they _want_ to drive users to
       | Facebook and keep them there, and keep them engaged. They 've
       | just happened to discover a universal human truth along the way,
       | which is that people _like_ feeling validated, and people _like_
       | being a member of a tribe. Facebook is the way it is because
       | thats what users _want_ , whether they will admit to it or not.
       | 
       | Anything that Facebook does will be perceived as making a
       | political and/or moral statement, which they obviously are trying
       | very hard not to do, because as soon as you take a position you
       | alienate half of the population (at least in the US). They've
       | apparently decided to go the route of burying their heads in the
       | sand instead of _trying_ to make things less tribal and divisive,
       | which in all honesty is a pretty understandable position to take,
       | and yet _even while actively trying not to piss off
       | conservatives_ they have still landed in hot water over perceived
       | favoritism towards the left. They are damned if they do and
       | damned if they don 't.
       | 
       | So honestly, what is the proposed solution here? What would you
       | do if you were in Zuckerberg's shoes? Do you campaign for
       | regulations that take this issue off of your hands but that let
       | the government call the shots somehow? Do you look at your board
       | members with a straight face and tell them you're going to tank
       | user engagement for some higher, squishy moral purpose for which
       | there is no clear payoff?
        
       | fpgaminer wrote:
       | You know what the internet needs? User agents.
       | 
       | We've got this idea stuck in our heads that only the website
       | itself is allowed to curate content. Only Facebook gets to decide
       | which Facebook posts to show us.
       | 
       | What if, instead, you had a personal AI that read every Facebook
       | post and then decided what to show you. Trained on your own
       | preferences, under your control, with whatever settings you like.
       | 
       | Instead of being tuned to line the pockets of Facebook, the AI is
       | an agent of your own choosing. Maybe you want it to actually
       | _reduce_ engagement after an hour of mindless browsing.
       | 
       | And not just for Facebook, but every website. Twitter, Instagram,
       | etc. Even websites like Reddit, which are "user moderated", are
       | still ultimately run by Reddit's algorithm and could instead be
       | curated by _your_ agent.
       | 
       | I don't know. Maybe that will just make the echo chambers worse.
       | But can it possibly make them worse than they already are? Are we
       | really saying that an agent built by us, for us, will be worse
       | than an agent built by Facebook for Facebook?
       | 
       | And isn't that how the internet used to be? Back when the scale
       | of the internet wasn't so vast, people just ... skimmed
       | everything themselves and decided what to engage with. So what
       | I'm really driving at is some way to scale that up to what the
       | internet has since become. Some way to build a tiny AI version of
       | yourself that goes out and crawls the internet in ways that you
       | personally can't, and return to you the things you would have
       | wanted to engage with had it been possible for you to read all 1
       | trillion internet comments per minute.
        
         | devonkim wrote:
         | I think transparency matters more. I liked Andrew Yang's
         | suggestion to require the recommendation algorithms of the
         | largest social networks to be open sourced given how they can
         | shape public discourse and advertising in all mass media is
         | regulated to prevent outright lies from being spread by major
         | institutions (although an individual certainly may do so).
        
           | anigbrowl wrote:
           | Not the recommendation engines. The graph. All the social
           | media companies (and indeed Google and others) profit by
           | putting up a wall and then allowing people to look at
           | individual leaves of a tree behind the wall, 50% of which is
           | grown with the help of people's own requests. You go to the
           | window, submit your query, and receive a small number of
           | leaves.
           | 
           | These companies do provide some value by building the
           | infrastructure and so on. But the graph itself is kept
           | proprietary, most likely because it is not copyrightable.
        
           | closeparen wrote:
           | >advertising in all mass media is regulated to prevent
           | outright lies from being spread
           | 
           |  _Advertising_ in mass media is regulated. You are very much
           | allowed to publish claims that the government would
           | characterize as outright lies, you just can 't do it to sell
           | a product.
        
           | root_axis wrote:
           | Setting aside the concerns about the efficacy of the idea, it
           | also seems like an arbitrary encroachment on business
           | prerogatives. I think everyone agrees that social media
           | companies need more regulation, but mandating technical
           | business process directives based on active user totals isn't
           | workable, not the least of which because the definition of
           | "active user" is highly subjective (especially if there is an
           | incentive to get creative about the numbers), but also
           | because something like "open source the recommendation
           | algorithm" isn't a simple request that can be made on demand,
           | especially with the inevitable enfilade of corporate
           | lawyering to establish battle lines around the bounds of
           | intellectual property that companies would still be allowed
           | to control vs that which they would be forced to abdicate to
           | the public domain.
        
           | SkyBelow wrote:
           | Does that actually work? If they create some complex AI and
           | then show us the trained model, it doesn't really give much
           | insight into the AI doing the recommendation. You could
           | potentially test certain articles to see if it is
           | recommended, but reverse engineering how the AI recommends it
           | would be far more time consuming than updating the AI. As
           | such Facebook would just need to regularly update the AI
           | faster than researchers can determine how it works to hide
           | how their code works. Older versions of the AI would
           | eventually be cracked open (as much as a large matrix of
           | numbers representing a neural network could be), but between
           | it being a trained model with a bunch of numbers and Facebook
           | having a never version I think they'll be able to hide behind
           | "oops there was a problem, but don't worry our training has
           | made the model much better now".
        
           | s1t5 wrote:
           | Open sourcing the algorithms (however we define it) does
           | absolutely nothing. What use is a neural network
           | architecture? Or a trained NN with some weights? Or an
           | explanation that says - we measure similar posts by this
           | metric and after you click on something we start serving you
           | similar posts? None of those things are secret. More
           | transparency wouldn't change anything because even if
           | completely different algorithms were used, the fundamental
           | problems with the platform would be exactly the same.
        
             | banads wrote:
             | It's silly to so confidently assert that opening up a
             | closed source algorithm to 3rd party analysis will "do
             | absolutely nothing". How could you possibly know there is
             | nothing unusual in the code without having audited it
             | yourself?
             | 
             | Seeing how the sausage gets made certainly can make lots of
             | people lose their taste for it.
        
         | pwdisswordfish2 wrote:
         | Not a new idea. One example was early marketing of Apache
         | Nutch.
        
         | [deleted]
        
         | ssss11 wrote:
         | I agree - wasnt the browser intended to be the user agent? And
         | counterpoint to some of the replies to you, surely people can
         | just pay instead of sites being ad-based, what other industries
         | operate in this absurd way? The public must think there's no
         | cost to creating software if everythings always free.
        
         | grishka wrote:
         | ActivityPub and other federated networks are the answer. They
         | do exactly that: if you aren't satisfied with the rules on
         | existing servers, you host your own. The network itself is wide
         | open, and its control is distributed across many server admins.
         | The way the content is presented is of course completely up to
         | the software the user is running. Having no financial incentive
         | to make UX a dumpster fire visible from space also helps a lot.
        
           | anigbrowl wrote:
           | They're not the answer as long as they don't have loads of
           | people. The attraction of FB and the like is that almost
           | everyone has a FB account, just like almost every public
           | figure has a twitter account. The downside of things like
           | Mastodon is how do you know what server you want to connect
           | to? For a non-technical user it doesn't offer any more
           | obvious utility than a FB group.
        
             | grishka wrote:
             | There is indeed the problem of discovery that Mastodon
             | doesn't feel like addressing. Like, you pick a server, make
             | an account and _now what_? There 's no way to bring your
             | existing social graph with you. Even if your friends are
             | there, you won't ever find them without asking each and
             | every one about their username@domain. But I have some
             | ideas on fixing that for my fediverse project -- like
             | making a DHT out of instance servers, thus making global
             | search possible while keeping the whole thing
             | decentralized.
        
         | adrianmonk wrote:
         | I like the line of thinking, but who actually provides the
         | agent, and what are their incentives?
         | 
         | This is far from a perfect analogy, but compare it to the
         | problem of email spam. People first tried to fight it with
         | client-side Bayes keyword filters. It turns out it wasn't
         | nearly as simple as that, and to solve a problem that
         | complicated, you basically need people working on it full time
         | to keep pace.
         | 
         | Ranking and filtering a Facebook feed would have different
         | challenges, of course. It's not all about adversaries (though
         | there are some); it's also about modeling what you find
         | interesting or important. But that's pretty complicated too.
         | Your one friend shared a woodworking project and your other
         | friend shared travel photos. Which one(s) of those are you
         | interested in? And when someone posts political stuff, is that
         | something you find interesting, or is it something you prefer
         | to keep separate from Facebook? There are a lot of different
         | types of things people post, so the scope of figuring out
         | what's important is pretty big.
        
         | gscott wrote:
         | Facebook figured out how to bring a Usenet flame war to the
         | masses and profit from it, job well done!
        
         | meshaneian wrote:
         | I like this, and so does my friend Confirmation Bias, who is
         | pretty clear that the AI would select completely unbiased
         | content relevant to me, not limited by any of the Bias family.
         | It would be 100% better than the bias filters in place now,
         | because my thoughts and selections are always unbiased, IMHO.
         | (FYI: Obviously I'm not being serious. You clearly knew that,
         | this notice is for the other person who didn't.)
        
         | joejerryronnie wrote:
         | Why would a personal AI which curates your content be any
         | "better" than FB's AI which curates your content? Isn't the
         | current AI based on what you end up engaging with anyway? If
         | you naturally engage in a variety of content across all
         | ideological spectrums, than that's what the FB AI is going to
         | predict for you. Unfortunately, the vast majority of us engage
         | with content which reinforces our existing worldview - which is
         | exactly what would happen with a personal AI.
        
           | TeMPOraL wrote:
           | Because an algorithm under your control can be tweaked by
           | you. Could be as simple as reordering topics on a list of
           | preferences. Facebook's algorithm can't be controlled like
           | that. Also, an algorithm you own won't change itself
           | unbeknownst to you.
        
         | munificent wrote:
         | The fundamental flaw is this:
         | 
         | The primary content no user wants to see any every user agent
         | would filter out is _ads_. Since ads are the primary way sites
         | stay in business, they are obligated to fight against user
         | agents or other intermediary systems.
         | 
         | The ultimate problem is that Facebook doesn't want to show you
         | good, enriching content from your friends and family. They want
         | to show you ads. The good content is just a necessary evil to
         | make you tolerate looking at ads. Every time you upload some
         | adorable photo of your baby for your friends to ooh and aah
         | over, you're giving Facebook free bait that they then use to
         | trap your friends into looking at ads.
        
           | naravara wrote:
           | >Since ads are the primary way sites stay in business, they
           | are obligated to fight against user agents or other
           | intermediary systems.
           | 
           | Not all users hate ads in principle, just in practice. In
           | theory, you'd be making the users select ads for relevance
           | and not being annoying. But obviously, the site wants to show
           | ads based on how much they're paying and "not being annoying"
           | only factors in if pushes people off the site entirely.
        
           | pwdisswordfish2 wrote:
           | "The ultimate problem is that Facebook doesn't want to show
           | you good, enrishing content from your friends and family."
           | 
           | Well, it is someone else's website. What do you expect
           | Zuckerberg has his own interests in mind.
           | 
           | In 2020, it is still too difficult for everyone to set up
           | their own website, so they settle for a page on someone
           | else's.
           | 
           | If exchanging content with friends and family (not swaths of
           | the public who visit Facebook - hello advertisers) is the
           | ultimate goal, then there are more efficient ways to to do
           | that without using Zuckerberg's website.
           | 
           | The challenge is to make those easier to set up.
           | 
           | For example, if each group of friends and family were on the
           | same small overlay network they set up themselves, connecting
           | to each other peer-to-peer, it would be much more difficult
           | for advertisers to reach them. Every group of friends and
           | family on a different network instead of every group of
           | friends and family all using the same third party, public
           | website on the same network, the internet.
           | 
           | Naysayers will point to the difficulty setting up such
           | networks. No one outside of salaried programmers paid to do
           | it wants to even attempt to write "user agents" today because
           | the "standard", a ridiculously large set of "features", most
           | of which benefit advertisers not users, is far too complex.
           | What happens when we simplify the "standard"? As an analogy,
           | look at how much easier is is to set up Wireguard, software
           | written more or less by one person, than it is to set up
           | OpenVPN.
        
             | gmax wrote:
             | It may not be _that_ hard to set up things for yourself,
             | have been toying with something like that for messaging
             | https://cweb.gitlab.io/StoneAge.html. The deeper question
             | is how to sustain this kind of products and make them
             | competitive without comparable funding.
        
             | ggggtez wrote:
             | I don't think that "user-agents" are the hard part either.
             | At this point, I think any grad student would happily write
             | a NN implementation that took various posts as input, and
             | returned to you a sorted list based on your preferences
             | (with input layers like: bag of words, author, links, time,
             | etc, that the user could put more or less weight into just
             | by simple upvote/downvote).
             | 
             | The problem is that no one has the incentive to host such a
             | service for free, and users wants the content to be
             | available 24/7. So it's not as simple as just setting up a
             | peer-to-peer network. Users who just use a phone as their
             | primary computer will still want to be able to publish to
             | their millions of followers, and so it wouldn't work to
             | have those millions of people connect directly to this
             | person's device. Maybe you can solve that with a bit-
             | torrent like approach, but the problem gets harder when you
             | include the ability to send messages privately.
        
               | pwdisswordfish2 wrote:
               | "Users who just use a phone as their primary computer
               | will still want to able to publich to their millions of
               | followers, and so it wouldn't work to have these millions
               | of people connect directly to this person's device."
               | 
               | You have shifted the discussion from small overlay
               | network for friends and family to large overlay network
               | for "millions of followers".
               | 
               | Those methods of sharing content with "millions of
               | followers" are already available and will no doubt
               | continue to be available.
               | 
               | A small private network is a different idea, with a
               | different purpose. The only thing it can potentially
               | "replace" is using those large public networks to share
               | content with small groups of friends and family.
               | Nevertheless, people will always have the choice of using
               | a public network.
               | 
               | There is no requirement that a service has to be "free",
               | or supported by ads. This is something else you injected
               | in the discussion. I use free software to set up
               | overlays, but I have to pay for internet service and
               | hosting. The cost of the "service" is not the setup it is
               | the internet access and hosting.
        
             | baq wrote:
             | i have a fundamental issue calling a content-curating,
             | psychological-experiment-running platform visited by
             | hundreds of millions of people daily 'someone else's
             | website'. the fact that it is privately owned doesn't
             | matter if nation states use it to wage information wars
             | against other nation states' citizens. to make matters
             | worse, the 'someone else' in question knows about it
             | perfectly well and is fine with it because it means he's
             | showing more ads.
             | 
             | this is plain and simple fucked up.
        
               | pwdisswordfish2 wrote:
               | Well, that is what it is. No one knew a single website
               | could grow so large, but it did. Even though there are
               | thousands of people working for its owner, when reading
               | articles like the OP we are reminded how much control he
               | still has over it. No doubt he still thinks of it as his
               | personal creation. Of course, "99.99999%" of the content
               | is not his. Perhaps most of the people who sign up on
               | Facebook are not employed by nation states but just
               | ordinary people who want an easy way to stay connected to
               | friends and family. Maybe these people should have a
               | better way to stay connected than using a public website.
        
             | StandardFuture wrote:
             | Excuse my ignorance, but isn't the overlay network setup
             | problem one that has problems at almost every level of the
             | stack? If there is not any definitive technical problems to
             | overcome, why is it not possible to create a mobile app
             | that friends and family could use as their own private
             | network?
             | 
             | Isn't the internet supposed to be every node acting as it's
             | own server and client simultaneously anyways? Is the
             | problem just the inability to truly decentralize discovery,
             | registry, and identity authentication of nodes in the
             | network? Or is the problem that most ISPs don't want people
             | operating services out of their homes or off of their
             | phones?
        
           | manquer wrote:
           | That is how it is today. But does it have to be like that ?
           | What is the minimum revenue per user required for service
           | like FB to run.
           | 
           | While everyone is sceptical on whether such a service can
           | reach critical mass to make financial sense, a brand new FB
           | replacement may not be able to do it, However FB itself can
           | certainly give that as an option without hurting their
           | revenues substantially.
           | 
           | I was sceptical on the value prop for Youtube Premium, I am
           | constantly surprised how many people pay for it, if google
           | can afford to loose ad money with YT premium, I am sure FB
           | can build a financial model around a freemium offering if
           | they wanted to.
        
             | marcinzm wrote:
             | Minimum doesn't matter, the only question is if it's more
             | profitable than the current approach. Facebook makes
             | $9/user/quarter. That's every user no matter how little
             | they use the site.
             | 
             | The issue however is that the users advertisers care about
             | are the ones with disposable income. The users most likely
             | to opt out of ads are the ones with disposable income. Thus
             | the marginal cost to Facebook from such users is
             | significantly more than $9/quarter.
        
               | ardy42 wrote:
               | >>> The ultimate problem is that Facebook doesn't want to
               | show you good, enriching content from your friends and
               | family. They want to show you ads. The good content is
               | just a necessary evil to make you tolerate looking at
               | ads.
               | 
               | >> That is how it is today. But does it have to be like
               | that ? What is the minimum revenue per user required for
               | service like FB to run.
               | 
               | > Minimum doesn't matter, the only question is if it's
               | more profitable than the current approach.
               | 
               | Only if you think strictly inside the box.
               | 
               | The real problem here is one is a misalignment of
               | incentives: Zuckerberg managing Facebook to maximize the
               | metric he's being evaluated on (profit and wealth), not
               | the value provided to society.
        
               | marcinzm wrote:
               | Value to society is subjective and many horrors of the
               | past (and current) were caused by trying to optimize
               | specific definitions of that term.
        
               | TeMPOraL wrote:
               | And many horrors of the past and present were and are
               | caused by optimizing for profits. So it's not like we get
               | to side-step horror-avoidance.
               | 
               | I'd start with treating users as partners, and not
               | cattle.
        
             | henriquemaia wrote:
             | > I am sure FB can build a financial model around a
             | freemium offering if they wanted to.
             | 
             | They probably could. As they could also charge you a
             | premium and then profit two times on top of you -- with
             | your fee and then by selling your data to third parties.
             | Why? Because who would know that was happening?
             | Corporations have no moral compass dictating their actions.
             | The bottom line being what's best for investors.
        
           | KorematsuFred wrote:
           | I am happy to pay $10 bucks a month for a facebook like
           | service that suggests me engaging, high quality content
           | without any promotional content.
           | 
           | I am already getting such service in the form of Android's
           | newsfeed feature on pixel. Its google but its pretty good.
        
           | monadic2 wrote:
           | > Since ads are the primary way sites stay in business
           | 
           | Flaw? It seems that the point would be to force FB to
           | transact with currency rather than a bait-and-switch tactic.
           | The site would also be more usable if they were forced to
           | change business model.
        
           | panopticon wrote:
           | I think another tangential but related issue is with how
           | these companies measure success. They measure success by
           | engagement, and things that drive the most user engagement
           | aren't usually the best for the user.
           | 
           | YouTube has been getting a lot of flack for this recently.
        
           | rockinghigh wrote:
           | > The good content is just a necessary evil to make you
           | tolerate looking at ads.
           | 
           | You could make the same argument for Google or other online
           | web sites relying on ads as the primary source of revenue.
        
             | tomaskafka wrote:
             | You can, and if you look at Google's actions long term,
             | they do.
        
           | TwoBit wrote:
           | Facebook would be perfectly happy to eliminate all ads if
           | instead you paid a small or even tiny monthly fee. But you
           | won't pay it.
        
             | TeMPOraL wrote:
             | No, they wouldn't, unless advertising gets banned. They'd
             | instead accept your payment _and_ find a way to shove ads
             | in anyway, in a covert or overt way, just as many paid
             | services do, because why leave money on the table?
        
               | rockinghigh wrote:
               | YouTube and Spotify went that subscription route and it
               | works well. There are no ads.
        
           | ngold wrote:
           | Still waiting for that ublock origin web browser.
        
           | a1369209993 wrote:
           | > The primary content no user wants to see any every user
           | agent would filter out is ads.
           | 
           | Not that you're wrong, but: _that 's the fucking point_!
           | 
           | Advertising delenda est.
        
             | arihant wrote:
             | Pay for Facebook then. 1.5% of total YouTube users
             | subscribe for YT Premium. I love how the smartest minds
             | will ignore the most primitive economics. Ads work. For
             | everyone. Except deluded.
        
               | pessimizer wrote:
               | If everybody paid for facebook, it would have as many
               | ads, if not more. That companies would leave money on the
               | table with no incentive to do so is a bizarre self-
               | justifying myth that people who live off advertising tell
               | themselves.
               | 
               | You pay for cable. Paying customers are a better audience
               | for ads than deadbeats.
        
               | creato wrote:
               | Where did you get that number? That's actually far, far
               | higher than I would have thought and quite encouraging...
        
               | jjjensen90 wrote:
               | On Feb 4th, Google said there were 20 million Youtube
               | Premium users, and I believe the latest estimates put
               | Youtube at 2 billion users, which would be a 1%
               | subscription rate.
        
               | mikenew wrote:
               | It would be interesting to see how they count a "user"
               | too. If a good portion of those 2 billion people don't
               | use YouTube very much then the % of users who are using
               | it regularly and subscribing might be a lot higher.
        
               | [deleted]
        
               | Calamity wrote:
               | You know the worst part? I do pay for YT Premium and yet
               | Google still finds a way to throw ads at me on Youtube
               | videos through videos it suggests via Google Feeds (the
               | leftmost screen on a Pixel). I bloody pay for the service
               | and yet I am still getting ads on any youtube video I
               | play when clicking any youtube suggested video on that
               | feed. How annoying do you think that is? When you give in
               | and pay, yet you are still getting harassed.
        
               | nsriv wrote:
               | I had this issue as well and logging out (in my case,
               | switching accounts) wiping Google app cache, and I think
               | rebooting to make Pixel Launcher refresh, then logging
               | back into the YT Premium subscribed account. Convoluted I
               | know, but I think the issue was it wasn't picking up some
               | profile variable denoting me as a subscriber. Hope that
               | helps!
        
               | a1369209993 wrote:
               | I don't use Facebook, and I want it to die. Facebook is
               | one of the many, many reasons _why_ advertising must be
               | destroyed.
        
               | nabla9 wrote:
               | > Pay for Facebook
               | 
               | You can do that?
        
               | Abekkus wrote:
               | I paid for youtube for a while, but I did not get a
               | different algorithm. It was the same feed of addictive,
               | stressful content. I stopped paying once I noticed this.
        
               | monadic2 wrote:
               | Compare how much money google can make off ads in a month
               | to $15. You're paying for way, way, way more than just
               | removing ads and it's obvious
        
               | trisiak wrote:
               | That's why you get other benefits that are much harder to
               | price in.
        
               | pwdisswordfish2 wrote:
               | That does not tell us much. Where can we look at
               | YouTube's balnce sheet? There is likely more to YouTube
               | as a business than selling ads on YouTube. For one,
               | YouTube under Google is like AC Nielson on steroids. The
               | combination easily rivals any "smart" TV.
        
               | nicoburns wrote:
               | Paid-for Facebook would be a viable business if it wasn't
               | competing with free-facebook. It's not ignoring economics
               | to think that Facebook is causing significant negative
               | externalities that ought to be priced or regulated to
               | allow more ethical alternatives to thrive.
        
               | baq wrote:
               | free facebook should be regulated out of existence. what
               | else is free that is good for you? in big cities you have
               | to pay for clean air to breathe already.
        
               | comawhite wrote:
               | To be honest, I'd happily pay for YT Premium if Google
               | didn't use my data to personalise other results and
               | content on the internet. I personally stop using
               | products/services that dictate what content is deemed
               | "suitable" for my consumption. I'll happily be served
               | adverts so long as I'm not getting manipulated.
        
               | freeone3000 wrote:
               | I already don't get ads on youtube, so I don't see why I
               | should pay for this when I can get all of the benefit
               | with none of the expense.
        
             | ggggtez wrote:
             | Let's imagine for a moment that a decentralized social
             | network actually took off.
             | 
             | How long until those ads crop back up anyway? Instagram
             | should give us some idea on how sponsored content might
             | look in such a system. According to some random site, the
             | average price for a "sponsored" instagram post is $300. You
             | think your friends are above showing you an ad when real
             | money is on the line? Maybe they won't be making that kind
             | of money with very few followers, but when Pizzahut asks
             | you to post an ad in exchange for a free pizza, I think
             | you'll see plenty of takers. Now, granted, at least the
             | people being paid are your friends, instead of Zuck.
        
         | GuiA wrote:
         | What you're referring to is splitting the presentation from the
         | content. The server (eg Facebook) provides you with the
         | content, and your computer/software displays it to your liking
         | (ie without ads and spam and algorithmically recommended crap).
         | 
         | There's a lot of history around that split, and the motivation
         | for HTML/CSS was about separating presentation from the content
         | in many ways. For another example, once upon a time a lot of
         | chat services ran over XMPP, and you could chat with a Facebook
         | friend from your Google Hangouts account. Of course, both
         | Google and Facebook stopped supporting it pretty quickly to
         | focus on the "experience" of their own chat software.
         | 
         | The thing is that there is very little money to be made selling
         | content, and a lot to be made controlling the presentation. So
         | everyone focuses on the latter, and that's why we live in a
         | software world of walled gardens that work very hard to not let
         | you see your own data.
         | 
         | There is some EU legislation proposal that may make things a
         | bit better (social network interop), but given the outsized
         | capital and power of internet companies i'm not holding my
         | breath.
        
           | Lammy wrote:
           | > you could chat with a Facebook friend from your Google
           | Hangouts account
           | 
           | This was never true. There was an XMPP-speaking endpoint into
           | Facebook's proprietary chat system, but it wasn't a S2S XMPP
           | implementation and never federated with anything. It was
           | useful for using FBChat in Adium or Pidgin, but not for
           | talking to GChat XMPP users.
        
             | hesk wrote:
             | I don't know about Facebook but Google Talk was federated
             | at some point [1].
             | 
             | [1] https://googletalk.blogspot.com/2006/01/xmpp-
             | federation.html
        
               | Lammy wrote:
               | Yep, Google's was. They never enabled server-to-server
               | TLS, though, so GTalk was effectively cut off from the
               | federated XMPP network after May 2014 when that became
               | mandatory: https://blog.prosody.im/mandatory-encryption-
               | on-xmpp-starts-...
        
           | divbzero wrote:
           | RSS is yet another example of separating content from
           | presentation.
        
             | drdeadringer wrote:
             | I don't see this as a bad thing. I experience this as a
             | good thing.
             | 
             | The RSS feeds I subscribe to give me plenty of
             | "presentation" or "branding". Logos, written descriptions
             | [both short- and long-form], clear names of what I am
             | subscribing to, URLs. Just the right amount for me, in
             | fact; if I wanted to go to their website(s) for their
             | particular buffet of blog posts, featured puff pieces on
             | Page Five, twitter mentions, &c I can do that ... or not.
             | I'm glad I don't have to if I don't want to, and all of
             | these folks are more than able to drop into their RSS feed
             | a "Please go here for our tour information with new stuff
             | in our online shop" mention just as you are able to go
             | straight to some website full of deep-thumping media
             | flashing into your senses as you get to where you want to
             | go instead of using RSS.
        
               | divbzero wrote:
               | Agreed completely. RSS is an example of what content-
               | presentation separation could be if we made it more
               | prevalent across the web.
               | 
               | There seems to be a steady thread of this sentiment here
               | on HN, yet over the years no one has quite cracked this
               | nut. Solutions welcome!
        
           | richardw wrote:
           | Your friends provide you with the content, not Facebook. You
           | only need Facebook now because you don't have a 24/7 agent
           | swapping content on your behalf and presenting it how you
           | like it.
        
         | liopleurodon wrote:
         | bring back rss feeds
         | 
         | then I choose what I read in my reader
        
         | globular-toast wrote:
         | > What if, instead, you had a personal AI that read every
         | Facebook post and then decided what to show you.
         | 
         | So you can read more of what you already agree with? That's
         | called living in a bubble. The mind cannot grow in a bubble.
        
         | bufferoverflow wrote:
         | > _AI that read every Facebook post_
         | 
         | I doubt FB would let you do that. It's "their" content.
        
         | ereyes01 wrote:
         | Another early assumption about the internet and computers in
         | general is that users were going to exert large amounts of
         | control over the software and systems they use. This assumption
         | has thus far been apparently invalidated, as people by far
         | prefer to be mere consumers of software that are designed to
         | make its designers money. Even OSS is largely driven by
         | companies who need to run monetized infrastructure, though
         | perhaps you don't pay for it directly.
         | 
         | Given that users are generally not interested in exerting a
         | high level of sophisticated control over software they use, how
         | then is the concept of a user agent AI/filter any different at
         | a fundamental level? It probably won't be created and
         | maintained as a public benefit in any meaningful way, and users
         | will not be programming and tuning the AI as needed to deliver
         | the needed accuracy. I don't think AI has yet reached a level
         | of sophistication where content as broad a range as what's
         | found on the internet (or even just Facebook) can be curated to
         | engage the human intellect beyond measuring addictive
         | engagement, without significant user intervention.
         | 
         | Hopefully I'm wrong, as I do wish I could engage with something
         | like Facebook without having to deal with ads or with content
         | curated to get my blood boiling. Sometimes I do wonder how much
         | it is Facebook vs. human tendency under the guise of an online
         | persona, as both are clearly involved here.
        
         | gjs278 wrote:
         | uhh or just show me all the posts in chronological order
        
         | yots wrote:
         | It's really not as sophisticated, but these guys[1] created an
         | extension that in addition to their main objective of analyzing
         | Facebook's algorithm also offers a way to create your own
         | Facebook feed. If I got it right, they analyze posts their
         | users see, categorize them by topic and then let you create
         | your own RSS feed with only the topics you want to see.
         | 
         | It's not clear to me whether you may see posts collected by
         | other users or only ones from your own feed and it seems highly
         | experimental.
         | 
         | [1] https://facebook.tracking.exposed/
        
         | 2019-nCoV wrote:
         | This already exists -- most social media is already curated.
         | You only see tweets and posts from those you follow or friend.
         | You can already block or ignore any undesirables. This works
         | fine for self-curation.
         | 
         | There is no need for holier-than-thou censorship short of legal
         | breaches. Good to see FB take this change of direction.
        
           | solarkraft wrote:
           | Except when it doesn't.
        
             | 2019-nCoV wrote:
             | Such as?
        
               | solarkraft wrote:
               | Twitter shows me garbage I don't like, as does YouTube.
               | They do this with very little regard for who you follow
               | nowadays and give you no say in what types of stuff you
               | actually want to be recommended. Sometimes they're nice
               | enough to say why they recommend something (which should
               | be standard), but most of the time it's just
               | infuriatingly stupid.
               | 
               | I'm not against machine curation at all, mind you. I want
               | the infrequent poster to have a higher weighed voice and
               | such. But I want to be able to control the parameters.
        
               | 2019-nCoV wrote:
               | For Twitter use curated lists. This allows you to avoid
               | "the algorithm".
               | 
               | You are bemoaning YouTube's discovery process, you need
               | not watch what's "Up Next" -- that's your choice.
        
         | janekm wrote:
         | Sounds pretty similar to the concept of "software agents" which
         | was popular in the mid '90s:
         | http://www.cs.ox.ac.uk/people/michael.wooldridge/pubs/iee-re...
         | 
         | Part of the concept was that the agents would actually roam
         | onto servers on the internet on your behalf raising complicated
         | questions around how to sandbox the agent code (came in useful
         | for VPSs and AWS-style lambdas in the end).
        
         | Swizec wrote:
         | I tried building this 10 years ago as a startup. Maybe time to
         | revisit, the zeitgeist is turning more and more towards this
         | and computing power has gotten cheap enough ...
        
         | ralston3 wrote:
         | > I don't know. Maybe that will just make the echo chambers
         | worse.
         | 
         | This.
         | 
         | Also. What incentive does a walled garden even have to allow
         | something like this? Put a different way, what incentive does a
         | walled garden have to not just block this "user agent"? Because
         | the UA would effectively be replacing the walled garden's own
         | "algo curated new feed" - except if the user builds their own
         | AI bot -- the walled garden can't make money the way they
         | currently do.
         | 
         | I think the idea is very interesting. I personally believe
         | digital UA's will have a place in the future. But in this
         | scenario I couldn't see it working.
        
           | fpgaminer wrote:
           | True, but we have ad blockers and they're effective. They're
           | effective against the largest, richest companies in the
           | world. There are various reasons for that, but at the end of
           | the day it remains true that I can use YouTube without ads if
           | I choose to. There's clearly a place in the world for pro-
           | user curation, even if that's not in FAANG's best interests.
           | I think it's antithetical to the Hacker ethos to not pursue
           | an idea just because it's bad for mega-corps.
        
             | TeMPOraL wrote:
             | Mega-corps don't stop themselves from pursuing an idea just
             | because it's bad for the hoi polloi, so why should we?
        
         | jackandrew wrote:
         | We're a small team working in stealth on this exact challenge.
         | Shoot me a note if you're interested in hearing more or getting
         | involved. itshelikos@gmail.com
        
           | mpfundstein wrote:
           | good luck
        
         | Merad wrote:
         | I think the overwhelming majority of users don't want to deal
         | with this kind of detail. IMO most people would end up using
         | some kind of preset that matched their preferred bubble.
        
         | austincheney wrote:
         | > What if, instead, you had a personal AI
         | 
         | I was in agreement with you until I read that. People don't
         | need to have content dictated to them like mindless drones
         | whether it is from social media, bloggers, AI, or whatever.
         | Many people prefer that, though, out of laziness. It's like the
         | laugh track on sitcoms because people were too stupid or tuned
         | out to catch the poorly written jokes even with pausing and
         | other unnecessarily directed focus. It's all because you are
         | still thinking in terms of content and broadcast. Anybody can
         | create content. Off loading that to AI is just more of the same
         | but worse.
         | 
         | Instead imagine an online social application experience that is
         | fully decentralized without a server in the middle, like a
         | telephone conversation. Everybody is a content provider amongst
         | their personal contacts. Provided complete decentralization and
         | end-to-end encryption imagine how much more immersive your
         | online experience can be without the most obvious concerns of
         | security and privacy with the web as it is now. You could share
         | access to the hardware, file system, copy/paste text/files,
         | stream media, and of course original content.
         | 
         | > And isn't that how the internet used to be?
         | 
         | The web is not the internet. When you are so laser focused on
         | web content I can see why they are indistinguishable.
        
           | solarkraft wrote:
           | I think your suggestion is a bit out of scope for what's
           | actually being discussed/not really a solution.
           | 
           | I'm active on the somewhat (not fully) decentralized social
           | medium Fediverse (more widely known as Mastodon, but it's
           | more than that) and I think a lack of curation is a problem:
           | Posts by people who post a lot while I'm active are very
           | likely to be seen, those by infrequent posters active while
           | I'm not very likely to go unnoticed.
           | 
           | How would your proposed system (that seems a bit utopic and
           | vague from that comment, to be honest) deal with that?
        
         | zachware wrote:
         | The risk is that it behaves like a reinforcement learning
         | algorithm which essentially rewards itself by making you more
         | predictable, I'd argue that's what curated social networks do
         | today.
         | 
         | If you're unpredictable you're a problem. Thus, it makes sense
         | to slowly push you to a pole so you conform to a group's
         | preferences and are easier to predict.
         | 
         | A hole in my own argument is that today's networks are
         | incentivized to do increase engagement where a neutral agent is
         | in most ways not.
         | 
         | So perhaps the problem isn't just the need for agents but for a
         | proper business model where the reward isn't eyeball time as it
         | is today.
        
           | scotty79 wrote:
           | > If you're unpredictable you're a problem.
           | 
           | But you are predictable, even if you think you are
           | unpredictable, you are just a bit more adventurous. Algorithm
           | can capture that as well. It will be easier for algorithm
           | that works on your behalf.
        
             | airstrike wrote:
             | Reminded me of
             | https://news.ycombinator.com/item?id=19336754
        
             | solarkraft wrote:
             | This makes me think of a talk with an AI-optimistic
             | Microsoft sales guy I had a few years ago. His argument was
             | essentially the same:"Look, it's no problem to have an AI
             | curate everything for you because the algorithm will just
             | know what you want, even if your habits are unusual!"
             | 
             | Of course this hasn't happened yet and I doubt it ever
             | will. Maybe I'm just insane, but most of the
             | recommendations from services I have fed data for hundreds
             | of hours (YouTube) are actually repulsive.
        
           | api wrote:
           | > So perhaps the problem isn't just the need for agents but
           | for a proper business model where the reward isn't eyeball
           | time as it is today.
           | 
           | I've been on this for years. Free is a lie, and the idea that
           | everything has to be "free as in beer" is a huge reason so
           | many things suck.
        
         | abhchand wrote:
         | I like this idea.
         | 
         | In response to it just creating more echo chambers:
         | 
         | - it can't be worse than now - At minimum, it's an echo chamber
         | of your own creation instead of being manipulated by FB.
         | There's value in that, ethically. - Giving people choice at
         | scale means it will at least improve the situation for some
         | people.
        
           | ketzu wrote:
           | Isn't facebook (and reddit, and twitter) showing you posts by
           | people companies etc. that you decided to follow? (And some
           | ads)?
           | 
           | I am pretty sure things can be worse than right now,
           | pretending like we are in some kind of hell state at the
           | bottom of some well where it can't possibly be worse, seems
           | unrealistic to me.
        
             | notriddle wrote:
             | I've seen Twitter pull tweets from an account merely
             | because someone I follow follows them. Facebook is the
             | same.
             | 
             | I think Reddit sticks strictly to your subscriptions,
             | unless you go to /r/all.
        
         | tomc1985 wrote:
         | This misses the point. Facebook refuses to look inwardly or
         | mess with their core moneymaker, regardless of how it affects
         | people. Noone is ever going to sip from the firehose just like
         | we'll never again get a simple view of friend's posts sorted by
         | creation date.
         | 
         | I think the real problem is Facebook's need to be such a large
         | company. They brought this on themselves trying to take over
         | the world. Maybe they need a Bell-style breakup
        
       | anigbrowl wrote:
       | I've posted a lot over the years about FB being leveraged by
       | genocidal regimes and bad actors. While I don't think they
       | necessarily pursue such ends, the fact is that social media is a
       | battlespace from where real-world aggression can be launched, and
       | that renting out platform space to this end has been extremely
       | profitable.
       | 
       | Perhaps it has already been posted elsewhere in this very long
       | thread, but if not I heartily encourage more ethically minded FB
       | employees to leak the presentation in question and indeed
       | anything else they consider relevant. At some point it will be
       | too late to feel bad about not having done so when it could make
       | a difference.
        
       | adamnemecek wrote:
       | I hate FB as much as the next guy but I think that Facebook is an
       | amplifier of other trends.
       | 
       | I think that the underlying issue is the two party system. The
       | echo chambers get amplified.
        
         | dfxm12 wrote:
         | _"Our algorithms exploit the human brain's attraction to
         | divisiveness," read a slide from a 2018 presentation. "If left
         | unchecked," it warned, Facebook would feed users "more and more
         | divisive content in an effort to gain user attention & increase
         | time on the platform."_
         | 
         | According to the article, FB is not taking a passive role in
         | this; they're actively trying to exploit people.
        
         | ver_ture wrote:
         | The two party system does not affect this discussion.
         | Facebook's algos will show you more and more $x content if
         | you've liked $x or subscribed to it, and never show you $y
         | content since you'd probably not like and engage with $y.
         | Doesn't matter how many parties/topics/underlyingIssues there
         | are.
         | 
         | If FB were neutral they would show you every FB post, millions
         | per second whizzing past your screen, but they can't do this,
         | they have to curate a wall for you to slowly scroll through and
         | for most revenue, like, share, or comment on.
         | 
         | Therefore, to show you the most content that you will like,
         | share, or comment on, they repeat the type ($x) you've already
         | liked, creating the echo.
         | 
         | So no, it is not mostly a problem of the underlying issue of
         | the two parties, this is entirely about how FB curates your
         | wall and simply doesn't show you "the other party"/$y or
         | anything deviant/$y of your likes.
         | 
         | Edit: changed political parties to variables to illustrate
         | point.
        
         | tasty_freeze wrote:
         | It is a feedback loop. Politics has become more polarized, I
         | believe, because of the need to be "pure" so avoid the wrath of
         | the party's highly polarized base.
         | 
         | 30 years ago an R and a D could cut a deal to get things done
         | and few people would notice that they compromised by giving a
         | little to get a little.
         | 
         | Now when such deals happen the deal makers are branded as
         | traitors and RINOs (do people use DINOs too?) and must be
         | primaried.
         | 
         | FB encourages polarization because it increases engagement with
         | their advertisers, which is useful to FB. The polarized base is
         | useful to parties because it motivates them to donate,
         | proselytize, and vote. That base polarization leads to
         | polarization in candidates, and the division grows.
        
       | LordHumungous wrote:
       | > Facebook policy chief Joel Kaplan, who played a central role in
       | vetting proposed changes, argued at the time that efforts to make
       | conversations on the platform more civil were "paternalistic,"
       | said people familiar with his comments.
       | 
       | I think Joel was right.
        
         | unethical_ban wrote:
         | That isn't a bad thing. We are constantly influenced by design
         | and society. It's going to happen. And in Facebook's case, with
         | respect to Rush: "If you choose not to decide, you still have
         | made a choice". Choosing not to build a user experience that
         | disarms unnecessary conflict, or that can limit disinformation,
         | is a clear choice.
         | 
         | The idea of designing human interaction and government policy
         | with the knowledge of how humans react is not shocking or new.
         | Heck, the "Pandemic Playbook" from the CDC continuously
         | references group behavior when discussing how to communicate
         | facts to the public. For example: If you tell people to stay
         | home on day 1, the public may doubt or tune out your advice. So
         | what do you do on days 1-3 so that on Day 4, government advice
         | is heeded? Get private companies on board, ramp up voluntary
         | advice for some time, before letting the big news fall.
         | 
         | If you'd like to learn more, check out Nudge by Cass Sunstein
         | [1]. And another book by the same man, specifically covering
         | the ethics of governments using the technique. [2]
         | 
         | [1] https://www.amazon.com/gp/product/B00A5DCALY
         | 
         | [2] https://www.amazon.com/gp/product/B01JGME90E
        
         | ironman1478 wrote:
         | I don't think this makes sense. It works off of assumptions
         | that are clearly untrue. 1. Consequences of language on the
         | internet are equal to that in person 2. Networking effects
         | 
         | For 1. If somebody on the street comes up to you and says "hey
         | I'm going to come beat up your family." At a bare minimum, the
         | cops are being called and it is somewhat taken seriously. On
         | the internet though, it is a reality for many people
         | (especially women) that there are no consequences for such
         | horrible language and communication. Also, people make
         | different decisions in real life when it comes to certain types
         | of language. I don't just go around swearing like in real life,
         | but people are way more offensive on the internet. There are
         | physical realities that don't map to the internet, that causes
         | different communication patterns on the internet.
         | 
         | For 2. When it comes to spreading disinformation through idiots
         | sharing links to each other, the effect is much more pronounced
         | than when a conspiracy theorists goes out to a street corner
         | and starts shouting ideas at people or has a million signs. Its
         | clear in the latter case they might have a few screws loose,
         | however in the former, everybody's "opinion" seems equal, but
         | we can't use our other senses to vet them and b/c communication
         | is slow/unclear on the internet, we also can't have a
         | protracted conversation to figure out what their ideas are and
         | where they come from (something you can easily do in person).
         | This then causes really bad ideas to spread because people have
         | lots of connections on facebook and there is no good way of
         | vetting people or ideas.
         | 
         | The idea to not be "paternalistic" only makes sense if you
         | think that communication in person is equivalent in every way
         | to in person communication, which is fundamentally untrue. The
         | only reason they don't do this is b/c they don't know how to
         | solve this problem for N countries generically and don't want
         | to be held liable for a policy that makes sense in country A,
         | but not in B and causes potential legal issues.
        
         | megamittens wrote:
         | Unless Joel is advocating allowing nudity on the platform then
         | he is just blowing smoke. Facebook is inherently paternalistic
         | and Joel Kaplan is right-wing hack.
        
         | crocodiletears wrote:
         | Honestly, this does give me much more confidence in Facebook's
         | internal governance, even if the platform often bows to media
         | demands.
         | 
         | Much to the Chagrin of many on HN, Facebook is, and has been a
         | fairly open platform to people of all convictions, backgrounds,
         | and political stripes. Even if it has been unsteady handed at
         | times. This, as well as their sorting algorithm may well be
         | contributing to the collapse of institutional trust and
         | cultural balkanization of the western world.
         | 
         | To a progressive liberal or political moderate who directly
         | benefitted from the economic and technological booms we've
         | experience over the last 30 years this is upsetting, because
         | the global order (and its associated stability) from-which
         | they've benefited, which brought us to where we are is
         | disintegrating around us.
         | 
         | To me, hand-wringing about Facebook's relatively hands-off
         | approach the political dialogues on their platform is just
         | resentment about the loss of a prescribed cultural narrative
         | and familiar cultural coalitions, the collapse of-which has
         | given every stakeholder in their nation's future an opportunity
         | to speak up for their own convictions and interests, in hopes
         | that theirs will be the dominant narrative of the new political
         | landscape.
         | 
         | I wish these dialogues and factional aggregations were
         | occurring on a more federated network. But so far as
         | centralized platforms go, I can't think of any company more fit
         | (that's not a compliment, but a lament) than Facebook to host
         | them.
        
           | skosch wrote:
           | > political dialogues
           | 
           | That sounds lovely.
           | 
           | Fake news (the actual kind), name-calling, absurd conspiracy-
           | theorizing, memes that remove all nuance from complex issues,
           | botnets that amplify anti-science/anti-intellectual nonsense
           | ... aren't political dialogue, they're the breakdown of it.
        
             | crocodiletears wrote:
             | They're indicative of a collapse of consensus on the part
             | of society. It could well be argued that prior to our
             | current political era, especially in the US, that the
             | political domain was largely constrained to a discourse on
             | cultural aesthetics, wherein the Democrats and and
             | Republicans argued over trivialities (on a broader
             | national, not individual respect) such as abortion,
             | marriage, and immigration, while they operated on an
             | implicit consensus concerning foreign policy, and had a
             | functional stalemate in terms of the size of the state,
             | farming out many of their policy decisions to thinktanks,
             | corporate donors, and well-established bureaucrats within
             | our regulatory bodies
             | 
             | America's role as international security guarantor, its
             | trade policies, and its government's role in domestic
             | affairs was never really up for debate, and it only really
             | changed stepwise in a stochastic manner, responding to
             | situations and incentives day-by-day with no conscious
             | consideration to the role of America or its state on a
             | broader scale.
             | 
             | What we're seeing now, is large portions of the population
             | coming to realize that that existing bipartian components
             | of the political consensus - which I believe to be a legacy
             | of the cold war, no longer serves their cultural or
             | economic interests.
             | 
             | This process is naturally fractious, chaotic, sometimes
             | violent, and full of dirty tricks, because politics isn't
             | just about flavor of the month policies anymore. We're in
             | the process of reinventing who we collectively are, and
             | what we want to be. As a result, we're running across real,
             | fundamentally irreconcilable political and moral
             | differences that have been buried for decades, as well as
             | confronting the failures and controversies of our past.
             | 
             | Many of those fundamental agreements settle neatly along
             | class, racial, and professional boundaries. Others, not so
             | much.
             | 
             | Science denial and anti-intellectualism is the natural
             | result, because much of science communication has become a
             | carrier mechanism for policy prescriptions predicated upon
             | society operating under a specific ideological consensus,
             | when in fact someone of a different political persuasion
             | might objectively consume the scientific data and come to a
             | different policy conclusion based on the same data.
             | 
             | For the less educated, who encounter proposals from
             | scientists they consider to be politically unworkable, and
             | which might rightfully be considered manipulatively framed,
             | it is easier to reject entire specialized fields of
             | research out of hand than to investigate further and
             | attempt to conceive of alternative proposals because they
             | lack the tools to engage with the information effectively
             | to begin with.
             | 
             | All of this is messy, but it constotutes a real political
             | dialogue on the part of society.
        
         | taurath wrote:
         | Isn't it preferable to be somewhat paternalistic when you have
         | paternal amounts of power over your userbase? Its not like
         | giving up the power is on the table.
         | 
         | There is of course the well documented problem of moderation -
         | it inevitably turns into an issue of a subset of the users vs
         | the moderators. Facebook gets by pretending to be neutral
         | "platform providers", but they actively optimize for their
         | benefit. They are about as neutral as a bathtub salesperson on
         | water heaters.
         | 
         | This whole idea that they don't have control only has the
         | ability to stand based on the indifference of its users. I can
         | only hope it eventually falls and the next grand experiment in
         | mass social interaction is a lot more gentle for society.
        
         | renewiltord wrote:
         | I, too, agree. Facebook is just an extension of the open
         | society.
        
           | cryptoz wrote:
           | Facebook admits to doing large-scale emotional manipulation
           | of its users. They published a 'scientific' paper where they
           | showed that they tried and succeeded to make 1 group
           | depressed (hundreds of thousands of people), and 1 group feel
           | happier (also hundreds of thousands of people).
           | 
           | They psychologically manipulate people into depressions, _on
           | purpose_.
           | 
           | Facebook is _not_ "just" an extension of open society.
           | Facebook is a specific powerful corporation that makes
           | immoral decisions to emotionally control their users.
        
             | iliekcomputers wrote:
             | Could you link said paper?
        
               | cryptoz wrote:
               | There are a lot of sources to read, including follow-up
               | papers by other teams that evaluate if Facebook had
               | "informed consent" (they did not) to emotionally
               | manipulate their users.
               | 
               | https://www.google.com/search?q=facebook+paper+emotionall
               | y+m...
        
           | tarkin2 wrote:
           | Disagree. It's different to normal society.
           | 
           | Normal society encourages civility by offering the inclusion
           | into a needed physically-near social group. Digital society
           | deincentises civility by offering a multitude of alternative
           | groups.
        
             | robertlagrant wrote:
             | It's localness by homogeneity rather than geography.
        
           | 1propionyl wrote:
           | "The East India Trading Company is just an extension of the
           | open seas."
        
             | bpodgursky wrote:
             | Yes, a company which owns large chunks of India and has a
             | well-used private army numbering in the tens of thousands,
             | is a great analogy for a social-media company. </s>
             | 
             | The East India Company was responsible, at least in part,
             | for tens of millions of deaths in various famines, and to
             | equate the two fails both by being ridiculous (Facebook is
             | not a private empire with an empire), and trivializes the
             | actual damage done by that institution.
        
               | jboog wrote:
               | Defenders of the EIC at the time surely said "yeah some
               | bad stuff happens but think about the squalor the average
               | Indian lived in prior to the Englishman coming in and
               | bringing great wealth to their country. Think of the
               | untold famine and poverty we're helping ameliorate by
               | bringing western Christian ideals and wealth to a
               | primitive people.
               | 
               | How DARE you compare some unfortunate incidents of the
               | EIC to the human misery that existed before the Brits
               | arrived, you're being ridiculous!! "
               | 
               | People have always been able to use motivated reasoning
               | to explain away the terrible externalizes of their
               | choices when there's a shitload of money on the line.
               | 
               | FB has been a tool to aid genocide, they've contributed
               | to incivility in societies throughout the world while
               | they're cashing checks but don't want to appear
               | "paternalistic" of course so it's fine.
        
               | banads wrote:
               | Social media companies have played a primary role in
               | overthrowing governments and manipulating elections
               | across the world.
        
               | 1propionyl wrote:
               | In much of South/South-East Asia, for many people,
               | Facebook _is_ the internet. (And remember Facebook Zero?
               | Facebook was aware of and tried to engender this).
               | 
               | https://qz.com/333313/milliions-of-facebook-users-have-
               | no-id...
               | 
               | A staunch defender of the EITC would claim they were
               | "just" engaging in mercantilism and facilitating the
               | exchange of goods, and the war and deaths were just
               | unfortunate side-effects. Facebook is "just" engaging in
               | connecting people and facilitating the exchange of
               | information, and stoking violence and racial conflict are
               | just unfortunate side-effects.
        
               | badloginagain wrote:
               | While I appreciate the invocation of Godwin's law,
               | Facebook is absolutely a private empire with an empire;
               | in context of modern society.
        
               | bpodgursky wrote:
               | You're not going to convince me (or hopefully, anyone)
               | that an institution with an army that actually goes about
               | the business of conquering and killing people, has any
               | moral equivalence with a misguided (and I'm not
               | contesting, destructive) social media company.
               | 
               | We can say that things are bad, while at the same time
               | admitting that in the past, people did far worse things.
               | It's a new, different, less-bad-but-still-bad, thing.
               | It's OK.
        
           | tunesmith wrote:
           | You have to factor in how the algorithm rewards content that
           | drives engagement. _Without_ that, it 's more like an open
           | society.
        
         | AgentME wrote:
         | >The high number of extremist groups was concerning, the
         | presentation says. Worse was Facebook's realization that its
         | algorithms were responsible for their growth. The 2016
         | presentation states that "64% of all extremist group joins are
         | due to our recommendation tools" and that most of the activity
         | came from the platform's "Groups You Should Join" and
         | "Discover" algorithms: "Our recommendation systems grow the
         | problem."
         | 
         | They're responsible for 64% of extremist group joins. Is trying
         | to change that number to 0% paternalistic?
         | 
         | I assume I'm currently not responsible for any extremist group
         | joins. Am I being paternalistic by not pushing people toward
         | joining extremist groups? Is it only paternalistic if you first
         | find yourself responsible for some extremist group joins, and
         | then try to lower that number?
        
         | pkilgore wrote:
         | Imagine an average level of civility in a society `C`.
         | 
         | Lets say users of your product, due to your product, operate at
         | `0.5 C`.
         | 
         | Is changing the product so they operate at a higher `0.75 C` or
         | back to `C` "paternalistic"?
         | 
         | Why?
         | 
         | I can see the argument for moving `C` to `1.5 C` as
         | paternalistic. But when you're already actively affecting `C`
         | in one way, why do we moralize about moving it the other way?
         | What makes down OK, but up BAD?
        
           | tdhoot wrote:
           | Paternalistic is thinking you are in a position to define C
           | and measure it.
        
             | unethical_ban wrote:
             | You keep on saying "paternalistic" as if it's a bad thing.
             | I left another comment in this subthread suggesting it is
             | not.
             | 
             | Yes, some people are wrong and some are right. With
             | government, there are basic freedoms that allow people to
             | be wrong, and not to be incarcerated or unduly burdened by
             | government policing thought.
             | 
             | But society? Facebook? Even government messaging ala "The
             | Ad Council"? Yes, absolutely, to hell with disinformation,
             | trolls, and toxic platforms.
        
             | deegles wrote:
             | Refusing to measure something doesn't make it not exist.
        
         | contemporary343 wrote:
         | "If two members of a Facebook group devoted to parenting fought
         | about vaccinations, the moderators could establish a temporary
         | subgroup to host the argument or limit the frequency of posting
         | on the topic to avoid a public flame war."
         | 
         | Most of the suggestions they considered were fairly modest
         | product design choices that probably would improve user
         | experience. To call these choices paternalistic is a stretch.
         | 
         | Also, the platform is already paternalistic - it polices
         | nudity, pornography and a range of other legal content.
        
         | [deleted]
        
       | seemslegit wrote:
       | How dare they - that used to be the job of traditional media and
       | entertainment.
        
       | casefields wrote:
       | Mirror: http://archive.md/YQeJY
        
         | neonate wrote:
         | Updated: https://archive.md/FyTDB
        
           | obi1kenobi wrote:
           | Wow, Cloudflare's 1.1.1.1 DNS server sets up a man-in-the-
           | middle (broken cert gives it away) and serves a 403 Forbidden
           | page when clicking on this link. Verified that 8.8.8.8 works
           | fine.
        
             | Defenestresque wrote:
             | I don't want to derail the discussion too much either, but
             | anyone curious about the reasoning can see this comment
             | from CloudFlare [0]
             | 
             | >We don't block archive.is or any other domain via 1.1.1.1.
             | Doing so, we believe, would violate the integrity of DNS
             | and the privacy and security promises we made to our users
             | when we launched the service.
             | 
             | >Archive.is's authoritative DNS servers return bad results
             | to 1.1.1.1 when we query them. I've proposed we just fix it
             | on our end but our team, quite rightly, said that too would
             | violate the integrity of DNS and the privacy and security
             | promises we made to our users when we launched the service.
             | 
             | >The archive.is owner has explained that he returns bad
             | results to us because we don't pass along the EDNS subnet
             | information. This information leaks information about a
             | requester's IP and, in turn, sacrifices the privacy of
             | users. This is especially problematic as we work to encrypt
             | more DNS traffic since the request from Resolver to
             | Authoritative DNS is typically unencrypted. We're aware of
             | real world examples where nationstate actors have monitored
             | EDNS subnet information to track individuals, which was
             | part of the motivation for the privacy and security
             | policies of 1.1.1.1.
             | 
             | > [snipped the rest]
             | 
             | [0] https://news.ycombinator.com/item?id=19828702
        
               | eloff wrote:
               | I'm not sure if it's a separate issue, but I've noticed
               | 1.1.1.1 sometimes can't resolve my bank. Adding 8.8.8.8
               | as an alternate DNS service resolves the issue for me. I
               | don't know if it's just balancing the requests or only
               | using 8.8.8.8 if the primary fails. I'd like to know the
               | answer to that.
        
             | snek wrote:
             | False, archive.is serves 1.1.1.1 and 1.0.0.1 as A records
             | back to people who try to resolve it using cf dns.
        
             | obi1kenobi wrote:
             | Posted as a Tell HN, to avoid derailing this post's
             | discussion: https://news.ycombinator.com/item?id=23315640
        
             | jgunsch wrote:
             | I believe like this is due to Archive rejecting Cloudflare,
             | not the other way around.
             | 
             | https://news.ycombinator.com/item?id=19828317
        
           | waterhouse wrote:
           | Interesting. The diff appears to be (a) they changed the
           | headline from "Facebook Knows It Encourages Division. Top
           | Executives Nixed Solutions." to "Facebook Executives Shut
           | Down Efforts to Make the Site Less Divisive", and (b) they
           | inserted a video most of the way down the article, captioned
           | "In a speech at Georgetown University, Mark Zuckerberg
           | discussed the ways Facebook has tightened controls on who can
           | run political ads while still preserving his commitment to
           | freedom of speech."
        
         | supernova87a wrote:
         | I will make a parenthetical point that the WSJ, while expensive
         | to subscribe, is a very high quality news source and worth
         | paying for if it's in your budget. There are discounts to be
         | found on various sites. And god knows their newsroom needs all
         | the subscribers it can get (just like NYT, etc) to stay
         | independent of their opinion-page-leaning business model that
         | tends to be not so objective (the two are highly separated).
         | Luckily they have a lot of business subscribers who keep them
         | afloat, but I decided to subscribe years ago and never
         | regretted it.
        
         | pwdisswordfish2 wrote:
         | Blocked in some countries and worldwide _from_ at least one
         | third party DNS provider.
         | 
         | https://en.wikipedia.org/wiki/archive.is
         | 
         | http://web.archive.org/web/20200526163314/https://www.wsj.co...
         | 
         | http://web.archive.org/web/20200526201849/https://www.wsj.co...
        
           | dilyevsky wrote:
           | If you mean cloudflare is blocking it it's actually other way
           | around - webarchive blocks cloudflare resolvers
        
             | thepangolino wrote:
             | I'm pretty sure it's cloudflare being bitchy about not
             | receiving some arcane DNS field from archive and therefore
             | blocking their requests.
        
               | dilyevsky wrote:
               | You're wrong - https://mobile.twitter.com/archiveis/statu
               | s/1018691421182791...
        
       | cwperkins wrote:
       | I think NYT has set a decent example with how to deal with
       | internet comments sections. I like the idea of a US House of
       | Representatives type approach to comments where every person in
       | the house is given an equal amount of time to address the house
       | so you can hear all perspectives.
       | 
       | The way NYT has done this is by introducing "Featured Comments".
       | A team at NYT, presumably ideologically diverse, picks insightful
       | features to highlight out of all comments. You can still view
       | comments sorted by number of recommendations, but they default to
       | the Featured Comments.
       | 
       | The web forum I think needs this more than any else is the
       | r/politics subreddit of Reddit. Someone please let me know their
       | experience, but I don't think the comments on highly upvoted
       | content are insightful at all. A lot seek to exacerbate and
       | misrepresent which IMO adds fuel to the flames of the flame wars.
        
       | dredmorbius wrote:
       | The Verge has an unpaywalled story;
       | https://www.theverge.com/2020/5/26/21270659/facebook-divisio...
        
       | ENOTTY wrote:
       | Here's the paragraph I found most damning. It would make me want
       | to assign liability to Facebook.
       | 
       | > The high number of extremist groups was concerning, the
       | presentation says. Worse was Facebook's realization that its
       | algorithms were responsible for their growth. The 2016
       | presentation states that "64% of all extremist group joins are
       | due to our recommendation tools" and that most of the activity
       | came from the platform's "Groups You Should Join" and "Discover"
       | algorithms: "Our recommendation systems grow the problem."
        
         | inimino wrote:
         | Of course, it's hard to assign blame without looking at how
         | "extremist groups" are defined and at whether the
         | recommendation tools do good as well as harm.
        
         | thrwn_frthr_awy wrote:
         | I thought the most interesting part was Mark asking not to be
         | bothered with these types of issues in the future. By saying do
         | it, but cut it 80%, he sounds like he wants to be able to say
         | he made the decision to "reduce" extremism, but without really
         | making a change.
        
         | tantalor wrote:
         | You are surprised? Here's the mission statement:
         | 
         | > Facebook's mission is to give people the power to build
         | community and bring the world closer together. People use
         | Facebook to stay connected with friends and family, to discover
         | what's going on in the world, and to share and express what
         | matters to them.
         | 
         | Encouraging group communication is the primary goal, regardless
         | of the consequences.
        
           | razzimatazz wrote:
           | It sounds like an honorable goal, doesn't it? But when you
           | build a community that becomes simply a place for shared
           | anger, you allow that anger to be amplified and seem more
           | legitimate.
        
           | ENOTTY wrote:
           | It's one thing to enable people to seek out extremist
           | communities on their own. It's quite another to build
           | recommendation systems that push people towards these
           | communities. That's putting a thumb on the scale and that's
           | entirely Facebook's doing.
           | 
           | This is one example, and it's quite possibly a poor example
           | as it is a partisan example, but Reddit allows The_Donald
           | subreddit to remain open, but it has been delisted from
           | search, the front page, and Reddit's recommendation systems.
        
       | specialist wrote:
       | From the article:
       | 
       |  _" Worse was Facebook's realization that its algorithms were
       | responsible for their growth. The 2016 presentation states that
       | "64% of all extremist group joins are due to our recommendation
       | tools" and that most of the activity came from the platform's
       | "Groups You Should Join" and "Discover" algorithms: "Our
       | recommendation systems grow the problem.""_
       | 
       | Then:
       | 
       |  _" In keeping with Facebook's commitment to neutrality, the
       | teams decided Facebook shouldn't police people's opinions, stop
       | conflict on the platform, or prevent people from forming
       | communities."_
       | 
       | Does not compute.
       | 
       | How can they claim to be neutral about the very problem they
       | themselves created?
       | 
       | There's _a lot_ of daylight between proactively accelerating
       | extremism and censorship. This is not a binary choice.
       | 
       | I'm right alongside Kara Swisher on this topic: Facebook's
       | leadership team is apparently incapable of nuance, self
       | awareness, or acknowledging culpability.
        
       | mudlus wrote:
       | Are we are going to have to wait for a generation to die and for
       | millions of lives to be lost (indirectly, say, through a
       | demagogue's botched response to a pandemic needlessly leading to
       | the infection of millions) before the average person is
       | comfortable using a protocol (say, ActivityPub and RSS) instead
       | of these parasitic for-profit platforms?
       | 
       | As long as the search for truth is burdened with advertising on
       | platforms democracy and freedom are doomed.
       | 
       | If you don't see these things are linked, then you're part of the
       | problem.
        
       | commandlinefan wrote:
       | Well, I can't read the paywalled article, but every solution I've
       | ever seen has been to closely control the narrative to match one
       | group's preferred spin. If that's what top executives nixed, then
       | good for them for having principles.
        
       | jdofaz wrote:
       | Maybe divisive content keeps other people engaged but I stopped
       | getting enjoyment out of facebook years ago and I avoid it now.
        
       | yumraj wrote:
       | I've always wondered how such discussions go in company meetings
       | where some product/feature has harmful effect of
       | something/someone but is good for the business of the company.
       | 
       | I cannot believe that everyone is ethicality challenged, only
       | perhaps the people in control. So what goes through that minds of
       | people who don't agree with such decisions. Do they keep quiet,
       | just worry about the payroll, convince themselves that what the
       | management is selling is a good argument _for_ such product
       | /service....
       | 
       | Luckily I've never had to face such a dilemma, but can't be
       | envious of those who have faced and come out of it by losing
       | either their morals or jobs.
        
         | crazygringo wrote:
         | > _I cannot believe that everyone is ethicality challenged_
         | 
         | No, but it's not always clear what the ethical choice is. In
         | philosophy, this is known as pluralism [1] -- the fact that
         | different people have irreconcilable ethical views, with no way
         | to find any "truth".
         | 
         | That might seem like a lot of justificatory mumbo-jumbo, but
         | there are genuine ethical arguments on all sides. For example,
         | did you know that in the postwar 1950's, the _lack_ of
         | polarization and divisiveness in American society was seen by
         | many as a major problem, because it didn 't provide enough
         | voter choice between the two parties? [2]
         | 
         | There are also plenty of ethical arguments that giving people
         | what's "good for them", rather than what they want (click on)
         | would run counter to their personal autonomy, and therefore
         | against their freedom. This is what critics of paternalism
         | believe. [3]
         | 
         | Then there's the neoliberal argument that markets always work
         | best (absent market failure). That most of human progress over
         | the past couple of centuries has resulted from companies doing
         | what's most profitable, despite how non-intuitive that is. In
         | that sense, Facebook doing what makes the most money _is_
         | ethically right.
         | 
         | I'm not saying I agree with any of these -- in fact, I don't.
         | 
         | But I am saying that supposing there's some kind of obvious
         | right ethical answer, and implying bad faith towards people at
         | Facebook that they're somehow making decisions they genuinely
         | believe to be wrong but making anyways, is not accurate.
         | 
         | [1]
         | https://en.wikipedia.org/wiki/Pluralism_(political_philosoph...
         | 
         | [2] https://newrepublic.com/article/157599/were-not-polarized-
         | en...
         | 
         | [3] https://en.wikipedia.org/wiki/Paternalism
        
           | specialist wrote:
           | The profit maximizing (shareholder value) argument is fairly
           | recent.
           | 
           | At many other times, the concentration of wealth, and
           | therefore power, was identified as a problem and actively
           | mitigated. For example, the founding fathers of the USA were
           | quite anti corporate and actions like the Boston Tea Party
           | were explicitly so.
        
             | babesh wrote:
             | Nah. The founding fathers were the richest colonists and
             | George Washington was the richest of them all. It was some
             | rich people opposing richer people overseas that they were
             | descended from.
             | 
             | They didn't want concentration of political power but they
             | had the economic power. Interestingly the political power
             | endangers them because it has the power to take away their
             | economic power. That's the real battle still going on
             | today.
        
               | specialist wrote:
               | How does one disprove the other?
        
               | babesh wrote:
               | Because it wasn't concentration of power they were
               | concerned with. They were only concerned with
               | concentration of power against them (political power
               | against their right to profit).
               | 
               | It was a selfish play not a principled one. For example,
               | slavery was written into the constitution. How the hell
               | does that happen when all men (and no women) were
               | supposedly equal?
               | 
               | Not all of them were for slavery but that was the end
               | result of the document/of the competing forces at play.
               | It institutionalized slavery in the new nation.
               | 
               | Wikipedia
               | 
               | https://en.m.wikipedia.org/wiki/United_States_Declaration
               | _of...
               | 
               | "According to those scholars who saw the root of
               | Jefferson's thought in Locke's doctrine, Jefferson
               | replaced "estate" with "the pursuit of happiness",
               | although this does not mean that Jefferson meant the
               | "pursuit of happiness" to refer primarily or exclusively
               | to property."
               | 
               | What has gradually happened is that personhood has been
               | gradually extended to more and more entities (sometimes
               | non human).
        
           | dragonwriter wrote:
           | > For example, did you know that in the postwar 1950's, the
           | lack of polarization and divisiveness in American society was
           | widely seen as a major problem, because it didn't provide
           | enough voter choice between the two parties?
           | 
           | There was not a lack of polarization and divisiveness _in
           | American society_.
           | 
           | The divides in American society and politics didn't map well
           | to the _two major political parties_ because there was a
           | major political realignment in progress and the parties hadn
           | 't yet aligned with the divides in society.
           | 
           | The problem was the divide between the major parties not
           | being sharp on the issues where there were, in fact, sharp,
           | polarizing divides in society, preventing members of the
           | public from effectuating their preferences on salient issues
           | by voting.
        
             | phkahler wrote:
             | So are you saying polarization makes it easier for people
             | to vote? It sounds plausible and undesirable.
        
               | dragonwriter wrote:
               | > So are you saying polarization makes it easier for
               | people to vote?
               | 
               | No, I'm saying that the description that polarization was
               | absent is wrong.
               | 
               | I'm also saying alignment of the axis of differentiation
               | between the major parties in a two-party system and the
               | salient divides in society makes it easier for people to
               | make meaningful choices, and feel they are doing so, by
               | voting.
               | 
               | When there are sharp polarizing social/political divides,
               | as there were over many issues in the 1950s, and they are
               | not reflected in the divides between the parties (as they
               | often weren't in the 1950s), then the government cannot
               | represent the people because the people cannot express
               | their preferences on important issues by voting.
        
             | not2b wrote:
             | In the 50s and 60s, there were really four parties, joined
             | into two by coalitions. On the Democratic side, there was a
             | social democratic, leftist faction, tensely allied with a
             | Southern party (the Dixiecrats). On the Republican side,
             | there was a pro-corporate but moderately liberal faction
             | (the Rockefeller Republicans) allied with a harder-line
             | conservative/liberatarian faction (the Goldwater
             | Republicans).
             | 
             | Two things happened in the 60s and early 70s: the Goldwater
             | faction largely took power in the Republican Party, and
             | because the Democratic Party embraced civil rights, the
             | Dixiecrats first flirted with independence (George
             | Wallace's campaign) and then gradually switched parties, so
             | now we have the oddity that there are people who fly
             | Confederate flags but are registered members of the party
             | of Lincoln. Many people who would have been Republicans in
             | the old days are now the moderate/neoliberal faction in the
             | Democratic Party.
             | 
             | So we still have four parties, they were just reshuffled.
             | Now the tension in the Democratic Party is between the old
             | FDR/LBJ new deal supporters, and their younger socialist
             | allies, and the more pro-business neoliberals. On the
             | Republican side it's between the business side (they don't
             | care much about ideology, they just want to make money) and
             | the hard-core conservatives.
        
           | alharith wrote:
           | I am sorry to say, this seems like a thoughtful answer but
           | there is a lot of nonsense in it is as well.
           | 
           | For example, pluralism doesn't state there is no way to "find
           | truth", but that in light of multiple views, to have good
           | faith arguments, avoid extremism, and engage in dialog to
           | find common ground.
           | 
           | > but there are genuine ethical arguments on all sides.
           | 
           | These ethical arguments, however genuine they may be, are not
           | equal however, otherwise, you would be falling victim to
           | making the false balance fallacy, commonly observed in media
           | outlets, or the "both sides" argument we have so unlovingly
           | become aware of in recent times. The False balance fallacy
           | essentially tosses out gravity, impact, and context.
           | 
           | > That most of human progress over the past couple of
           | centuries has resulted from companies doing what's most
           | profitable, despite how non-intuitive that is.
           | 
           | Despite the over-simplicity of framing it as companies simply
           | doing what is most profitable, this is, in fact, extremely
           | intuitive, and has been studied, measured, and observed. I am
           | curious what you find unintuitive about it?
           | 
           | > But I am saying that supposing there's some kind of obvious
           | right ethical answer, and implying bad faith towards people
           | at Facebook that they're somehow making decisions they
           | genuinely believe to be wrong but making anyways, is not
           | accurate.
           | 
           | This view may be true in a vacuum, but it is irrelevant. We
           | live in American society, and there is an American ethical
           | framework in which Facebook's actions can be viewed as
           | unethical. Other countries that have this similar issue have
           | their own ethical frameworks in which to deem Facebook's
           | actions ethical/unethical.
        
             | jimmaswell wrote:
             | > American ethical framework in which Facebook's actions
             | can be viewed as unethical
             | 
             | I'm curious what you mean by this, because I'd expect the
             | American values of independence and free expression to be
             | counter to wanting Facebook to actively supress divisive
             | discourse. (Yes, I know the first amendment only applies to
             | the government; the point is the spirit of the "American
             | ethical framework")
        
             | crazygringo wrote:
             | > _pluralism doesn 't state there is no way to "find
             | truth"_
             | 
             | To the contrary, that is literally what pluralism as a
             | philosophical concept says. You can read up on Isaiah
             | Berlin's "value pluralism" [1], for example.
             | 
             | > _These ethical arguments, however genuine they may be,
             | are not equal however_
             | 
             | On what basis? Again, the entire premise of pluralism
             | provides no method for comparison.
             | 
             | > _this is, in fact, extremely intuitive_
             | 
             | Many would disagree. You might enjoy reading [2], which
             | explains just how hard it is for citizens to understand it,
             | from the point of view of an economics professor.
             | 
             | > _and there is an American ethical framework_
             | 
             | Except there isn't, that's the point. For example,
             | Republicans and Democrats obviously believe in deeply
             | divergent ethical frameworks. And there's far more
             | diversity beyond that. Plus there's no way to say that any
             | American ethical framework would even be right -- what if
             | it were wrong and needed correction?
             | 
             | [1] https://en.wikipedia.org/wiki/Value_pluralism
             | 
             | [2]
             | https://papers.ssrn.com/sol3/papers.cfm?abstract_id=999680
        
               | carapace wrote:
               | I dunno: the last sentence of the abstract of [2] is:
               | 
               | > A better understanding of voter irrationality advises
               | us to rely less on democracy and more on the market.
               | 
               | To my mind this immediately brings up the question of why
               | people who are irrational _voters_ would be expected to
               | be rational economic actors.
               | 
               | - - - -
               | 
               | ...Ah! I just looked at it again and saw the sub-heading:
               | "Cato Institute Policy Analysis Series No. 594"
               | 
               | PLONK!
        
             | dragonwriter wrote:
             | > For example, pluralism doesn't state there is no way to
             | "find truth"
             | 
             | Well, there are lots of different ideas lumped together as
             | "pluralism", but most of them not only hold that there is
             | no way to find truth on the issues to which they apply, but
             | that there is no "truth" to be found.
             | 
             | > We live in American society,
             | 
             | Some of us do, some of us don't.
             | 
             | > and there is an American ethical framework in which
             | Facebook's actions can be viewed as unethical.
             | 
             | Sure, but there are many, mutual contradictory and, often
             | mutually hostile American ethical frameworks, so that's
             | true of virtually every actor's actions, and virtually
             | every alternative to those actions.
        
         | pwned1 wrote:
         | The Banality of Evil.
        
         | NoodleIncident wrote:
         | > some product/feature has harmful effect of something/someone
         | but is good for the business of the company
         | 
         | If you start with such black-and-white assumptions, you will
         | never be able to actually empathize with those people. Nothing
         | is that simple when you're close enough to see the details.
         | 
         | Things good for the company should be and frequently are good
         | for the people using the product. The same thing can also harm
         | the same people, or a different set of people, or the company,
         | in a way that's impossible to disentangle from the good.
         | 
         | There's a whole back and forth about Facebook and political
         | divisions. It starts with someone assuming that tech companies
         | put people in bubbles and echochambers, assuming they'll only
         | be engaged with stuff they agree with. Then you run the
         | numbers, and realize that people are far more isolated from
         | opposing opinions in real life than they are on the internet,
         | you interact with more people online, and they censor
         | themselves less. But at the same time, you can change your mind
         | about echochambers, and decide that this is a bad thing, being
         | exposed to different opinions makes you more entrenched in what
         | you actually believe.
         | 
         | It's never as simple as "this is bad for everyone except us but
         | at least we're getting rich". Everything has more nuance than
         | that when you experience it up close
        
           | AlexandrB wrote:
           | > Things good for the company should be and frequently are
           | good for the people using the product.
           | 
           | I think there's a misalignment here. In traditional business
           | what you said may be generally true (with some striking
           | counterexamples like cigarette companies). In internet
           | advertising things good for the company should be and
           | frequently are good for the company's _customers_. Facebook
           | 's users are _not_ its customers, and Facebook is generally
           | incentivized to keep users on the site and consuming content
           | (and advertising) by any means necessary - regardless of the
           | long-term harm it might cause the users.
        
           | clairity wrote:
           | yes, it's never simply black-and-white, but you're
           | overstating that case, especially with facebook. by now,
           | nearly everyone in tech and many adjacent industries (e.g.,
           | entertainment) has heard about and probably internalized the
           | downsides of facebook, particularly the mechanisms and
           | tactics employed to advance facebook at the detriment of
           | society at large. it's pretty clear many of those people at
           | facebook are avoiding or ignoring inconvenient truths when it
           | comes to removing those mechanisms and tactics to the benefit
           | of society at large but at the detriment of facebook.
        
           | DSingularity wrote:
           | People are more isolated in the real world? Please provide a
           | source. Aside from the fact that this is hard to measure now
           | that the underlying medium has itself been modified -- I
           | would hardly expect this to be the case. Online I am
           | connected to those whom I socialize with or am otherwise
           | professionally connected to. In the "real world" this
           | constraint is largely absent.
        
             | NoodleIncident wrote:
             | This is the hardest source I can find, but it only measures
             | what happens on Facebook. The numbers do seem higher than
             | what I'd expect for IRL conversations, though:
             | 
             | https://research.fb.com/blog/2015/05/exposure-to-diverse-
             | inf...
             | 
             | > Online I am connected to those whom I socialize with or
             | am otherwise professionally connected to. In the "real
             | world" this constraint is largely absent.
             | 
             | This seems entirely backwards to me? Maybe you talk more
             | with strangers IRL than online, but I doubt it. I only have
             | n=1 (me), but we are talking right now. Who knows where we
             | live in relation to each other?
             | 
             | So much of politics is split between urban and rural
             | environments. Those groups are defined by where they live,
             | so I expect very few conversations in person between the
             | two, especially about politics.
        
               | DSingularity wrote:
               | Thanks for the link. Reading now. Regarding my reply, I
               | was thinking more about social networking apps like
               | Facebook, Instagram, Snapchat, WhatsApp, or linkedin and
               | less about hackernews/reddit types. Mainly because I
               | think the bulk of social interactions happen there.
        
             | creddit wrote:
             | I highly doubt you and I are socially or professionally
             | connected and yet here we are.
        
               | _tulpa wrote:
               | This connection doesn't mean shit compared to someone you
               | see face to face and share experiences with. Yet this
               | watered down form of connection seems to have replaced
               | the latter, which I think is the fundamental social
               | problem of the internet.
        
               | bradlys wrote:
               | Does it matter the quality of the connection? The
               | argument is about being shown different viewpoints and
               | that the internet shows you more than in person.
               | 
               | Is that hard to disagree with? I didn't even know atheism
               | was a thing until I was on the Internet. No one in my
               | community was an atheist and the media we were provided
               | didn't reference it much.
        
               | _tulpa wrote:
               | I think quality is almost the only thing that matters.
               | 
               | Personal anecdotes aside, we're mostly terrible at
               | dealing with new ideas when they conflict with stuff we
               | already know or is close to our identity. Remove the
               | human element of the connection and we're even more
               | likely to dismiss said conflicting ideas outright as
               | stupid (I'll try link to that research). It's not hard to
               | imagine how that might lead to strong yet poorly
               | justified social division.
        
             | thisiszilff wrote:
             | It does seem logical: your in person interactions are
             | mediated by your personal relationship with people. Online
             | you can come across anything and everything. The in person
             | equivalent would be walking by ten or twenty small protests
             | set up with megaphones loudly arguing for various things
             | you vehemently disagree with.
        
             | visarga wrote:
             | > In the "real world" this constraint is largely absent.
             | 
             | In the real world you are connected to people living and
             | travelling around you, and that is not necessarily an
             | unbiased set of people. It can be quite far from the
             | average random group. You're still in a bubble.
        
           | MaxBarraclough wrote:
           | > Things good for the company should be and frequently are
           | good for the people using the product. The same thing can
           | also harm the same people, or a different set of people, or
           | the company, in a way that's impossible to disentangle from
           | the good.
           | 
           | > It's never as simple as "this is bad for everyone except us
           | but at least we're getting rich". Everything has more nuance
           | than that when you experience it up close
           | 
           | This too needs more nuance. These points even apply to
           | outright crime. Legal prohibitions should sometimes be
           | expanded in the public interest, because sometimes it
           | essentially is the case that something is bad for everyone
           | except some small group.
           | 
           | This is reflected in the way data-protection laws now exist
           | in many countries, for instance.
        
         | [deleted]
        
         | jhowell wrote:
         | I think what happened here is a little different than how you
         | describe. For me, it seems they had a hypothesis, found support
         | for their hypothesis, then changed its definition for
         | speculative motivations with tangible harm.
        
         | threatofrain wrote:
         | Almost everyone is ethically challenged, we just need the right
         | circumstances for particular expressions to emerge. The people
         | who do right and wrong by you might be alternative persons
         | under alternative scenarios.
         | 
         | The very poor and very rich are often placed in front of
         | ethically interesting bargains, such as a trade of life for
         | money, whereas Hacker News has trouble even daring to ballpark
         | the dollar value to life -- a middle class aesthetic where one
         | has neither the resources nor the desperation to trade in
         | flesh.
        
         | ummonk wrote:
         | Is it clear that echo chambers and polarized discussion are
         | good for the bottom line? I imagine they help with user growth
         | and user retention, but would people engaged in these polarized
         | echo chambers actually spend more on advertised products?
        
         | kace91 wrote:
         | I've been there, obviously not to the level of a facebook board
         | member.
         | 
         | IMO the feeling is not really that different from making
         | choices as a consumer ("was this shirt made by child labor?",
         | "was the animal this meat comes from treated humanely?", etc).
         | People tend to turn a blind eye to those questions unless
         | something comes up that hits close to home.
         | 
         | To be clear, I'm not saying that's justifiable or a good
         | mindset to have, just what I think happens.
        
           | qzw wrote:
           | I think what an FB exec is trying to decide is more analogous
           | to "should we use child labor to make our shirts?" or "should
           | we incur higher costs to run a humane farm?"
        
           | brundolf wrote:
           | That may apply in many cases, but I don't think an engineer
           | or manager at Facebook can use that excuse. They'd have lots
           | of other options.
        
           | forgotmypw17 wrote:
           | This kind of thinking, looking behind the veil of money, has
           | convinced me to stop using currency altogether, for now, for
           | the most part. I still pay for web hosting and domains, I
           | still buy bottled water for lack of better options, but for
           | anything else like clothes, food, houseware stuff, etc., I've
           | stopped buying altogether. Everything you buy carries a huge
           | veiled cost of human health and lives, animal and plant
           | health and lives, environment damage, habitat loss, and so
           | on. I just don't want to be complicit anymore. I wear the
           | same clothes, and I pick up the clothes people leave in boxes
           | on the street or go to churches. There is a glut of
           | consumable goods and the charities are throwing tons of it
           | away everyday. Same goes for food, kitchenware, paintings,
           | decorations. I've been told my great-grandmother used to say,
           | "God gives you a day, and then food for that day." That is
           | the approach I have taken. Went for a walk yesterday, found
           | two paintings. One of them needed finishing, which I'm happy
           | to do. For 3+ years, I have not used any "external" products
           | like shampoo, lotion, cream, etc., not even soap, except
           | occasionally buying a bar of dr bronners soap (paper wrap)
           | and using that for laundry. Almost everything in that
           | department, even the "organic" or "natural" or "eco-friendly"
           | has a long ingredient list full of what I want to avoid both
           | putting on myself, as well as drinking, which is what's going
           | to happen if I put them down the drain. Also, all of it fucks
           | up the skin biome. I've not had any skin problems since I
           | unsubscribed from them. And so on. I know it's not an option
           | for everyone, but it's the only option for me, as long as I
           | have a choice, to choose this way, and keep pondering how to
           | do better every day.
        
             | carapace wrote:
             | I just wanted to say that's awesome and you're my hero. :-)
        
             | hackissimo123 wrote:
             | Where do you get free food?
        
               | cycloptic wrote:
               | It grows in the ground, or around the bones of other
               | living creatures.
        
               | forgotmypw17 wrote:
               | I live in a city, so mostly from dumpsters. Tons of
               | recoverable food is thrown out every day. Way, way more
               | than I can figure out what to do it.
               | 
               | I've also gotten more into fasting and eating less, but
               | so far, no involuntary fasting has occurred.
               | 
               | I've also become more social, so sometimes others share
               | their food with me, even in these difficult times. Yes,
               | they bought it with money, and fed the eco-shaver, but I
               | think it's still less than if I'd done it myself.
               | 
               | Occasionally, I go to restaurants towards closing time,
               | and ask if they have any leftovers they are throwing
               | away.
               | 
               | A great book on all this I read on this is called "The
               | Scavengers' Manifesto". I learned a lot from meeting
               | others on the street and looking through the trash.
               | 
               | I've done a bit of foraging when in wilder areas, and
               | I've seen places where people grow most of their food
               | themselves, in small communities. I think this is the
               | future.
        
           | dfxm12 wrote:
           | I disagree and think it is significantly different. Facebook
           | decision makers have way more agency in the directions their
           | company takes than a consumer has in their choice of clothes
           | to buy at Target (or wherever).
           | 
           | Shirt consumers don't have much of a choice. They can only
           | buy what's for sale (and in their price range). And then, how
           | can they be sure if a shirt was or wasn't made by child
           | labor? How would an individual consumer's behavior lead to
           | ending child labor?
           | 
           | According to the article, Facebook execs understood what the
           | product was doing, and, while they have the ability to stop
           | it, don't. Maybe I understand what you're saying if we're
           | talking engineers/middle managers, but that's a boring
           | conversation. The buck has to stop somewhere.
        
             | lol636363 wrote:
             | As consumer, you may not be able to stop child labor but
             | you can vote with your wallet.
             | 
             | Several of my friends buy clothes from a few vetted brands
             | because of exactly this issue.
             | 
             | Then I have another friend who was huge cruise ships fan.
             | He encouraged me to go on my first cruise too. But then
             | there was a report about mistreatment of cruiseship
             | employees, and he is totally against cruiseships now. His
             | actions probably won't change anything alone but if enough
             | consumers start to act like him, a change may happen.
        
               | wolco wrote:
               | Probably will do two things.
               | 
               | If he spends that money locally it helps the community.
               | 
               | Cruise ship will treat employees worse to make up the
               | shortfall in cash. The Cruise ships industry needs a tell
               | all netflix movie to change things.
        
               | jbay808 wrote:
               | I often wonder. Even if people stop buying, the feedback
               | signal to a company can be very inefficient.
               | 
               | They might not understand where they went wrong and think
               | they need to lower prices or something. Of course, that
               | just leads to more pressure on working conditions.
        
             | tinco wrote:
             | Are you seriously arguing that consumers can't spend $5
             | less on a shirt so that instead of having "BALR." it was
             | made under less shitty conditions? Consumers have plenty
             | money for t-shirts, they just choose to spend it on fashion
             | statements instead of thinking about working conditions of
             | people half a planet away.
             | 
             | There's plenty of choice. It's not about choice, it's about
             | what's on your mind, and what you put on your mind. If you
             | want to look cool, you put the working conditions concern
             | off of your mind. If you want to make money, you put the
             | division concern off of your mind.
             | 
             | The buck stops at every stop.
             | 
             | edit: did a quick google, first result on a plain white
             | t-shirt that's fair trade is $25, first result on
             | 'fashionable' plain white t-shirt (by balr or supreme) is
             | $60...
        
               | Osiris wrote:
               | Basic economic theories require that consumers have full
               | information and make rational decisions. Neither of those
               | are valid assumptions.
               | 
               | In this case, the vast majority of people don't know if a
               | shirt was made with child labor or not. If this
               | information was clearly communicated to every consumer
               | I'm sure you'd see consumer behavior change to some
               | degree.
        
             | whathappenedto wrote:
             | I actually feel the opposite. Consumers have the ultimate
             | choice -- their choice is not beholden to anyone except
             | themselves. Then they can execute their choice
             | unilaterally.
             | 
             | A VP or even the CEO is beholden to shareholders, their
             | employees, their advertisers, their own ethics, their
             | users, various government regulations (and government
             | interests that are not laws but what they prefer). So
             | almost everything they do is a tradeoff.
        
               | dfxm12 wrote:
               | What a cop out. You can't just pass the buck forever. You
               | want to bring _shareholders_ into this? Was _exploiting
               | the human brain's attraction to divisiveness_ put to a
               | vote? What does it matter when Zuckerberg has a
               | controlling share of the company [0]? He answers to
               | himself.
               | 
               | Facebook spent almost $17MM in lobbying efforts last year
               | [1]. I wonder why governments doesn't exactly have an
               | eagle eye on this...
               | 
               | The rank and file employees at Facebook have no say about
               | this. Tim Bray leaving Amazon to no ill effect shows
               | this.
               | 
               | We're talking about Facebook _exploiting the human brain
               | to increase time on the platform_. The users have little
               | to say about this, and as long as the users are there,
               | advertisers have nothing to say to Facebook.
               | 
               | So that leaves Facebook answering to their own ethics.
               | Yes. _that 's_ the problem.
               | 
               | 0 - https://www.investopedia.com/articles/insights/082216
               | /top-9-...
               | 
               | 1 - https://www.opensecrets.org/federal-
               | lobbying/clients/summary...
        
               | FactolSarin wrote:
               | A corporation is a device for maximizing profit and
               | minimizing ethics. Everyone can say they're behaving
               | ethcially. Consumers can say, "Well, all my friends are
               | there, I can't quit," and it's true for some people. The
               | CEO and other decision-makers can say, "Well, I have to
               | do this otherwise the shares go down and I could get
               | fired," and they may be right. Shareholders can say, "I'm
               | just investing in the most profitable companies, if they
               | were doing something bad, it should be illegal," and they
               | have a point too.
               | 
               | This is where governments come in. Companies should
               | behave ethically, but ultimately we shouldn't just leave
               | it up to them. That's why societies have laws. What we
               | really need to do is use regulation and penalties to
               | force Facebook into ethical behaviour.
               | 
               | Of course, this isn't going to happen because there's no
               | political will to do so, generally due to "free speech"
               | or "free market" objections.
        
               | whathappenedto wrote:
               | This is not passing the buck. It's acknowledging that
               | there are many stakeholders involved in a
               | company+platform, and that many decisions are about
               | making tradeoffs rather than having a "right" answer.
               | 
               | If you always go with the populist vote, like when users
               | rioted about the news feed when it was first introduced,
               | https://techcrunch.com/2006/09/06/facebook-users-revolt-
               | face... then you may be sacrificing the long-term
               | viability of your company. This harms employees,
               | investors, and eventually the public. Are you saying
               | that's not even a consideration at all?
               | 
               | We're not talking about "Facebook exploiting the human
               | brain to increase time on the platform". You brought up
               | Target and shirts. So we're talking about who has more
               | agency, users or executives, in a general manner. That
               | consumers generally only need to concern themselves with
               | their own ethics, versus the complex entanglement of
               | ethics at a company, gives users more agency to make
               | choices reflecting their ethics.
        
               | [deleted]
        
               | [deleted]
        
             | wolco wrote:
             | Why couldn't you choose where to buy your shirt. Shirts can
             | be made anywhere it should be one of the easiest to find
             | multiple venders for.
             | 
             | If you are saying at walmart or another big place they only
             | have 4 brands in your price range and how can you tell
             | which ones involve child labor. You could research if you
             | cared.. by not buying a brand you reduce your risk by 99%.
        
         | Hokusai wrote:
         | Boeing 737 MAX killed 346 people. So, it seems that death is
         | not a deterrent.
         | 
         | The mails from the case are good to understand the internal
         | discussions:
         | https://www.theguardian.com/business/2020/jan/10/737-max-sca...
        
           | nickff wrote:
           | > "Boeing 737 MAX killed 346 people. So, it seems that death
           | is not a deterrent."
           | 
           | I really don't understand your point, unless you're implying
           | that there was a meeting where Boeing planned to kill those
           | people. I am not an aviation expert, but what happened with
           | the MAX seems to be a product of the certification process,
           | urgent business needs, systems engineering issues, and bad
           | internal communications at Boeing.
           | 
           | I haven't seen any evidence that someone specifically
           | predicted the chain of events which would unfold on those
           | flights, and clearly communicated the issue, then had
           | executive(s) respond that it was 'worth the money'.
           | 
           | As an aside, I have seen quotes about the 787, which were
           | similar to those in your linked article (mostly with respect
           | to production quality issues), yet the 787 has not had
           | similar accidents. One problem with working on such huge
           | projects is that the line engineers do not understand that
           | managers are constantly hearing alarmist 'warnings' which
           | don't pan out. If 1% of Boeing staff give false alarms in a
           | year, that means there are 1600 false alarms.
        
             | whatever1 wrote:
             | Wrong. Boeing engineers raised up concerns that were
             | dismissed.
             | 
             | "Frankly right now all my internal warning bells are going
             | off," said the email. "And for the first time in my life,
             | I'm sorry to say that I'm hesitant about putting my family
             | on a Boeing airplane." [1]
             | 
             | [1]https://www.cnbc.com/2019/10/30/boeing-engineer-raised-
             | conce...
        
               | nickff wrote:
               | I didn't say that nobody raised concerns, I said:
               | 
               | >>"I haven't seen any evidence that someone specifically
               | predicted the chain of events which would unfold on those
               | flights, and clearly communicated the issue, then had
               | executive(s) respond that it was 'worth the money'."
               | 
               | In large projects like the MAX, there are always people
               | raising concerns.
        
               | whatshisface wrote:
               | > _In large projects like the MAX, there are always
               | people raising concerns._
               | 
               | Does that mean that people raising concerns can be
               | ignored, or does that mean that most large projects only
               | get by with luck?
        
               | nickff wrote:
               | I think that's a really interesting question, but I think
               | the answer is orthogonal to your dichotomy. In my
               | experience, very successful projects depend on the great
               | managers that know who to listen to in each different
               | situation, and they know how people will react in each
               | situation.
               | 
               | One of the best examples of this is Dave Lewis, who lead
               | the design of the F-4 Phantom II, one of the most
               | successful fighter aircraft of all time. He directed the
               | structural design team to design for 80% of the required
               | ultimate load, because he knew that everyone was
               | conservative in their numbers; then the design was
               | tested. The structure ended up lighter than comparable
               | aircraft, and the Phantom II had phenomenal performance.
               | 
               | It also helps if the managers are good at making
               | predictions of their own; Tetlock has written two great
               | books about this, including: https://en.wikipedia.org/wik
               | i/Superforecasting:_The_Art_and_...
        
             | Hokusai wrote:
             | > I haven't seen any evidence that someone specifically
             | predicted the chain of events which would unfold on those
             | flights, and clearly communicated the issue, then had
             | executive(s) respond that it was 'worth the money'.
             | 
             | People understand the consequences of what they say. I
             | doubt that most people will say that statements out loud,
             | even when they know that are true.
             | 
             | But, people knew and money was involved.
             | 
             | * February 2018
             | 
             | "I don't know how to refer to the very very few of us on
             | the program who are interested only in truth..."
             | 
             | "Would you put your family on a MAX simulator trained
             | aircraft? I wouldn't."
             | 
             | "No."
             | 
             | * August 2015
             | 
             | "I just Jedi mind tricked this fools. I should be given
             | $1000 every time I take one of these calls. I save this
             | company a sick amount of $$$$."
        
               | nickff wrote:
               | I have read similar quotes about most modern aircraft
               | development programs, yet aviation is quite safe. The
               | fact you can find a few alarmists in a company of 160,000
               | is rather unsurprising.
               | 
               | Those quotes would be much more convincing if those
               | employees put _every_ prediction they ever made on the
               | record, not just the ones that turned out to be sort-of
               | right in hindsight.
               | 
               | From manager's perspective, you can't listen to everyone
               | complaining about being rushed, understaffed, and
               | underfunded (, because everyone looking to cover their
               | butts in a bureaucracy does all three). On the other
               | hand, you have to be on the lookout for credible issues.
        
               | whatever1 wrote:
               | You cannot have it both ways. First you claim that no one
               | spoke up and then you dismiss the ones who did as
               | alarmists
        
               | nickff wrote:
               | If someone does not make specific and testable
               | predictions which turn out to be right, they are useless
               | alarmists. If you want to read about how to assess
               | predictors (and improve predictions), I suggest you read:
               | https://en.wikipedia.org/wiki/Superforecasting:_The_Art_a
               | nd_...
        
               | anigbrowl wrote:
               | I'm a very good forecaster, and skill in this area
               | doesn't stem from reflexive dismissal or ability to
               | deploy fallacious counter-arguments.
        
               | nickff wrote:
               | Would you please point out the fallacy?
        
             | lol636363 wrote:
             | Of course, no one planned it. But encouraging or demanding
             | to take shortcuts is what caused it.
             | 
             | I have been in software industry for 15 years and this
             | happens all the time, being forced to release unfinished
             | features, asked to ignore security, backups, etc. I would
             | imagine same thing happens in other industries.
        
               | nickff wrote:
               | My understanding of the MAX issues is that the issues
               | were not really shortcuts, though they might look that
               | way in hindsight (because every mistake looks that way in
               | hindsight).
               | 
               | From my non-aviation perspective, it looks like they
               | basically pieced together a bunch of complex systems,
               | with each team making a number of (different) assumptions
               | about each system. The systems themselves were influenced
               | by FAA requirements to maintain the old certificate,
               | which meant that certain desirable changes were
               | impossible, so workarounds were devised. The problems
               | were due to misunderstandings about how the systems would
               | work when assembled, and these issues were not discovered
               | and/or communicated. It really seems like a systems
               | engineering problem, aggravated by a number of external
               | influences (including business reasons and
               | certification).
        
               | babesh wrote:
               | There is no FAA requirement to maintain the old
               | certificate. Boeing and it's customers wanted to do that
               | for cost savings.
               | 
               | It is supposedly costly in time and money to acquire a
               | new rating but it has been done obviously.
               | 
               | The airlines wanted a single pool of interchangeable
               | pilots flying in name interchangeable planes (their
               | existing 737s and the 737 MAX). Supposedly one of the
               | airlines threatened to take new business to Airbus and
               | had penalties written into the contract to make the 737
               | MAX fly under the existing certificate.
               | 
               | So it wasn't the old certificate driving these issues, it
               | was Boeing and it's customers wanting to maintain the old
               | certificate that drove the issues. That is a very large
               | difference.
        
               | nickff wrote:
               | Perhaps my previous post was vague, but I meant 'FAA
               | requirements [of commonality, required to] maintain the
               | current certificate'.
               | 
               | The FAA may be in the right or in the wrong, but it has
               | made certifying new designs almost prohibitively
               | expensive and time-consuming; for evidence of this,
               | simply look at the Cessna 172 (still in production on a
               | 60-year old certificate), and what happened when
               | Bombardier tried to put a new airliner into production.
               | 
               | You're definitely right that the airlines wanted
               | interchangeable type ratings for crew, but the issue
               | slightly more complicated than you're painting it.
               | 
               | I never argued the old certificate forced the issues, the
               | certification system just strongly incentivized
               | 'upgrading' the 737. This was one of many causes.
        
             | shadowgovt wrote:
             | It's more that there were several meetings where issues
             | were raised that would kill people _if_ they occurred, and
             | those in charge decided the risk factors were minimal
             | enough that they could execute on the plan.
             | 
             | Nobody planned to kill the astronauts on the Challenger.
             | Such a systemic failure to anticipate and manage risk
             | correctly is a team effort and heavily incentive-driven.
             | Putting incentives in place that reward risk-taking
             | increases the odds someone will die.
        
               | babesh wrote:
               | More concisely, if you won't do it, then you will be
               | replaced by someone who will.
        
               | nickff wrote:
               | I think I have a very different understanding of the root
               | cause of the o-ring failure on Challenger than you do.
               | 
               | The common understanding seems to be that the managers
               | decided to launch when the booster temperature was cold
               | (though not necessarily out of limits), and some were
               | warning that it may cause some unforeseen issues.
               | 
               | My read is that each limit in the operations manual
               | should have been backed by a test to failure, or at least
               | a simulation of what would occur if the vehicle was
               | operated outside the limits. Such a process allows the
               | operators to clearly understand what can go wrong, and
               | why the limits are set where they are. This is what they
               | did on the SSMEs, but not on the boosters (because they
               | thought the boosters were fairly simple).[0]
               | 
               | [0] https://ocw.mit.edu/courses/aeronautics-and-
               | astronautics/16-...
        
           | umvi wrote:
           | > So, it seems that death is not a deterrent.
           | 
           | Well, the tobacco industry is still alive and well, and those
           | companies literally peddle death.
        
             | jcims wrote:
             | They peddle a high risk product. So do companies that
             | manufacture motorcycles and parachutes.
        
               | AlexandrB wrote:
               | This comparison is flawed in several respects. The most
               | obvious is that cigarette companies spent decades
               | _intentionally misleading_ the public about the dangers
               | of their product. This is not the same as just selling a
               | potentially dangerous product, especially one where the
               | dangers are so viscerally obvious as with a parachute.
        
               | jonny_eh wrote:
               | > parachutes
               | 
               | Parachutes kill people? I thought they do the opposite.
               | Maybe firearms or alcohol make better examples.
        
               | jcims wrote:
               | The habitual use of any of the above will increase your
               | chances of untimely death.
        
               | jonny_eh wrote:
               | But in the case of parachutes, it's not the device, it's
               | the activity. I know it's splitting hairs, but it's
               | important, especially when it comes to assigning moral
               | responsibility to manufacturers.
        
               | dylan604 wrote:
               | If you use a parachute one time in case of emergency,
               | yes, it is a life saving device that still has a high
               | level of risk. However, I believe they were referring to
               | the people that choose to parachute for sport/recreation
               | rather than emergency situations.
        
         | bjt2n3904 wrote:
         | > I've always wondered how such discussions go in company
         | meetings where some product/feature has harmful effect of
         | something/someone but is good for the business of the company.
         | 
         | I mean, it's one thing if we're talking about something like an
         | airbag, where harm can result from normal usage because of a
         | design flaw. It's another thing to talk about the Ford Pinto --
         | where harm could happen due to accidental misusage.
         | 
         | Does Facebook encourage division? Do ice cream ads encourage
         | obesity? Or alcohol ads encourage drunk driving? (I get that
         | Facebook's "engagement algorithms" are designed to maximize
         | profit, and has a side effect of showing you things that are
         | upsetting and frustrating... but that isn't their design. I'm
         | no fan of "the algorithm", and don't think they should use it,
         | but I think they should be free to.)
         | 
         | In this instance, I don't think it's fair to say Facebook has a
         | "harmful effect". The abuse, misuse, and addiction to Facebook
         | can be harmful, for sure... but that's not Facebook's fault.
         | That's the end user's fault.
         | 
         | Should Facebook come with a warning label, like cigarettes? I
         | don't think so. (I also don't think cigarettes should be
         | mandated to come with images of people dying of lung cancer
         | when alcohol can be sold without images of people with liver
         | disease... but I digress.)
         | 
         | Everyone wants to "mitigate harm". But you need to be able to
         | separate "harm due to malfunction", "harm due to accidents",
         | and "harm due to abuse". This seems to be firmly in the third
         | category, which is the least concrete and most "squishy"
         | category.
         | 
         | Especially squishy, when "harm" is considered to be people
         | saying and/or thinking the wrong things.
        
           | visarga wrote:
           | > In this instance, I don't think it's fair to say Facebook
           | has a "harmful effect". The abuse, misuse, and addiction to
           | Facebook can be harmful, for sure... but that's not
           | Facebook's fault. That's the end user's fault.
           | 
           | Yeah, it wasn't me who posted this reply, it was the cells in
           | my body. It's their fault... I think complex systems create
           | effects that go beyond the individual parts. Facebook is
           | running and profiting from such an 'effect' on society.
           | 
           | Their right to freely express their creativity by making the
           | feed how they wish should be balanced with the large scale
           | (negative) effects that appear in the system.
        
         | jimmaswell wrote:
         | I'm ethicality challenged if I think the biggest (or at least
         | up there) forum of public discourse shouldn't be micromanaged
         | like a day care, with "divisive" people sent to time out? Is it
         | unthinkable to you that some people value free expression over
         | being protected from negativity?
        
         | pacala wrote:
         | As actors in the World, we are machines that turns sensor data
         | into a linear stream of actions. To the extent the decision
         | process in not completely random, there exists a metric that
         | ends up maximized by the decision process, sometimes referred
         | to as 'god' or even 'God'. The vast majority of economic
         | decision processes in the modern economy are driven by one
         | metric: money, sometimes referred to as 'Mammon'. A corporation
         | is an aggregation of human / computerized actors that work to
         | maximize the corporation metric: money earned by said
         | corporation.
         | 
         | The discussions are very simple: Course of action A makes us
         | X$, course of action B makes us XXX$. Therefore course of
         | action B is taken. There is no consideration of other effects
         | besides, perhaps, a quantization of risks. Risk of losing the
         | 'good guys' facade, counterbalanced by PR expenses, or risk of
         | being sued, counterbalanced by legal expenses.
        
         | [deleted]
        
         | DataWorker wrote:
         | Nobody thinks they are complicit but in reality we all are.
         | Some can accept this while others let the cognitive dissonance
         | drive their behavior in convoluted and hard to discern ways.
         | Redemption only comes after accepting that we're born of
         | original sin. Anybody who supports or uses non-free software
         | has worked to finance the amoral tech decision making that
         | you're decrying. Even Stallman makes compromises. Welcome to
         | modernity.
        
         | truculent wrote:
         | You don't apply, don't get hired, or don't get promoted,
         | depending on how effective their hiring processes are.
        
           | yumraj wrote:
           | That leads me to another question: are there people/companies
           | who will not hire someone who has been an employee of FB?
        
             | babesh wrote:
             | I have heard someone spouting that but it was all baloney
             | since that person supported actions just as bad.
        
             | truculent wrote:
             | That seems generally unlikely to me, but it's a big ol'
             | world out there so I am sure it has occurred someplace
        
         | gabaix wrote:
         | Facebook internal memo by Andrew Bosworth, VP June 18, 2016
         | 
         |  _The Ugly
         | 
         | We talk about the good and the bad of our work often. I want to
         | talk about the ugly.
         | 
         | We connect people.
         | 
         | That can be good if they make it positive. Maybe someone finds
         | love. Maybe it even saves the life of someone on the brink of
         | suicide.
         | 
         | So we connect more people
         | 
         | That can be bad if they make it negative. Maybe it costs a life
         | by exposing someone to bullies. Maybe someone dies in a
         | terrorist attack coordinated on our tools.
         | 
         | And still we connect people.
         | 
         | The ugly truth is that we believe in connecting people so
         | deeply that anything that allows us to connect more people more
         | often is de facto good. It is perhaps the only area where the
         | metrics do tell the true story as far as we are concerned.
         | 
         | That isn't something we are doing for ourselves. Or for our
         | stock price (ha!). It is literally just what we do. We connect
         | people. Period.
         | 
         | That's why all the work we do in growth is justified. All the
         | questionable contact importing practices. All the subtle
         | language that helps people stay searchable by friends. All of
         | the work we do to bring more communication in. The work we will
         | likely have to do in China some day. All of it.
         | 
         | The natural state of the world is not connected. It is not
         | unified. It is fragmented by borders, languages, and
         | increasingly by different products. The best products don't
         | win. The ones everyone use win.
         | 
         | I know a lot of people don't want to hear this. Most of us have
         | the luxury of working in the warm glow of building products
         | consumers love. But make no mistake, growth tactics are how we
         | got here. If you joined the company because it is doing great
         | work, that's why we get to do that great work. We do have great
         | products but we still wouldn't be half our size without pushing
         | the envelope on growth. Nothing makes Facebook as valuable as
         | having your friends on it, and no product decisions have gotten
         | as many friends on as the ones made in growth. Not photo
         | tagging. Not news feed. Not messenger. Nothing.
         | 
         | In almost all of our work, we have to answer hard questions
         | about what we believe. We have to justify the metrics and make
         | sure they aren't losing out on a bigger picture. But connecting
         | people. That's our imperative. Because that's what we do. We
         | connect people._
         | 
         | Shortly after the leak Bosworth distanced himself from the
         | post. https://www.theverge.com/2018/3/29/17178086/facebook-
         | growth-...
        
           | brlewis wrote:
           | > _Nothing makes Facebook as valuable as having your friends
           | on it, and no product decisions have gotten as many friends
           | on as the ones made in growth. Not photo tagging. Not news
           | feed. Not messenger. Nothing_
           | 
           | Is this certain? The effects of useful features on growth are
           | longer term and harder to measure than, for example, placing
           | and styling friend suggestions in a way to confuse users into
           | thinking they're friend requests.
        
           | nkozyra wrote:
           | Even that is very handwave-y. It talks about "connections"
           | and events, but not that the algorithm (in the broad,
           | commonly-used sense) encourages and incentivizes that which
           | builds "engagement."
        
           | rainyMammoth wrote:
           | This sounds like complete bullsh*it.
           | 
           | Where does he bring up the subject of Facebook connecting
           | people to the level of addiction? With the only goal of
           | maximizing screen time (and dopamine) to sell more ads? It's
           | not "connecting people", it's "addicting people".
           | 
           | It is as if a 3rd world foodbank for Africa was bragging that
           | they feed the world so well that 90% of Africa is now
           | overweight, but that's good because they continue to "feed
           | people".
        
           | robrenaud wrote:
           | I read that discussion as it was happening on the internal
           | FB@work. Oh man, there were so many true believers replying
           | about how this was so wise and inspiring. As far as I
           | remember, no one questioned him. I wish I had posted that in
           | a biological context, something that grows without bound or
           | care for its environment is cancer. There is Boz arguing that
           | Facebook is cancer.
        
             | jcims wrote:
             | Cancer is just a specialized case of evolution that in many
             | instances is turbocharged by genetic
             | instability...essentially the biological form of 'move fast
             | and break things'. This results a very adaptive germline
             | that handily outcompetes everything constrained by purpose
             | while also overcoming novel threats thrown at it by the
             | greatest medical minds of our time.
             | 
             | If it didn't kill people that we love we'd marvel at its
             | capability.
             | 
             | Is Facebook a 'cancer'? I think it's more of a cultural
             | radiological device that exposes the cancer that's already
             | there.
        
           | TheRealDunkirk wrote:
           | I mean, he's not wrong. Facebook sucks because a lot of
           | people are not-great human beings, and Facebook just allows
           | you to see that. Oops. People might think that peer pressure
           | would shame people into better behavior, but the concept of
           | shame no longer exists in the post-modern world. Everyone
           | feels justified in whatever they believe, and the Covid-19
           | situation on the platform couldn't be a more perfect example
           | in illustrating the problem.
           | 
           | I say this from first-hand experience. I discovered that
           | people I called friends were racist. I now consider those
           | friends merely acquaintances, and I have since deleted my
           | account. Better to just be ignorant of people's ignorance
           | when I can't do anything about it.
        
         | throwaway894345 wrote:
         | > I cannot believe that everyone is ethicality challenged, only
         | perhaps the people in control.
         | 
         | Seems likely that social media as an industry selects more
         | strongly for unethical executives, presumably because online
         | advertising is the only effective way to monetize social media
         | and it is more or less fundamentally unethical. I imagine the
         | same effect can be observed among tobacco and fossil energy
         | executives--these are industries where there is no ethical
         | monetization strategy, at least not one that is in the same
         | competitive ballpark as the unethical strategy.
        
           | spaced-out wrote:
           | >Seems likely that social media as an industry selects more
           | strongly for unethical executives
           | 
           | More so than the fossil fuel industry? Big tobacco? Or the
           | pharmaceutical industry? Wallstreet? Clothing/apparel
           | manufacturers?
        
             | throwaway894345 wrote:
             | > More so than the fossil fuel industry? Big tobacco?
             | 
             | I already addressed this in my second sentence:
             | 
             | > I imagine the same effect can be observed among tobacco
             | and fossil energy executives
             | 
             | No, that wasn't meant to be an exhaustive list of unethical
             | industries.
        
           | ptudan wrote:
           | Online advertising as a concept is fundamentally unethical? I
           | think you're speaking in hyperbole here. Stealing user data
           | without consent (or with fake here read this 500 pg legalise
           | consent) is unethical for certain.
           | 
           | But a bike blog putting ads for bike saddles on the bottom of
           | their page to pay for their server costs and writing staff?
           | Hard to see how that's unethical unless you think selling
           | anything is unethical.
        
             | throwaway894345 wrote:
             | > Online advertising as a concept is fundamentally
             | unethical?
             | 
             | No, I meant "online advertising as an industry". It's
             | unethical to the extent that it depends on stealing user
             | data, which presumably is the overwhelming majority of the
             | industry by value (i.e., I'm assuming your privacy-
             | respecting bike saddles ads don't account for even 1% of
             | the industry's value).
        
         | coliveira wrote:
         | The discussions in this article are never shared with
         | employees, it is just a matter raised in closed high level
         | board meetings. Companies never discuss openly negative
         | positions, and if they do it is only to dismiss them.
        
         | standardUser wrote:
         | I've been in that situation. I argued as much as I felt I could
         | get away with and made the strongest arguments I could against
         | unethical behavior. I was eventually forced out. A couple years
         | later, the company was investigated by law enforcement and
         | subsequently declared bankruptcy.
         | 
         | The people in control were the only ones pushing for the
         | unethical actions, but most others were a lot more quiet than I
         | was and several stuck around until the bitter end.
        
         | xapata wrote:
         | > how such discussions go
         | 
         | In my case, I told my manager about a system design problem
         | that would cause a daily annoyance to 100k people, forcing them
         | to input their passwords more often than necessary. He said,
         | "they'll accept it." I said, "I quit."
        
         | arkades wrote:
         | So, I work in healthcare - as a doc, and at various times, as
         | an admin in healthcare centers as well as in health insurance.
         | I don't know how much of that experience relates to FB's
         | behavior, but I have some idea of what it's like to work in a
         | field and be either called a hero or a devil, depending on the
         | day. I am neither.
         | 
         | Deep breath.
         | 
         | As an industry, we are often doing things that are perceived to
         | be evil. I've noticed the following:
         | 
         | 1. Some of that interpretation is just wrong. People from the
         | outside tend to have a poor understanding of what we do
         | (providers, centers, insurers) and draw conclusions based on
         | highly imperfect information. This is compounded by the fact
         | that journalists have a terrible comprehension of what we do
         | and an incentive to dramatize and oversimplify it - resulting
         | in people _reading_ the news and walking away misinformed and
         | wrongly feeling like they 're now educated on the topic. This
         | happens a lot.
         | 
         | 2. We sometimes do things, or want to do things, that have
         | potential harms _and_ potential benefits - e.g., in health
         | insurance, I 'd love to have had the ability to twist people's
         | arms into coming to get a flu shot. It would have been a huge
         | net benefit to their health. It would have been a net reduction
         | in our costs. It would have been great! If we'd had the ability
         | to ignore patient autonomy and force it, or carrot-and-stick
         | it, we probably would have. We would not have conceptualized it
         | as "ignoring patient preference," we would have conceptualized
         | it as "preventing a bunch of preventable hospitalizations and
         | deaths and, for the elderly, permanent consequences of
         | hospitalizations." And that would have been true! And would
         | have allowed us to not think about the trade-off so much. It's
         | not lying to yourself: it's looking at the grey, round-edged
         | parts of a cost-benefit analysis and subjectively leaning it in
         | your direction. My motivation there isn't even about the money
         | - the money just gets it on the radar as something my employer
         | would be willing to prioritize.
         | 
         | 3. Resource scarcity. I only have so many resources to
         | allocate. One may benefit a patient X; another may benefit them
         | 10X. If X benefits my organization and the 10x choice doesn't,
         | I'll probably choose X. By itself I'm not choosing to do harm -
         | I'm choosing a win/win. Enough decisions like that, in enough
         | contexts, probably do give rise to net harm. But the choice
         | isn't to do harm.
         | 
         | 4. Not every battle can be a "will I burn my career over this?"
         | battle. If I'd ever been faced with a choice that I thought was
         | harm > benefit to patients, I would have burnt the house down
         | over it. But I haven't. I've been faced with lots of little
         | grey questions with uncertain costs and uncertain benefits
         | where there was, in fact, benefit, and usually not just to us
         | but to the patients too. I imagine that's where most
         | organizations go awry: a thousand decisions like this, shaking
         | out under the pervasive organizational need for profit. Like a
         | million million particles of sand moved by the tide, settling
         | out into an overall pattern due to gravity. I think the badness
         | is generally an emergent pattern, not a single person choosing
         | to do evil, or choosing themselves over causing harm to many.
         | I've never been in that position, ever, so either my career is
         | highly anomalous, or that's just not how those choices present
         | themselves in real life. I suspect it's the latter. (Or, I
         | guess, my being amoral is a valid third possibility.)
        
         | Spooky23 wrote:
         | People are capable of all sorts of mental gymnastics to keep
         | things at arm's length. Bad practice X is because of group Y or
         | requirement Z.
        
         | troughway wrote:
         | Where do you draw the line?
         | 
         | If the customers are willing to pay a huge markup on a product,
         | who are you to tell them wiser?
         | 
         | - youdontchargeenough11 (probably)
        
           | yumraj wrote:
           | This is more than just a price markup..
           | 
           | This is more like a pharma company or Monsanto knowing that
           | their product kills, but ignore or hide the data and keep
           | selling the product.
        
             | troughway wrote:
             | Division on FB started out as squabbles between friends and
             | relatives.
             | 
             | And yet, here we are.
        
         | MattGaiser wrote:
         | People tend to rationalize it as not that ethically challenging
         | or by compensating through some other societal benefit.
         | 
         | I knew someone who ran a FB group that devolved into conspiracy
         | theories and absurd levels of anger to the point that members
         | of the group were lashing out at local politicians.
         | 
         | The group owner liked the power and influence so rationalized
         | it as "increasing public engagement in politics." This person
         | is otherwise a vegetarian who fosters animals and works in the
         | medical field.
        
         | brundolf wrote:
         | I think it's mostly denialism (which is cultivated by
         | management). This is a great article about it:
         | https://newrepublic.com/article/155212/worked-capital-one-fi...
        
         | dahart wrote:
         | > I cannot believe that everyone is ethically challenged
         | 
         | Right, so what assumptions are leading to the conclusion that
         | this situation can only be caused by everyone being ethically
         | challenged? Are ethics shared and absolute enough for the
         | answer to this question to be easy or black & white?
         | https://en.wikipedia.org/wiki/Moral_relativism
         | 
         | > Luckily I've never had to face such a dilemma
         | 
         | Are you certain about that? I realize you're talking
         | specifically about C-level execs debating something in a board
         | room, but consider the ways that we all face lesser versions of
         | the same dilemma. For example, do you ever consume and/or pay
         | money for things that are generally harmful to society?
         | Environmental concerns are easy to pick on since more or less
         | everything we buy has negative environmental effects... ever
         | bought a car? flown on an airplane? Smoked a cigarette or
         | enjoyed a backyard fire pit? Bought anything unnecessarily
         | wrapped in plastic? It's _really_ hard to make the less harmful
         | choice, and a lot of people don't care at all, so by and large
         | as a society we put up with the harm in favor of convenience.
         | As consumers, we are at least half of the equation that is
         | leading to socially harmful products existing. If we didn't
         | consume it, the company meetings wouldn't have anything to
         | debate.
        
         | toshk wrote:
         | From my experience there are very strong currents in a group
         | that are very hard to go against as an individual. Only very
         | contrarian people will go against the grain in formal meetings
         | with high level executives or other individuals with status in
         | a group. This is why often big organizations are able to
         | produce decisions that the team behind it doesn't agree with
         | and that look silly from the outside. Many people in such a
         | team will not feel personally responsible because they feel
         | like they didn't have any influence on the decision making
         | proces, even if they could have said something. There are other
         | dynamics at play I think, but this is one of them. (The
         | contrarians seem to not survive long in the corporate world)
        
           | sjg007 wrote:
           | This dynamic is present in FB the website as well. You find
           | clusters or groups of folks who re-amplify a point. It's so
           | effective that you can find "Re-Open" rallies in your state
           | driven by a shady "gun-rights" nonprofit. Even though polling
           | largely supports the lock down and actions taken to curb the
           | pandemic. You also find that outside the group people are a
           | lot more nuanced and reasonable. It's fascinating. What is
           | even more concerning is that a lot of bots drive this
           | behavior.
           | 
           | I think the issue is that in the long term it dilutes FB. I
           | know many people who don't post on FB, preferring Instagram
           | etc... I know these are still FB platforms but it's a big
           | shift. So FB will eventually become Usenet and effectively
           | non-functional.
           | 
           | There's some type of social network that's between Instagram
           | and FB that doesn't exist yet.
        
           | abawany wrote:
           | Also, IME, if you do say something, others jump down your
           | throat quickly and viciously. I still remember this one
           | former cow-orker and his words: 'they debate, they decide, we
           | deliver': this project ended up losing the company millions
           | and left it as a has-been in ecommerce because people chose
           | to accept and support the utter insanity that was going on
           | right in front of their faces.
        
           | rewoi wrote:
           | As a programmer I am not responsible (and paid) for
           | management decisions. It is also not my job to fix toxic
           | culture in a company.
        
             | hnruss wrote:
             | As management is responsible for bad management decisions,
             | so too is the programmer responsible for implementing bad
             | decisions.
        
         | analyst74 wrote:
         | Not at boardroom level, but I was in a couple meetings in past
         | jobs where this happened.
         | 
         | In one case, people had different ideas of what's more
         | ethical/user friendly, since we can't resolve those
         | disagreements with more arguing, we go with metrics, and
         | metrics have no morality.
         | 
         | In another case, everyone agreed that it was slightly shady,
         | but it was a highly competitive market and we have to do it to
         | stay alive.
         | 
         | On the bright side, if a company ventures too deep into bad
         | practices, it will eventually lose trust of the public. Which
         | is why the capitalistic world hasn't descended into complete
         | madness portrayed in dystopian sci/fi films.
        
         | zarkov99 wrote:
         | They quit. The process selects for the most sociopathic because
         | the fitness function is heavily weighted towards bringing
         | profits in the short term. Ethics are only a consideration to
         | the extent that they affect public perception (hence profits)
         | or safeguard against litigation ( protecting profits).
        
         | mc32 wrote:
         | What kind of harm do you propose is the kind that should have
         | pushback?
         | 
         | Movie executives discuss (do they even?) the ramifications of
         | their movies which glorify ills? Do they censor violence,
         | suicide, etc?
        
           | Hokusai wrote:
           | > What kind of harm do you propose is the kind that should
           | have pushback?
           | 
           | "Some 700,000 members of the Rohingya community had recently
           | fled the country amid a military crackdown and ethnic
           | violence. In March, a United Nations investigator said
           | Facebook was used to incite violence and hatred against the
           | Muslim minority group. The platform, she said, had "turned
           | into a beast."" https://www.reuters.com/investigates/special-
           | report/myanmar-...
        
             | mc32 wrote:
             | So why facebook but not movies and TV over the air or
             | streamed via other platforms? What, because it comes from
             | studios and other sanctioned organs? Are they above
             | propaganda and above having agendas?
             | 
             | I'm not saying FB is not culpable, but I'm saying if they
             | are, then so are others.
        
               | Hokusai wrote:
               | > above having agendas?
               | 
               | Having an agenda is normal and is good. Everybody that
               | plans for the future has an agenda. What is wrong is to
               | have a "hidden agenda".
               | 
               | A "hidden agenda" is wrong because is a form of
               | manipulation. When an organization has a "hidden agenda"
               | means that they are lying to achieve a goal that they are
               | hiding.
               | 
               | If a movie agenda is to "create awareness of human
               | trafficking", and it shows how "human trafficking"
               | impacts peoples lives, that is not "hidden" and it is
               | actually an agenda that most people supports.
               | 
               | So, to have an agenda is intelligent, needed, common,
               | awesome behavior. Stones have no-agenda, rocks have no
               | agenda. To have a "hidden agenda" is what should be
               | criticized.
               | 
               | Why will anyone think that to have an agenda is bad?
        
         | danharaj wrote:
         | Capitalist systems sieve out people whose goals are at odds
         | with the accumulation of capital. By the time you get to a
         | boardroom, everyone has been tested hundreds of times for their
         | loyalty to profit. All deviations are unstable: over a long
         | enough period of time they will be replaced or outcompeted.
        
           | dtech wrote:
           | Yep, CEO's are sociapaths about 4-10x the normal rate [1] for
           | this reason
           | 
           | [1] https://www.forbes.com/sites/jackmccullough/2019/12/09/th
           | e-p...
        
             | arkades wrote:
             | By an arbitrary definition of sociopath invented by a
             | researcher that has little to do with the commonly-accepted
             | definition of sociopath, who used his broadened definition
             | to build his career on the pillar of running around making
             | surprising declarations about "sociopaths."
             | 
             | I'm really, really tired of hearing about the "sociopath
             | CEO" numbers. They're not real.
        
           | pm90 wrote:
           | Not sure why this comment is being downvoted. The people who
           | rise through the ranks are exactly the kind unburdened by
           | ethical or moral issues that get in the way of the business
           | generating revenue. In fact, such folks use their short term
           | gains from breaking such implicit expectations to jettison
           | themselves ahead of their peers. As such, this kind of
           | behavior is incentivized.
           | 
           | Those with such issues either quit or work in non
           | controversial parts of the org.
        
             | danharaj wrote:
             | When you describe capitalist processes, some people take it
             | as if you are making a morally charged argument.
        
               | asveikau wrote:
               | I think there is a crowd that kneejerk downvotes ideas
               | they interpret as anti-capitalist, without reading the
               | argument.
               | 
               | An example: I am not a Marxist. But I think the Marxist
               | question of "surplus value" as an ethical question is
               | relevant and interesting. I pointed it out on HN a few
               | times. Again, without being a Marxist, just
               | intellectually curious. Nobody ever asks me if I am
               | really a Marxist. I get downvoted pretty severely when I
               | point it out. I get an impression that they smell a whiff
               | of the opposing sports team and turn negative.
        
               | selimthegrim wrote:
               | I don't ask people if hey are really Marxists because
               | when I have in the past I get accused of "pigeonholing"
               | their idea(s)
        
               | danharaj wrote:
               | Ugh, such strong language. Please censor it as M*rx or,
               | better yet, "literally Satan".
        
               | Avicebron wrote:
               | There are very many lucky people who are now fierce
               | libertarians on HN these days.
        
             | nicoburns wrote:
             | Agreed. And it works between companies as well as between
             | people within companies. The system is set up so that only
             | those who push the boundaries and exploit externalities can
             | compete.
        
         | zelon88 wrote:
         | I've typically found my employment via companies who deal with
         | a variety of contracts, some of them for weapons or defense
         | contractors.
         | 
         | I could go down the rabbit hole of chasing down all those
         | contracts and would probably find that many of the products my
         | company makes get sold to groups and causes that I don't
         | support. But in the end; I've gotta eat.
         | 
         | Do I want to throw away my career which is 99% unrelated to the
         | SJW cause I support just because 5% of our products
         | _eventually_ get used against that cause. What about the 95% of
         | our products which go to worthy causes?
         | 
         | I'll say it again... I just gotta eat, man. What's good for the
         | gander is probably good for the goose too.
        
           | eropple wrote:
           | Forgive me for the bluntness, but nobody with any set of
           | technical skills "gotta eat" by supporting those kinds of
           | efforts. I've worked to practice what I preach, too; I've
           | consistently worked in do-no-harm jobs. I make rowing
           | machines today and the worst you can hang on me from the past
           | is that I had a daily-fantasy-sports site for a client for a
           | while (which I'm not proud of but it's a pretty venal
           | sin)--and I have made more than enough money to do very well
           | for myself.
        
           | lallysingh wrote:
           | Are all your employment options equally in the moral grey
           | area? Or did you just not want to think about it?
           | 
           | Look, do what you want, it's your life. I spent a decade
           | working in defense and now I don't. Some times were
           | uncomfortable. I hope you keep your eyes open when making
           | decisions to avoid some of the discomfort I've felt in the
           | work I've done.
        
           | whatshisface wrote:
           | Products that may be sold to terrorists include canned beans
           | and Toyota trucks. Your situation might actually be less
           | morally compromising than the Facebook stuff being discussed,
           | because in their case they _are_ the  "questionably motivated
           | 'freedom fighters,'" (i.e. they're directly doing the morally
           | questionable stuff) whereas you're just selling stuff to a
           | broad market that may include questionably motivated "freedom
           | fighters." It's sort of the difference between selling
           | lockpicks that may eventually be used in a burglary or might
           | also be used to get Grandma's safe open, versus breaking in
           | yourself.
        
           | peruvian wrote:
           | If you're working for defense or weapons contracts you're
           | supporting the industry that has kept us in the Middle East
           | for almost two decades.
           | 
           | I agree that it's often a moral grey zone but in this case
           | it's pretty clear. If you're an engineer there's plenty of
           | other companies to choose from.
        
             | zelon88 wrote:
             | While that is true, I've worked in manufacturing
             | environments with high tech equipment. This manufacturing
             | equipment is so sensitive it gets covered with tarp during
             | dog-and-pony shows. We are using equipment and techniques
             | in the USA that other nations could only dream of
             | implementing. Why do you think most airplane manufacturers
             | are located in the USA? Don't you think an airline would
             | buy aircraft engines from China if they could?
             | 
             | Keeping America on the forefront of technology has its
             | benefits. If we don't invest in cornering these
             | technologies; our adversaries will.
             | 
             | Unfortunately it's the same technology that has kept us in
             | the middle east that's also been a forceful deterrent which
             | safeguards all Americans.
        
               | [deleted]
        
           | DavidVoid wrote:
           | Makes me think of a quote from an old West German anti-napalm
           | film "Nicht Loschbares Feuer" ( _The Inextinguishable Fire_ )
           | [1].
           | 
           |  _" The students of the Harvard University write that I
           | should leave the criminal Dow Chemical Company._
           | 
           |  _I 'm a chemist. What should I do?_
           | 
           |  _If I develop a substance, someone can come and make
           | something out of it. It could be good for humanity, or it
           | could be bad for humanity._
           | 
           |  _Besides napalm, Dow Chemical manufactures 800 other
           | products._
           | 
           |  _The insecticides that we manufacture help mankind._
           | 
           |  _The herbicides that we manufacture scorch this harvest and
           | cause him harm. "_
           | 
           | [1] https://vimeo.com/107990231
        
           | minkzilla wrote:
           | Are you actively looking for employment elsewhere so that you
           | can transition away from supporting harmful causes? Or are
           | you using the excuse that you have to eat as a reason not to
           | do hard things in your life?
           | 
           | I have used that excuse myself. I'm trying to get better at
           | not using it.
        
         | pdkl95 wrote:
         | > I cannot believe that everyone is ethicality challenged
         | 
         | The difficult ethical discussion probably never happens. The
         | decisions being made in those meetings are usually seen as
         | small/inconsequential. The problems caused by those "small"
         | decisions are ignored. Eventually those problems become
         | normalized allowing another "small" decision to be made. Humans
         | seem to be very bad at recognizing how a set of "small"
         | decisions eventually add up to major - sometimes shocking[1] -
         | consequences that nobody would have approved if asked directly.
         | Most of the time, nobody realizes just how deviant their
         | situation had become.
         | 
         | For a good explanation of the mechanism underlying the
         | normalization of deviance (as an abstract model), I _strongly_
         | recommend this[2] short talk by Richard Cook.
         | 
         | [1] https://blog.aopa.org/aopa/2015/12/07/the-normalization-
         | of-d...
         | 
         | [2] https://www.youtube.com/watch?v=PGLYEDpNu60 ("Resilience In
         | Complex Adaptive Systems")
        
         | dmitrygr wrote:
         | I've been there at Google a few times and can imagine exactly
         | how this went :/. The one time I can tell about is the blogger
         | disaster [1]. The top leadership, spearheaded by chief of legal
         | was basically ignoring everyone's logical arguments at the
         | meetings, the town halls, etc. We kept coming to the mic and
         | telling them that their ideas of what is and isn't sexual are
         | arbitrary, as are anyone else's. They said "no, we have experts
         | and we have a clear definition" (they didn't). We explained
         | that post-facto removing content people wrote is cruel and
         | unnecessary. They claimed "nobody would care or miss it". (of
         | course they would). We told them that this will hurt
         | transgender people, who used to find support in blogs of others
         | going through the same life challenges and blogging about it.
         | Those blogs would be banned under the policy. They said they
         | had data that impact would be minimal. (They had no data).
         | Normal rank-and-file people at google all knew the idea was a
         | bad one. We fought hard. They scheduled a 8-am townhall and
         | announced it the day before at 9pm! We showed up anyways en
         | masse! There was a line to the microphone!
         | 
         | They had microphones in the audience. I walked up and directly
         | asked for the "data" they claimed to have showing no impact
         | will be had. They claimed and I quote "we have no hardcore
         | data" (audience was laughing at the word choice given the
         | topic). I said that "well, then how can you claim to be making
         | a data-driven decision?" Drummond answered that "we know this
         | is right and we are sure." The town hall was a waste of time.
         | Nothing we said was _heard_ and all they did was recite lines
         | at us from the stage that made it look like either they did not
         | understand what we had to say, or they were trying very hard to
         | appear to not understand. Both sides were talking, but nothing
         | we said seemed to change their mind, They came there to deliver
         | a policy, not to collect feedback on it, despite claiming this
         | was a meeting to _discuss_ it. That was clear.
         | 
         | We did not give up. Google's TGIF was the next day. A number of
         | people came there early and lined up and the microphones, ready
         | to bring this up again and again. In front of the whole company
         | and the CEO as well (Larry and Sergey were not at the town hall
         | and claimed to have not heard of the policy until "the ruckus
         | started").
         | 
         | I guess they saw the large line of people and relented. Before
         | the scheduled TGIF began they announced they will reverse the
         | policy.
         | 
         | This was a rare victory, for this sort of a situation. I am
         | willing to bet that there are _lots_ of good people at facebook
         | who also fought as hard or harder against this. They just
         | probably lost. Having seen how this plays out internally, I am
         | not surprised, just sad.
         | 
         | To anyone at FB who fought against this, I send you my thanks!
         | 
         | [1] https://techcrunch.com/2015/02/23/google-bans-sexually-
         | expli...
        
         | miguelmota wrote:
         | It's kind of a combination of all the above. Majority of
         | employees are working for a paycheck and they don't really care
         | what goes on as long as they get paid. If the person is in an
         | executive type role then their goal is to increase revenue so
         | they convince themselves that it's good for the company.
        
           | thomasjudge wrote:
           | This seems very near the moral vacuity of the "just following
           | orders" defense
        
             | spaced-out wrote:
             | That's perhaps the greatest power of the corporation: it
             | allows people to do shitty things without any specific
             | person being at fault.
             | 
             | Executives have a "duty" to increase "shareholder value".
             | It's not that they necessarily wanted to do X, but their
             | hands were tied because the "data" clearly showed that X
             | was best for shareholders. Plus, if X was so bad, it's
             | really the government's fault for not making it explicitly
             | illegal.
             | 
             | Shareholders aren't individuals either, they're mostly
             | mutual funds, pension funds, ETFs, etc... that makes
             | _algorithmic_ investment decisions. They didn 't ask for X,
             | but the funds they invested in will react to not getting X.
        
             | StillBored wrote:
             | For the beta roles (because I can't help mapping wolf/pack
             | behavior to most corp meetings anymore) about all a person
             | can do is mount a weak defense. Which gets ignored by upper
             | mgmt as they justify ASPD with a framework that says the
             | number one priority is the corporate profit statement.
             | 
             | What percentage of people in these meetings are so wealthy
             | they can risk everything over morally gray area decisions
             | like this? Further how many can get away with it repeatedly
             | should they choose to fight a battle like this?
        
               | eropple wrote:
               | Few people in such a meeting are "risk[ing] everything".
               | 
               | I've quit jobs rather than doing sketchy things. For
               | people in these industries, there's always a next job.
        
               | StillBored wrote:
               | I don't think this is a question of someone doing
               | "sketchy" things. Its a question of someone in the room
               | questioning a morally questionable action, being
               | implemented by a part of the organization as a whole.
               | Quiting over it, or whatever likely doesn't even have an
               | effect. Someone on the team required to implement it is
               | going to follow the bosses orders. This appears to have
               | happened a few times with members of the US president's
               | cabinet over the past few years.
               | 
               | So, its more a "stay and fight" or "get rolled over and
               | threaten/quit" decision. I'm betting most people just
               | weigh the monthly mortgage payment against that and they
               | raise the issue, but it doesn't get pushed beyond the
               | discussion phase. If this goes on long enough, they
               | switch jobs, or they become that person that just keeps
               | their head down and do what they are told.
        
               | eropple wrote:
               | If you're just gonna keep doing it, you're not "staying
               | and fighting" at all.
               | 
               | You don't have to be the just-following-orders guy, is
               | what I'm saying. Somebody else might--that doesn't have
               | to be you, and shouldn't be.
        
               | jkaptur wrote:
               | Just so you know, the model of "alpha wolves" is
               | considered simplistic and outdated in the study of actual
               | wolves. Just one link:
               | https://www.nationalgeographic.org/media/wolves-fact-and-
               | fic....
               | 
               | I've found that when people use "wolf pack" (or "caveman
               | times") explanations, what they're actually doing is
               | using social models that (surprise!) reflect the culture
               | that created them: humans in the twentieth century.
        
           | thoughtstheseus wrote:
           | Part of this is a focus on short-term initiatives that are
           | easy to measure and repeat. Boiling down billions of software
           | decisions to a few KPIs seems short-sighted IMO but hey it
           | makes money.
        
         | carapace wrote:
         | > I cannot believe that everyone is ethicality (sic)
         | challenged...
         | 
         | Why not? Ockham's Razor says to accept the most parsimonious
         | explanation, and I think that's it.
         | 
         | I mean look at little kids: they're amoral monsters. If they
         | weren't so cute our species would have gone extinct ages ago.
         | 
         | Look at our methods to train ourselves to be better people:
         | religions cause wars while "Wolf of Wallstreet" is a big hit.
         | ($392 million worldwide.)
         | 
         | Look at our leaders.
        
           | AlexandrB wrote:
           | _The Wolf of Wallstreet_ was a scathing critique of
           | capitalist excess. To think otherwise is to consider a
           | lifestyle where your wife hates you and you crash your car on
           | quaaludes because you 've got nothing better going on
           | glamorous.
        
             | carapace wrote:
             | I didn't see it. All I know about it comes from Christina
             | McDowell's open letter:
             | 
             | https://www.laweekly.com/an-open-letter-to-the-makers-of-
             | the...
             | 
             | > Your film is a reckless attempt at continuing to pretend
             | that these sorts of schemes are entertaining, even as the
             | country is reeling from yet another round of Wall Street
             | scandals. We want to get lost in what? These phony
             | financiers' fun sexcapades and coke binges? Come on, we
             | know the truth. This kind of behavior brought America to
             | its knees.
             | 
             | My point is that we did find it entertaining to the tune of
             | $0.4B, and that doesn't bode well for our general level of
             | moral development.
        
             | carapace wrote:
             | FWIW I just found this _fascinating_ tangent:
             | https://melmagazine.com/en-us/story/the-perfect-irony-
             | that-t...
             | 
             | > THE PERFECT IRONY THAT 'THE WOLF OF WALL STREET' FILM WAS
             | ALSO A REAL-LIFE SCAM
             | 
             | (caps in original)
             | 
             | > How Leo got caught up in a money-laundering scheme that
             | screwed the Malaysian people out of billions.
        
         | babesh wrote:
         | I have an example. We built a feature that would be good for
         | users. However we found out that it would result in lost
         | revenue. The decision of whether to keep the feature got
         | bounced up management. Eventually we were told to can the
         | feature and that the decision was made at the very top. Keeping
         | it would have affected quarterly revenues. So no go.
         | 
         | That showed me what kind of company it was. The decision went
         | directly against one of the company's supposed core values.
         | This was not a small company. Don't work there anymore.
        
         | [deleted]
        
       | cmrdporcupine wrote:
       | A rather persistent recruiter from FB contacted me recently, and
       | given the new WFH scenario there I was almost considering looking
       | into it further, despite it probably being a frying-pan-fire
       | thing (coming from Google)
       | 
       | But after reading this... yeah, no.
        
       | artche wrote:
       | I think most social media discussions have degraded to outrage of
       | the week.
       | 
       | I limit myself to instagram stories once a month to broadcast
       | that I'm still there to my close friends.
        
       | artemisyna wrote:
       | Can someone copy/paste the original text or post a non-paywall
       | link?
        
       | blhack wrote:
       | IF facebook offered me the option of paying $5/mo to just get API
       | access to the things my friends posted, and I could display them
       | however I want (LIKE FOR INSTANCE IN CHRONOLOGICAL ORDER!) I
       | would happily pay it.
        
       | supernova87a wrote:
       | I really didn't realize until perhaps the last 2 years that
       | Facebook fundamentally tapped some hidden human need/instinct to
       | argue with people who they believe are incorrect. Specifically,
       | and more importantly, combined with the human inability to
       | actively decide to _not_ pay attention when things are
       | inconsequential or not yet worth arguing about.
       | 
       | Sometimes, just shutting up about an issue and not discussing it
       | is the best thing for a group to do. _Not_ more advocacy or
       | argument. Time heals many things. No app is going to help you
       | take that approach -- and that 's not what technology is going to
       | help solve (or is incentivized to solve). Just like telling a TV
       | station that's on 24 hours to _not_ cover a small house fire when
       | there 's no other news.
       | 
       | People are not good at disengaging from something when that's the
       | right thing to calm the situation. And Facebook somehow tapped
       | into that human behavior and (inadvertently or purposefully)
       | fueled so many things that have caused our country (and others)
       | to get derailed from actual progress.
       | 
       | There is no vaccine yet for this.
       | 
       | And not to dump on the Facebook train, since others would have
       | come to do it instead. But they sure made a science and business
       | of it.
        
         | abdullahkhalids wrote:
         | > some hidden human need/instinct to argue with people who they
         | believe are incorrect
         | 
         | This is perhaps a form of "folk activism" [1]:
         | 
         | > In early human tribes, there were few enough people in each
         | social structure such that anyone could change policy. If you
         | didn't like how the buffalo meat got divvied up, you could
         | propose an alternative, build a coalition around it, and
         | actually make it happen. Success required the agreement of tens
         | of allies -- yet those same instincts now drive our actions
         | when success requires the agreement of tens of millions. When
         | we read in the evening paper that we're footing the bill for
         | another bailout, we react by complaining to our friends,
         | suggesting alternatives, and trying to build coalitions for
         | reform. This primal behavior is as good a guide for how to
         | effectively reform modern political systems as our instinctive
         | taste for sugar and fat is for how to eat nutritiously.
         | 
         | Facebook is a collection of your friends or your "tribe", so
         | repeated arguments with your tribe members is what our
         | unconscious brain pushes us towards. That coupled with the
         | dopamine hit of validation via likes (which is common to other
         | online discussion platforms).
         | 
         | [1] https://www.cato-unbound.org/2009/04/06/patri-
         | friedman/beyon... I don't agree with a lot said here. Only
         | linking the definition of folk activism
        
         | devmunchies wrote:
         | I call this the "outrage economy". There are several companies
         | (facebook, twitter, reddit, youtube, etc) that grew based on
         | user activity of varying types. The more bickering and
         | polarization, the bigger X Company gets and need to hire more
         | employees and get more funding, and that feeds into more
         | growth. There is also a secondary economy built on or used by
         | these original companies (software tooling, ad software, legal,
         | clickbait, etc). We now have a big chunk of the economy feeding
         | pointless bickering.
        
           | [deleted]
        
         | edgarvaldes wrote:
         | >Facebook fundamentally tapped some hidden human need/instinct
         | to argue with people who they believe are incorrect.
         | 
         | Maybe FB do it better, but it is the same in every online
         | "forum" where you get notifications about comments.
        
           | goatinaboat wrote:
           | _Maybe FB do it better, but it is the same in every online
           | "forum" where you get notifications about comments._
           | 
           | Facebook influences what you see to a far greater extent than
           | a tradition forum did.
        
         | austincheney wrote:
         | > There is no vaccine yet for this.
         | 
         | If you realize it's a dumpster fire then delete your account
         | and move on with life. If that line of thinking is a challenge
         | in absolutely any way the problem is addiction.
         | 
         | https://en.m.wikipedia.org/wiki/Addiction
        
         | empath75 wrote:
         | I actually really enjoy having a good argument with random
         | people online, but I don't as much enjoy arguing with my
         | friends and family. 1) I don't like being mad at or
         | contemptious of people I'm close to and 2) they're usually not
         | worth the effort of arguing with because they're just cutting
         | and pasting stupid shit they found elsewhere and it's
         | _exhausting_ to continuously correct the record when they put
         | zero effort into copy and pasting it to begin with.
         | 
         | I first purged everyone that posted that stuff from my feed,
         | and then eventually quite facebook altogether.
        
         | staysaasy wrote:
         | "Time heals many things"
         | 
         | Extremely true, also relevant for work disagreements between
         | people who have existing positive relationships. A surprising
         | number of disagreements disappear if left on their own for a
         | time.
         | 
         | I find that many people with engineering backgrounds (myself
         | included) can struggle letting conflicts sit unresolved. I
         | suspect that instincts learned debugging code get ported over
         | to interpersonal issues, as code bugs almost never disappear if
         | simply left to rest.
        
         | LordHumungous wrote:
         | Yeah. There's a lot of relief in letting go, accepting that
         | other people are outside of your power to control, and just
         | practicing acceptance no matter how wrong or annoying or stupid
         | you think people are being.
        
         | toohotatopic wrote:
         | They tapped into that human behaviour 'as somehow as' hn
         | doesn't have an orangered envelope when somebody replies to
         | your messages. It's by design and not by coincidence.
         | 
         | There are plenty of vaccines for this, but not in the sense
         | that you can apply it to people by force, like you can apply a
         | vaccine to babies. Meditation, yoga, religions, sports - there
         | are many ways to calm the mind.
        
         | Animats wrote:
         | _Sometimes, just shutting up about an issue and not discussing
         | it is the best thing for a group to do._
         | 
         | Then the terrorists win.
         | 
         | That used to be the conventional wisdom on trolls, but there
         | are now so many of them. Worse, about half are bots.[1] (Both
         | NPR and Fox News have that story, so it's probably correct.)
         | 
         | [1] https://www.huffpost.com/entry/carnegie-mellon-
         | covid-19-twit...
        
         | wfbarks wrote:
         | wish that I could actively decide not to pay attention to this
         | comment, but I am unable.
        
         | ErikAugust wrote:
         | I think it's important to note that Facebook didn't invent any
         | of this. They just built the biggest mainstream distribution
         | channel to do so. Nothing they ever did in terms of
         | facilitating pointless arguments has been all that original
         | either.
         | 
         | People have been doing this forever, and even on the Web much,
         | much longer than Facebook has existed.
         | 
         | Now that said, they know what they have on their hands and how
         | it makes them the money. They aren't going to fix it. It is a
         | big feature of their product.
        
           | hi_im_miles wrote:
           | Part of this is that Facebook makes the opinions of people
           | you know but don't really care about highly visible, which I
           | think leads to some of the animosity you see on the platform.
           | When the person you're confronting is the uncle of someone
           | you talked to once back in high school, there's little
           | incentive to be kind.
        
           | deathgrips wrote:
           | Psychopaths often excuse their behavior by saying that if
           | they don't take advantage of others, someone else will do it
           | instead.
        
             | puranjay wrote:
             | To be fair, people will go to great lengths to argue over
             | things they think are wrong. People make alt accounts on
             | Reddit and Twitter to do it. Heck, people will even
             | navigate 4chan's awful UI and content, fill in captchas,
             | just to tell someone that they're wrong.
             | 
             | Facebook could make it harder to post content, but I doubt
             | that would make much of a difference
        
           | goatinaboat wrote:
           | _think it's important to note that Facebook didn't invent any
           | of this_
           | 
           | I think that's literally true. They told their algorithm
           | "maximise the time people spend on Facebook" and it
           | discovered for itself that sowing strife and discord did
           | that.
           | 
           | Facebook's crime is that when this became obvious they
           | doubled down on it, because ads.
        
           | specialist wrote:
           | Facebook, and others, absolutely innovated with their
           | recommendation engines. Enabled by implementing the most
           | detailed user profiling to date coupled with machine
           | learning.
        
           | cryptoz wrote:
           | > I think it's important to note that Facebook didn't invent
           | any of this.
           | 
           | I don't agree with that. I very strongly think that Facebook
           | did invent a lot of this.
           | 
           | > They just built the biggest mainstream distribution channel
           | to do so
           | 
           | Scale does matter though. There is a lot in life that is
           | legal or moral at small scale but illegal or immoral at large
           | scale. Doing things at scale does change the nature of what
           | you are doing. There's no 'just' to be had there.
           | 
           | > Nothing they ever did in terms of facilitating pointless
           | arguments has been all that original either.
           | 
           | I don't agree with that either. They have even published
           | scientific papers, peer-reviewed, to explain their new and
           | novel methods of creating emotionally manipulative content
           | and algorithms.
           | 
           | > People have been doing this forever, and even on the Web
           | much, much longer than Facebook has existed.
           | 
           | I also don't agree with this. Facebook has spent 10+ years
           | inventing new ways to rile people up. This stuff _is_ new.
           | Yes I know newspapers publish things that are twisted up etc,
           | but that 's different, clearly. The readers of the paper are
           | not shouting at each other as they read it.
           | 
           | I think it's super dangerous to take this new kind of mass-
           | surveillance and mass-scale manipulation and say, welp,
           | nothing new here, who cares? I think that's extremely
           | dangerous. It opens populations to apathy and lets
           | corporations do illegal and immoral things to gain unfair and
           | illegal power.
           | 
           | Facebook should not be legally allowed to do all the things
           | they are doing. It's invasive, immoral, and novel, the way
           | they deceive and manipulate society at large.
        
             | visarga wrote:
             | > Doing things at scale does change the nature of what you
             | are doing.
             | 
             | "Quantity has a quality of its own"
        
             | stanleydrew wrote:
             | > I think it's super dangerous to take this new kind of
             | mass-surveillance and mass-scale manipulation and say,
             | welp, nothing new here, who cares?
             | 
             | If that's the outcome you're concerned about I think it's
             | legitimate. I don't think it's parent's intention to
             | encourage apathy though.
             | 
             | However even if unintended, it's a good reminder that we
             | should all watch out to not fall into that apathy trap.
        
         | SherlockeHolmes wrote:
         | Disclaimer: I don't agree with your conclusions regarding
         | general needs and instincts (of human) and that a human possess
         | abosolute/built-in/DNA-engrained inabilities (therefore, I do
         | believe a human can fly). I also don't agree that Facebook is
         | to share much of the blame for the chaotic human zeitgeist
         | present today. I do believe a human is highly malleable and
         | impressionable, and that these qualities have been exploited
         | historically at various scales for various reasons.
         | 
         | "There is no vaccine yet for this."
         | 
         | There may not be any vaccine, but there may be a cure. If we
         | change the language used to communicate within a
         | setting/platform such as Facebook, possibly by using a subset
         | of the language previously used or by adopting a more Formal
         | construct.
         | 
         | But Facebook is a virtual neighborhood, with greatly increased
         | bandwidth and range. It is difficult or impossible to achieve
         | it in their settings.
        
         | mistermann wrote:
         | > There is no vaccine yet for this.
         | 
         | There is a fair amount of anecdotal evidence suggesting that
         | psychedelics can have a significant impact when used correctly.
        
         | specialist wrote:
         | _" There is no vaccine yet for this."_
         | 
         | What would that look like?
         | 
         | Given our current (social) media ecosystem, converting outrage
         | into profit (per Chomsky, McLuhan, Postman, and many, many
         | others), what does a non-outrage maximizing strategy look like?
         | 
         | I currently favor a slower, calmer discourse. A la both
         | Kahnemann's thinking fast vs slow, and McLuhan's hot vs cold
         | metaphors.
         | 
         | That means breaking or slowing the feedback loops, removing
         | some of the urgency and heat of convos.
         | 
         | Some possible implementation details:
         | 
         | - emphasis on manual moderation, like metafilter, and dang here
         | on HN
         | 
         | - waiting periods for replies. or continuing allowing the
         | submissions but delay their publication. or treat all posts as
         | drafts with a hold period. or HN style throttling. or...?
         | 
         | - only friends can reply publicly.
         | 
         | - hide "likes"
         | 
         | - do something about bots. allow aliases, but all accounts need
         | verified real names or ownership.
         | 
         | Sorry, these are just some of misc the proposals I remember. I
         | should probably have been cataloguing them.
        
         | Causality1 wrote:
         | People do go on Facebook and argue with others, but that's not
         | the core of the divisiveness. Rather, people sort themselves
         | into opposing groups and spend most of their time talking
         | amongst themselves about how good they are and how horrible the
         | other group is.
        
         | Vysero wrote:
         | Close your eyes, hold your breath and hope the situation
         | resolves itself, that's your solution? I don't believe in a:
         | "hidden human need/instinct to argue with people". There is
         | nothing hidden about human conflict. It is as natural as any
         | conflict; as natural as space and time. In fact, without
         | conflict evolution can not exist. Obviously, a good portion of
         | the arguments being had have the potential of bearing no fruit,
         | but I would argue that just as many of them not only should but
         | NEED to be had, and are quite productive on the whole.
        
           | mark-r wrote:
           | There is definitely such a need, which is why this cartoon is
           | one of my favorites of all time: https://xkcd.com/386/
        
             | tstrimple wrote:
             | Where do you draw the line between someone on the internet
             | "being wrong", and someone on the internet spreading
             | dangerous misinformation? Sure, walking away from the first
             | is often a good course of action, but what about the
             | second?
        
               | mark-r wrote:
               | It's often very hard to tell the difference. If I ever
               | find myself in doubt, I try to default to walking away.
               | Not always successfully.
        
             | pjmorris wrote:
             | Same here. I often joke that I should keep this one posted
             | at my desk. It helps remind me to walk away.
        
             | ShellfishMeme wrote:
             | I always have to think about that one when I had just been
             | spending 20 minutes trying to formulate an elaborate
             | response to someone's comment, just to proceed to sigh and
             | close the thing without posting it. Probably most of the
             | time it was for the better too.
        
               | specialist wrote:
               | I'm self culling a lot more too. I'd argue those unposted
               | drafts are still valuable, if only to help oneself flesh
               | out ideas.
        
               | ShellfishMeme wrote:
               | I completely agree. Often I don't proceed because the
               | point I wanted to make is just not properly defendable at
               | that point. But the ideas written down in the process
               | don't disappear and at the same time I got a better
               | appreciation of the other side's point of view. I'd
               | rather hold on to those ideas and evolve them further
               | than to get too attached and be forced into a situation
               | where I eventually defend my opinions because I moved
               | myself into a position where I feel like this is a deep
               | personal belief rather than a well defined objective-ish
               | argument I tried to make.
        
         | maest wrote:
         | That's an interesting thought, for sure. I should point out
         | that this doesn't only apply to facebook, but other large
         | discussion forums as well: reddit, 4chan, tumblr, twitter etc.
        
           | gimboland wrote:
           | And Hacker News!
           | 
           | > some hidden human need/instinct to argue with people who
           | they believe are incorrect
           | 
           | I've said it before, I'll probably say it again: this place
           | is chock full of people just itching to tell you you're wrong
           | and why. Don't get me wrong: obviously there's also a hell of
           | a lot of great discussion and insightful technical knowhow
           | being shared by real experts -- but in my experience I also
           | do have to wade through quite a lot of what feels like knee-
           | jerk pedantry and point-scoring.
        
             | mywittyname wrote:
             | When people here tell you that you're wrong, they tend to
             | do it with style.
             | 
             | One time I made a comment about how dividends affect stock
             | price and someone spent like 2 hours writing a program in R
             | to do some analysis just to prove me wrong.
        
             | mac01021 wrote:
             | I, for one, wouldn't have it any other way!
             | 
             | When I'm wrong I want it explained to me why. Even when I'm
             | right about something controversial, I want to see the best
             | arguments to the contrary.
             | 
             | I don't want people to be less argumentative. I just want a
             | higher intellectual caliber than what's generally available
             | on facebook or twitter, and HN fits the bill reasonably
             | well.
        
               | downerending wrote:
               | It's tricky. One has to learn how to disagree without
               | being disagreeable.
               | 
               | Or, get your fuck-you billions and people will start
               | agreeing with you a lot. :-)
        
         | munificent wrote:
         | _> I really didn 't realize until perhaps the last 2 years that
         | Facebook fundamentally tapped some hidden human need/instinct
         | to argue with people who they believe are incorrect._
         | 
         | I think everyone has a natural human need to feel that they
         | have agency in their community. The need to feel that they
         | participate in the culture that surrounds them and that they
         | can have some affect on the groups that they are members in.
         | The alternative is being a powerless pawn subject to the whims
         | of the herd.
         | 
         | In the US, I think most people _lost_ this feeling with the
         | rise of suburbia, broadcast television, and consumer culture.
         | There are almost no public spheres in the US, no real commons
         | where people come together and participate. The only groups
         | many people are  "part" of are really just shows and products
         | that they consume.
         | 
         | Social media tapped into that void. It gave them a place to not
         | just hear but to speak. Or, at least, it gave them the illusion
         | of it. But, really, since everyone wants to feel they have more
         | agency, everyone is trying to change everyone else but no one
         | wants to be changed. And all of this is mostly decoupled from
         | any _real_ mechanism for actual societal change, so it 's
         | become just angry shouting into the void.
        
         | HenryBemis wrote:
         | In my line of work, we need to dig deep and find the root cause
         | on anything we 'touch'. I have noticed (since day 1 in this
         | line of work) that elaborate, complex truths tire the audience,
         | they want something snappy and 'sexy'. I remember a French
         | C-suite telling me "make it sexy, you will lose them".
         | 
         | Facebook managed to get this just right: lightweight, sexy (in
         | the sense of attractive), easy to believe, easy to understand,
         | easy to spread. The word "true" is completely absent on the
         | above statement. That generates clicks. That keeps users logged
         | in more. That increases "engagement". That increases as
         | revenue. Game over.
         | 
         | The masterminds/communications brilliant minds could never get
         | so many eyeballs and ears tuned-in with such a low cost before.
         | 
         | I've mentioned before that FB = cancer It gives 1 (ability to
         | communicate) and it takes 100.
        
         | maxerickson wrote:
         | How does your postulation map to say, anti-vax?
        
           | bentcorner wrote:
           | I don't personally think it's productive _for me_ to engage
           | with these kind of people but I will definitely support and
           | cheer on others doing so:
           | https://www.youtube.com/watch?v=Q65aYK0AoMc (NSFW content)
           | 
           | (Personally I get too wound up in internet arguments and it's
           | just not a healthy space for my head to be in)
        
             | maxerickson wrote:
             | Sure, I don't spend much time yelling at idiots.
             | 
             | The point of my question is that activists tend to talk
             | about things, so "shutting up about an issue and not
             | discussing it is the best thing for a group to do." won't
             | ever actually happen.
        
         | zachware wrote:
         | This is quite possibly the most elegant summary of what's wrong
         | with how we engage with information through the media/social
         | landscape.
         | 
         | We didn't evolve as a species to process this much information,
         | or as Yuval Noah Harari calls it in Sapiens, gossip.
        
         | derg wrote:
         | In general and not necessarily related to just facebook, but
         | one of the best things I've come to learn about myself and the
         | world around me is that sometimes the absolute _best_ thing you
         | can do for yourself is to just shut up and walk away, even if
         | you know in your heart of hearts that you are correct.
        
           | jeffdavis wrote:
           | And the corrolary: just because the other party walks away
           | from the argument doesn't mean that you are right.
        
           | seek3r00 wrote:
           | I agree, as long as you can find time to research and
           | understand if you're really correct. This way you avoid
           | conflicts but you still learn if you were wrong.
        
             | derg wrote:
             | 100%. Some of my biggest learning experiences were when I
             | walked away when I thought I was completely, utterly
             | correct only to find out a little later that I was actually
             | completely wrong!
             | 
             | I say that in the past tense just because it makes in terms
             | of what I'm trying to convey but it still routinely happens
             | too, and will continue to happen.
        
           | fossuser wrote:
           | I think this is generally helpful to keep in mind.
           | 
           | I also think there's an art to deescalation and discussing
           | ideas or persuading someone you disagree with to see an
           | alternative view (and then giving them space to change their
           | mind).
           | 
           | Productive discussion isn't possible with everyone or even
           | one individual depending on where they are in their life, but
           | I've generally found it works better than expected when you
           | can remove your own identity and feelings from it.
           | 
           | It's rarely in the spotlight though because it doesn't get
           | retweeted or shared as much as combative arguing that's more
           | a performance from each side (with likes and cheering on the
           | sidelines).
        
             | pathseeker wrote:
             | >I also think there's an art to deescalation and discussing
             | ideas or persuading someone you disagree with to see an
             | alternative view
             | 
             | This doesn't really work with core views like politics.
             | Some of the most polarizing topics are not because the
             | opposite side can't see your view, it's because you
             | fundamentally disagree on priorities.
             | 
             | Examples:
             | 
             | Anti-abortion people aren't going to be persuaded by yet
             | another view on women's rights. They think you're arguing
             | to murder babies. The plight of a woman in poverty is not
             | going to suddenly make them go, "oh, well then I guess _a
             | litte_ murder is okay. "
             | 
             | A libertarian isn't going to be swayed to suddenly think a
             | planned economy is a better approach even when presented
             | with spectacular market failures. They completely
             | understand the failures suck and understand the alternative
             | views just fine. Another anecdote is not realistically
             | going to alter a belief that central planning is worse
             | overall.
        
               | kelnos wrote:
               | It's unfortunate you're being downvoted, because I think
               | in many ways you're right. People -- especially people
               | who hold strong views on divisive issues -- usually will
               | not change their minds when presented with new
               | evidence[0]. They're swayed by emotional appeals that get
               | them to change how they _feel_ about an issue.
               | 
               | Sure, there are exceptions, and some people can be
               | dispassionate enough to weigh evidence and change their
               | minds, but that is definitely not the norm.
               | 
               | [0] I read a fantastic article on this a year or two ago,
               | but can't find it now; will update with an edit if I find
               | it before the edit window expires.
        
               | fossuser wrote:
               | I disagree on this - I have a more optimistic view of the
               | ability for people to change how they think.
               | 
               | You're right that a core belief tied into someone's
               | identity is not going to be changed by new evidence,
               | unless you can get people to value trying to figure out
               | what's true and updating on evidence itself (rather than
               | having an 'answer' already and just using motivated
               | reasoning to come up with arguments that support their
               | 'answer'). This is hard.
               | 
               | I know I've personally gone from someone who made these
               | kind of bad reasoning mistakes - the smarter you are the
               | more insidious they can be because you're better at being
               | a clever arguer and coming up with plausible sounding
               | reasons while ignoring or rationalizing contradicting
               | evidence. I've worked hard to get better at it (and I
               | still am, it's an ongoing process). Yes, this is only a
               | sample size of one, but I think it's possible.
               | 
               | I have an optimistic view of the capacity for a person to
               | learn how to think better, while simultaneously having a
               | pessimistic view of the general public's current ability
               | to think rationally. This may seem like a conflict, but
               | really it just means that I think it's possible for us to
               | be a lot better than we are, while recognizing it's a
               | bigger project than just stating the specific evidence
               | available for any specific argument.
               | 
               | People have to be willing to consider why they believe
               | what they believe, and be honest about the potential to
               | change their mind based on new information that
               | contradicts what they believe to be true.
               | 
               | I think that's the goal we have to work toward first.
        
               | pdonis wrote:
               | _> unless you can get people to value trying to figure
               | out what 's true and updating on evidence itself_
               | 
               | What evidence could you give to disprove the belief that
               | abortion is murder? The belief is not a claim about
               | evidence; it's a claim about priorities, as the GP said.
               | Or, if you like, about what actions count as belonging to
               | what categories.
        
               | TheOtherHobbes wrote:
               | A more interesting question is why is abortion
               | consistently used as a tribal issue in US politics.
               | 
               |  _Of course_ there is an underlying difference of
               | opinion, and of course it matters to those on both sides.
               | 
               | But it matters because the media have done an
               | exceptionally good job of herding people into different
               | camps - by focusing on a small and standardised
               | collection of divisive issues and amplifying the rhetoric
               | around them.
               | 
               | Does someone benefit from these divisions, and from the
               | loss of civility and civic cohesion they create, and
               | perhaps also from the implied promotion of violent
               | oppositional defiant subjectivity over rational argument
               | that powers them?
        
               | pdonis wrote:
               | _> A more interesting question is why is abortion
               | consistently used as a tribal issue in US politics._
               | 
               | I think a factor here is that the US pushes the
               | boundaries of what it really takes to have a free country
               | with a diverse population more than other countries do.
               | 
               | Other countries--or at least other developed countries--
               | have a more homogeneous population than the US has, and
               | also do not have the same tradition of skepticism about
               | and distrust of government that the US has. Also other
               | countries do not have quite the same Constitutional
               | provision for the free exercise of religion that the US
               | has.
               | 
               | A less homogeneous population means there is a wider
               | range of traditions that people are brought up with. That
               | creates a lack of common ground about a lot of things.
               | For example, I'm not aware of any other developed country
               | that has a significant population of young earth
               | creationists.
               | 
               | A tradition of skepticism about and distrust of
               | government means that people are less willing to accept a
               | legal rule that conflicts with their personal
               | convictions, and more willing to complain about it
               | publicly (or indeed to take even more drastic action).
               | Note that this applies to both sides of the abortion
               | debate: to extreme pro-lifers who feel that any abortion
               | at all is wrong, and to extreme pro-choicers who feel
               | that any restriction on abortion at all is wrong. Current
               | US law and jurisprudence is actually not close to either
               | of those extremes, so both extremes have plenty of
               | reason, in their view, to complain.
               | 
               | The Constitutional protection of free exercise of
               | religion means that "personal convictions", if they are
               | backed by a religious tradition, carry a lot more weight.
               | This is most obvious in the US on the anti-abortion side
               | of the debate.
               | 
               |  _> Does someone benefit from these divisions_
               | 
               | I think someone taking political advantage of divisions
               | within the population can happen in any country, but it
               | might well be true that the US, for the reasons I
               | described above, presents more opportunities for it to
               | happen.
        
               | fossuser wrote:
               | See my other comment for how I think about/tackle
               | abortion specifically:
               | https://news.ycombinator.com/item?id=23315552
        
               | pdonis wrote:
               | I see it, I'll respond further there.
        
               | evolve2k wrote:
               | Damn. I ended up caught up in peoples reponses to the
               | abortion debate question and totally forgot my disgust of
               | Facebook which was top of mine as I started reading the
               | comments. Ironically it confirms the article and how
               | Facebook can keep distracting from focus on itself by
               | having platform users head down rabit hole after rabit
               | hole in an attempt to satiate their flawed human desire
               | to be right all the time.
               | 
               | Woah!
        
               | darkengine wrote:
               | Sadly, I think you are right. All reasoning starts with
               | postulates that you cannot prove. For ethics and
               | politics, these axioms are our emotions and values. Two
               | people who have a different set of values can't have a
               | logical argument because they're using entirely different
               | systems of reason.
        
               | visarga wrote:
               | That's a learned behaviour - to ignore opposing
               | arguments. They train themselves to counter arguments
               | they don't like with their own prefabricated arguments,
               | learned from the mass media.
               | 
               | Like, for example: politician X is corrupt, he was caught
               | taking bribes. Counter: everyone is stealing, at least
               | his party gives our group more benefits than the other
               | party.
        
               | majewsky wrote:
               | "Everyone is stealing" is actually a rational counter
               | when grounded in sufficient truth. It's depressing, sure,
               | but it _can_ be rational.
        
               | cataphract wrote:
               | Studies on persuading people out of prejudice do exist
               | (see e.g. https://www.ocf.berkeley.edu/~broockma/kalla_br
               | oockman_reduc... ). The problem is that the approaches
               | are not as emotionally satisfying as just affirming your
               | moral superiority and calling people bigots, so not many
               | people do it (though some do, search for "deep
               | canvasing").
               | 
               | Abortion is tricky because once you recognize the life of
               | the embryo as a value to protect it's difficult to have
               | it come as less compelling than a right to bodily
               | autonomy. Still, most anti-abortion people would still
               | carve out a lot of exceptions (rape, genetic problems,
               | danger to the mother's life and so on). Most people
               | recognize that the even right to life is not absolute
               | (another example: they would agree it would be unlawful
               | to refuse to obey an order in time of war that would
               | almost certainly result in a soldier's death).
        
               | fossuser wrote:
               | I think abortion is a lot easier when you frame the
               | argument around suffering.
               | 
               | I think part of the problem with abortion is the left
               | argues that "it is not a life" which is generally a weak
               | argument. It's better to accept/concede that you are
               | ending life, but doing so without suffering before
               | there's a neural net that can recognize anything - I
               | think that's the important bit (and why third trimester
               | abortions are banned anyway).
               | 
               | The push back then tends to be that life itself is sacred
               | and can never be ended (suffering is not relevant), but
               | this is generally not truly believed by the people making
               | the argument so it's easy to point out their
               | contradictory support for the death penalty. They then
               | usually say there's a difference between innocent life
               | and people who've committed crimes at which point you're
               | back to negotiating conditions and suffering seems like a
               | pretty good condition to use.
               | 
               | [Edit] It's also a messier issue because I think a
               | component of the debate is shaming women for sex. That
               | they should be forced to have their baby as some sort of
               | penance for having sex. Obviously this is largely unsaid
               | in favor of more palatable arguments, but if it's the
               | true driver then it's hard to even start because you're
               | not addressing the true motivation (which may not even be
               | fully realized by the person arguing).
        
               | pdonis wrote:
               | _> It 's better to accept/concede that you are ending
               | life, but doing so without suffering before there's a
               | neural net that can recognize anything_
               | 
               | First, this assumes that such a "neural net" is required
               | for suffering. I personally don't have a problem with
               | that, but making an ironclad scientific case for it is
               | going to be very difficult, since we don't understand
               | _how_ "neural nets" actually produce suffering even in
               | the case of humans with fully developed brains.
               | 
               | Second, by this criterion, it's not just third trimester
               | abortions that should be banned, but abortions at any
               | time after the "neural net" develops. That's a lot
               | earlier than our current jurisprudence draws the line
               | (neural activity can be detected in the brain of a fetus
               | at about six weeks, vs. viability at roughly 24 weeks as
               | more or less the current jurisprudence line), which means
               | that our current jurisprudence is allowing a lot of
               | suffering by this criterion.
               | 
               | So I'm not sure this framing actually makes the argument
               | any easier.
        
               | fossuser wrote:
               | Thanks - I think that's a fair criticism and I don't know
               | enough to really comment further.
        
               | neutronicus wrote:
               | > I think part of the problem with abortion is the left
               | argues that "it is not a life"
               | 
               | I don't think this is accurate. Getting a typical pro-
               | choice person to discuss the fetus at all, much less
               | whether it can be called alive, takes some serious
               | cornering (I am pro-choice, to be clear).
        
               | fossuser wrote:
               | Sure, but diverting the question from the topic where
               | your point is weakest is just misdirection and isn't very
               | persuasive.
               | 
               | There are a lot of good reasons other than this one to
               | support pro-choice, but those reasons will be irrelevant
               | to someone who views 'abortion as murder'. You have to
               | put yourself in their position and reason about it like
               | they would, then think about what is the best argument
               | from their position.
               | 
               | Basically steel-manning their side and then tackling the
               | best argument head on.
               | 
               | I think this is where really interesting discussions
               | happen and where minds can change, otherwise you end up
               | just discussing the same tired points without making any
               | progress.
        
               | mrmonkeyman wrote:
               | You guys are doing it right now.
        
               | pjc50 wrote:
               | Suffering matters - of the mother. The death in hospital
               | of a woman who was denied an abortion was the catalyst
               | for the successful campaign in Ireland to get the
               | constitution changed to permit abortion.
               | 
               | #repealthe8th
        
               | throwaway441 wrote:
               | > I think abortion is a lot easier when you frame the
               | argument around suffering.
               | 
               | Really? I think it becomes much more difficult. It
               | invites arguments for infanticide (see the 2013 Giubilini
               | paper on after-birth abortion for a famous example of
               | this). The same arguments concerning a woman who is not
               | able to take care of a child apply equally well after
               | birth if suffering is the only consideration, because
               | it's entirely possible to end the life of the baby in a
               | painless manner. As someone who is pro-life, I've
               | generally found the suffering angle to be the least
               | compelling of the pro-choice counterarguments.
        
               | fossuser wrote:
               | I do think you're right that there's an extra element
               | beyond just suffering (otherwise you can argue that
               | killing infants instantly is okay if they don't notice
               | and they're not yet self-aware).
               | 
               | I think it's a mixture of suffering and having a neural
               | network formed enough for ...something? I have an
               | intuitive feeling that it's wrong to kill infants before
               | they're self-aware even if 'done painlessly', but I don't
               | feel that way about a blastocyst or a fetus without a
               | sufficiently formed neural network that can suffer.
               | 
               | I recognize this isn't perfectly consistent though and I
               | don't have a great answer for why.
        
               | pdonis wrote:
               | _> Studies on persuading people out of prejudice_
               | 
               | The GP is not talking about prejudice; he's talking about
               | a genuine difference in priorities. Calling that
               | "prejudice" implies that one of those choices of
               | priorities is simply wrong; it ignores the possibility
               | that there is no one "right" choice of priorities.
        
               | afthonos wrote:
               | I think the argument doesn't suffer if you replace
               | "prejudice" with "strongly held beliefs". The human mind
               | doesn't have a secret truth-o-meter, so from the inside,
               | prejudice and strongly held beliefs are
               | indistinguishable. The fact that some people hold a
               | belief strongly is itself proof that that belief _can_ be
               | held, and therefore that people can be convinced to hold
               | it.
               | 
               | Basically, a technique that works to convince people away
               | from prejudice over and above what presenting them with
               | truth does should be applicable to any belief.
        
               | pdonis wrote:
               | _> I think the argument doesn't suffer if you replace
               | "prejudice" with "strongly held beliefs"._
               | 
               | Yes, it does, because the post I originally responded to
               | said "persuading people out of" these beliefs. How is
               | that justified if the beliefs are not known to be wrong?
               | "Prejudice" implies that the beliefs _are_ known to be
               | wrong, so it 's justified to try to persuade people out
               | of them. "Strongly held beliefs" does not carry the same
               | implication.
        
               | BurningFrog wrote:
               | > _The fact that some people hold a belief strongly is
               | itself proof that that belief can be held, and therefore
               | that people can be convinced to hold it._
               | 
               | The fact that I'm tall is proof that people can be tall.
               | But not that you can become tall.
               | 
               | In general people have the opinions they need to have to
               | feel good about themselves. That's hard to change.
        
               | kelnos wrote:
               | I think "priorities" is a pretty good way of framing it,
               | at least when considering the abortion debate. I'm pro-
               | choice, but I don't consider my position to be any kind
               | of moral right; I just believe that in this situation,
               | the priority should go to the mother and her wishes, not
               | the fetus. I don't think that giving priority to the
               | fetus is inherently illogical or wrong, it's just not the
               | choice I'd make.
               | 
               | The problem that I have, though, is that I don't believe
               | that many pro-life advocates look at it that way; instead
               | of thinking about what's best for the people around the
               | potential baby, they resort to religious or strictly
               | emotional arguments in support of their views[0], which I
               | will never consider persuasive.
               | 
               | The cut-off point is entirely up to society's consensus.
               | You could go to the extreme and say that vasectomies (or
               | even male masturbation) and tubal ligations are murder,
               | because they destroy germ cells that could turn into
               | children eventually. Some religions prohibit birth
               | control of any kind. Many people aren't comfortable with
               | the morning-after pill. Some people are fine with an
               | abortion up to N weeks, but not after.
               | 
               | And that's what I find sad about arguments on this topic:
               | people have drawn their line in the sand, and they
               | believe that they are right, any other option is wrong,
               | and that they must impose their rightness on everyone
               | else, regardless of any disagreement in beliefs.
               | 
               | As a result, I just tend to not get into arguments about
               | this, as I don't think it's worth the blood-pressure
               | increase to engage a pro-life advocate in discussion.
               | 
               | [0] I may be wrong about this; I frankly do not have many
               | (any?) pro-choice friends, so I only know what I read,
               | and that may be a case of me just hearing the loudest
               | voices, not the most representative ones.
        
               | pdonis wrote:
               | _> I frankly do not have many (any?) pro-choice friends_
               | 
               | Did you mean to say "pro-life" here?
        
               | paublyrne wrote:
               | You're right on an individual level, but when you
               | extrapolate a less argumentative approach and encourage
               | curiosity rather than beligerence I do believe society
               | overall is amenable to change. On a macro scale it works.
               | I'm loath to point to abortion specifically as in the
               | example as it's a devisive issue but Ireland which voted
               | to allow abortion two years ago in a landslide referendum
               | is an example of society changing its views as a whole,
               | even while some individuals in that society remain
               | immovable.
        
               | wuliwong wrote:
               | From my own experience, I've had many many many
               | conversations regarding topics like this that haven't
               | changed my views in any substantial way. BUT, I have had
               | a handful that did and those are so valuable that I think
               | they it IS worth banging our heads into each other's
               | walls most of the time for these rare moments.
        
               | seneca wrote:
               | Yep, I think you're dead on. There are competing and
               | diametrically opposed values out there, and in many cases
               | both can be fairly argued in favor of. For example
               | fairness vs freedom. No amount of shouting about the
               | details will convince someone who primarily values
               | fairness that freedom is more important, and the
               | arguments are largely pointless unless the participants
               | are genuinely seeking to examine the ideas, which is
               | rarely ever the case online.
               | 
               | I think people understand that at a very base level, and
               | that's why online arguments are often really more of a
               | performance to score points with your side, or to take
               | shots at the other side. Rarely is anyone actually
               | attempting to convince or learn, they're just playing out
               | some weird tribal warfare ritual and dressing it up as
               | debate.
               | 
               | As the larger thread here suggests, the best move in this
               | game is simply not to play it.
        
               | mrbungie wrote:
               | Well, your reply is being 'deescalated' via downvotes.
               | That shows how open people are to thoughts that defy
               | their values and/or world model.
        
               | goatinaboat wrote:
               | It's being downvoted because it's an example of exactly
               | the behaviour it's purportedly opposing.
        
             | wdb wrote:
             | Personally, I think most of the persuasive related research
             | is not done through public research (universities, public
             | funding) but more through corporate or military (?)
             | research. I can even imagine that there is obtained
             | knowledge being used to help persuade the public for
             | political gain. Which the wider public or public
             | universities aren't being shared and is oblivious to the
             | public and the wider science community.
             | 
             | Guess, it sounds bit like a tin hats theory but I can
             | imagine the above is the happening at the moment.
        
               | dillonmckay wrote:
               | The 'other' NLP - neurolinguistic programming.
               | 
               | https://en.m.wikipedia.org/wiki/Neuro-
               | linguistic_programming
        
               | intended wrote:
               | To be honest, I pay a lot of attention to the research
               | coming out, and there's not be much which is
               | counterintuitive. All of it builds off of stuff we've
               | seen from the days of eternal September.
               | 
               | Generalist subs/topics collect junk. Directed subs have
               | more focus and are healthier in defending crap.
        
               | artificial wrote:
               | Unfortunately social media sock puppets were so hot last
               | decade:
               | https://www.theguardian.com/technology/2011/mar/17/us-
               | spy-op...
        
           | [deleted]
        
           | 12xo wrote:
           | A big reason why old people speak less...
        
           | datalus wrote:
           | I learned this lesson recently, and I am sure I will continue
           | to learn this lesson in the future as I've already learned it
           | in the past. Just in many different contexts and ways!
        
           | brewdad wrote:
           | I've gotten better, over time, at typing my heated, emotional
           | response to someone online and then hitting Cancel or the
           | back button and not posting.
           | 
           | Sometimes I do hit the Reply button though. I still have room
           | for self-improvement. :-)
        
             | accountinhn wrote:
             | I do this a lot, especially in Reddit. Trick I learned is I
             | use my notepad now to type the reply, then wait for an hour
             | and post it. This helped with few things:
             | 
             | 1. Slowly improving in checking for typos
             | 
             | 2. Reading something after a break helps framing my point
             | better and helps remove the heated emotion from the text
             | 
             | 3. I also don't save the comment, so I have to spend time
             | searching for it. This further helps in filtering the
             | topics that I don't care about
             | 
             | 4. Using services while not logged in, basically be a
             | lurker
        
             | ok_coo wrote:
             | This is the point I got to when I realized I should no
             | longer have an account and I just went ahead and deleted
             | it.
             | 
             | For about 6 months after, I thought I would reactivate and
             | jump back in but it's been years now and I'm 100% sure it
             | was a life improvement.
        
             | specialist wrote:
             | Not gonna like. u/dang has had to scold me a few times.
             | 
             | Improvement takes time.
             | 
             | We certainly need more and better role models.
        
             | dghughes wrote:
             | I've been doing that more recently too. I say to myself do
             | I really want to do this? Or why am I getting involved in
             | this? Especially Twitter where you can't unselect yourself
             | from a conversation.
             | 
             | My new mantra is the saying, "not my circus, not my
             | monkeys".
        
             | SNosTrAnDbLe wrote:
             | This! It always helps me to take time out and paste what
             | you have on a different notepad and come back to it. Only
             | hit send if it still makes sense.
        
         | [deleted]
        
         | razzimatazz wrote:
         | You are on to something. I interpret it as it is fantastic
         | fun/addictive/dopamine-short-term-win to argue or discuss with
         | someone. Especially if you can afford the hangover that outrage
         | might lead to.
         | 
         | Face to face with people I know or at least recognize as human,
         | not a bot, educated or at least not cartoon-hick-personality -
         | Arguments can be great, because of the ability to see when to
         | pull back and stop something from escalating. We are all human
         | after all.
         | 
         | In internet-powered discussion, where numbers of people
         | observing can be huge, and every username can feel inhuman or
         | maybe even just trolling in an attempt to create a stupid
         | argument - that Argument gets painful. But the dopamine hit is
         | still there...
        
         | Barrin92 wrote:
         | > And Facebook somehow tapped into that human behavior and
         | (inadvertently or purposefully)
         | 
         | It's not just that they tapped into it, it's the entire mission
         | statement in a sense. 'to connect the world' if you want to
         | treat it like a sort of network science basically means to
         | lower the distance between individuals so much that you've
         | reduced the whole world to a small world network. There's no
         | inhibition in this system, it's like an organism on every
         | stimulant you can imagine.
         | 
         | Everything spreads too fast and there's no authority to shut
         | anything down that breaks, so the result is pretty much
         | unmitigated chaos.
         | 
         | The vaccine is the thing people complain about all the time,
         | the much maligned 'filter bubbles', which is really just to say
         | splitting these networks into groups that can actually work
         | productively together and keeping them away from others that
         | make them want to bash their heads in.
        
         | ckastner wrote:
         | > _I really didn 't realize until perhaps the last 2 years that
         | Facebook fundamentally tapped some hidden human need/instinct
         | to argue with people who they believe are incorrect._
         | 
         | Hidden human need/instinct to argue, period. These arguments
         | aren't intellectual debates, it's people getting pissed off at
         | something, and venting their rage towards the other side.
         | 
         | It's odd how addictive rage can be. But that's not a new
         | phenomenon. Tabloids have been exploiting this for decades
         | before Facebook.
        
         | tbabb wrote:
         | I really don't like the "it can't be helped" attitude about
         | what Facebook has become.
         | 
         | They made a _choice_ to throw gasoline on the flames of these
         | aspects of human behavior. Few people seem to realize that
         | Facebook _could_ have been a force for good, if they had made
         | different choices or had more integrity when it comes to the
         | design and vision of their platform.
         | 
         | The way that things happened is not they only possible way they
         | _could_ have happened, and resigning to the current state as
         | "inevitable", to me, reeks of an incredible lack of
         | imagination.
        
           | A4ET8a8uTh0 wrote:
           | I am not sure I can agree. Facebook did not change in any
           | significant way. It still serves as a platform to boost your
           | message. It is, at best, simply a reflection of the human
           | condition. The previous example was the internet and some of
           | the revelations it brought about us as a species. FB just
           | focused it as much as it could.
           | 
           | Force for good. I do not want to sound like this, but how, in
           | your vision, that would look like? This is a real question.
        
             | tbabb wrote:
             | It could have been a platform that enlightens, informs, and
             | uplifts people instead of exploiting attention and anger,
             | and profiting from misinformation.
             | 
             | You can make money by making people feel good instead of
             | bad. You can be rotten.com, or you can be Pixar. You have a
             | choice. An organization with integrity will look at not
             | just how much money they're immediately making, but whether
             | they're pushing the world towards better or worse. A hands-
             | off attitude of "it's not my problem that people like this
             | sh*t" is not integrity, it's a rationalization for greed.
             | 
             | You can make choices that result in making less than the
             | absolute maximum amount of cash you can get your hands on,
             | in service of building a product/experience/brand with
             | value and goodwill of its own. There are countless examples
             | of this in other places-- just look at any company that
             | builds its reputation based on quality. Each of these
             | brands could make their products for cheaper and lower
             | quality, and make more immediate profit, at the (much
             | larger) long-term cost of destroying the brand, the
             | customer goodwill, and the market advantage. Defending
             | against such short-term greediness is uphill work, but it's
             | both the enlightened and profitable thing to do.
             | 
             | Instead of actively amplifying memes and misinformation,
             | they could have chosen to build features supporting
             | community and/or expert moderation. They built an algorithm
             | that optimizes purely for attention, but they could have
             | made something that accounts for quality; paying attention
             | to patterns in good and bad sources of information, and
             | reliable/unreliable discriminating tastes in the community.
             | The emphasis on quality and reliability of content was the
             | pitch for Quora, for example, and they did much better at
             | that task than Facebook. Which is not surprising, because
             | Facebook seems to clearly not be trying to optimize for
             | this at all. Wikipedia and StackOverflow are also two huge
             | success stories of community/expert moderation. It works if
             | you actually prioritize it.
             | 
             | They could have chosen to hire journalists, editors, and
             | artists to produce and vet high-quality content to drive
             | people to the platform, and step responsibly and
             | effectively into the media void that was created when
             | newspapers began to collapse. An analogy for this would be
             | the way that Netflix, Amazon, HBO, and friends have created
             | a new boom and golden age of content creation to fill the
             | void left by the dying medium of broadcast TV. There could
             | have been something like this for print, and Facebook was
             | well positioned for it.
             | 
             | [Jaron Lanier](https://www.ted.com/talks/jaron_lanier_how_w
             | e_need_to_remake...) has lots of ideas about how to make an
             | internet that isn't hostile toward its own users. One of
             | his revolutionary ideas: Charge people money for services
             | instead of siphoning their data and their attention in ways
             | that hurt them.
             | 
             | There are a zillion directions they could have gone.
        
             | [deleted]
        
         | CarelessExpert wrote:
         | > I really didn't realize until perhaps the last 2 years that
         | Facebook fundamentally tapped some hidden human need/instinct
         | to argue with people who they believe are incorrect.
         | 
         | Funny, years ago, around the Aurora shooting in Colorado, it
         | was Facebook that made me recognize this behaviour in myself.
         | 
         | It's why I left the platform.
         | 
         | Also, obligatory XKCD: https://xkcd.com/386/
        
         | danlugo92 wrote:
         | It's always alien when I read stuff like this.
         | 
         | Most of my facebook feed is just memes and selfies.
         | 
         | (I'm venezuelan)
         | 
         | When facebook as new/trending up years ago there were some
         | political discussions but people quickly figured out it was
         | worthless, how come USAians haven't?
        
           | vharuck wrote:
           | Being angry over politics is a pastime in the USA. Not sure
           | why this happens in some countries and less so in others.
        
             | razzimatazz wrote:
             | In NZ, we follow along behind the USA in most things, so
             | getting angry over politics is growing and growing. I see
             | it as a hobby, growing in popularity. But also a hobby you
             | can discourage those you know from getting into, and if the
             | social momentum pulls us away from it then the hobby
             | doesn't take hold.
        
           | mrmonkeyman wrote:
           | Perhaps you guys should be more engaged with politics..
        
         | Kaze404 wrote:
         | I had the same realization recently, and deleted my Twitter
         | account in favor of a new one where I only follow people I know
         | in real life.
         | 
         | That worked great for a couple of weeks, but now I log on
         | Twitter and half of my feed is tweets of people I don't know or
         | follow, with the worst, most infuriatingly stupid hot takes. No
         | wonder they have literally hundreds of thousands of likes. The
         | platform is built around this "content".
        
       | jbay808 wrote:
       | I'm no fan of Facebook. But for what it's worth, back when I was
       | still using it in ~2010, it helped me learn a lot about the
       | worldviews of people on the opposite end of the political
       | spectrum who I rarely if ever interacted with in person. The
       | mechanism for this was Facebook Groups - I'd hang out in climate
       | change denial groups talking to denialists and asking them
       | questions. And although it didn't change my mind and I didn't
       | change theirs, I (and my, err, opponents) both actually learned a
       | lot and came to see the other side as more honest and less
       | irrational/evil than we once thought.
       | 
       | I don't know if Facebook still serves this purpose today.
        
         | crocodiletears wrote:
         | It's less like that anymore. Group raids involving post
         | reporting became a huge issue a while back, so most political
         | pages use membership application questions requiring you to
         | positively affirm or signal in-group association before
         | joining. Nothing prevents you from lying to get into a group,
         | but it's oddly effective as a mechanism for preventing partisan
         | opponents from engaging in any dialogue.
        
           | jbay808 wrote:
           | Ah, good to know.
           | 
           | This is why we can't have nice things.
        
           | karmelapple wrote:
           | I think a big part of the shift of interactions over the last
           | 5 - 10 years is the communication platform (Facebook, in this
           | case) bringing in new users who had zero experience debating
           | in a text-only format. It's probably inevitable, unless the
           | platform tries to educate and heavily police new users on
           | what proper behavior is.
           | 
           | Facebook was incentivized to grow as fast as possible.
           | Comments and discussion was one of many vectors for growing;
           | photos, news, and silly images was just as important. The
           | quality of all that wasn't as important as the content coming
           | from people you know and trust.
           | 
           | Contrast that with a community like HN, where quality of
           | comments and content is much more important, since you have
           | little to no trust for almost all people submitting content.
        
       | stickfigure wrote:
       | Can't read the article, but I've seen a lot of my friends
       | unfriend other people that have political opinions that differ
       | from theirs. And the ever so popular post "If you disagree with
       | _thing xyz_ let me know know so I can unfriend you! "
       | 
       | This isn't Facebook's doing. People self-select monocultures.
        
       | hadtodoit wrote:
       | Why does facebook need to do anything about this? People have
       | been disagreeing with each other violently or otherwise for as
       | long as humans have existed. Do they think they can do anything
       | about this?
        
         | kgin wrote:
         | There has never been a mechanism whereby everyone can be
         | against everyone else about everything.
         | 
         | When my high school english teacher and my aunt are arguing
         | about politics and they've never met each other, it's clear
         | this is a new development in human conflict.
        
       | knzhou wrote:
       | The degree to which "damned if you do, damned if you don't" is in
       | effect here is remarkable. If Facebook literally removes
       | anything, then HN is outraged because it's censorship,
       | paternalism, all that. But if Facebook _does not adopt an
       | actively paternalistic attitude where it shows people content
       | that they deem is "good for them"_ , then that's outrageous too.
       | Both complaints predictably rocket to the top of HN.
       | 
       | Which is it, guys? How can you simultaneously be outraged that
       | Facebook is imposing any restrictions on speech at all, and
       | horrified that it _isn't_ actively molding user behavior on a
       | massive scale?
       | 
       | There's an amusing comment from a Facebook employee downthread
       | asking: if division is caused by showing people opposing
       | political opinions, should we try to stop that to reduce
       | division, or should we do nothing, to avoid forming filter
       | bubbles? Predictably, every single reply condemns him as evil for
       | not realizing one of the options is obviously right, but they're
       | split exactly 50/50 on what that right course of action is.
        
         | closeparen wrote:
         | If Facebook followed some deterministic algorithm like "show
         | all content from friends, in chronological order" then I don't
         | think there would be such loud voices calling for it to also
         | solve $social_problem.
         | 
         | But Facebook _does_ exercise editorial control, in the service
         | of engagement. It 's fair to ask that this curation consider
         | other objectives as well, or at least counterbalance the side-
         | effects it's known to have (divisive content is more engaging
         | and so is amplified; at least correct it back down to neutral).
        
           | ccktlmazeltov wrote:
           | If facebook was following a deterministic algorithm, people
           | would complain too.
        
         | DenisM wrote:
         | > How can you simultaneously be outraged that Facebook is
         | imposing any restrictions on speech at all, and horrified that
         | it isn't actively molding user behavior on a massive scale?
         | 
         | There is a simple answer to that - HN is not a homogeneous set
         | of people; different users have different opinions and express
         | them at different times with different intensity.
        
           | knzhou wrote:
           | True, but I would have expected at least a _little_ visible
           | disagreement. You never see anybody saying "wait a second, I
           | think making the filter bubble effect a bit worse is actually
           | worth it!" Each individual submission's comments is just full
           | throated, unanimous condemnation --- even when adjacent
           | comments are directly contradictory. They just don't engage
           | with the possibility that deciding what to do might actually
           | be hard.
        
           | throwlaplace wrote:
           | If this were true you'd have the average opinion filter out -
           | naysayers would downvote supporters and vice versa. The truth
           | is in fact that hn is hypocritical and becomes outraged for
           | the sake of outrage just like every other opinionated group
           | (where the raison d etre is to express an opinion rather than
           | discourse).
           | 
           | Edit: think hn isn't just about getting attention? Then
           | explain to me why responses are ranked? Or even ranked
           | without requiring a response? Even Amazon reviews in
           | principle require leaving ratings only in good faith (ie
           | having engaged with the product). It's obvious and dang and
           | whomever could make that a perquisite of voting but they
           | wouldn't in a million years. That they haven't proves my
           | point.
        
             | pessimizer wrote:
             | The Law of Averages isn't actually a law, and everything
             | doesn't balance out in the end.
        
               | throwlaplace wrote:
               | Lol the law of large numbers (the real name for the law
               | of averages) is 100% a law
               | 
               | https://en.wikipedia.org/wiki/Law_of_large_numbers
        
         | pessimizer wrote:
         | All of HN other than you isn't actually one person arguing with
         | themselves. There are a variety of people here with a range of
         | opinions, each with different degrees of internal consistency.
         | This smacks of the "everybody says all kinds of things, so you
         | should just ignore everybody" or the "they're yelling at me
         | from the left and from the right, so that's proof I'm doing
         | everything correctly" defenses. The second is the moderation
         | fallacy; the first might be called the "Argument From
         | Sociopathy."
        
         | ccktlmazeltov wrote:
         | This 100%, this reminds me of PG's haters post[1].
         | 
         | [1]http://www.paulgraham.com/fh.html
        
         | neilk wrote:
         | Facebook is sometimes in a difficult position. But, with
         | respect, I don't think it's correct to say that it's the fault
         | of their critics for putting them in a dilemma.
         | 
         | Facebook does not face outrage for "literally remov[ing]
         | anything". In the real world, Facebook is removing thousands of
         | items every day, maybe tens of thousands. Most of them are
         | totally justified. Is HN outraged about that?
         | 
         | Also, keep vs. remove is a false dilemma. Facebook has many
         | more options than keeping or removing items. Most of the time,
         | it's about what they choose to boost or reward in other ways.
         | https://twitter.com/BKCHarvard/status/1263891198068039680
         | 
         | That said, I think you're right that being an effective arbiter
         | on planetary speech is a difficult place to be, even for people
         | with the best intentions, and FB didn't get where they are
         | today by having the best intentions all the time.
         | 
         | But personally, I think that calls into question whether
         | Facebook or anything like them should even exist.
        
         | throwlaplace wrote:
         | you know what i don't understand? how many people chafe at the
         | idea that violent video games create violent people? when i was
         | a kid that was the hot-button tech issue. no one puts any stock
         | in that right? because it's an inversion of causality - violent
         | people gravitate towards violent video games. and before video
         | games we had parental advisory on rap albums. no one put any
         | stock in that either. how is fb different? why isn't the onus
         | on the consumer?
        
         | starpilot wrote:
         | You people want less government, but you also want more. Which
         | is it, guys?
         | 
         | You could say this about any divisive issue if you lump all
         | sides together.
        
           | icelancer wrote:
           | I want more government when it helps me and less when it
           | doesn't. What's so hard about that? Everyone agrees with me,
           | I'm sure.
        
         | StanislavPetrov wrote:
         | Personally I don't think Facebook should censor anything except
         | spam and posts that break the law like direct threats of
         | violence and child abuse. However, I do think they have a
         | responsibility when it comes to paid content and the algorithms
         | they use to push content on people. Its one thing to have an
         | uncensored forum where people might be exposed to things they
         | seek out, and entirely another thing for Facebook to choose and
         | collate what people are seeing. Once they do that, they share
         | responsibility for the content people see, rather than just
         | being a platform.
        
         | cityzen wrote:
         | The right course of action is to stop using Facebook. That's
         | it, very simple. people here being polarized is exactly what
         | Facebook wants. Current facebook users are akin to opioid
         | addicts... probably found some relief from whatever social pain
         | they were feeling that now they have accepted selling their
         | privacy for their fix.
         | 
         | I would say I can't wait for the day that Facebook is gone but
         | it will be a long time considering the amount of insecure and
         | unintelligent people that need a platform like that to avoid
         | ever being challenged in the real world.
        
           | dctoedt wrote:
           | > _The right course of action is to stop using Facebook._
           | 
           | That'd give rise to something akin to a Gresham's Law problem
           | [0]. I think we have a civic duty to engage patiently -- and
           | politely -- with our friends who hold views we disagree with,
           | because (A) they get to vote; (B) angry invective isn't
           | persuasive; and (C) social proof is a thing, and sometimes
           | people _can_ be persuaded to come around to their friends '
           | point of view, eventually. It's a long shot, but worth a
           | shot.
           | 
           | [0] https://en.wikipedia.org/wiki/Gresham%27s_law
        
             | cityzen wrote:
             | So stay on Facebook. Keep in mind that while you're trying
             | to patiently and politely engage, Facebook continues to
             | build a profile about who you are and how to manipulate
             | you. The more you feed it, the more it learns and knows how
             | to use yourself against yourself.
             | 
             | In terms of civic duty, this definition of American civic
             | duty made me laugh:
             | 
             | Citizenship connects Americans in a nation bound by shared
             | values of liberty, equality, and freedom. Being a citizen
             | comes with both rights and responsibilities. Civic duty
             | embodies these responsibilities. Such civic duties help
             | uphold the democratic values of the United States.
             | 
             | The problem with civic duty is that I feel mine is to tell
             | people to stop using Facebook... but others... well, they
             | are going to show up at a state capital carrying military
             | grade weapons because they don't want to wear facemasks in
             | a pandemic. Would you like to patiently and politely engage
             | with them? I sure as hell wouldn't because they focus more
             | on their perception of "rights" while ignoring their
             | responsibilities.
             | 
             | My point is that civic duty in America is a pretty
             | romanticized concept that is certainly not based in any
             | reality in 2020.
             | 
             | When I did use Facebook briefly back before 2012, I found
             | it disappointing that my "christian" extended family
             | members would send me incredibly hateful memes about how
             | Obama was a muslim terrorist or satan in a tan suit. If
             | that is civic discourse, no thanks. The one upside is that
             | I haven't spoken to them in almost a decade and never plan
             | to again.
        
               | dctoedt wrote:
               | > _others... well, they are going to show up at a state
               | capital carrying military grade weapons because they don
               | 't want to wear facemasks in a pandemic. Would you like
               | to patiently and politely engage with them?_
               | 
               | I have a few friends like that on Facebook, and yes I do
               | patiently and politely engage with them -- but on
               | occasion I've had to remind them that they aren't the
               | only ones who own guns and know how to use them ....
        
           | GordonS wrote:
           | I agree, but also from a different angle.
           | 
           | The other day a friend told me about a takeaway place I
           | should try. I couldn't find the menu on their website, and my
           | friend replied that it was on theirbl Facebook page, and I
           | should stop being a luddite.
           | 
           | No, _no_ , dammit! We're literally giving the world wide web
           | to a single corporation, and worse, normalising it - to a lot
           | of people, Facebook basically _is_ "the internet".
           | 
           | How can this possibility be in the best interest of users? It
           | almost feels like the balance has swung too far, and it's
           | perilously close to the point of no return...
        
             | cityzen wrote:
             | Sometimes you just have to miss out on a good meal.
        
           | ccktlmazeltov wrote:
           | Why? It's a great app to keep track of what your friends are
           | doing
        
       | lmilcin wrote:
       | The problem really is platforms that give people content to
       | please them. An algorithm selects content that you are likely to
       | agree with or that you have shown previous interest. This only
       | causes people to get reinforced in their beliefs and this leads
       | to polarization.
       | 
       | For example, when I browse videos on Youtube I will only get
       | democratic content (even though I am from Poland). Seems as soon
       | as you click on couple entries you get classified and from now on
       | you will only be shown videos that are agreeable to you. That
       | means lots of Stephen Colbert and no Fox News.
       | 
       | My friend is deeply republican and she will not see any
       | democratic content when she gets suggestions.
       | 
       | The problem runs so deep that it is difficult to find new things
       | even if I want. I maintain another browser where I am logged off
       | to get more varied selection and not just couple topics I have
       | been interested with recently.
       | 
       | My point of view on this: this is disaster of gigantic
       | proportions. People need to be exposed to conflicting views to be
       | able to make their own decisions.
        
         | Bedon292 wrote:
         | I think that is not quite right, but the distinction is subtle.
         | The algorithm selects the content that you are most likely to
         | be engaged with. For most people likely that is the filter
         | bubble, and seeing only what they agree with. But for some
         | folks, they actively like to have debates (or troll one
         | another) and see more content they will not agree with, because
         | what they don't agree with gets more engagement. The intent is
         | to keep you engaged and active as long as possible on the site,
         | and feed whatever drives that behavior.
        
         | foofoo4u wrote:
         | This is the exact same behavior I have noticed from YouTube as
         | well. I miss the "old" YouTube around 2011, when it was a
         | terrific place to discover new and interesting videos. If I
         | watched a video on mountain biking, let's say, then the list of
         | suggested videos all revolved around that topic. But in today's
         | YouTube, the suggested content for the same mountain biking
         | video is all unrelated, often extremely polarizing, political
         | content. I actually can NO LONGER discover new interesting
         | content on YouTube. Like you say, it automatically categorizes
         | you based on the very first few videos and that's all you see
         | from there on out. That is why I have now configured my browser
         | to block all cookies from YouTube. I'm annoyed that I can no
         | longer enjoy YouTube logged in, but at least now I feel like
         | I've gotten back that "old" YouTube of what it once was. It's a
         | whole lot less polarizing now, I feel much better as a result
         | of it, and the suggestions are significantly improved.
        
           | lmilcin wrote:
           | Exactly. I remember clicking on homepage to get selection of
           | new, interesting videos. Now I just get exactly the same
           | every time I click. Useless. I would like to discover new
           | topics not get rehash of same ones.
        
         | empath75 wrote:
         | > The problem really is platforms that give people content to
         | please them.
         | 
         | But it doesn't please them -- study after study shows a high
         | correlation between depression and anxiety and social media
         | use.
        
           | robertlagrant wrote:
           | Plenty of things don't cure depression. Doesn't make them
           | bad.
        
         | sixothree wrote:
         | Same goes for non-political content. I often have to log out of
         | youtube to find something new and interesting (even though I
         | have hundreds of subscriptions).
        
         | dang wrote:
         | Sorry for the self-reference outside of a moderation context,
         | but I wrote what turned into an entire essay about this last
         | night: https://news.ycombinator.com/item?id=23308098. It's
         | about how this plays out specifically on HN. Short version:
         | it's precisely because this place _is_ less divisive that it
         | _feels_ more divisive. In fact, HN is probably the least
         | divisive community of its size and scope on the internet (if
         | there are others, I 'd like to know which they are), and
         | precisely because of this, many people feel that it's among the
         | most divisive. The solution to this paradox is that HN is the
         | rare case of a large(ish) community that keeps itself in one
         | piece instead of breaking into shards or silos. If that's true,
         | then although we haven't yet realized it, the HN community is
         | actually on the leading edge of the opportunity to learn to be
         | different with one another, at least on the internet.
        
           | lmilcin wrote:
           | First of all, less divisive environment means you interact
           | with people of different opinions which means that few
           | interactions will be with exactly like-minded people.
           | 
           | Environments where all people tend to think exactly the same
           | are typically extremist in some way, resulting from some kind
           | of polarization process that eliminates people that don't
           | express opinion at the extreme of spectrum. They are either
           | removed forcibly or remove themselves when they get
           | dissatisfied.
           | 
           | One way HN stays away from this polarization process is
           | because of the discussion topics and the kind of person that
           | typically enjoys these discussions. Staying away from
           | mainstream politics, religion, etc. and focusing mainly on
           | technological trivia means people of very different opinions
           | can stay civilized discussing non-divisive topics.
           | 
           | Also it helps that extremist and uncivilized opinions tend to
           | be quickly suppressed by the community thanks to vote-
           | supported tradition. I have been reading HN from very close
           | to start (even though I have created the account much
           | further). I think the first users were much more
           | VC/development oriented and as new users were coming they
           | tend to observe and conform to the tradition.
           | 
           | (I red your piece. I think I figured it out. The users
           | actually select themselves on HN though in a different way.
           | The people who can't cope with diverse community can't find
           | place for themselves, because there is no way to block
           | diverse opinion, and in effect remove themselves from here
           | and this is what allows HN to survive. The initial conditions
           | were people who actually invited diverse opinion which
           | allowed this equilibrium).
        
           | anigbrowl wrote:
           | The thing is that HN is essentially run like singapore - a
           | benign-seeming authoritarian dictatorship that shuts down
           | conflicts early and is also relatively small and self-
           | contained. One thing that doesn't get measured in this
           | analysis is the number of people who leave because they find
           | that this gives rise to a somewhat toxic environment, as
           | malign actors can make hurtful remarks but complaints about
           | them are often suppressed. Of course, it tends to average out
           | over time and people of opposite political persuasions may
           | both feel their views are somewhat suppressed, but this
           | largely reactive approach is easily gamed as long as its done
           | patiently.
        
           | jeffdavis wrote:
           | One of my theories about the success of HN is that we are
           | grouped together based on one set of topics (on which we
           | largely agree), but we discuss other topics over which we are
           | just as divided as the general public.
           | 
           | I believe there is an anchoring effect -- if you are just in
           | a discussion where someone helps you understand the RISC-V
           | memory model, it feels wrong to go into another thread on the
           | same site and unload a string of epithets on someone who
           | feels differently than you do about how doctors should get
           | paid.
        
           | tfandango wrote:
           | This is why I like HN. I am always challenged with different
           | points of view on here, and in a non-argumentative way. It's
           | just a rational discussion. Often I will see something on FB
           | or Twitter that is outrageous to me (by design), but when I
           | look it up on HN and find some discussion on the details,
           | truth is often more sane than it seems...
        
         | bcrosby95 wrote:
         | This isn't necessarily bad all the time. But when content is
         | used to form opinions on real world things that actually
         | Matter, it definitely becomes a problem.
         | 
         | In other words, Steam, please filter games by my engagement in
         | previous games I've played. News organizations, please don't
         | filter news by my engagement in previous news.
         | 
         | Facebook's problem is it acts in two worlds: keeping up with
         | your friends, and learning important information. If all you
         | did was keep up with your friends' lives, filtering content by
         | engagement is kind of meh.
         | 
         | Same with youtube. I mostly spend all my time on there watching
         | technical talks and video game related stuff. It's pure
         | entertainment. So filtering content is fine. But if I also used
         | it to get my news, you start to run into problems.
        
         | MrZander wrote:
         | That is a really annoying issue I have with YouTube.
         | 
         | I occasionally watch some of the Joe Rogan podcast videos when
         | he has a guest I'm interested in. I swear, as soon as I watch
         | one JRE video, I am suddenly inundated with suggestions for
         | videos with really click-baity and highly politicized topics.
         | 
         | I've actually gotten to the point where I actively avoid videos
         | that I want to watch because I know what kind of a response
         | YouTube will have. Either that or I open them in incognito
         | mode. It's a shame. I wish I could just explicitly define my
         | interests rather than YT trying to guess what I want to watch.
        
         | lostmyoldone wrote:
         | In the case of Facebook they absolutely do not try to please
         | me. They quite literally tries to do the exact opposite of
         | everything I would like from my feed.
         | 
         | Chronological with the ability to easily filter who I see, and
         | who I post to. On each point capabilities has either been
         | removed, hidden, or made worse in some other creative way.
         | 
         | Adding insult to injury, having to periodically figure out
         | where they've now hidden the save button for events, or some
         | other feature they don't want me to use is always a 'fun'
         | exercise.
        
           | bittercynic wrote:
           | It doesn't address all of those, but if you visit
           | https://www.youtube.com/feed/subscriptions it looks like it's
           | still just a reverse chronological list of videos from your
           | subscriptions.
        
         | germinalphrase wrote:
         | Cynically, very few platforms want you to make your own
         | decisions.
        
         | alittletooraph wrote:
         | I agree with you but this is an incredibly hard problem to
         | solve. How are you going to get your friend to engage with
         | videos that are in direct opposition to her world views?
         | Recommendations are based on what she actually clicks on, how
         | long she actually watches the videos, etc.
         | 
         | And from the business perspective, they're trying to reduce the
         | likelihood that your friend abandons their platform and goes to
         | another one that she feels is more "built for her".
        
           | lmilcin wrote:
           | A start would be to recognize that businesses are not allowed
           | to exploit this aspect of human nature because the harm is
           | too great to justify business opportunity.
        
             | asdff wrote:
             | We need to regulate this industry very badly
        
           | jimbob45 wrote:
           | It's easy to solve. FB gets to either be a platform for
           | content or a curator for content. They can't be both because
           | that would be a conflict of interest.
        
             | sbarre wrote:
             | Then what's the business model? Who pays for all of it?
             | 
             | I'm not defending a specific approach or solution, but just
             | pointing out that at this point, FB is a huge entrenched
             | business that makes a lot of money on the status quo, and
             | so convincing them to change "for the better" is barking up
             | the wrong tree until "for the better" means "more
             | profitable".
             | 
             | Splitting the platform and curation means the platform
             | needs a revenue stream. If the curator pays the platform,
             | then all you're doing is shifting the conflict up a notch,
             | not solving it.
        
         | CivBase wrote:
         | What really scares me is how many people I know who acknowledge
         | that platforms like Facebook and YouTube are designed to create
         | echo chambers which tend to distort people's opinions and
         | perceptions towards extremes... but still actively engage with
         | them without taking any precautions. They know it's bad for
         | them, but they keep going back for more.
        
           | scollet wrote:
           | Having awareness probably means they can engage in a
           | meaningful way. Some degree of maturity and critical thought
           | are required to dam up invaluable media. It's something akin
           | to junk food; junk media.
        
       | LordFast wrote:
       | Social media is an addictive substance and should be controlled.
       | End of story.
        
       | [deleted]
        
       | vtail wrote:
       | Disclaimer: I started working at FB recently.
       | 
       | Consider the following model scenario. You are a PM at a
       | discussion board startup in Elbonia. There are too many
       | discussions at every single time, so you personalize the list for
       | each user, showing only discussions she is more likely to
       | interact with (it's a crude indication of user interest, but it's
       | tough to measure it accurately).
       | 
       | One day, your brilliant data scientist trained a model that
       | predicts which of the two Elbonian parties a user most likely
       | support, as well as whether a comment/article discusses a
       | political topic or not. Then a user researcher made a striking
       | discovery: supporters of party A interact more strongly with
       | posts about party B, and vice versa. A proposal is made to
       | artificially reduce the prevalence of opposing party posts in
       | someone's feed.
       | 
       | Would you support this proposal as a PM? Why or why not?
        
         | freeAgent wrote:
         | That's beside the point, though. The point here is that
         | Facebook executives were told by their own employees that the
         | algorithms they designed were recommending more and more
         | partisan content and de-prioritizing less partisan content
         | because it wasn't as engaging. They were also told that this
         | was potentially causing social issues. In response, Kaplan/FB
         | executives said that changing the algorithm would be too
         | paternalistic (ignoring, apparently, that an algorithm that
         | silently filters without user knowledge or consent is already
         | fundamentally "paternalistic"). Given that Facebook's objective
         | is to "bring the world closer together", choosing to support an
         | algorithm that drives engagement that actually causes division
         | seems a betrayal of its stated goals.
        
         | carapace wrote:
         | You voluntarily put yourself in this position with no good way
         | of fixing it. No one's _forcing_ Facebook to do what they (and
         | now you) do, eh?
         | 
         | My perception of reality is that you and your brilliant data
         | scientist are (at best naive and unsuspecting) patronizing
         | arrogant jerks who have no business making these decisions for
         | your users.
         | 
         | You captured these peasants' minds, now you've got a tiger by
         | the tail. The obvious thing to do is let go of the tiger and
         | run like hell.
        
         | ForrestN wrote:
         | I would take a step back and question the criteria we are using
         | to make decisions. "Engagement" in this context is euphemistic.
         | This startup is talking about applying engineering to influence
         | human behavior in order to make people use their product more,
         | presumably because their monetization strategy sells that
         | attention or the data generated by it.
         | 
         | If I were the PM I'd suggest a change in business model to
         | something that aligns the best interests of users with the best
         | interests of the company.
         | 
         | I'd stop measuring "engagement" or algorithmically favoring
         | posts that people interact with more. I'd have a conversation
         | with my users about what they want to get out of the platform
         | that lasts longer than the split second decision to click one
         | thing and not another. And I'd prepare to spend massive
         | resources on moderation to ensure that my users aren't being
         | manipulated by others now that my company has stopped
         | manipulating them.
         | 
         | I think the issues of showing content from one side of a
         | political divide or the other is much less important than
         | showing material from trustworthy sources. The deeper issue,
         | which is a very hard problem to solve, is dealing with the
         | fundamental asymmetries that come up in political discourse. In
         | the US, if you were to block misinformation and propaganda
         | you'd disproportionately be blocking right wing material. How
         | do you convince users to value truth and integrity even if
         | their political leaders don't, and how do you as a platform
         | value them even if that means some audiences will reject you?
         | 
         | I don't know how to answer those questions but they do start to
         | imply that maybe "news + commenting as a place to spend lots of
         | time" isn't the best place to expend energy if you're trying to
         | make things better?
        
         | pacala wrote:
         | Predicting political preferences is a huge can of worms. Don't
         | do it. Have the data scientist delete their model.
        
           | [deleted]
        
           | vtail wrote:
           | They did as you say (you are a PM, after all!), and next week
           | they rolled out the "likelihood of engagement" model. An
           | independent analysis by another team member, familiar with
           | the old model, confirmed that it was still mostly driven by
           | politics (there is nothing much going on in Elbonia, besides
           | politics), but politics was neither the direct objective not
           | an explicit factor in the model.
           | 
           | The observed behavior is the same: using the new model, most
           | people are still shown highly polarized posts, as indicated
           | by subjective assessment of user research professionals.
           | 
           | What should you do now?
        
             | lostmyoldone wrote:
             | In regards to a predictive model and privacy/ethics/etc,
             | regardless of your objective function and explicit
             | parameters a model can only be judged on what it actually
             | predicts, thus it is enough to answer the prior question to
             | be able to answer this.
             | 
             | This is because of the fact that machine learning models
             | are prone to learn quite different things than the
             | objective function intended, hence the introduction of
             | different intent or structure of the model must be
             | disregarded when analysing the results.
             | 
             | To any degree the models predict similarly, they must be
             | regarded as similar, but perhaps in a roundabout way.
        
             | pacala wrote:
             | I come to grips with:
             | 
             | * The 'engagement' metric leads to toxic outcomes no mater
             | what.
             | 
             | * The upper management / board is single mindedly obsessed
             | with 'engagement', as a proxy for making money.
             | 
             | * I cannot function in an environment where my personal
             | ethics is in direct conflict with the company focus.
             | 
             | Therefore I quit. YMMV.
        
             | dd36 wrote:
             | We used newsgroups and message boards long before Facebook.
             | They weren't as toxic, I'm assuming due to active
             | moderation. The automated or passive or slow moderation is
             | perhaps the issue.
        
               | asdff wrote:
               | Newsgroups and message boards users of old probably
               | aren't very representative of the populations using
               | Facebook today.
        
               | colinmhayes wrote:
               | I think they weren't as toxic because content creators
               | didn't realize divisive content drives much more
               | engagement. It's not about moderation, it's a paradigm
               | shift in the way content is created.
        
           | dkn775 wrote:
           | Agreed, as a general rule I shy away from predicting things I
           | wouldn't claim expertise in otherwise. This is why consulting
           | with subject matter experts is important. Things as innocuous
           | as traffic crashes and speeding tickets are a huge world
           | unbeknownst to the casual analyst (the field of "Traffic
           | Records")
        
         | drchopchop wrote:
         | No. Why should the only desirable metric be user engagement?
         | 
         | Is the goal of FB engagement/virality/time-on-site/revenue
         | above all else? What does society have to gain, long term, by
         | ranking a news feed by items most likely to provoke the
         | strongest reaction? How does Facebook's long-term health look,
         | 10 years from now, if it hastens the polarization and anti-
         | intellectualism of society?
        
           | maest wrote:
           | > How does Facebook's long-term health look, 10 years from
           | now, if it hastens the polarization and anti-intellectualism
           | of society?
           | 
           | Arguably, the PM doesn't care since they have short term
           | targets the want to hit and they might not even be with the
           | company in a few years' time.
        
           | cj wrote:
           | > Is the goal of FB engagement/virality/time-on-site/revenue
           | above all else?
           | 
           | Strictly speaking, Facebook is a public company that exists
           | only to serve its shareholder's interests. The goal of
           | Facebook (as a public company) is to increase stock price.
           | That almost often, if not always, means prioritizing revenue
           | over all else.
           | 
           | That's the dilemma.
           | 
           | Then again, I believe Mark has control of the board, right?
           | (And therefore couldn't be ousted for prioritizing ethical
           | business practices over revenue - I could be wrong about
           | this)
        
             | freeflight wrote:
             | _> Strictly speaking, Facebook is a public company that
             | exists only to serve its shareholder 's interests._
             | 
             | That's a very US-centric interpretation, which fits because
             | Facebook is a US company.
             | 
             | But it's still reductive to the issue considering how
             | Facebook's reach is also far and wide outside the US.
             | 
             | In that context, it's not really that much of an unsolvable
             | dilemma, it only appears as such when the notion of
             | "shareholder gains above all else" is considered some kind
             | of "holy grail thu shall never challenge".
        
         | jointpdf wrote:
         | No. To me, all recommendation engines should be:
         | 
         | - _User-configurable and interpretable_ : Enable tuning or re-
         | ranking of results, ideally based on the ability to reweight
         | model internals in a "fuzzy" way. As an example, see the last
         | comment in my history about using convolutional filters on song
         | spectrograms to distill hundreds of latent auditory features
         | (e.g. Chinese, vocal triads, deep-housey). Imagine being able
         | to directly recombine these features, generating a new set of
         | recommendations dynamically. Almost all recommendation engines
         | fail in this regard--the model feeds the user exactly what the
         | model (designer) wants, no more and no less.
         | 
         | - _Encourage serendipity_ : i.e. purposefully select and
         | recommend items that the model "thinks" is outside the user's
         | wheelhouse (wheelhouse = whatever naturally emerging cluster(s)
         | in the data that the user hangs out in, so pluck out examples
         | from both nearby and distant clusters). This not only helps
         | users break out of local minima, but is healthy for the data
         | feedback loop.
        
         | nitwit005 wrote:
         | I'd just suggest the data scientist was optimizing the wrong
         | metrics. People might behave that way, but having frequent
         | political arguments is a reason people stop using Facebook
         | entirely. It's definitely one of the more common reason people
         | unfollow friends.
         | 
         | Very high levels of engagement seems to be a negative indicator
         | for social sites. You don't want your users staying up to 2AM
         | having arguments on your platform.
        
         | laughinghan wrote:
         | If you restrict yourself to 2 bad choices, then you can only
         | make bad choices. It doesn't help to label one of them
         | "artificial" and imply the other choice isn't artificial.
         | 
         | It is, in fact, not just crude but actually quite _artificial_
         | to measure likelihood to interact as a single number, and
         | personalize the list of discussions solely or primarily based
         | on that single number.
         | 
         | Since your chosen crude and artificial indication turned out to
         | be harmful, why double-down on it? Why not seek something
         | better? Off the top of my head, potential avenues of
         | exploration:
         | 
         | * different kinds of interaction are weighted differently. Some
         | could be weighted negatively (e.g. angry reacts)
         | 
         | * [More Like This] / [Fewer Like This] buttons that aren't
         | hidden in the [?] menu
         | 
         | * instead of emoji reactions, reactions with explicit editorial
         | meaning, e.g. [Agree] [Heartwearming] [Funny] [Adds to
         | discussion] [Disagree] [Abusive] [Inaccurate] [Doesn't
         | contribute] (this is actually pretty much what Ars Technica's
         | comment system does, but it's an optional second step after up-
         | or down-voting. What if one of these were the only way to up-
         | or down-vote?)
         | 
         | * instead of trying to auto-detect party affiliation, use
         | sentiment analysis to try to detect the tone and toxicity of
         | the conversation. These could be used to adjusts the weights on
         | different kind of interactions, maybe some people share
         | divisive things privately but share pleasant things publicly.
         | (This seems a little paternalistic, but no more so than
         | "artificially" penalizing opposing party affiliation)
         | 
         | * certain kinds of shares could require or encourage
         | editorializing reactions ([Funny] [Thoughtful] [Look at this
         | idiot])
         | 
         | * Facebook conducted surveys that determined that Upworthy-
         | style clickbait sucked, in spite of high engagement, right?
         | Surveys like that could be a regular mechanism to determine
         | weights on interaction types and content classifiers and
         | sentiment analysis. This wouldn't be paternalistic, you
         | wouldn't be deciding for people, they'd be deciding for
         | themselves
        
         | Osiris wrote:
         | I miss the days when my feed wasn't curated at all and just
         | sorted by most recent post.
         | 
         | The whole point of having friends and being able to (un)follow
         | people is to I can curate my own feed.
         | 
         | I don't use Facebook anymore except for hobby related groups
         | like my motorcycling group.
        
           | freeAgent wrote:
           | Same. I miss the days of the chronological feed. Facebook's
           | algorithms seem to choose a handful of people and groups I'm
           | connected to and constantly show me their content and nothing
           | else. It's always illuminating when I look someone up after
           | wondering what happened to them only to see that they've been
           | keeping up with Facebook, but I just don't see any of their
           | posts.
        
             | tfandango wrote:
             | yesterday, in fact, I saw a post from a family member that
             | I really wanted to read, I started but was interrupted.
             | When I had a chance to focus again, I re-opened the FB app
             | and the post was nowhere to be seen, scrolled up, scrolled
             | down, it was gone. I had to search for my family member to
             | find it again. Super frustrating, and makes you wonder what
             | FB decided you didn't need to see (which I guess is the
             | point of this whole thread)...
        
         | ggggtez wrote:
         | This is a false choice. The real problem stems from the fact
         | that the model rewards engagement at the cost of everything
         | else.
         | 
         | Just tweaking one knob doesn't solve the problem. A real
         | solution is required, that would likely change the core
         | business model, and so no single PM would have the authority to
         | actually fix it.
         | 
         | Fake news and polarization are two sides of the same coin.
        
         | seek3r00 wrote:
         | No, it would not be my job to decide which discussions should a
         | user be driven to.
         | 
         | If a user is driven to political discussions, so be it.
         | 
         | Sure, this is good for the company because it means the user
         | will spend more time on the platform, but it is a side effect
         | really.
        
           | phkahler wrote:
           | >> No, it would not be my job to decide which discussions
           | should a user be driven to.
           | 
           | But facebook feels it's their job to drive certain thing to
           | users. That's the whole point as far as they can tell. I
           | disagree too.
        
           | fach wrote:
           | I would think engagement would be a core metric you would be
           | measured against in this example. And if that's the case,
           | this certainly isn't a side effect.
        
         | LeifCarrotson wrote:
         | As a PM, I'd support it as an A/B test. Show some percentage of
         | your users an increased level of posts from the opposite party,
         | some others an increased level of posts from their own party,
         | and leave the remaining 90% alone. After running that for a
         | month or two, see which of those groups is doing better.
         | 
         | They've clearly got something interesting and possibly
         | important, but 'interaction strength' is not intrinsically good
         | or bad. I would instead ask the researcher to pivot from a
         | metric of "interaction strength" to something more closely
         | aligned to the value the user derives from their use of your
         | product. (Side note: Hopefully, use of your product adds value
         | for your users. If your users are better off the less they use
         | their platform, that's a serious problem).
         | 
         | Do people interacting with posts from the opposite party come
         | away more empathetic and enlightened? If they are predominantly
         | shown posts from their own party, does an echo chamber develop
         | where they become increasingly radicalized? Does frequent
         | exposure to viewpoints they disagree with make people
         | depressed? They'll eventually become aware outside of the
         | discussion board of what the opposite party is doing, does
         | early exposure to those posts make them more accepting, or does
         | it make them angry and surprised? Perhaps people become
         | fatigued after writing a couple angry diatribes (or the
         | original poster becomes depressed after reading that angry
         | diatribe) and people quit your platform.
         | 
         | Unfortunately, checking interaction strength through comment
         | word counts is easy, while sentiment analysis is really hard.
         | Whether doing in-person psych evals or broadly analyzing the
         | users' activity feed for life successes or for depression,
         | you'll have tons of noise, because very little of those effects
         | will come from your discussion board. Fortunately, your
         | brilliant data scientist is brilliant, and after your A/B test,
         | has tons of data to work with.
        
         | smhinsey wrote:
         | This is why the liberal arts are important, because you need
         | someone in the room with enough knowledge of the world's
         | history to be able to look at this and suggest that maybe given
         | the terrible history of pseudo-scientifically sorting people
         | into political categories, you should not pursue this tactic
         | simply in order to make a buck off of it.
        
           | Barrin92 wrote:
           | You don't need liberal arts majors in the boardroom, you need
           | a military general in charge at the FTC and FCC.
           | 
           | Can we dispense with the idea that someone employed by
           | facebook regardless of their number of history degrees has
           | any damn influence on the structural issue here, which is
           | that Facebook is a private company whose purpose is to
           | mindlessly make as much money for their owners as they can?
           | 
           | The solution here isn't grabbing Mark and sitting him down in
           | counselling, it's to have the sovereign, which is the US
           | government exercise its authority which it has forgotten how
           | to use apparently and reign these companies in.
        
           | majewsky wrote:
           | "Thank you for consulting us so extensively. After long
           | deliberation, we've decided to move ahead with the
           | implementation to meet Q4 targets."
        
           | 1propionyl wrote:
           | Agreed. Engineers have an ethical duty to the public. When
           | working on software systems that touch on so many facets of
           | people's lives, a thorough education in history, philosophy,
           | and culture is necessary to make ethical engineering
           | decisions. Or, failing that, the willingness to defer to
           | those who do have that breadth of knowledge and expertise.
           | 
           | I'm reminded of this article:
           | 
           | https://www.theatlantic.com/technology/archive/2015/11/progr.
           | ..
           | 
           | "The term is probably a shortening of "software engineer,"
           | but its use betrays a secret: "Engineer" is an aspirational
           | title in software development. Traditional engineers are
           | regulated, certified, and subject to apprenticeship and
           | continuing education. Engineering claims an explicit
           | responsibility to public safety and reliability, even if it
           | doesn't always deliver.
           | 
           | The title "engineer" is cheapened by the tech industry."
           | 
           | "Engineers bear a burden to the public, and their specific
           | expertise as designers and builders of bridges or buildings--
           | or software--emanates from that responsibility. Only after
           | answering this calling does an engineer build anything,
           | whether bridges or buildings or software."
        
       | dredmorbius wrote:
       | Related, earlier this week on the New Books Network
       | 
       | Cailin O'Connor, "The Misinformation Age: How False Beliefs
       | Spread" (Yale UP, 2018)
       | 
       | (New Books in Journalism) Duration: 40:00
       | 
       | Published: Wed, 20 May 2020 08:00:00 -0000
       | 
       | Media: https://traffic.megaphone.fm/LIT1956686397.mp3 (audio)
       | 
       | Podcast: https://www.podcastrepublic.net/podcast/425693571
       | 
       | Why should we care about having true beliefs? And why do
       | demonstrably false beliefs persist and spread despite bad, even
       | fatal, consequences for the people who hold them?
       | 
       | Author page: http://cailinoconnor.com/the-misinformation-age/
       | 
       | Editor's book site:
       | https://yalebooks.yale.edu/book/9780300234015/misinformation...
       | 
       | Worldcat: https://www.worldcat.org/title/misinformation-age-how-
       | false-...
        
       | save_ferris wrote:
       | Zuckerberg's invincibility as CEO is nothing short of one of the
       | greatest failures of modern capitalism. It's simply astounding
       | that such a terrible leader has retained control of what is
       | clearly a company out of control. And the market accepts all of
       | it while individuals constantly criticize his and Facebook's
       | actions.
       | 
       | People always throw around "well stop using Facebook" but that
       | clearly isn't a reasonable solution from a scalability
       | standpoint. What percentage of those people also hold Facebook
       | stock, either directly or through a hedge fund, ETF, etc.? It
       | could be more than we think.
       | 
       | At the end of the day, profits don't care about people, and this
       | is the consequence we all have to live with.
        
       | [deleted]
        
       | 12xo wrote:
       | Attention is the currency of media. Sensationalism is the fuel.
        
       | visarga wrote:
       | Very nice two phrases ... I can imagine the rest...
        
       | m12k wrote:
       | One man's division is another man's engagement
        
       | ccktlmazeltov wrote:
       | flagged because it's behind a paywall
        
       | Fiveplus wrote:
       | This entire thread, discussion and the article in focus make me
       | so relieved. I'm so proud of my decision to facebook, twitter and
       | reddit altogether. There is soooo much less noise in my life. I'm
       | finally reading books, enjoying my hobbies while still getting
       | what 'I' like from the internet - RSS feeds to give me the latest
       | and most popular developments in news without any user generated
       | comments. 1-on-1 messaging services to help me stay connected
       | with my loved and dear ones and an occasional tour of websites
       | like HN and my favorite blogs from the bookmark folder. I do not
       | want the reader to assume my model is perfect, it's subjective.
       | But that's the point - it is what I make out to be the perfect
       | browsing model and intended use-case of internet to me. Another
       | minor point, ever since I moved away from reading what 'people'
       | have to say in comments, it de cluttered my mind.
       | 
       | The internet is what you make of it. I let it direct how I used
       | it, and getting myself away from that grip and 'sucked into'
       | environment is a blessing.
        
       | dghughes wrote:
       | I wonder what would happen if Facebook and Twitter were shutdown
       | for 30 days.
        
       | alzaeem wrote:
       | The outrage towards Facebook causing divisiveness is a red
       | herring. You want to see divisive content, go to foxnews vs cnn.
       | Pretty much the entire media is partisan and biased towards their
       | constituents' points of view. For Facebook, it would be nice if
       | they stick to showing whatever is posted by a user's friends or
       | organizations they like/follow without much curation, but my view
       | is that their impact on divisiveness overall is miniscule
        
       | DanielBMarkham wrote:
       | Everybody who uses Facebook should spend about ten minutes on it.
       | Catch up with the important things friends are doing and leave.
       | 
       | Unfortunately, this behavior is not in Facebook's best interest.
       | For them, it's Facebook now, Facebook later, Facebook as far as
       | the eye can see. Everything is Facebook.
       | 
       | There is a premise to this article that needs to be called out
       | and expunged. I have come to the sad conclusion that Facebook is
       | a company that should not exist. It's laying waste to huge
       | sections of the economy that used to provide valuable,
       | informative content, it's in a battle to suck your entire day
       | away from you with streaming and other services, and it's premise
       | is in direct contradiction to how we know societies evolve. You
       | can't start with "how do we fix it" and end up anywhere good.
       | 
       | They're not dummies. There might be a lot of happy-talk, echo
       | chamber discussions happening inside the company, but they know
       | the score. That's why they're picking political winners and
       | losers. I imagine there's a ton of money heading out to both
       | parties to provide cover over the next few election cycles.
       | 
       | I think looking back, if we manage to navigate our way through
       | this period, it's going to be viewed as a very sad and dark time,
       | much like the dark ages. I sincerely hope I am completely wrong
       | about all of this.
        
       | munificent wrote:
       | A few years back, there was a documentary called "The
       | Brainwashing of My Dad" about how Fox News and conservative radio
       | turned a relatively non-political Democrat into an angry, active
       | Republican.
       | 
       | In the past couple of years, I witnessed the same thing happen to
       | my mother, except driven almost entirely by Facebook and its non-
       | stop parade of right-wing pro-Trump racist memes.
        
       | renewiltord wrote:
       | People always blame Facebook when the existence of Internet
       | forums has always led to radicalization of individuals.
       | Facebook's crime is making forums accessible to all.
       | 
       | These are just your fellow people. This is how they are in the
       | situation that they're in. So be it. Let them speak to others
       | like them.
       | 
       | The cost of that is many angry people. The benefit of that is
       | that folks like me can find my people. That benefit outweighs the
       | cost.
       | 
       | This is just the price of the open society.
        
         | chongli wrote:
         | _People always blame Facebook when the existence of Internet
         | forums has always led to radicalization of individuals.
         | Facebook 's crime is making forums accessible to all._
         | 
         | If it were only that, I would have a hard time assigning blame
         | to Facebook. However, it is not only that. Facebook exercises
         | editorial control through its recommendation engine. Users
         | don't see all posts in chronological order. They see posts
         | ranked by Facebook based on invisible and inscrutable
         | algorithms that are optimized for engagement.
         | 
         | It just so happens that making people angry is an effective way
         | to keep them engaged in your platform. Thus it's not fair to
         | call Facebook a neutral party if they're actively foregrounding
         | divisive content in order to increase engagement.
        
           | renewiltord wrote:
           | I'm sympathetic to this position. I've heard people say the
           | same about YouTube and I don't have a concrete position on
           | this.
           | 
           | On one hand, if someone were to tell me "The Mexicans are
           | ruining America" and I were to say "Damned right! Who else do
           | you know who says these great and grand truths about
           | America?" I would expect that person to introduce me to more
           | people like them and my radicalization and engagement would
           | increase out of my own desire to have more of this thing.
           | That aspect of Facebook's recommendation engine just seems
           | like a simulation of a request for more like what I want in a
           | very obedient manner. That is, the tool is actually
           | fulfilling what I am expressing I desire.
           | 
           | On the other hand, the inputs are inscrutable and not clearly
           | editable. For instance, suppose I look at myself and say "God
           | damn it, some of these things I'm saying are really bigoted.
           | I don't want to be like this", I cannot actually self-modify
           | because there is no mechanism on Facebook to modify the
           | inputs. It'll select for me the content I have these auto-
           | preferences for but not the ones I have higher order
           | preferences for.
           | 
           | Essentially it's a fridge that always has cake even though I
           | want to lose weight.
           | 
           | So, yeah, I'm sympathetic that I cannot alter the weights on
           | my recommendation and say "I want to clear your understanding
           | of the person I want to be. Stop reinforcing the one I am
           | now."
           | 
           | Certainly the recommendation engine is a flaw. I do _like_
           | recommendations though and that 's my favourite way of
           | browsing YouTube in the background. It's pretty good at music
           | discovery. So, perhaps it needs to be only opt-in. Imposed by
           | choice rather than by default. It still has to be possible to
           | turn it off.
           | 
           | Even then, I'm not sure. This is an ethical question I've
           | been thinking about for ages: Is it ethical to allow someone
           | to make a choice that could be detrimental and that they
           | cannot recover from? What are the parameters around when it
           | is ethical? Opting in to recommendations could be a one way
           | trap.
        
         | ergl wrote:
         | > The cost of that is many angry people. The benefit of that is
         | that folks like me can find my people. That benefit outweighs
         | the cost.
         | 
         | It outweighs the cost for you. It certainly doesn't for society
         | at large.
        
         | asdff wrote:
         | The difference is that facebook is unlike a forum. It's not
         | actively moderated, and content is bumped according to
         | engagement/marketing potential rather than chronologically by
         | genuine user interest alone.
        
           | jimmaswell wrote:
           | Pages are moderated by their creators and you can unfriend
           | someone you don't want to see posts from.
        
             | AlexandrB wrote:
             | Can I get Facebook to show me _all_ posts from someone or
             | is that still not possible?
        
         | AlexandrB wrote:
         | > This is just the price of the open society.
         | 
         | I don't think an open society can be built on top of an
         | advertising platform. Facebook is not a neutral party here -
         | they control who sees what content at what time with little
         | accountability or transparency.
        
       | bookmarkable wrote:
       | Perhaps important journalism, but it is behind a paywall, so
       | apparently WSJ is satisfied that only their subscribers know this
       | information about Facebook.
       | 
       | Meanwhile, Facebook is not behind a paywall, so they can monitor
       | the conversations of billions of people despite monthly stories
       | that circulate illustrating gross misconduct.
        
       | donohoe wrote:
       | Do you work at Facebook? This is reprehensible.
       | In essence, Facebook is under fire for making        the world
       | more divided. Many of its own experts        appeared to agree--
       | and to believe Facebook could        mitigate many of the
       | problems.        The company chose not to.
       | 
       | Unless you are actively pushing to change it from the inside, you
       | should leave now. Take a reasonable amount of time to find a new
       | job and leave.
       | 
       | Otherwise you're complicit.
        
       | [deleted]
        
       | neycoda wrote:
       | The sad fact is that people choose flavorful news over verifiable
       | facts.
        
       | shaan1 wrote:
       | There are more than 1.5 billion users on Facebook. If they are
       | not worried, and want to be misused, why the hell are others so
       | hell bent on bringing down Facebook lol.
       | 
       | If the users really cared, we wouldn't be having this talk.
       | 
       | Also this is the media wanting to bring down the enemy.
        
       | JoeAltmaier wrote:
       | Its a new societal urge: the addiction to feeling righteous
       | indignance. Endorphin rush, available to anybody with a keyboard.
       | Gonna be hard to put that genie back into the bottle.
        
       | tunesmith wrote:
       | Facebook and other similar systems reward engagement. Engagement
       | happens when people are surprised. Surprise happens when people
       | come across new apparent "information". New information is most
       | easily propagated through the use of lies.
       | 
       | It follows pretty clearly. If they don't want divisiveness, they
       | have to either step away from rewarding engagement, or they have
       | to stop people from lying. They're in a bind, except it's society
       | that is bearing the cost.
        
       | sneak wrote:
       | Do we ask or expect the same of the phone or cable companies?
       | 
       | Why the agenda to (further) censor Facebook and similar?
        
         | kordlessagain wrote:
         | Note the voting on your questions as opposed to the engagement
         | of the discourse you've started. There are a percentage of
         | users who don't like you asking these questions and a
         | percentage of them who want to understand what these questions
         | mean.
         | 
         | Phone and cable companies do not create polarization because
         | they carry ALL data (usually). Services like Facebook, Twitter
         | and HN all provide the _ability_ to modify the content, in
         | place. This is done with automation (code) and we can expect
         | that automation to become more aware moving forward (AI).
         | 
         | This ability to modify content in place by the companies
         | produces revenue at the same time it creates the _ability_ for
         | some types of divisiveness to form. Humans are divisive, under
         | certain conditions, and there isn 't much that can be done
         | about it other than education about how to stop being divisive.
         | 
         | Education becomes impossible when the entities controlling the
         | channels do so in a way that prevent users changing what type
         | of content they see (such as education about how to avoid
         | divisiveness), maybe due to the fact it kills revenue.
         | 
         | Worse, the more choice you give users (free, decentralized
         | internet anyone?), the more some users will choose to introduce
         | behaviors that give way to divisiveness in a given group.
         | Trolls using imagery to build propaganda filled stories.
         | 
         | Trolls have taken over the Republican party, if nobody has
         | figured this out by now. Note how they use strong imagery to
         | glue their never-ending stories together.
         | 
         | It's a no-win situation. The best thing to do is simply walk
         | away from it or maybe build a personal search engine AI crawler
         | thing that works for just you and only you.
         | 
         | "Here's your content, Boss." "Thanks, Sidekick!"
        
         | smacktoward wrote:
         | Phone and cable companies are _extensively_ regulated.
        
         | bitcurious wrote:
         | If you mean that Facebook should be regulated as a utility, by
         | all means make that argument - I think you'll find broad
         | support.
         | 
         | As it is, Facebook is constantly making editorial decisions in
         | terms of what content is shown (which posts, in what order,
         | with what presentation). Their own research had found that some
         | of those editorial decisions have externalities in the form of
         | increasing social conflict. Rather than take steps to address
         | it, or even research this question more, they wiped their hands
         | of it.
        
           | dorkinspace wrote:
           | > If you mean that Facebook should be regulated as a utility,
           | by all means make that argument - I think you'll find broad
           | support.
           | 
           | In this case, would Facebook become compulsory for American
           | citizens?
        
         | [deleted]
        
         | rb808 wrote:
         | Agreed, if you look at cable news channels these day its all
         | about division and fighting.
        
         | AlexandrB wrote:
         | Phone companies don't set up incentive structures that
         | encourage a certain kind of content. Facebook has an
         | "algorithmic" feed, likes, and "engagement" metrics that
         | rewards certain behaviours and punish others. They are rightly
         | being pilloried when these incentives encourage and promote
         | constant outrage, conspiracies, and completely fact-free fear
         | mongering.
        
           | dleslie wrote:
           | This is a thought-provoking answer, and it shows how Facebook
           | (and others) are straddling the line between publisher and
           | platform.
           | 
           | IMHO, because they do perform curation, both algorithmic and
           | manual, they should be considered publishers.
        
             | sneak wrote:
             | Seems to me that their argument against censorship should
             | be even stronger, then, as editorializing is protected
             | expression.
        
           | catalogia wrote:
           | > _Phone companies don 't set up incentive structures that
           | encourage a certain kind of content._
           | 
           | I'm not convinced of that. Through technical and billing
           | means, phones encourage one-on-one conversations while
           | discouraging conversations with multiple participants. By
           | disincentivizing certain kinds of conversations, they
           | disincentivize certain kinds of content. It's hard to say
           | exactly what sort of impact this may have on society, but I
           | doubt it doesn't have any.
           | 
           | This may be a far cry from Facebook's deliberate algorithmic
           | tweaking to manipulate the emotions of their users, but I
           | think it's interesting to consider in it's own right.
        
         | tobib wrote:
         | My guess is it's because phone or cable companies don't apply
         | comparable measures, at least not that I know of. Targeted
         | content for example.
        
       | TheAdamAndChe wrote:
       | Anyone know a good paywall workaround for wsj?
        
         | kordlessagain wrote:
         | Full page screen capture plugin on Chrome plus a community that
         | posts to a IPFS node and updates some decentralized search
         | thing to be able to find it?
        
       | redorb wrote:
       | I've asked the question - what if FB went for bartender rules? No
       | politics no religion... sometimes I feel like those are 65% of
       | the content.
        
       | contemporary343 wrote:
       | Every platform ultimately makes choices in how users engage with
       | it, whether that goal is to drive up engagement, ad revenues or
       | whatever metric is relevant to them. My general read is that
       | Facebook tries to message that they're "neutral" arbiters and
       | passive observers of whatever happens on their platform. But they
       | aren't, certainly not in effect, and possibly in intent either.
       | To preserve existing algorithms is not by definition fair and
       | neutral!
       | 
       | And in this instance, choosing not to respond to what its
       | internal researchers found is, ultimately, a choice they've made.
       | In theory, it's on us as users and consumers to vote with our
       | attention and time spent. But given the society-wide effects of a
       | platform that a large chunk of humanity uses, it's not clear to
       | me that these are merely private choices; these private choices
       | by FB executives affect the commonweal.
        
         | AlexandrB wrote:
         | It's pretty laughable for Facebook to claim they're neutral
         | when they performed and published[1] research about how
         | tweaking their algorithm can affect the mood of their users.
         | 
         | [1]
         | https://www.theatlantic.com/technology/archive/2014/06/every...
        
           | 1propionyl wrote:
           | Even if they hadn't done that, it would still be a laughable
           | claim prima facie.
           | 
           | There's something of an analogue to the observer effect: that
           | the mere observation of a phenomenon changes the phenomenon.
           | 
           | Facebook can be viewed as an instrument for observing the
           | world around us. But it is one that, through being used by
           | millions of people and
           | personalizing/ranking/filtering/aggregating, affects change
           | on the world.
           | 
           | Or to be a little more precise, it structures the way that
           | its users affect the world. Which is something of a
           | distinction without much difference, consequentially.
        
         | pm90 wrote:
         | If the private platform is de facto the primary source of news
         | for the majority of the population, this affects the public in
         | incredible ways. I don't understand how the US Congress does
         | not recognize and regulate this.
        
           | 1propionyl wrote:
           | "It is difficult to get a man to understand something, when
           | his [campaign fundraising] depends on his not understanding
           | it." - Upton Sinclair (lightly adapted)
        
       | kingkawn wrote:
       | The implications of being correct are in for a few tweaks
        
       | ggggtez wrote:
       | >Another concern, they and others said, was that some proposed
       | changes would have disproportionately affected conservative users
       | and publishers, at a time when the company faced accusations from
       | the right of political bias.
       | 
       | This is the same thing they were worried about in the lead up to
       | the 2016 election when they fired their newsroom for not
       | promoting pizzagate and other conspiracies that would be deemed
       | as "biased" against conservatives. And they clearly still haven't
       | learned anything about why letting engagement algorithms run wild
       | is bad for society.
        
       | [deleted]
        
       | beepboopbeep wrote:
       | This is why twitter and facebook don't have dislike buttons. By
       | removing a quick and easy way of voicing dissent to a point,
       | people take to the comments to verbally punish others. For a site
       | that is dependent on user engagement,
       | anger/outrage/frustration/negativity in general is a gold mine. I
       | remember when reddit tried removing the down vote button the
       | comments got NASTY. They back-peddled very quickly from that
       | decision.
        
       | kisna72 wrote:
       | I don't understand why this is a surprise to anyone lol
        
         | kgin wrote:
         | There's a difference between suspicion and confirmation
        
       | dafty4 wrote:
       | If there is an effort to broker civil and constructive debate,
       | isn't division fine?
        
       | noizejoy wrote:
       | As once said by Billie Eilish "duh"!
       | 
       | More seriously: Arms dealers are not exactly benefitting from
       | facilitating peace making efforts either, so economically this
       | makes all the sense in the world to me.
        
       | MattGaiser wrote:
       | Reddit has the same issues of division and does not do anything
       | as a company to sort people. It all comes down to the individuals
       | themselves.
       | 
       | Is division really all that new or can we just see it more now?
        
         | thrwn_frthr_awy wrote:
         | > Is division really all that new or can we just see it more
         | now?
         | 
         | In the article Facebook themselves say measured the increase
         | and knew they caused it with their algorithms.
        
         | newacct583 wrote:
         | It's a little different. Reddit doesn't choose the content
         | presented to users, they allow the community to self-sort into
         | community-managed subreddits with their own cultures and
         | preferences and voting behavior. In fact reddit only barely
         | exerts any control over the selection of subreddit moderators
         | (mostly stepping in only to resolve things in extremis).
         | 
         | Facebook's algorithms decide on __everything __in your feed. If
         | you aren 't interested in politics on reddit you might never
         | see it at all. If Facebook thinks you might be a republican
         | (and often that's just a demographic thing coupled with a few
         | past clicks on political stories), they will _literally fill
         | your screen_ with paid advertising designed to drive your
         | political preferences.
         | 
         | The point is that division is visible on Reddit (and
         | everywhere), but _driven and encouraged_ by Facebook. And that
         | these are different phenomena. I 'm not completely sure I
         | agree, but the point isn't as simple as "division exists".
        
       | trekrich wrote:
       | they don't want the narrative changing from extreme left to
       | center.
        
       | kolbe wrote:
       | Funny. I'm sure the Wall Street Journal knows the same thing, but
       | reaps profits from it as well.
        
       | im3w1l wrote:
       | What can you realistically do? The alternatives as I see it are,
       | Riling people up (showing different opinions)         Echo-
       | chambers (showing same opinion)         Sweeping issues under the
       | rug (showing neither)
        
       | arbuge wrote:
       | "It is difficult to get a man to understand something when his
       | salary depends upon his not understanding it." - Upton Sinclair
        
       | mmlnkb wrote:
       | beser hacker ever
        
       | mmlnkb wrote:
       | ,lmlknm
       | 
       | / ,'.mn.?' ml,
       | 
       | //. .'.lm' /
        
       | engineer_22 wrote:
       | Nationalize Facebook.
        
         | tenebrisalietum wrote:
         | People are reflexively downvoting you, but actual discussion
         | would be nice.
         | 
         | After all, the postal service is in the constitution, so this
         | country started with an essential communication service
         | nationalized from the outset.
         | 
         | What if Facebook was government funded and supported in the
         | same manner, instead of privately advertiser-funded?
        
         | wtfno009887466 wrote:
         | ROFL, who's the worse privacy violator, Facebook or the NSA
        
           | ouid wrote:
           | Facebook. With absolute certainty.
        
             | crocodiletears wrote:
             | NSA's a black box whose sole purpose is the aggregation and
             | analysis of any information with potential relevance to US
             | national security. It has the capacity to compel or
             | infiltrate companies like Facebook to make them cooperate
             | with its goals, and data sharing agreements with multiple
             | nations. Privacy violation isn't a side-effect of its
             | business model, it's its raison d'etre.
             | 
             | I wouldn't dismiss NSA so offhandedly along this metric,
             | even if it's ostensibly more constrained along legal
             | boundaries.
        
           | engineer_22 wrote:
           | why pretend, just roll it all into one
        
       | jimmaswell wrote:
       | They decided against paternalistic meddling and let discourse
       | happen naturally? That sounds best to me. I don't want Facebook
       | to be a school teacher hovering over a lunch table to make sure
       | nobody swears. People posting "divisive" content is far
       | preferable to the alternative.
        
         | [deleted]
        
         | lostmyoldone wrote:
         | It's not people posting divisive content that is the big
         | problem, the big problem is divisive content getting all the
         | eyeballs, causing people to (due to completely normal human
         | psychology) to believe everyone either are completely against
         | them, or completely with them, and nothing in between.
         | 
         | Even disregarding anything but mental and physical health, the
         | consequences are significant and quite real.
         | 
         | No, they don't need to become the gatekeeper of all "bad
         | things"(tm) the same way they protect us from accidentally
         | gazing at a terrifying nipple, that would be preposterous, but
         | they could probably try a little harder to not act completely
         | opposite to their users best interest as often as they do.
         | 
         | Especially when that happens to be a significant fraction of
         | all the people on earth, that's probably not too big of an ask?
        
         | banads wrote:
         | If FB wanted to "let discourse happen naturally" and not be
         | paternalistic, they wouldn't use an opaque, non-chronological
         | algorithm to control who gets to see what in such a way that
         | primarily benefits FB's bottom line.
         | 
         | What is this alternative you speak of?
        
           | quotemstr wrote:
           | Optimizing for engagement does not favor any particular
           | viewpoint. The authors of this article are incensed that
           | Facebook doesn't engage in more _viewpoint-based_ adjustment
           | of the conversation. Favoring or disfavoring a post based on
           | the viewpoint it expresses is very different from optimizing
           | an algorithm to give a user more of what he wants, whatever
           | that is.
        
             | skosch wrote:
             | That's a misunderstanding of the problem.
             | 
             | Optimizing for engagement tends to favour _extreme,
             | simplistic,_ and _highly emotional_ viewpoints. In other
             | words, it caters to human nature. This tendency is harmful
             | to rational discourse, regardless of whether or not you
             | happen to agree with any given viewpoint.
        
       | JackFr wrote:
       | I simply cannot understand the motivation of people who seemingly
       | want to be made angry.
       | 
       | I've had friends tell me I'm just buying my head in the sand, but
       | I don't think I am. I'm trying my best not to be manipulated into
       | a worse emotional state. I don't go on Facebook anymore because I
       | realize that objectively time spent on Facebook made me less
       | happy.
        
       | mwfunk wrote:
       | It just feels like weaponized Usenet from the mid-'90s, or almost
       | every popular online forum since then. Multiplayer game
       | communities even. They're like tinderboxes for negativity. Very
       | small numbers of bad faith actors (griefers, trolls, scammers,
       | spammers, or just plain assholes) can trivially derail entire
       | communities. Even without people trying to screw everything up,
       | plain old human nature, and the nature of electronic
       | communications, can make it happen as well. It just takes a
       | little longer.
       | 
       | Put another way, each flame begets one or more flames, whereas
       | each good comment might get responses but maybe it stands on its
       | own. Over time the signal to noise ratio of any forum tends to
       | degrade to nothing as the forum becomes more popular because of
       | this. Moderation, scoring systems, etc. can ameliorate this but
       | in general the less specialized the forum, the worse it is. It's
       | like entropy in that it only goes in one direction, it's just a
       | matter of time and how much you can push back on it. Bad comments
       | beget more bad comments, but good comments don't necessarily
       | beget more good comments. And at some point, the ratio of bad
       | comments to good comments drives away any potential good
       | commenters and the event horizon is crossed and the forum dies.
       | Or it lives on as a cesspool for whatever.
       | 
       | The difference between Facebook and Twitter in 2020 vs
       | comp.os.linux (or whatever) in 1995 is that it's not specialists
       | screaming at each other about which distro or programming
       | language or OSS license is best (or worst). It's a much wider net
       | of far less informed or rational people, encouraged to argue
       | about infinitely dumber and less knowable or debatable stuff.
       | It's like scammy clickbait, but for arguments rather than clicks.
       | The other difference between Facebook and Twitter in 2020 vs
       | online communities of the past is that Facebook and Twitter make
       | money off of it. All this BS fuels "engagement" and keeps larger
       | volumes of people posting and therefore revealing themselves to
       | trackers and creating a stream of ad views for the platform
       | owners. At some point I do think the toxicity of the platforms
       | will start costing them users, but that doesn't seem to be
       | happening anytime soon.
        
       | geori wrote:
       | So Neal Stephenson wrote a book about this - at least the good
       | half of the book -
       | https://en.wikipedia.org/wiki/Fall;_or,_Dodge_in_Hell
       | 
       | It's 20 years in the future where facebook and similar services
       | are much, much worse. Wealthy people pay for editors to remove
       | misinformation from their feeds. And the country gets bifurcated
       | with coastal elites having access to editors and flyover country
       | ("Ameristan") has turned into a conspiracy plagued wasteland.
        
       | LogicRiver wrote:
       | Facebook thrives on being able to create divides and bias among
       | its users, good or bad.
        
       | rdxm wrote:
       | said it before, will say it again. FB is a cancer on our society
       | and species (moreover social media is the same in this
       | dimension). Will end up being the cigarettes/smoking of these
       | generations.
       | 
       | I feel sorry for my children growing up with this disastrous
       | influence in their lives.
        
       | iamspoilt wrote:
       | Link without paywall: http://archive.vn/YQeJY
        
       | alpineidyll3 wrote:
       | Regulators need to stop giving tech giants a pass on common
       | carrier liability. It would solve a lot of problems overnight.
        
       | aantix wrote:
       | Sheryl Sandberg wants males to "lean in", assume a more
       | cooperative role.
       | 
       | But then she actively supports the most divisive platform in
       | history. A perfect dismount in her mental gymnastic routine.
        
       ___________________________________________________________________
       (page generated 2020-05-26 23:00 UTC)