[HN Gopher] Users post more falsehoods after others correct them...
       ___________________________________________________________________
        
       Users post more falsehoods after others correct them: study
        
       Author : rbanffy
       Score  : 54 points
       Date   : 2021-05-25 11:11 UTC (11 hours ago)
        
 (HTM) web link (news.mit.edu)
 (TXT) w3m dump (news.mit.edu)
        
       | avivo wrote:
       | Note that the full story is far more complicated than this
       | headline suggests. Some of the same authors show that "gently
       | nudging users to think about accuracy increases quality of news
       | shared".
       | 
       | Paper: https://www.nature.com/articles/s41586-021-03344-2
       | 
       | Tweet summary from PI of both studies:
       | https://twitter.com/DG_Rand/status/1372217700626411527?s=20
        
       | eplanit wrote:
       | "they retweeted news that was significantly lower in quality and
       | higher in partisan slant, and their retweets contained more toxic
       | language."
       | 
       | Change the word "retweet" to "publish/broadcast", and that
       | describes most "mainstream" journalism.
       | 
       | I stopped when they referenced Snopes as a "fact checker"....the
       | same Snopes that "fact-checked" a satirical article in the
       | Babylon Bee about whether Brett Kavanaugh should prove via DNA
       | that he's not the son of Hitler. [1]
       | 
       | The authors should write an article on the more fundamental issue
       | of 'truthiness' -- it was once a joke by Stephen Colbert, but now
       | people actually talk in terms of "my truth" and "her truth" --
       | not _the_ truth.
       | 
       | If gender is a state of mind, then why not race, or even species?
       | With objectivity lost, then they're surely unlikely to find its
       | vestiges on Twitter.
       | 
       | [1] https://www.snopes.com/fact-check/democrats-demand-
       | kavanaugh...
        
         | ccn0p wrote:
         | 2+2=5
        
         | Mordisquitos wrote:
         | > If gender is a state of mind, then why not race, or even
         | species?
         | 
         | Race is absolutely a state of mind, more so than gender. All
         | else being equal, a person who would identify as "POC" in the
         | USA would simply be seen as "white" in many other Western
         | countries.
        
           | pjc50 wrote:
           | > Race is absolutely a state of mind, more so than gender.
           | 
           | Yes ...
           | 
           | > All else being equal, a person who would identify as "POC"
           | in the USA would simply be seen as "white" in many other
           | Western countries.
           | 
           | I'm struggling to think of what you mean here? The 20th
           | century saw the transition of various southern european
           | ethnicities into being "white" where they might previously
           | not have been. And also a brutal war in the ruins of
           | Yugoslavia among ethnicities that outsiders would find hard
           | to distinguish and label together as "white".
           | 
           | Race is a state of mind - both in oneself _and in the eyes of
           | other people_.
        
             | Mordisquitos wrote:
             | > I'm struggling to think of what you mean here?
             | 
             | An example would be how a broadly non-American audience
             | commented to a video entitled _" People Guess Who is White
             | In a Group of Strangers"_ (though, to be fair, some
             | Americans commenters were equally shocked) [0].
             | 
             | You are absolutely right about the huge variety of ethnic
             | identities and conflicts in different cultures, and how
             | they cannot be understood under the mores of a different
             | culture. That is the spirit of my comment, offering a
             | counterpoint to the implication that the idea of "race" is
             | somehow objective.
             | 
             | [0] https://old.reddit.com/r/ShitAmericansSay/comments/7d4a
             | ff/sa...
        
         | jb775 wrote:
         | People on the left don't realize that the "fact checkers" are
         | an extension of the propaganda arm of the left.
         | 
         | For anyone who disagrees, take a look at the "fact-checker"
         | Twitter profiles here and tell me with a straight face they
         | aren't literal mouth puppets of the extreme left and/or the
         | communist party: https://www.truthorfiction.com/our-team/
        
         | tzs wrote:
         | > I stopped when they referenced Snopes as a "fact
         | checker"....the same Snopes that "fact-checked" a satirical
         | article in the Babylon Bee about whether Brett Kavanaugh should
         | prove via DNA that he's not the son of Hitler.
         | 
         | Are you suggesting that when something that is false is being
         | circulated as true, fact checkers should ignore it if it
         | originated on a satire site?
        
           | eplanit wrote:
           | It's amazing how many people think that the Babylon Bee isn't
           | satire, while there's little confusion over the Onion. Both
           | state clearly that they're satire. Does Snopes fact-check the
           | Onion? Do you think they should?
        
             | tzs wrote:
             | If an Onion article gets widely circulated on social media
             | in such a way that people aren't realizing it is satire and
             | people ask Snopes about it, then yes, they should do a fact
             | check article on it stating that it was satire that
             | originated at The Onion.
             | 
             | > It's amazing how many people think that the Babylon Bee
             | isn't satire, while there's little confusion over the
             | Onion.
             | 
             | I think there are a few things that contribute to this.
             | 
             | 1. The way sharing works on many social media sites is that
             | when you share a link to a story at some other site the
             | social media site embeds the story in your post. It links
             | back to the original site, but many people who see your
             | post will just read the embedded story rather than click to
             | go read it on the original site. They see the satirical
             | article by itself rather than in the context of a whole
             | site full of satirical articles, making it easier to not
             | realize it is satire.
             | 
             | 2. The Onion is more well known.
             | 
             | 3. Due to things like Q, the Bee's satire is making claims
             | that are often _less_ wild than things that people already
             | believe, reducing the chances that they will realize it is
             | satire.
        
         | stirfish wrote:
         | >If gender is a state of mind, then why not race, or even
         | species?
         | 
         | Why do you need gender and race to be objective?
         | 
         | As for species, we've gotten that wrong too
         | https://www.amazon.com/gp/aw/d/019501958X
        
           | [deleted]
        
       | at_a_remove wrote:
       | I have seen quite a lot of "corrections" that are agonizingly
       | smug technicalities, flaming strawmen, and the like. Those
       | attempts to "debunk" whatever usually make me question what
       | _else_ this group has been playing word-games about.
       | 
       | Often, it comes with some intensity, then a completely
       | hypocritical stance on something _else_ and again I wonder about
       | the integrity.
       | 
       | It can be done, but it must be done cleanly, unimpeachably, and
       | with ground given when "your side" is wrong.
        
         | nimih wrote:
         | It's worth pointing out that this research was conducted by
         | arguing with people on twitter via a bunch of automated bots.
         | It's not really that surprising that people will [be slightly
         | more likely to] double down if they notice that the person
         | trying to talk them out of their beliefs is part of a legion of
         | literal robots.
        
           | ronsor wrote:
           | Especially when those same groups already complain about bots
           | invading their discussions.
        
         | MaxBarraclough wrote:
         | > I have seen quite a lot of "corrections" that are agonizingly
         | smug technicalities, flaming strawmen, and the like.
         | 
         | I'm reminded of Paul Graham's _hierarchy of disagreement_.
         | [0][1]
         | 
         | [0]
         | https://en.wikipedia.org/wiki/Paul_Graham_(programmer)#Graha...
         | 
         | [1] http://www.paulgraham.com/disagree.html
        
       | RcouF1uZ4gsC wrote:
       | The results remind me of the following passage from Dale
       | Carnegie's How to Win Friends and Influence People:
       | 
       | Shortly after the close of World War I, I learned an invaluable
       | lesson one night in London. I was manager at the time for Sir
       | Ross Smith. During the war, Sir Ross had been the Australian ace
       | out in Palestine; and shortly after peace was declared, he
       | astonished the world by flying halfway around it in thirty days.
       | No such feat had ever been attempted before. It created a
       | tremendous sensation. The Australian government awarded him fifty
       | thousand dollars; the King of England knighted him; and, for a
       | while, he was the most talked-about man under the Union Jack. I
       | was attending a banquet one night given in Sir Ross's honor; and
       | during the dinner, the man sitting next to me told a humorous
       | story which hinged on the quotation "There's a divinity that
       | shapes our ends, rough-hew them how we will." The raconteur
       | mentioned that the quotation was from the Bible. He was wrong. I
       | knew that. I knew it positively. There couldn't be the slightest
       | doubt about it. And so, to get a feeling of importance and
       | display my superiority, I appointed myself as an unsolicited and
       | unwelcome committee of one to correct him. He stuck to his guns.
       | What? From Shakespeare? Impossible! Absurd! That quotation was
       | from the Bible. And he knew it. The storyteller was sitting on my
       | right; and Frank Gammond, an old friend of mine, was seated at my
       | left. Mr. Gammond had devoted years to the study of Shakespeare.
       | So the storyteller and I agreed to submit the question to Mr.
       | Gammond. Mr. Gammond listened, kicked me under the table, and
       | then said: "Dale, you are wrong. The gentleman is right. It is
       | from the Bible." On our way home that night, I said to Mr.
       | Gammond: "Frank, you knew that quotation was from Shakespeare."
       | "Yes, of course," he replied, "Hamlet, Act Five, Scene Two. But
       | we were guests at a festive occasion, my dear Dale. Why prove to
       | a man he is wrong? Is that going to make him like you? Why not
       | let him save his face? He didn't ask for your opinion. He didn't
       | want it. Why argue with him? Always avoid the acute angle." The
       | man who said that taught me a lesson I'll never forget. I not
       | only had made the storyteller uncomfortable, but had put my
       | friend in an embarrassing situation. How much better it would
       | have been had I not become argumentative.
        
         | jb775 wrote:
         | Apples to oranges. You providing this as an equivalent example
         | to the current state of politics-based censorship is a more
         | telling embodiment of the situation.
         | 
         | The example here is demonstrably provable by pulling out a copy
         | of Hamlet and flipping to Act Five, Scene Two. When Twitter
         | shadow-bans accounts and hides hashtags of discussion they
         | don't agree with politically, they aren't able to do this.
         | Their thought process is clouded by the _illusion_ that they
         | 're _positively_ not-wrong...but when questioned or presented
         | with counter-evidence, they defer to Hitchens 's razor (a
         | flawed methodology) and put wax in their ears.
        
         | malloryerik wrote:
         | An interesting and apropos passage, though it might be improved
         | by a caveat that while the source of the Sir Ross' quote was an
         | unimportant detail, falsehoods on Twitter are often both the
         | primary content of a tweet and highly relevant to important
         | social and political issues. Many tweets are outright
         | slanderous.
        
         | exo-pla-net wrote:
         | You will win more friends with a "never correct others" policy.
         | However, what if belief in the misinformation has deadly
         | consequences? Steve Jobs died due to him seeking "alternative"
         | cancer treatment. Had someone convincing spoken up when
         | acupuncture was being discussed around him, someone who changed
         | his mind into recognizing that acupuncture is a sham, Jobs
         | might still be alive.
        
           | muffinman26 wrote:
           | How To Win Friends and Influence People doesn't actually
           | advocate never trying to convince people of an opposing
           | viewpoint. Carnegie argues that you'll never be able to
           | convince people of something by correcting them directly, but
           | that you can persuade people if you express genuine interest
           | in their opinion, listen well, ask questions, and allow the
           | other person to save face by presenting the correction as
           | something they came to on their own. Whereas, if you correct
           | someone directly, no matter how solid your facts, they are
           | more likely to feel attacked and double down.
        
           | sigstoat wrote:
           | > You will win more friends with a "never correct others"
           | policy. However, what if belief in the misinformation has
           | deadly consequences?
           | 
           | where do you think the line should be drawn? dying is pretty
           | clearly terrible.
           | 
           | what if someone is going to invest all of their money in a
           | ponzi scheme? or put 50% of their savings into GME? or invest
           | in a high expense ratio index fund?
           | 
           | we've got to ignore some mistakes and errors on the part of
           | others.
        
             | exo-pla-net wrote:
             | I'm glad you've decided to let me arbitrate this matter.
             | Here is my official misinformation heuristic, three
             | questions to ask oneself before acting:
             | 
             | 1) Is this misinformation likely to convince and harm a
             | great number of people?
             | 
             | 2) Could belief in this misinformation result in severe
             | health or emotional damage, for a single person or more who
             | believe it?
             | 
             | 3) If yes to either, are you an expert on the issue,
             | relative to the person or group making the claim?
             | 
             | If you're indeed an expert, try to save the person or
             | people from their downfall.
             | 
             | If you're not an expert, read up on the issue before
             | meddling. God forbid it's _you_ who is wrong.
             | 
             | And, of course, if no significant harm is likely, ignore
             | it.
        
         | raincom wrote:
         | Great example. This is also an issue many doctors treating
         | terminally ill patients. Of course, it is true that the patient
         | will die in a month or two. What should a doctor do? Tell the
         | truth to the patient to screw him emotionally? Or keep silent?
         | Or lie to the patient?
        
       | [deleted]
        
       | ffggvv wrote:
       | > The study was centered around a Twitter field experiment in
       | which a research team offered polite corrections, complete with
       | links to solid evidence, in replies to flagrantly false tweets
       | about politics.
       | 
       | yeah i don't really trust or believe these "researchers". case
       | and point from "fact checkers":
       | 
       | https://twitter.com/GlennKesslerWP/status/125626793122004992...
       | 
       | https://twitter.com/GlennKesslerWP/status/139716616659076711...
       | 
       | and snopes in particular is very biased. i remember when they
       | labeled a claim by aoc as "factually untrue but morally correct"
        
       | jjk166 wrote:
       | I think we need to take a look at how we deal with the same issue
       | in face to face interactions. Calling out someone who is
       | incorrect in a group setting often goes poorly - you seem
       | obnoxious and they need to double down on their rhetoric to save
       | face. Even doing so "politely" just makes you seem more like a
       | dick. While that may sound irrational, think about how you'd
       | react at a conference where one person in the audience shouted
       | "that's not true! this arcane reference which no one in the
       | audience has evaluated claims that's just a myth!" in general
       | you'd think they were at best misguided if not trying to be
       | deliberately provocative. What are the odds that you would go
       | home that night and look up the reference the heckler cited?
       | 
       | Far more effective is to take someone aside and tell them in
       | private that they are incorrect, especially when combined with
       | acknowledgement of what they got right. In that context you are
       | not attacking their social position and thus the stakes are
       | lower. While it is hard to simulate such a position of privacy
       | and trust in an online setting, private direct messages on most
       | platforms are a good start. Anecdotally, people seem much more
       | receptive to a respectful direct message and conversation seems
       | much more cordial and focused on pursuit of knowledge. While I
       | have no evidence that this leads to long term positive changes in
       | posting behavior, I'd be willing to wager it has a better long
       | term outcome.
        
       | [deleted]
        
       | slibhb wrote:
       | Often "correcting falsehoods" is like trying to talk someone out
       | of believing in God by explaining evolution. Or trying to argue
       | with someone who believes in "systemic white supremacy" by
       | pointing out that Asian-Americans earn more money than white
       | Americans.
       | 
       | I don't know what to do in these situations. You can't talk
       | people out of their values but it is better when people express
       | their values in a moderate way that aligns with empirical facts.
       | Probably there's some polite way to express your disagreement
       | without saying "actually here's why you're wrong".
        
         | enriquto wrote:
         | The Socratic method is good for that. Instead of saying "you
         | are wrong", you ask a series of questions that induce a
         | contradiction in the other person. This is often deemed as
         | trolling or sealioning when the discussion is unwelcome (which
         | often is), but when people are open-minded it is a respectful
         | way to argue.
        
           | deathanatos wrote:
           | This is way harder than it sounds. You can ask the followup
           | questions, but too often, they'll get answered.
           | Contradictions, even pointed out, aren't contradictions _to
           | them_. Hitting a contradiction in the first place requires
           | that the person being questioned applies a consistent model
           | to the questions, and most of the time, they 're not. And
           | getting over that requires admitting that they're wrong.
           | 
           | When I am relating such conversations to my SO, it's often
           | with the (modified) phrase "you can drown a horse in water,
           | but you can't force it to drink."
        
           | jfengel wrote:
           | I'm one of those who call Socrates a troll. The problem isn't
           | just that it's unwelcome but that it's unproductive. It
           | proves only that the person isn't capable of supporting their
           | own premise, not that the premise is wrong. It doesn't lead
           | to truths on its own, and doesn't point in the direction of
           | improved hypotheses.
           | 
           | Socrates then makes the assertion that he knows nothing, and
           | is therefore immune to such treatment (and is thus superior).
           | He's not putting himself on the line -- exactly the kind of
           | thing that trolls do.
           | 
           | If Socrates asks you what "virtue" is, what can you say
           | except, "I dunno. Why are you asking? What is it you actually
           | want to know?"
           | 
           | Modern Socratic method isn't really all that similar to what
           | Socrates actually did. It's intended to be cooperative,
           | rather than adversarial. It's nominally based on the dialogue
           | in Meno, which is really more about epistemology than about
           | pedagogy (and which draws Socrates to some weird conclusions
           | about past lives).
           | 
           | Even so, it's not really meant to be argumentation. It's not
           | between equals. The teacher leads the student to "discover"
           | the truth that the teacher already knows. Not just knows, but
           | knows so thoroughly that they can guide the student around
           | all of the possible mis-steps.
           | 
           | I'm all for respectful dialogue, but that's not really what
           | either Socrates nor the modern pedagogues who take
           | inspiration from him are doing. I'll be honest that I've got
           | disagreements with the notion of respectful dialogue as well,
           | but they're off-topic here.
        
             | enriquto wrote:
             | > It proves only that the person isn't capable of
             | supporting their own premise, not that the premise is
             | wrong. It doesn't lead to truths on its own,
             | 
             | But that's the only we can aspire to! Any statement exists
             | because somebody is stating it. You cannot really "have" a
             | truth that is not held by anybody; that means that you
             | still have to find it. The Socratic method thus serves to
             | find a person that is able to hold a certain premise, by
             | sieving away all the people who are not. Notice that this
             | does not yet mean than the premise is true, but it is a
             | necessary condition.
             | 
             | > and doesn't point in the direction of improved
             | hypotheses.
             | 
             | I do not know of any systematic method that does that. Do
             | you? It seems to be a purely creative, not inductive,
             | process.
             | 
             | Regarding the "trollishness" character of Socrates I agree
             | with you. If Socrates was born again today, we (the
             | society) would kill him again.
        
         | stirfish wrote:
         | > Or trying to argue with someone who believes in "systemic
         | white supremacy" by pointing out that Asian-Americans earn more
         | money than white Americans.
         | 
         | White Supremacy is the belief that white people are inherently
         | superior to everyone else and should dominate, regardless of
         | how much money anyone makes. There are plenty of poor white
         | supremacists.
         | 
         | Did you mean systemic social privilege?
         | https://en.wikipedia.org/wiki/Social_privilege
         | 
         | >I don't know what to do in these situations.
         | 
         | Empathy and understanding.
        
           | slibhb wrote:
           | This is what I meant: https://en.wikipedia.org/wiki/White_sup
           | remacy#Academic_use_o...
        
             | stirfish wrote:
             | > The term white supremacy is used in some academic studies
             | of racial power to denote a system of structural or
             | societal racism which privileges white people over others,
             | regardless of the presence or the absence of racial hatred.
             | According to this definition, white racial advantages occur
             | at both a collective and an individual level (ceteris
             | paribus, i. e., when individuals are compared that do not
             | relevantly differ except in ethnicity). Legal scholar
             | Frances Lee Ansley explains this definition as follows:
             | 
             | >By "white supremacy" I do not mean to allude only to the
             | self-conscious racism of white supremacist hate groups. I
             | refer instead to a political, economic and cultural system
             | in which whites overwhelmingly control power and material
             | resources, conscious and unconscious ideas of white
             | superiority and entitlement are widespread, and relations
             | of white dominance and non-white subordination are daily
             | reenacted across a broad array of institutions and social
             | settings.
             | 
             | This makes sense to me, and doesn't seem related to how
             | much money Asian Americans make.
             | 
             | Have you heard of redlining? It might make "systemic white
             | supremacy" make more sense in an American context. If
             | you've heard of Martin Luther King Jr, this is why he was
             | assassinated.
             | 
             | https://en.m.wikipedia.org/wiki/Redlining
        
       | 3np wrote:
       | What are the odds that the Twitter accounts that the researchers
       | base their findings on were also bots? It's an interesting
       | thought to entertain with intertwined mutual feedback loops of
       | assisted automated systems all assuming each other to be human.
        
         | [deleted]
        
       | JohnWhigham wrote:
       | Massive global communities were a mistake. We're not cut out for
       | them.
        
       | paulpauper wrote:
       | Whenever Bernie Sanders or some other highly polarizing
       | politician makes some sweeping claim, I have found that comments
       | can be effective,such as on Reddit, for at least setting the
       | record strait or putting it in the correct context or showing
       | counterexamples.
        
       | jb775 wrote:
       | The actual issue is that 90% of "falsehoods" aren't proven as
       | false by the WOKE gatekeepers of modern discussion...but with
       | certainty, they label and dismiss them as "falsehoods" when the
       | cognitive dissonance of their personal politics kicks in.
       | 
       | Then they say "the burden of proof lies on you", but considering
       | that digging up proof on these types of things (in the short
       | term) often relies on further discussion and research, it's kind
       | of a catch-22 to continue the debate.
       | 
       |  _Latest case in point:_ Covid was a super-virus created in a lab
       | with funding personally approved by Fauci (who is now enriching
       | himself from the debacle)....and made it 's way out of the lab
       | and into the general population.
       | 
       |  _Next case in point:_ The 2020 election machines actually were
       | hacked, and results were tampered with. Gasp! Censor that wrong-
       | think!
        
       | BugsJustFindMe wrote:
       | > _the researchers observed that the accuracy of news sources the
       | Twitter users retweeted promptly declined by roughly 1 percent in
       | the next 24 hours after being corrected_
       | 
       | How do you measure accuracy in single digit percents? That seems
       | impossibly precise.
        
         | playpause wrote:
         | Maybe it's aggregated. If someone retweets seven stories, and
         | four are deemed 'accurate', that's about 57%.
        
         | zksmk wrote:
         | Before: 100 users, 100 accurate retweets.
         | 
         | After: 100 users, 99 accurate retweets.
         | 
         | The study followed 2000 and I can only assume they all
         | retweeted multiple times, like 10, so you get 20000 retweets.
         | Seems plausible to me.
        
       | floxy wrote:
       | This is essentially a textbook example from
       | 
       | Chapter 5 of Probability Theory: The Logic of Science
       | 
       | http://www.med.mcgill.ca/epidemiology/hanley/bios601/Gaussia...
       | 
       | "...The new information D is: 'Mr. N has gone on TV with a
       | sensational claim that a commonly used drug is unsafe', and three
       | viewers, Mr. A, Mr. B, and Mr. C, see this. Their prior
       | probabilities P(S|I) that the drug is safe are (0.9, 0.1, 0.9),
       | respectively; i.e. initially, Mr. A and Mr. C were believers in
       | the safety of the drug, Mr. B a disbeliever. But they interpret
       | the information D very differently, because they have different
       | views about the reliability of Mr. N. They all agree that, if the
       | drug had really been proved unsafe, Mr. N would be right there
       | shouting it: that is, their probabilities P(D|SI) are (1, 1, 1);
       | but Mr. A trusts his honesty while Mr. C does not. Their
       | probabilities P(D|SI) that, if the drug is safe, Mr. N would say
       | that it is unsafe, are (0.01, 0.3, 0.99), respectively.
       | 
       | ...
       | 
       | Put verbally, they have reasoned as follows:
       | 
       | A) - Mr. N is a fine fellow, doing a notable public service. I
       | had thought the drug to be safe from other evidence, but he would
       | not knowingly misrepresent the facts; therefore hearing his
       | report leads me to change my mind and think that the drug is
       | unsafe after all. My belief in safety is lowered by 20.0 db, so I
       | will not buy any more.
       | 
       | B) - Mr. N is an erratic fellow, inclined to accept adverse
       | evidence too quickly. I was already convinced that the drug is
       | unsafe; but even if it is safe he might be carried away into
       | saying otherwise. So,hearing his claim does strengthen my
       | opinion, but only by 5.3 db. I would never under any
       | circumstances use the drug.
       | 
       | C) - Mr. N is an unscrupulous rascal, who does everything in his
       | power to stir up trouble by sensational publicity. The drug is
       | probably safe, but he would almost certainly claim it is unsafe
       | whatever the facts. So hearing his claim has practically no
       | effect (only 0.005 db) on my confidence that the drug is safe. I
       | will continue to buy it and use it."
        
       | OrvalWintermute wrote:
       | I'd question if the abruptly clipped nature of the conversational
       | medium (twitter) leads automatically to more defensive responses.
       | Some of these communication tools lack completely the nuances of
       | human to human conversation.
       | 
       | For example, Pre-COVID, you are having a nice cup of Joe with a
       | chocolate cake at a local coffeshop, sitting in your favorite
       | chair. You happen to engage in a conversational with a stranger,
       | but someone with whom the conversation was intriguing, albeit,
       | completely out of your regular circle.
       | 
       | If the stranger, with an attentive face and no negative emotions,
       | in a stable voice, provides some completely contrary information
       | to something you think that you know how might you take it?
       | 
       | Now, as a thought exercise, instead imagine the same as a simple
       | blurb on twitter. No context, bereft of humanity, no
       | communication that this person is a friend, enemy, or merely
       | reposting talking points from whichever biases newsmedia they
       | follow.
       | 
       | Would you think that the latter would be more likely to result in
       | a negative response, only from the medium difference (and what it
       | lacks)?
        
       | goalieca wrote:
       | > Not only is misinformation increasing online
       | 
       | I am really starting to despise this word because it is so very
       | imprecise. Yes, there are cases where people are making up facts
       | to support and influence but very seldom do even the experts make
       | statements with scientific precision and nuance that some topics
       | deserve. This word is often being used to politically dismiss any
       | opinion you don't agree with.
        
       | hashkb wrote:
       | Across the board, correcting and confronting people, no matter
       | how you do it, leads them to double down, in most cases. Source:
       | reality.
        
       | gadders wrote:
       | Here's the paper:
       | https://dl.acm.org/doi/pdf/10.1145/3411764.3445642
       | 
       | Here's the Snopes "debunking" of one of the claims:
       | https://www.snopes.com/fact-check/ukraine-clinton-foundation...
       | 
       | I wouldn't call that unequivocal. Snopes re-wrote the original
       | claim to say it was referring to the Ukrainian Government, and
       | then said it was false. (The tweet did make some other claims,
       | but they were discussed in a separate article).
       | 
       | I think if you're going to convince people, maybe use a fact-
       | source that matches their political leanings.
        
         | [deleted]
        
       | MaxBarraclough wrote:
       | Surprised to see that the article makes no mention of the closely
       | related _backfire effect_ in psychology. From [0]:
       | 
       | > _given evidence against their beliefs, people can reject the
       | evidence and believe even more strongly._
       | 
       | [0]
       | https://en.wikipedia.org/wiki/Confirmation_bias#backfire_eff...
        
       | jaredwiener wrote:
       | No one wants to be fact checked.
       | 
       | https://blog.nillium.com/fighting-misinformation-online/
        
       | jsight wrote:
       | > Among other findings, the researchers observed that the
       | accuracy of news sources the Twitter users retweeted promptly
       | declined by roughly 1 percent in the next 24 hours after being
       | corrected.
       | 
       | 1 percent? How do you measure a 1 percent decline in news source
       | quality? This sentence throws doubt at the whole idea, IMO.
        
         | nicklecompte wrote:
         | The paper is online:
         | https://dl.acm.org/doi/pdf/10.1145/3411764.3445642
         | 
         | > To measure users' subsequent behavior after receiving the
         | correc- tion, we focused on three main outcome variables. Most
         | importantly, we considered the quality of news content shared
         | by the users. We quantifed the quality of news content at the
         | source level using trust- worthiness scores of news domains
         | shared by the users based on a list of 60 news domains rated by
         | professional fact-checkers (this list contains 20 fake news, 20
         | hyperpartisan, and 20 mainstream news outlets where each domain
         | has a quality score between 0 and 1) [39]. A link-containing
         | (re)tweet's quality score was defned as the quality of the
         | domain that was linked to. (Quality scores could not be
         | assigned to tweets without links to any of the 60 sites.)
         | 
         | Citation [39] is Pennycook,G. & Rand,D.G. "Fighting
         | misinformation on social media using crowdsourced judgments of
         | news source quality." I am not sure what "crowdsourced" means
         | here because I didn't read all the citations.
         | 
         | But it sounds like they have a plausible metric. It's always
         | very dumb to criticize science based on a press release.
        
           | jsight wrote:
           | I'm not sure what you are arguing here. Do you think 1% is a
           | significant finding? What was the margin of error?
        
             | nicklecompte wrote:
             | I don't really know if it's a significant finding. My point
             | was that you asked "how you measure a 1 percent decline in
             | news source quality?" and suggested that your personal
             | confusion about such a thing throws the entire paper into
             | question. Instead of having such strong judgments, why not
             | just read the paper?
             | 
             | I am personally not sure how astronomers measure
             | gravitational redshift with such apparent accuracy. But my
             | ignorance does not mean astronomers are a bunch of frauds.
             | It means I haven't done the required reading.
        
               | jsight wrote:
               | Doubt is not a strong judgement. I was hoping someone had
               | the details and would clarify. I appreciate that you did
               | that as you've turned my doubt into certainty. They did
               | not precisely measure a 1% change in news source quality.
        
       | PeterisP wrote:
       | In essence, don't feed the trolls, ignore them.
        
       | wyldfire wrote:
       | > The study was centered around a Twitter field experiment in
       | which a research team offered polite corrections, complete with
       | links to solid evidence, in replies to flagrantly false tweets
       | about politics.
       | 
       | I wonder if it would be different if it had come from someone
       | they knew in real life. I guess I shouldn't be at this point, but
       | I'm always surprised that the people posting the misinformation
       | aren't terribly embarrassed about it when it's revealed.
       | 
       | > "We might have expected that being corrected would shift one's
       | attention to accuracy. But instead, it seems that getting
       | publicly corrected by another user shifted people's attention
       | away from accuracy -- perhaps to other social factors such as
       | embarrassment."
       | 
       | > "Future work should explore how to word corrections in order to
       | maximize their impact, and how the source of the correction
       | affects its impact,"
       | 
       | I think this is important work, but I'm pessimistic about
       | anything that will really be effective.
        
         | jerf wrote:
         | "We might have expected that being corrected would shift one's
         | attention to accuracy."
         | 
         | Really? Who would think that? Science requires an open mind,
         | sure, but it doesn't require you to be some sort of idiot when
         | formulating your hypotheses. To a first approximation, nobody
         | responds to being corrected with a polite thank you and a shift
         | to focus on accuracy, and everybody knows that.
         | 
         | Whatever model of humanity they're operating with is less
         | realistic than _homo econimus_. Are they using _homo vulcanus_?
        
           | jsight wrote:
           | Exactly... I find that a lot of the folks posting the most
           | misinformation have basically zero interest in whether it
           | actually is true.
        
             | jerf wrote:
             | That's excessively specific. Most people, no further
             | qualifiers, have zero interest in whether what they are
             | posting is actually true. Most people are just socially
             | signalling their preferred in group by what news they
             | propagate. Labeling things "misinformation" is just another
             | dodge for having to engage with whether or not something is
             | true, or has a grain of truth in it that you might not
             | like.
        
         | bnralt wrote:
         | > I guess I shouldn't be at this point, but I'm always
         | surprised that the people posting the misinformation aren't
         | terribly embarrassed about it when it's revealed.
         | 
         | When things become hyperpartisan, people can't see this
         | revelation. When people read articles like this, I wouldn't be
         | surprised if the reaction is mostly "Yes, I can't believe those
         | people who disagree with me do this," and not a reflection on
         | their own behavior.
         | 
         | You can see this if you follow any political argument online.
         | No one believes that the someone on an opposing side could have
         | a valid argument, or could correctly point out their mistake.
         | If a third-party observer tries to point out a mistake, people
         | will usually cast them as the enemy and accuse them of
         | spreading falsehoods.
         | 
         | Walter Lippmann's excellent Public Opinion points out how it's
         | difficult for people to be both interested in a topic and
         | neutral. This excerpt touches on the issue:
         | 
         | > "It has been said" writes Walter Bagehot, [Footnote: On the
         | Emotion of Conviction, Literary Studies, Vol. Ill, p. 172.]
         | "that if you can only get a middleclass Englishman to think
         | whether there are 'snails in Sirius,' he will soon have an
         | opinion on it. It will be difficult to make him think, but if
         | he does think, he cannot rest in a negative, he will come to
         | some decision. And on any ordinary topic, of course, it is so.
         | A grocer has a full creed as to foreign policy, a young lady a
         | complete theory of the sacraments, as to which neither has any
         | doubt whatever."
        
           | Nursie wrote:
           | It's very tempting, as in your insightful quote there, to
           | form opinions on everything. However it would be nice if more
           | of us said "I don't have enough information to have an
           | informed opinion here" more often.
           | 
           | And I include myself in that.
           | 
           | (Edit - yes I am a middle class Englishman!)
        
           | jfengel wrote:
           | I'd point out that Bagehot wrote that in 1871. People don't
           | change, though communications media sure do.
        
         | Mordisquitos wrote:
         | > I wonder if it would be different if it had come from someone
         | they knew in real life. I guess I shouldn't be at this point,
         | but I'm always surprised that the people posting the
         | misinformation aren't terribly embarrassed about it when it's
         | revealed.
         | 
         | That's an interesting possibility, and I have a small anecdote
         | in the context of the COVID-19 pandemic that is slightly
         | related.
         | 
         | My mother, at the older end of the lately maligned "boomer"
         | generation, would often rant with me on the phone about how
         | obviously fake or ridiculous hoaxes were being spread on
         | WhatsApp groups that she participated in. Even though she
         | doesn't have a scientific background, she has a good instinct
         | for the general ideas and how science works, and a very well-
         | honed bullshit detector. After a while, maybe out of lockdown
         | boredom, she started refuting the hoaxes when she could find
         | good sources or science-based arguments to do so... and
         | eventually, some of her friends and acquaintances who _did_
         | tend to fall for false information, now ask her beforehand if
         | something they have just been shared makes sense or not!
         | 
         | Note however that this is not in the US, so at least she
         | doesn't have to contend with overarching partisan identity
         | lines regarding belief in this or that.
        
           | bombcar wrote:
           | The US political lines make this horribly hard to do unless
           | you ACTUALLY know the participants well - just take a look at
           | the HN thread about the electric F150 to see acres of people
           | unwilling to even consider that someone could own a pickup
           | truck without being "in the wrong group".
        
         | yorwba wrote:
         | > I'm always surprised that the people posting the
         | misinformation aren't terribly embarrassed about it when it's
         | revealed.
         | 
         | Alternatively, the _people_ posting misinformation were
         | terribly embarrassed, posting less on Twitter as a result,
         | while the _bots_ kept posting at their original schedule, thus
         | decreasing average quality as a result.
         | 
         | The paper seems to be lacking information about absolute tweet
         | counts, so it's hard to tell what the change in their relative
         | measures means in practice.
        
           | joering2 wrote:
           | My impression from being on twitter daily, is that some
           | people (plenty) love to play stupid on Twitter, never mind
           | their wisdom. I mean I find pilots, neurosurgeons, C-level
           | execs, Harvard grads, etc, acting ridiculous, mostly related
           | to politics but not only. I think its as pure trollism as it
           | can get. And they never learn - I see someone laying down
           | facts, using reason, logic etc.. then they are back spewing
           | lies the next day. If you wanna turn the nicest person into a
           | monster, just leave them on twitter for a week or two.
        
         | MattGaiser wrote:
         | > I wonder if it would be different if it had come from someone
         | they knew in real life. I guess I shouldn't be at this point,
         | but I'm always surprised that the people posting the
         | misinformation aren't terribly embarrassed about it when it's
         | revealed.
         | 
         | Anecdotally, they don't see it as truth being revealed, but me
         | being "brainwashed", a "linear thinker" or "slave to
         | orthodoxy."
        
       | bjt2n3904 wrote:
       | I think this an extremely important topic. But I don't think it's
       | so much THAT users are corrected, as it is HOW users are
       | corrected.
       | 
       | > To conduct the experiment, the researchers first identified
       | 2,000 Twitter users, with a mix of political persuasions, who had
       | tweeted out any one of 11 frequently repeated false news
       | articles. All of those articles had been debunked by the website
       | Snopes.com.
       | 
       | And here lies the issue. We don't "correct" the issue with a
       | discussion out of genuine curiosity, we "correct" the issue by
       | making an appeal to authority. Like the XKCD #386 comic, we're an
       | obsessive dog licking at it's wounds -- we can't go to sleep.
       | Someone is wrong.
       | 
       | I haven't encountered a single "Flat Earther". I have encountered
       | one genuinely "anti-vaxer". But the way the discussion goes on
       | the internet, I'd expect they're behind every tree.
       | 
       | When people rush over with a link on "Snopes", and then smugly
       | sit back thinking, "Checkmate" -- it worsens the issue. The
       | reason it makes matters worse is because we're acting like Dwight
       | Schrute: "False. CNN has not purchased an industrial washing
       | machine to put a spin on stories. News stories cannot be placed
       | inside a washing machine."
        
       | DSingularity wrote:
       | Isnt this obvious? We live in a hyper-connected world where
       | sentiments can take a dive in a minute when something starts
       | trending. We also live in a world where every well-funded
       | organization has a propaganda arm.
       | 
       | Take Israeli propagandists as an example. What do you think
       | happens when a hashtag like #SheikhJarrah starts getting
       | attention and activity measured in the thousands per hour? You
       | think they will just ignore these threads? Of course not. Troll
       | networks get notified. Volunteers in all kinds of pro-Israeli
       | organizations are mobilized and directed to specific
       | conversations. Suddenly, it seems like every active thread is
       | getting attacked by personalities each responding with very
       | similar claims and very similar approaches.
       | 
       | I believe this happens across the board and not just with
       | political/ideological issues. You are not going to have an online
       | platform allowing people to share information widely without
       | attracting the interests of propagandists eager to
       | improve/maintain the images of their clients.
        
       | marcodiego wrote:
       | Any form of refutation can be taken by conspiracists as evidence
       | of the conspiracy.
        
       | stephc_int13 wrote:
       | Lecturing people doesn't work because there is no trust.
       | 
       | We should not assume that authority derived from credentials or
       | the use of factually supported logic is sufficient to convince,
       | without trust it is merely anecdotical.
        
       | godshatter wrote:
       | Maybe the world has changed but when someone says something
       | stupid and someone else calls them on it, backing it up with
       | facts, and they double down on the stupid... shouldn't there be a
       | population of people out there that see this as a desperation
       | move on the part of the original poster? Or a sign of closed-
       | mindedness?
       | 
       | Correcting someone in a public venue isn't for the audience of
       | one that is spouting stupid things, it's for the rest of the
       | world that is reading the thread. So a fact-based correction that
       | causes more stupid to be posted is doing it's job, really.
       | 
       | If the response happens to reference the facts that were posted,
       | well, now you've got a conversation going.
       | 
       | I would also like to point out that stupid is in the eye of the
       | beholder sometimes. Something can sound stupid on first blush yet
       | turn out to be true.
        
         | pjc50 wrote:
         | > Maybe the world has changed but when someone says something
         | stupid and someone else calls them on it, backing it up with
         | facts, and they double down on the stupid... shouldn't there be
         | a population of people out there that see this as a desperation
         | move on the part of the original poster?
         | 
         | Did you miss the entire Trump presidency? People _love_ that
         | sort of thing. Facts are difficult, inscrutable things that
         | have to be dug out of observations and carefully safeguarded.
         | They often tend to be disappointing. Whereas lies and
         | fantasies? Those are _theatre_.
         | 
         | Oh, and algorithmic timelines make this worse: correcting
         | someone is promoting their original views to other people.
        
           | AnimalMuppet wrote:
           | The problem is that there is too much information. For
           | (almost) any position, there are some facts that support that
           | position. There are other facts that oppose the position. The
           | _evidence_ (the sum of all the facts) often leans one way or
           | the other - either supporting or opposing the position. But
           | someone arguing in bad faith can often find enough facts to
           | look somewhat convincing, which lets them persuade at least
           | some others that their position is correct.
        
         | HPsquared wrote:
         | See the 1% rule:
         | 
         | https://en.m.wikipedia.org/wiki/1%25_rule_(Internet_culture)
         | 
         | The vast majority of people are lurkers and only read the
         | content.
        
         | Nursie wrote:
         | It's certainly a sign of closed mindedness.
         | 
         | AFAICT most discussion on the internet is pretty closed-minded
         | though, people have their preconceptions, and they argue them.
         | Rarely is anyone enlightened, or are minds changed. It is
         | common to see such a person be corrected on a point of fact but
         | then go on to repeat the same falsehood in another place or on
         | another day. This is likely because to them the narrative is
         | important, the emotions, rather than correctness.
         | 
         | > Correcting someone in a public venue isn't for the audience
         | of one that is spouting stupid things, it's for the rest of the
         | world that is reading the thread.
         | 
         | I hope so, but I'm not sure it works like that either, as the
         | legions of spouters seem to grow by the day :/
        
         | everdrive wrote:
         | I think a lot of people can't see through the noise. Find a
         | video of two pundits arguing, and I guarantee that you can find
         | videos which alternately claim that each side "destroyed" the
         | other. It's not just that people can't integrate all the
         | different information out there. (this is part of the problem,
         | too, of course) Frankly, I think we have underestimated the
         | degree (or at least I underestimated it quite a bit) to which
         | many people never really understand the validity of an
         | argument, but simply adopt it when it becomes "mainstream." Of
         | course the problem now is that there is not really a
         | "mainstream." There are many small competing and conflicting
         | "mainstreams." And so people adopt many of these ideas.
        
           | Verdex wrote:
           | It's even worse than that. I went to a evolution vs
           | creationism debate once when I was in college (it was put on
           | for the students there). Everyone in that auditorium was an
           | atheist and everyone was a young earth creationist. The
           | difference was in which speaker had just made a really good
           | _sounding_ point.
           | 
           | The people on the extremes each believe that their side won
           | but as far as I could tell all the people in the middle
           | believed in both as they crossed some "sufficiently cool"
           | threshold.
        
           | brutal_chaos_ wrote:
           | Reasoning takes effort and people are lazy. So why not listen
           | to my favorite <insert individual of note in society>. /s
           | 
           | IMO, it sounds like a lack of critical thinking skills.
           | Instead of pondering validity, as one does to "think
           | critically," as you stated, some people adopt a line of
           | thought from their authority (be it a pastor at church,
           | Tucker Carlson/Rachel Maddow/etc on TV, POTUS, news outlets,
           | etc).
        
         | mgh2 wrote:
         | "Truth is so obscure these times, and falsehood so established,
         | that unless we love the truth, we cannot know it." - Blaise
         | Pascal
         | 
         | Unfortunately, the odds are against us. Lying is inherent in
         | human nature. Even if you point out the facts, most people will
         | be deluded again because the majority drowns everything,
         | especially in an anonymous environment such as the internet.
        
           | 1cvmask wrote:
           | But what are the facts? WMDs in Iraq and the myriad of other
           | lies our governments throw at us. Fool me once shame on you.
           | Fool me ten thousand times and shame on everyone.
        
         | orev wrote:
         | Something missing in this type of analysis is the future
         | actions of the person being corrected. One might say something
         | wrong, be corrected, then argue (because being corrected makes
         | you feel bad and angry). But the _next_ time they say
         | something, they might consider it more, or maybe not be as
         | extreme, as they want to avoid the bad feelings caused by
         | negative feedback.
         | 
         | It's easy to see this in action on this very site -- nobody
         | likes to be downvoted, so the conversations stay somewhat more
         | civil than other sites.
        
         | throwaway803453 wrote:
         | Often the words "I, you, my, your" are as bad as swear words
         | when correcting someone. One sure way to anger someone and have
         | the double down is to hurt their ego by using one of those
         | words (e.g., your code has a bug, vs the code has a bug). Avoid
         | those words and disagreements become a lot more productive.
        
           | [deleted]
        
       | tryonenow wrote:
       | >Yes, in some ways. A new study shows Twitter users post even
       | more misinformation after other users correct them.
       | 
       | I can't help but feel like the academics studying this "problem"
       | are blinded by hubris. Even the byline is exemplary - what's
       | being described is a _discussion_.
       | 
       | When you gatekeep science in the public square with "fact
       | checking" you inevitably end up with a politicized orthodoxy. The
       | opinions and majority consensus of our academic institutions are
       | not beyond reproach, and there have repeatedly been instances
       | where the messaging was misleading or false - look no further
       | than the discourse surrounding covid starting early last year.
       | Latest example being the lab origin hypothesis - a cooky, right
       | wing, xenophobic conspiracy theory, until it wasn't. Fortunately
       | media outlets are finally backtracking on their politicized "fact
       | checking" in this case: see the editor's note here [0] for
       | example.
       | 
       | 0. https://www.vox.com/2020/3/4/21156607/how-did-the-
       | coronaviru...
        
       ___________________________________________________________________
       (page generated 2021-05-25 23:01 UTC)