[HN Gopher] Something weird is happening on Facebook
       ___________________________________________________________________
        
       Something weird is happening on Facebook
        
       Author : incomplete
       Score  : 412 points
       Date   : 2021-09-27 18:19 UTC (4 hours ago)
        
 (HTM) web link (www.politicalorphans.com)
 (TXT) w3m dump (www.politicalorphans.com)
        
       | micromacrofoot wrote:
       | It doesn't really matter if these posts are intentionally being
       | used to gather data, _someone 's_ gathering it.
       | 
       | Cross-reference some of the security question posts like "first
       | car" with large data leaks and you're bound to get some matches
       | eventually. Scammers in some countries can live for years off of
       | one good bank account hit, and many of them make a living just
       | reselling the data before it even gets that far.
       | 
       | We need some serious public education about freely sharing
       | information on the internet. Like "this is your brain on drugs"
       | nationwide PSA campaigns.
        
       | giansegato wrote:
       | I don't get it. What do these people would be doing with such
       | data? Ok, they know that a certain Facebook account named John
       | Doe is likely to be a male, between 40 and 50 years old, voted
       | for Trump both in 16 and 20. So what? It's not like you can
       | retarget said account through ads. I fail to see the purpose.
        
         | spoonjim wrote:
         | The Cambridge Analytica scandal shows exactly what you can do
         | with this data.
         | 
         | Targeted advertising lets you run a campaign that never could
         | be run before... one where you appear to be something different
         | to different people. If you were trying to seize control of
         | three different warring groups, you could advertise to A,
         | "We'll kill B and C!" , to B, "We'll kill A and C!" and to C,
         | "We'll kill A and B!" which you couldn't do in a stump speech
         | without people figuring out the ruse.
         | 
         | By building a detailed psychological profile of individuals,
         | you can build a model that allows you to tie their responses to
         | these questions with the political messages they're susceptible
         | to. Cambridge Analytica paid a few hundred thousand people to
         | do a quiz where they shared their Facebook likes and answered
         | questions about their personality. CA then used that to build a
         | model that showed "People who live in Slidell, Louisiana and
         | like Dodge Ram trucks will be most receptive to messages about
         | illegal immigration and are generally supportive of state
         | violence". Then they can run that ad to everyone in Slidell who
         | likes Dodge Ram pickups.
        
           | rscoots wrote:
           | Why is targeted political messaging inherently immoral
           | though?
           | 
           | It seems the crux of the issue here is that people are being
           | fooled into supplying data about themselves in a non-
           | consentual manner.
           | 
           | The former has been happening in politics forever and imo the
           | latter has been every tech companies MO essentially for the
           | last decade.
        
         | bink wrote:
         | In 2016 it was used to target these specific types of people
         | with outrage. Get them riled up against their opponents and
         | make them more likely to vote for your candidate or issues and
         | more likely to spread your propaganda for you.
         | 
         | AFAIK you absolutely can target (or could target) groups of
         | people based on very specific criteria.
        
         | akersten wrote:
         | There is _extreme_ value in knowing who your most-prospective
         | marks are. If you had a population of 1,000 people, and _had_
         | to sell something (read: convince to vote a certain way) to 400
         | of them, wouldn 't you like to know the subset of those people
         | who are already predisposed to your position and just need a
         | little more nudging with a narrowly-tailored meme , instead of
         | making a dartboard attempt against all 1,000?
         | 
         | That was the entire value-add of Cambridge Analytica , whose
         | Facebook-API data-gathering loophole has now been replaced by
         | just engaging suckers via the platform itself and a tiny bit of
         | NLP/sentiment analysis.
        
           | obelos wrote:
           | I suspect there's also value in avoiding showing some forms
           | of persuasive/propagandistic content to those who are
           | unlikely to be amenable to it. This allows the content to
           | circulate with less suppressive feedback from a target's
           | peers.
        
             | AlexAndScripts wrote:
             | You can also avoid targeting people who already support
             | you. Or, in the case of Cambridge Analytica's brexit
             | manipulation, identify those who have never voted before
             | (using hundreds of indicators) and introduce them to
             | politics... Their first information being propaganda from
             | the leave campaign.
        
         | kube-system wrote:
         | There's lots of stuff you can do with the data.
         | 
         | The most obvious one: with the security question type stuff,
         | you can take over other people's accounts.
         | 
         | But collecting data about people is also useful if you're
         | trying to spread an agenda. You can determine what types of
         | messages resonate with an audience. You can group those people
         | and target them separately -- not through ads, but through
         | special interest accounts/groups. You can recruit people to
         | amplify your message. You can even get people to act in real
         | life, i.e.
         | https://en.wikipedia.org/wiki/Internet_Research_Agency#Ralli...
         | 
         | If you can identity, categorize, and influence the loudest
         | voices, you can influence public discussion and opinion.
        
         | x0x0 wrote:
         | You can pull the data off of fb.
         | 
         | FB's targeting tools go to some lengths to only allow you to
         | target by dems on fb and associated properties. These
         | questions, plus profile views, allow you to extract the
         | information for external use. Eg selling a list of <fb id,
         | first car, birth year, favorite color, pet name, etc> tuples.
        
       | m0d0nne11 wrote:
       | Is there some way to delete/filter/downvote click-bait header
       | lines on HN like this one?
        
       | throwawaymanbot wrote:
       | Zuckerberg should be in an orange jumpsuit.
        
       | tyingq wrote:
       | My favorite is questions that appear to be roundabout ways to
       | gather your password reset questions.
       | 
       | Like _" Your stripper name is: Your Favorite Color + Name of Your
       | First Dog!"_. Never fails to get tons of responses.
        
       | okareaman wrote:
       | My FB feed is lousy with these sorts of things. My family &
       | friends seem to enjoy answering questions like "What was the
       | first concert you attended?" I don't think it's for advertising
       | purposes because I have not noticed any improvement in the
       | accuracy of targeted ads. I don't think it's for political
       | purposes because that is much easier to figure out without these
       | oblique questions.
       | 
       | One group who would benefit from detailed life style profiles are
       | life insurance companies. More detail is better for setting
       | accurate premiums while remaining competitive with other life
       | insurance companies.
       | 
       | Edit: I almost forgot to mention a really popular one I've seen a
       | lot of lately: "Have you ever had a DUI ? I'll wait." It's
       | unbelievable to me that people would answer this question, but it
       | definitely something insurance companies would like to know
       | because their records don't go that far back into the paper age.
       | A lot of people answer something like "No, but I should have."
        
         | void_mint wrote:
         | > My FB feed is lousy with these sorts of things. My family &
         | friends seem to enjoy answering questions like "What was the
         | first concert you attended?" I don't think it's for advertising
         | purposes because I have not noticed any improvement in the
         | accuracy of targeted ads. I don't think it's for political
         | purposes because that is much easier to figure out without
         | these oblique questions.
         | 
         | Hyper dystopian take: Gathering data to be able to create real-
         | seeming narratives for fictional profiles to push political
         | agendas.
        
         | htrp wrote:
         | > "What was the first concert you attended?"
         | 
         | That's literally a security question for a bank password reset.
        
           | okareaman wrote:
           | That's not a good example, which is why I added the DUI
           | question. I've noticed most of my friends have stopped
           | answering security type questions as they've been warned.
        
         | matsemann wrote:
         | I've long wished for an opportunity to filter my feed, so that
         | I only see what people post themselves. But I guess fb force
         | these things down my throat because no one actually posts
         | anything anymore, except big life events like weddings or
         | announcing a child. So people would realize fb is dead when it
         | comes to keeping in touch with friends and family.
        
           | aembleton wrote:
           | Train the algorithm! Just exit Facebook as soon as you
           | encounter one of these.
        
           | okareaman wrote:
           | Facebook is dead as far as I'm concerned because I tested it.
           | I quit all groups and unliked all pages. My feed just became
           | ads and posts from friends. I have a couple of friends who
           | like to post, but the vast majority don't post anything. It's
           | a ghost town.
        
             | silexia wrote:
             | I shit down my Facebook recently after my interactions with
             | it became unhealthy.
        
             | emerged wrote:
             | It's a pretty weird place these days. There are a few
             | friends who get like 7 trillion likes if they post that
             | their baby farted, but many many other friend posts have
             | literally 0 or very few likes.
             | 
             | At some point I stopped commenting or Liking any post which
             | already has more than ~10 Likes or comments. In some sense
             | it feels really strange to me that people bother to engage
             | with content where their engagement is essentially
             | invisible within the crowd.
        
       | pytlicek wrote:
       | Something weird is happening on Facebook for a long time :/
        
       | Buvaz wrote:
       | Its just me..collecting data to work out what will get my mom of
       | her phone.
        
       | whyenot wrote:
       | It's becoming increasingly clear that social media needs to be
       | more strictly regulated. How to do that in a free society is a
       | difficult question. OTOH, if we take too long to figure this out,
       | it may be too late. In fact, it may already be too late.
        
         | jensensbutton wrote:
         | What regulation would help here?
        
       | pkamb wrote:
       | > Yes, a question-post invites more engagement than a simple
       | comment, but there's something else at work here.
       | 
       | Is there?
       | 
       | I've noticed wannabe influencers on Instagram including questions
       | and polls with every one of their Stories. They're doing it to
       | "juice the algorithm" by getting responses. That in turn
       | theoretically gets them featured on the Explore Page or whatever.
       | YouTubers do the same things, ending each video with a CTA
       | question you should answer in the comments.
       | 
       | The Facebook question pages that boomers answer seem to just be
       | doing the same thing, attracting comments and interactions and
       | thus boosting the page.
       | 
       | The bigger question I have is why Facebook thinks _I_ would be
       | interested in seeing in my timeline that my 68 year old aunt has
       | answered  "Freddy Mercury" to some question about the best
       | musical act they've seen live.
        
       | PerkinWarwick wrote:
       | Is it possible to set up Facebook to only give you people you
       | know or people you follow on your feed? I'd consider an account
       | if that were true, with the appropriate personal filters on of
       | course given the spying the company does.
       | 
       | It's a totally sleazy company, but it actually provides a
       | valuable service at the same time.
        
       | marcus_holmes wrote:
       | I'm curious if the data harvested from this is skewed to older
       | people, and more "naive" people. Most of my techie friends have
       | uninstalled FB (like me), or rarely interact with it. And my
       | smarter friends just don't interact with that kind of clickbait-y
       | post.
       | 
       | I wonder if marketing folks will even notice. Like Google
       | Analytics, which is disproportionately blocked by smarter and
       | more technical people. Marketers cheerfully ignore that, though.
       | Will they even know that they're missing our data?
       | 
       | Is the Venn diagram of FB enthusiastic data-donaters and people
       | who don't block GA just a circle? If so, are public policies and
       | corporate marketing strategies going to be designed to cater to
       | them and not us?
        
       | brap wrote:
       | Speaking of weird things happening on Facebook... I see a lot of
       | official artist pages (usually artists who have been pretty
       | successful a few years ago but no longer are, mostly rappers for
       | some reason) that post A LOT of random "memes" (mostly just stuff
       | stolen from Reddit) that are completely unrelated to their work,
       | usually with very clickbaity captions (tag a friend etc). When
       | you go on their page it doesn't show up. Most comments seem to be
       | from 2nd world countries (like my own). What's that about?
        
       | md_ wrote:
       | I'm confused. The premise of this post seems to be that this is
       | some malicious attempt to deduce basic geographic data on
       | Facebook users. But doesn't Facebook let you target ads based on
       | such data already?
       | 
       | Why would I not just pay Facebook directly for such targeting?
        
         | deckar01 wrote:
         | I believe the goal is to build a model that correlates public
         | attributes that FB provides with hidden attributes learned from
         | these probes. When the attacker is ready to target users
         | matching specific hidden attributes, they can reverse the
         | correlation into ranges of public attributes that FB will
         | accept for targeting ad campaigns.
        
         | mikey_p wrote:
         | The implication is that comments on public posts could be
         | scraped by a third party to build an off-FB psychographic
         | profile, similar to how Cambridge Analytica tried to generate
         | profiles for users by querying data through a third party
         | Facebook app.
        
       | bob229 wrote:
       | Who gives a hoot. Only an absolute moron use social media
        
       | cpr wrote:
       | I stopped reading when the article repeated the thoroughly tired
       | and debunked talking points about Russians stealing DNC server
       | materials...
        
         | jessaustin wrote:
         | I was torn on this one. I do agree with you, in that, if we
         | can't trust them on stuff that was obvious in June of 2017, how
         | can we trust the "facts" we can't check? However, this is at
         | least an _interesting_ hypothesis. The particulars are probably
         | wrong, but all sorts of different parties might try
         | "campaigns" like this for all sorts of different reasons.
         | (Although GOP might not be my first suspect for dastardly
         | clever schemes...)
         | 
         | The most likely possibility seems that we have algorithms
         | fooling algorithms with no humans in the loop. Sure, there
         | might not be enough "real" (i.e. a real human purchasing a real
         | product) revenue sloshing around here to make the whole effort
         | worthwhile. However, there might be a poorly configured
         | dashboard somewhere that makes it _appear_ as if that 's the
         | case... Meanwhile FB laughs all the way to the bank.
        
       | twic wrote:
       | First comment:
       | 
       | > Generally speaking, people should become more and more wary of
       | memes.
       | 
       | I suppose memes which explicitly attack other memes had to emerge
       | at some point.
        
       | pjdemers wrote:
       | How many people choosing random answer would it take to muddy the
       | data to the point it is useless?
        
       | greenyoda wrote:
       | > "Sure, that first post won't accurately predict your birth
       | year..."
       | 
       | Actually it would have, in 2019. 66 + 1953 = 2019, subtract your
       | age, and you get your year of birth.
        
       | Johnny555 wrote:
       | I've been wondering what those posts are about, I keep seeing
       | them in my feed, and they're always answered by the same 60+ year
       | old relatives.
        
         | ogn3rd wrote:
         | Many of those questions are used as 2FA as well.
        
           | Johnny555 wrote:
           | Those I understand, but I'm wondering about the "What's your
           | perfect fall day" or "How old were you when you got your
           | first job" questions that don't seem like they'd yield
           | answers to security questions.
        
             | mikey_p wrote:
             | It's not about security questions, the implication that the
             | author doesn't make explicit is that "how old were you when
             | you got your first job" probably says more about your
             | economic well being and possibly your political leanings.
        
             | rubicon33 wrote:
             | Those are thrown in there to make the general asking of
             | these types of questions considered normal.
        
               | edoceo wrote:
               | Magician calls that mis-direction.
        
             | MikeTheGreat wrote:
             | "What's your perfect fall day" does not strike me as a good
             | security question
             | 
             | "How old were you when you got your first job" actually
             | does seem like a good security question for many people.
             | You're unlikely to forget it, after a while not many other
             | people will know it, and it's a single number (whole
             | number, most likely) so it's easy for a computer to parse
             | (and hard for you to mess up by leaving out / adding too
             | much detail).
             | 
             | Depending on what you respond to on social media, and
             | depending on if you've got any accounts that use this as a
             | security question, you might want to go back and force the
             | account to use a new question ;)
        
               | NineStarPoint wrote:
               | The main issue I see with the "age of first job" question
               | is that there aren't actually statistically that many
               | likely answers to the question, especially if you have
               | basic information on a person. Compare to make and model
               | of first car, which has a lot of possible options and
               | only slightly correlates to life situation.
        
               | edoceo wrote:
               | Make and Model of cars are used on credit verification in
               | USA - like historical address, it's likely linked to some
               | previous credit activity.
        
               | mikey_p wrote:
               | The author implies, but doesn't make explicit that they
               | think things like "age when you got your first job" are
               | more about inferring socioeconomic indicators about the
               | individual than fishing for password clues.
               | 
               | And honestly I bet age of first job probably is a decent
               | indicator of certain economic factors.
        
             | TameAntelope wrote:
             | According to the article, anything related to your age or
             | where you live is a pretty good indicator of what your
             | voting preferences are. The article cites "90% accuracy" if
             | your answers can be used to reasonably guess those facts.
             | 
             | Additionally, if your profile is public (the default still,
             | I believe) is made available when you comment, I'd guess
             | there'd be follow-up scraping going on to collect more
             | details that are then used in conjunction with whatever
             | your response was.
        
             | mooxie wrote:
             | Well the article notes that some amount of user demographic
             | info is revealed simply by interacting with the post,
             | regardless of the information provided by the respondent.
             | 
             | Secondly, this is almost like the email phishing paradox:
             | to an educated user it seems like the number of people who
             | respond with relevant information would be extremely low,
             | but if the attempt costs you basically nothing and you get
             | something useful 1% of the time, you're still winning.
             | 
             | "My perfect fall day is my memory of Aunt June when we
             | lived in Connecticut in the 70s, before she passed away."
             | In itself something like that doesn't seem useful, but
             | there's a good amount of information in there if someone
             | can correlate it with other details about your life.
        
         | tomcam wrote:
         | 60 year-old here. Thanks, broseph!
        
       | CosmicShadow wrote:
       | Was just talking about this the other night with my wife and how
       | we should just start our own group sharing these baiting
       | questions to see how quickly and largely we can grow it.
        
       | ___luigi wrote:
       | Social platforms amplifies all of our human qualities, and our
       | interaction habits. Since old ages, people were striving to seek
       | attention and show their work [1] [2]. After reading this book
       | [3] indistractable, I started to reflect on how our educational
       | systems are not designed to prepare students to live in this
       | digital age, yesterday it's FB, today it's TikTk and tomorrow
       | there will be something else. FB is just one Pawn in this game.
       | 
       | I know that siding with FB is one of these topics that are very
       | controversial in HN, but I am not finding excuses for the
       | practices of these companies, my point is that our kids will live
       | in a different age than the one we lived in, educational systems
       | should keep up with these challenges and find innovative way to
       | prepare people to efficiently manage that.
       | 
       | [1]: https://en.wikipedia.org/wiki/Mu%27allaqat [2]:
       | https://en.wikipedia.org/wiki/Culture_of_ancient_Rome [3]:
       | https://www.nirandfar.com/indistractable/
        
       | jsnell wrote:
       | With those comment counts, something dodgy is obviously
       | happening.
       | 
       | The interesting question here is whether Facebook is somehow
       | accidentally amplifying it. Certainly it is not in Facebook's
       | interest to allow this kind of data harvesting. If it hurts you
       | to think that Facebook somehow isn't maximally evil, at least
       | consider that this is data that could be only Facebook's.
       | Allowing somebody else to harvest it is money straight out of
       | Facebook's pocket.
       | 
       | So, given FB should not be complicit, what mistake could they be
       | making to allow the system to be haunted? The obvious guess is
       | that they have a feedback loop in the ranking algorithm. It
       | values comments very highly as a signal of good engagement, but
       | they weren't prepared for "content" that is this good at
       | eliciting low effort comments and have wide appeal
       | demographically. As long as one of these reaches a critical mass,
       | it'll be shown to tens or hundreds of millions of people just by
       | any engagement feeding even more engagement.
       | 
       | Is there anything less obvious?
        
         | NelsonMinar wrote:
         | It's absolutely in Facebook's interest to allow this kind of
         | viral garbage. It feels like engagement, gives people a way to
         | engage socially in seemingly harmless questions. (I'm not at
         | all convinced these are some dastardly data gathering scheme -
         | many of the viral questions don't have meaningful answers.)
         | 
         | It would be so, so simple to stop these. Just de-prioritize
         | posts with too many replies, or replies from people you don't
         | know, or.. anything. The virality of these things sticks out
         | like a sore thumb. Facebook is choosing to not stop them.
         | 
         | Then again as we learned recently Facebook is choosing not to
         | stop all sorts of things on their platform.
         | https://news.ycombinator.com/item?id=28512121
        
         | jaywalk wrote:
         | It has been this way for years, so it's definitely no accident.
         | Posts that elicit more comments absolutely end up getting
         | ranked higher by The Algorithm(tm).
        
         | flatline wrote:
         | They recently started showing friends' response comments first,
         | so you can easily see those of your friends out of the 116,000
         | replies. The rise of these in my feed seemed to correspond with
         | this feature.
         | 
         | Disclaimer, am infrequent FB user and this may have been around
         | for longer than I realize.
        
         | kodah wrote:
         | > The interesting question here is whether Facebook is somehow
         | accidentally amplifying it.
         | 
         | > Certainly it is not in Facebook's interest to allow this kind
         | of data harvesting
         | 
         | Mark Zuckerberg has been quoted through leaked documents to be
         | a strong purveyor of "engagement" at nearly all costs. I don't
         | think giving Facebook the term "accidental" is appropriate
         | anymore. Their desire for engagement trumps the health of their
         | network. I'll dig through my favorited submissions for the WSJ
         | article.
         | 
         | Edit: That was easy: https://archive.md/GQFLq
         | 
         | People are rarely motivated by evil, but they are motivated by
         | opportunity to which an outcome can be perceived as pure evil
         | by the people it affects most.
        
           | taurath wrote:
           | Engagement is a really dirty word to me nowadays - the
           | attention economy comes with all sorts of really bad side
           | effects. We've turned almost all conversations into an ad in
           | order to sell more ads. It's you vs 100 people with a
           | doctorate in psychology at any given time.
        
             | termau wrote:
             | Agreed. I've actively taken steps to combat it, only use my
             | phone in black and white mode, and removed all apps. Stick
             | to my PC for general browsing and my phone use as gone down
             | to 30 minutes a day (mostly calls, messages with the wife,
             | and email).
        
         | captainmuon wrote:
         | I always thought that FB is somewhat complicit (not out of
         | evilness necessarily).
         | 
         | I see a lot of "viral" posts - some like those mentioned in the
         | article, but also a ton of odd woodworking, cooking, and "resin
         | art" videos. The videos are quite repetitive and not really
         | interesting so I wonder if they are maybe hidden ads, but they
         | are not marked as such, and it is not clear what they are
         | selling. (Well maybe they are trying to sell resin, which is
         | really expensive.)
         | 
         | Anyway, it seems like they are different kinds of posts on FB.
         | Some stay close to their point of origin, and only rarely get
         | shown to other people who have not liked a page or are friends
         | themselves. And other posts which, if somebody commented on or
         | interacted in any way with them, get shown to their friends and
         | friends-of-friends.
         | 
         | After running a charitable cause / political FB page for a
         | while, I'm convinced that internally there are actually
         | different categories of posts - ones that are shown to
         | followers, and ones that are allowed to float or go viral. I
         | really wonder what the mechanism is to get into the floating
         | category. It doesn't seem to be based on quality, nor on money
         | spent. Maybe it is some interaction metric that somebody
         | learned to game?
        
           | xg15 wrote:
           | > _I see a lot of "viral" posts - some like those mentioned
           | in the article, but also a ton of odd woodworking, cooking,
           | and "resin art" videos. The videos are quite repetitive and
           | not really interesting so I wonder if they are maybe hidden
           | ads, but they are not marked as such, and it is not clear
           | what they are selling._
           | 
           | As someone who got caught up in some of those videos when I
           | was in complete "mindlessly browse facebook" mode, my guess
           | would be they are optimized for "engagement", nothing more,
           | nothing less. They are just interesting enough that you want
           | to know how the end result looks while harmless enough to
           | appeal to a maximally broad audience.
        
             | jrochkind1 wrote:
             | How are the people making the videos making money though,
             | or why are they doing it if not?
        
               | machinerychorus wrote:
               | It's still possible to share things just for the sake of
               | sharing
        
               | jrochkind1 wrote:
               | Have you seen these videos? I agree they look like they
               | have been "optimized for engagement"... I guess someone
               | could be doing that just for fun, that's your theory?
        
           | taurath wrote:
           | Engagement is just second order advertising, because it makes
           | people spend more time on the ad platform.
        
         | NineStarPoint wrote:
         | A quick point I'd make is that it may not be a mistake (from
         | facebooks perspective) to allow others to exploit a given
         | system as long as they're gaining enough value from it to
         | outweigh that. If whatever is being exploited doubles how much
         | they can charge for ads, they might accept some of their data
         | being stolen until they could find a way to have their cake and
         | eat it too.
        
         | cratermoon wrote:
         | > The interesting question here is whether Facebook is somehow
         | accidentally amplifying it.
         | 
         | Someone somewhere found a way to exploit what FB's engagement
         | metrics do. Is it 'accidental' that FB amplifies things if
         | their system is designed to do exactly what it does when gamed?
        
         | jensensbutton wrote:
         | Isn't the obvious answer to stop scraping (or, at least try)?
         | The author states that the value here is collecting data (not
         | money from ads or something) and Facebook's APIs don't allow
         | for the kind of analysis they'd need to build profiles. Article
         | specifically talks about using Python to scrape profiles
         | (presumably using a logged in account).
        
       | Gollapalli wrote:
       | Cambridge Analytica style targeted political messaging, and
       | retailoring of political formulae to personality/moral-
       | foundations/IQ profiles, in order to create coalitions is the
       | future of political messaging and activism. In many ways, now
       | that the gameboard has changed so drastically, it's unavoidable.
       | 
       | This is how politics works now. It's not (just) Russia, or China,
       | it's every political activism group or lobby that wants to
       | achieve anything. Welcome to the new age.
        
         | engineer_22 wrote:
         | no different than it used to be, just more sophisticated.
        
           | Gollapalli wrote:
           | I think there is a qualitative difference.
           | 
           | Previous means of influencing politics involved NGO's and
           | political parties actively working different demographics in
           | order to get them to vote in the organization's interest.
           | These organizations may loosely be considered managerial
           | bureaucracies, whether they are labor unions, or activist
           | NGO's, or political parties. Even large scale media campaigns
           | conducted via mass psychology are essentially managerial or
           | bureaucratic in nature, using mass organizations at a large
           | scale.
           | 
           | The new means of manufacturing consent are not in this
           | character. Rather than acting directly on mass groups using
           | mass organizations, they operate by directly targeting
           | individuals and niche groups leveraging algorithms and
           | digital means. It's different paradigm: mass vs niche, mass
           | media vs targeted media, mass psychology vs individual
           | psychological profiling, large bureaucratic organizations vs
           | smaller technologically enabled teams, the management of
           | people vs the management of algorithms.
           | 
           | It's two different approaches to power, and hence, two
           | different elite groups. And when you have two different elite
           | groups, you have conflict. It's a new world, a revolution in
           | the making.
        
       | jollybean wrote:
       | I wonder if it's even worth pondering the various kinds of
       | dumpster fires that happen there?
       | 
       | It's like we're caught watching a tornado hit a garbage pile ...
       | while the 'exit' sign is clear for all of us to follow if we
       | want.
       | 
       | I think the answer to all questions Facebook-related is 'delete'
       | / 'exit' / 'log off' and then to go ahead to Spotify and listen
       | to some Ahad Jamal from 40 years ago to put it in context.
        
         | JasonFruit wrote:
         | Off topic: I hadn't listened to Ahmad Jamal in years until a
         | couple months ago, and he was a genius. So much creativity,
         | with no sacrifice of lyricism, and an amazing way with silence.
        
       | throwawaywindev wrote:
       | Those look like phishing for answers to account security
       | questions.
        
       | ergot_vacation wrote:
       | "Oh no the Skinner box we built to exploit and manipulate people
       | is being used to exploit and manipulate people!"
       | 
       | "But why is that-"
       | 
       | "Because OTHER people are doing it!"
       | 
       | "Oh no!"
        
         | sk2020 wrote:
         | That the author is peddling the absurd "Russians hacked muh
         | servers" popular myth makes the comment all the more poignant.
         | 
         | Social networks tolerate fake traffic because it increases
         | their perceived value. The real crime is the fictive usage and
         | engagement metrics they use to set ad pricing.
        
         | Kiro wrote:
         | Facebook is not ohnoing anything here. In fact, I wouldn't be
         | surprised if their algorithm boosts posts encouraging this kind
         | of low-effort engagement.
        
           | jjoonathan wrote:
           | Yeah. Other parties get data, FB gets cold hard cash. I bet
           | they're quite happy with those terms.
        
       | xenihn wrote:
       | Not sure if AsianHustleNetwork (AHN) really falls into what's
       | being described in this article, but it's one of the more
       | insidious FB communities I've run across, in terms of members
       | being milked for affiliate revenue.
       | 
       | The content is overall wholesome and useful, but I'm assuming
       | most members (both contributers and passive viewers/clickers)
       | don't realize that they're lining the owners' pockets with their
       | clickthroughs, along with whatever personal data is being
       | collected through Facebook.
        
       | madrox wrote:
       | This could be more benign than the article makes it out to be.
       | Growing a content business in 2021 requires you to understand
       | Facebook's algorithms and what will get your post amplified.
       | Instagram famously only shows your posts to 10% of your followers
       | by default [1]. The trigger points for your post to reach wider
       | require certain engagement quotas, and if you're designing your
       | post to get more comments then it'll hit them faster. Often
       | accounts do these kinds of posts because their profile is about
       | selling a single thing (like a book they wrote) or are
       | dropshipping and focused on the marketing side. The real thing
       | they want to get in front of users is the link in their bio.
       | 
       | I think the real issue here is that it's impossible to tell the
       | benign from the malignant. Is that cute mom blog going to start
       | hawking ivermectin? What is my comment revealing about me that I
       | don't authorize? There's no Better Business Bureau for Facebook
       | pages. Maybe there should be.
       | 
       | 1. https://www.thelovelyescapist.com/2018-instagram-algorithm/
        
       | vidanay wrote:
       | Facebook...Nuke it from orbit.
        
       | aasasd wrote:
       | 1.4 million comments on a single post? Holy crap! I was
       | previously wondering on Reddit, what kind of vapid self-
       | importance compels people to comment in threads that already have
       | over 100-200 comments--when new ones go straight to the bottom
       | and no one sees them afterwards. But this is on scale of some
       | mental illness, unless I seriously misunderstand something about
       | Facebook comments.
        
         | the_arun wrote:
         | It is mindblowing to see so many people socially active!
        
           | hoten wrote:
           | I wouldn't consider engaging in FB comment threads to be
           | social nor active.
        
         | pkamb wrote:
         | Don't worry - their comment response to the meme page is
         | annoyingly broadcast into the timeline of every one of their
         | friends.
        
         | TrackerFF wrote:
         | From what I see, people tend to tag their friends in these
         | massive threads.
        
         | SPascareli13 wrote:
         | Your friends will see your comment in fb, so there is an
         | incentive to comment even when there is already millions of
         | other comments.
        
         | Jorengarenar wrote:
         | >I was previously wondering on Reddit, what kind of vapid self-
         | importance compels people to comment in threads that already
         | have over 100-200 comments--when new ones go straight to the
         | bottom and no one sees them afterwards.
         | 
         | It works in similar way as here. The post have 145 comments
         | (right now) and yet I'm adding another one.
        
         | strulovich wrote:
         | Facebook shows comments from your friends highlighted in your
         | feed.
         | 
         | So if one of your friends comments with the one millionth
         | comment, you can end up seeing the post and your friend's
         | comment In your feed. So while no one can read all comments -
         | your friends are likely to see yours.
        
           | cmg wrote:
           | I've seen a lot of this recently, both on the types of posts
           | in the article and on the posts of controversial / extremist
           | right-wing politicians like MTG.
        
           | HaloZero wrote:
           | And then y'all can communicate in thread too. So it's a
           | little microcosm.
        
         | jjoonathan wrote:
         | My guess: on FB, it steers your post to your friends, so you
         | aren't talking to the 1.4M so much as chatting with your
         | friends about a meme that 1.4M people are also chatting with
         | their friends about.
         | 
         | Just a guess, though. I don't actually FB.
        
         | dbtc wrote:
         | https://en.wikipedia.org/wiki/Placing_notes_in_the_Western_W...
        
       | CosmicShadow wrote:
       | The same type of people answer these as the people who get an
       | email from Amazon or Home Depot about a product they bought with
       | a question like "What are the dimensions" and they answer "I
       | don't know".
       | 
       | And for all time, everyone else is like WTF did you even answer
       | the question if you don't know, it's not like your friend asked
       | you in person, and that is the story of 80% of Q&A's on every
       | product. *SIDE RANT OVER!
        
         | ComputerGuru wrote:
         | The emails Amazon sends (or used to send) to randomly selected
         | prior purchasers of a product when there's a new unanswered
         | question have a subject line along the lines of "David is
         | asking you if xxx, can you help him?"
         | 
         | They're deliberately made to look like personal appeals to the
         | individual specifically, and I don't blame people for not
         | understanding that it's disgusting growth/engagement hacking.
        
       | bondarchuk wrote:
       | > _This multi-billion dollar industry has to be getting revenue
       | somewhere else._
       | 
       | Wait, how do they know this is a multi-billion dollar industry in
       | the first place?
        
         | Lammy wrote:
         | https://www.statista.com/statistics/693438/affiliate-marketi...
        
       | badkitty99 wrote:
       | The 90's called and wants their chainmail back
        
       | th0ma5 wrote:
       | Probably mapping spread vectors.
        
         | uptownfunk wrote:
         | does that mean trying to understand how "virality" spreads?
        
       | arbuge wrote:
       | Just a note for those of you who were confused like I was upon
       | reading this article: the author seems to be using the term
       | "affiliate networks" in an unusual way - they're calling Facebook
       | pages with some kind of commercial relationship between them
       | "nodes" in an "affiliate network".
       | 
       | The commonly accepted usage is:
       | 
       | https://en.wikipedia.org/wiki/Affiliate_network
        
         | cratermoon wrote:
         | I think the author's description is a completely accurate
         | description of affiliate networks. The fact the the commercial
         | relationship results in "likes" or "influence" for the
         | affiliates rather than direct monetary compensation from the
         | merchants is just the nature of social media.
        
       | LAC-Tech wrote:
       | 7 Tacos is a lot. Don't let someone you care about eat 7 Tacos,
       | that's just enabling.
        
       | bovermyer wrote:
       | The real question is, how do we discourage interaction with these
       | bait posts on a scale that matters?
        
         | bluGill wrote:
         | Block all the originators of them. I've been doing that for a
         | week or so now, and facebook is slowly getting better. (first I
         | left all groups myself - facebook is a terrible way to keep up
         | with you hobbies as there is no good way to see everything).
         | Facebook is becoming more and more pictures of real events in
         | my friend's life - actually social media, and less and less
         | politics, memes, and pictures that were funny the first time 10
         | years ago...
        
         | dqv wrote:
         | Well you can take the approach that one TikToker's mom took:
         | comment on the post and say it's datamining.
        
           | Nition wrote:
           | I once saw an official Facebook blog post about some change
           | to the T&C, with 80,000+ comments on it, most of them one of
           | those meaningless copy-pastes about how they don't consent to
           | Facebook using their data. More comments were appearing every
           | second.
           | 
           | Every 20th post or so, someone would be saying something like
           | "Stop it you idiots, this stupid copy-paste doesn't do
           | anything, you can't declare your rights like this." Then a
           | bunch more copy-paste comments would appear before the next
           | person telling the idiots to stop.
        
           | wutbrodo wrote:
           | Would the segment of users who respond to content-free posts
           | like these have much of a reaction to a comment saying that?
           | 
           | I don't mean that as a snarky dismissal, but a sincere
           | question. I know that plenty of people, especially among
           | digital-natives, have instant negative reactions to being
           | reminded of data collection. But the type of user answering
           | "what was your first car" or "what do you call your
           | grandchildren" do not strike me as having much overlap with
           | the groups that are cynical about social media platforms.
        
             | dqv wrote:
             | I don't really think it's a legitimate or scalable way to
             | inform people. On a smaller scale it could help people in
             | the local social network to be aware of what those posts
             | are doing.
             | 
             | Another thing to do is to tell people "they're trying to
             | use this information to get into your bank account". That
             | will make them stop really fast. But as far as scalability
             | goes, I don't think there's a way.
        
         | Kiro wrote:
         | It's a really interesting question. What makes people so eager
         | to respond? I keep seeing friends and family (especially
         | family, older generation) answering these obvious spam
         | questions all the time.
        
       | jcims wrote:
       | Some other interesting questions they should pose:
       | 
       | - What's your mother's maiden name?
       | 
       | - What street were you born on?
       | 
       | - What was your first car?
       | 
       | - What's your childhood's best friend's name?
        
         | ceejayoz wrote:
         | Oh, I've absolutely seen these come up.
         | 
         | "Tag your mother if you love her", "tag your childhood best
         | friend", etc.
        
         | ctvo wrote:
         | "Being an American is a privilege few have. Let's share our
         | American issued social security numbers!" <Over photo of a bald
         | eagle>
        
           | mywittyname wrote:
           | My understanding is that a portion of the SSN is based on the
           | ZIP of your birthplace. I'm sure some clever person could
           | come up with a way to use birth location + a checksum that
           | would reveal an SSN without actually typing it out.
           | 
           | I'm thinking something like, "Add up all the individual
           | numbers of your SSN and figure out what Founding Father you
           | are!" Use some statistics to ensure that lots of people get
           | good ones.
        
             | jrwr wrote:
             | Used too, But anyone under the age of 20 will have a random
             | SSN [1]
             | 
             | 1: https://www.ssa.gov/kc/SSAFactSheet--IssuingSSNs.pdf
        
           | robocat wrote:
           | I presume there are a lot of non-Americans with SSNs: I have
           | one from working in the US on a working visa.
        
         | madrox wrote:
         | These do get shared, but usually in meme form where you turn
         | your birthday into your "Werewolf name" and are encouraged to
         | share in the comments. Because you're sharing in an altered,
         | amusing form, you don't stop to consider someone can reverse
         | your birthday from it.
        
           | jrochkind1 wrote:
           | My birthday is literally already on my Facebook "about" page
           | though, and shown to all my friends whenever it comes around,
           | so they can post on my timeline.
        
       | gavin_gee wrote:
       | isnt this already known as social engineering data collection for
       | hackers for use in later attacks?
        
         | mikey_p wrote:
         | That's not what the article is about. It points out that
         | passwords for random folks aren't really worth much, and are
         | probably out there on the dark web for purchase anyway, but the
         | idea that psychographic profiles could be build by a third
         | party that is scraping these comments off public posts.
        
       | captainmuon wrote:
       | I think this is just a conspiracy theory. What's happening is
       | that capitalism sets ridiculous incentives, so people are
       | compelled to set up all these blogs and create these memes to
       | maybe get fractions of cents per interaction. The real scandal is
       | not that the Russians or the Chinese are attacking democracy this
       | way, it is that we are waisting so much productivity on this
       | (both users, and people working on such campaigns).
       | 
       | PS: If the described tactic really works, I gotta try it out in
       | order to take over the world.
        
       | tarkin2 wrote:
       | So, there's a network of popular accounts that are posting
       | questions and harvesting the comments to psychologically analyse
       | Facebook users and later politically target them and their social
       | networks with disinformation that's tailored to their
       | psychological grouping? That's what I gleamed.
        
       | yabones wrote:
       | My speculation: The exact same thing that happens with reddit
       | accounts used for astroturfing... People will 'build up' a
       | profile over the course of several months, reposting popular
       | posts from months/years ago, etc. They're actually quite easy to
       | spot, an account with high posting points and low comment points
       | is usually one of these. Then, when it's nice and ripe, they will
       | sell it to a troll farm which uses it to push a particular
       | agenda. We saw this happen quite a bit in 2015-2016, and again
       | starting in early 2020 (though it never really stopped).
       | 
       | But, in this case, the product is the 'network' rather than
       | individual accounts. Something that appears this 'organic' and
       | 'homegrown' is a very valuable tool for a widespread disinfo
       | campaign.
       | 
       | Or, it could simply be the magician gang that makes viral posts
       | of gross food. https://www.huffpost.com/entry/creators-
       | countertop-spaghetti...
        
         | [deleted]
        
         | bink wrote:
         | I've never understood this logic. Very few subs limit posts by
         | karma and those that do certainly don't require the insane
         | amounts of karma that these bots are farming.
         | 
         | I know I don't go back through someone's post history before
         | voting on their comments and I don't really care about their
         | aggregate karma values.
        
           | MattGaiser wrote:
           | It just takes one person to do that though and say "you made
           | a new account for this you troll?" before everyone gets it.
        
             | josefx wrote:
             | That would work if the moderators of various subs weren't
             | part of it, quite a few subs will hand out temp. bans when
             | you "accuse someone of shilling". Even if the account was
             | created the day a story went public, didn't post anything
             | until weeks later when the story hit reddit and never
             | backed up its attempts to discredit one side of the
             | conflict.
             | 
             | Some fun oriented subs will however happily ban spam bots
             | that just automate imgur reposts if you point one out early
             | enough.
        
           | cogman10 wrote:
           | Before the advent of the current crop of social media, online
           | forums would post your post count and sign up dates. Posts
           | from someone that has a brand new account and low post count
           | were HIGHLY scrutinized. In some cases, needing mod approval
           | before becoming visible.
           | 
           | My assumption is troll farms are buying accounts with karma
           | to try and do an end around such a system. It wouldn't be
           | hard on reddit/hn/or other vote based social media locations
           | to pay extra scrutiny, even automated, against brand new
           | accounts. By using established accounts, it makes astroturf
           | detection harder to do. Now every account is potentially an
           | astroturfer.
        
           | yabones wrote:
           | Indeed, there's more to it than that. I haven't gone in-
           | depth, so this is my very pedestrian understanding...
           | 
           | The very secretive spam filter has cut-outs for 'high value'
           | accounts - this isn't really documented formally, but it's
           | pretty obvious that posting limits are essentially
           | nonexistant for 1M+ users, either by design or because
           | they're well known to the mods...
           | 
           | The value of the high-karma accounts is that they're much
           | more likely to be accepted for moderator applications. Get
           | enough mods on a default sub, and you basically control the
           | universe. That's very difficult, so the much easier way is to
           | create legit-looking fringe subreddits with names like
           | "newstoday" or "politicallyuncensorednews". Get enough of
           | your smurf accounts to upvote those, and you can get to
           | rising for /all. Get enough real bites and you might even get
           | it to the front page.
           | 
           | I haven't really looked into this stuff for a few years,
           | because it's frankly depressing. So my understanding will be
           | a little off what the most recent networks are doing.
        
         | spoonjim wrote:
         | Is there a difference in your post's reach on Reddit if you
         | have 1,000 karma vs. 1,000,000?
        
           | oconnor663 wrote:
           | I wouldn't be surprised if some of their shadowban logic was
           | more lenient on accounts that had a lot of karma, to avoid
           | high-profile embarrassing mistakes? Just a guess.
        
             | lpcvoid wrote:
             | I don't see how banning a high karma account would be
             | embarrassing? I didn't even know that people cared about
             | karma on Reddit.
        
               | oconnor663 wrote:
               | Just that high karma accounts are marginally likelier to
               | be subreddit mods or whatever? Again just a baseless
               | guess on my part.
        
             | Lammy wrote:
             | Luckily for Reddit it can only be highly embarrassing if
             | people actually notice and raise a stink, which
             | shadowbanning is designed to avoid. The shadowbanned user
             | may just naturally get sick of the lack of interaction and
             | leave the site, like this top /r/worldnews moderator with
             | 14mil Karma who has been shadowbanned since July last year:
             | https://old.reddit.com/user/maxwellhill
        
               | mywittyname wrote:
               | It would be interesting if their shadow-ban logic
               | actually gave users upvotes at random for posts.
        
         | Nition wrote:
         | Questions like this one from the article are also very common
         | on /r/AskReddit:
         | 
         | > Without naming the state you are from, what is it famous for?
         | 
         | Hard to tell if that's intentional data gathering or just
         | someone innocently copying a common data-gathering question
         | format from Facebook though.
        
           | 0x4d464d48 wrote:
           | Fuck zodiac signs.
           | 
           | Tell me what kind of car you plan to buy next!
        
             | platz wrote:
             | A hovercraft
        
           | akersten wrote:
           | That kind of thing is a gold mine for data harvesting.
           | Automatically: for each comment from a certain state, go
           | through their commenting history to map opinions on whatever
           | you care to search for on a state-by-state level. Way cheaper
           | and faster than phone surveys, and also gives a radically
           | higher value slice of the population (than those who would
           | pick up the phone to answer a survey). And that's probably
           | the least nefarious thing someone could do with that data.
        
             | giantrobot wrote:
             | Or just look them up on SnoopSnoo, it'll pull in those
             | things and give you a decent estimate of their personal
             | details.
        
           | cratermoon wrote:
           | The generic "without directly revealing PII, provide some
           | detail which ML algorithms can use to determine the PII with
           | a high degree of accuracy" is pretty clearly a product of
           | intentional data gathering. The fact that is might get copied
           | and organically spread is just a bonus.
           | 
           | This also goes for "the first letter of your last name plus
           | the date of your mother's birthday are your <pop culture
           | tag>".
           | 
           | Oh and all the cutesy little image processing tools like
           | "what would you look like older/younger/as a different
           | gender/if you were a cartoon" are there to train facial
           | recognition algorithms.
           | 
           | Yet even supposedly sophisticated people fall for these.
        
             | cinntaile wrote:
             | Do you have any actual proof that this is used for data
             | mining and not because it's an engaging, fun question?
        
               | decremental wrote:
               | I think this misses the point. Even if what he said has
               | never happened, your mind should be trained to never
               | engage with things like that to begin with. The very
               | first thing that should occur to you is "I'm about to
               | upload a picture of my face on the internet. That's not a
               | good idea. I won't do that." Same with entering personal
               | information.
        
               | _jal wrote:
               | There are multiple "the point"s.
               | 
               | Yes, teach your kid to be careful with their data.
               | 
               | But also, how many outfits are actually doing this, or is
               | this currently a theoretic concern? If people are doing
               | this, what are their goals? Are they meeting them?
               | 
               | Are there less obvious examples of the same thing?
               | 
               | I'm sure you can keep going with more points.
        
               | decremental wrote:
               | If you just don't ever engage then even if the concern is
               | not over a theoretical threat, you'll never have to worry
               | either way. This isn't a "citation needed" kind of issue.
               | "Do you have a source for it not being a good idea to
               | provide personal information to strangers?" Bizarre.
        
               | [deleted]
        
               | handoflixue wrote:
               | Why should I care, though?
               | 
               | "What state are you from" seems like pretty innocuous
               | information. I'll readily mention that I'm from Seattle
               | if it's relevant to the conversation or asked directly,
               | and I tend to think of myself as being on the more
               | paranoid / pro-privacy side of things.
               | 
               | All the big players already have my address because I
               | gave it to them, and a stalker should be able to work
               | that information out pretty easily because I post in the
               | Seattle sub-reddit and otherwise engage a lot with
               | Seattle topics.
        
               | edoceo wrote:
               | > seems like pretty innocuous
               | 
               | That's how they get you. It's a trap!
        
               | handoflixue wrote:
               | Yeah, but... what actually IS the trap? What bad thing
               | happens because of this?
        
               | AlexAndScripts wrote:
               | You'll happily mention it, and a human can find it if
               | they look through your profile. ML will have a much
               | harder time in comparison to going through some comments,
               | all in a similar way, on a post where everyone is saying
               | something about their state.
        
               | handoflixue wrote:
               | Yeah, but... what's bad about that? Why should I be
               | opposed to ML knowing I'm from Seattle?
        
               | 1123581321 wrote:
               | The most commonly known proof that such tactics are used
               | is that Cambridge Analytica used social quizzes and games
               | to gather data.
               | https://www.politico.eu/article/cambridge-analytica-
               | facebook...
        
               | cinntaile wrote:
               | I would argue that's quite different from when some
               | influencer asks what state you're from without telling
               | what state you're from. Quizzes and games give data in a
               | much easier format compared to this. It's just a trick to
               | engage people and people like to laugh with stereotypes
               | so they happily oblige. Analyzing all this unlabeled text
               | data to find out what state someone is from seems hardly
               | worth the effort, it's not like that's valuable
               | information.
        
               | 1123581321 wrote:
               | That may be. I'd expect these tactics to evolve, but like
               | most people I'll only know for sure if it blows up in a
               | big scandal.
        
       | GoodJokes wrote:
       | Delete Facebook. By god. How many articles do we need to write to
       | communicate the same sentiment.
        
         | spoonjim wrote:
         | "Delete Facebook" is like saying "Delete drugs." The whole
         | reason Facebook is dangerous is because its draw on the human
         | psyche is stronger than that of the average person to resist,
         | and that the people who do engage with it can harm the people
         | who don't. There needs to be a national social media policy the
         | way there is a national drug policy. We need to have the
         | conversation about where we want to be as a country from "full
         | ban" to "fully legal," just like people recognize that a
         | country's drug policy can be somewhere between Singapore's and
         | Portugal's.
        
           | boppo wrote:
           | >national drug policy
           | 
           | I'm not sure that's less harmful to society than no policy.
           | It may be causing more people to use.
        
         | [deleted]
        
         | [deleted]
        
         | parksy wrote:
         | Already done, parents are still on there as it's the best way
         | to connect with people they went to school with 50 years ago,
         | as far as I can tell Facebook is burning through the younger
         | ages and it's going to end up a digital retirement home,
         | already well on the way.
        
         | wutbrodo wrote:
         | This article is more than just "Facebook is bad". It's
         | proposing a specific phenomenon the author thinks it's noticed,
         | and digging into what may be behind it.
        
         | macintux wrote:
         | My democracy is being destroyed by Facebook. What good did
         | dropping it do me?
        
         | jcims wrote:
         | I just signed up for an account last year. I unfollowed
         | everyone about six months later. Now I pop on every few days
         | and either see something I'm tagged in or it says 'you're all
         | caught up'. Sometimes it throws an error which is rewarding.
         | 
         | Seems to avoid most of the garbage on Facebook but I can still
         | use it to contact people or hit the marketplace.
        
       | TheSockStealer wrote:
       | Some of these questions are similar to those questions you would
       | see in an identify verification challenges. What is your first
       | car, pet name, city you were born in. I am not saying this is the
       | "only" answer, but could be one of them.
        
         | wodenokoto wrote:
         | The article directly addresses this point and claims that the
         | password is worthless, while somehow also claiming that the
         | data they can scrape about you after you interact with them is
         | where the money is made.
        
       ___________________________________________________________________
       (page generated 2021-09-27 23:00 UTC)