[HN Gopher] FTC Fines Twitter $150M for Using 2FA Phone Numbers ...
       ___________________________________________________________________
        
       FTC Fines Twitter $150M for Using 2FA Phone Numbers for Ad
       Targeting
        
       Author : averysmallbird
       Score  : 438 points
       Date   : 2022-05-25 21:32 UTC (1 hours ago)
        
 (HTM) web link (www.ftc.gov)
 (TXT) w3m dump (www.ftc.gov)
        
       | [deleted]
        
       | octagons wrote:
       | I don't have the most optimistic outlook for this having any
       | impact, but I really hope this sets a precedent for limiting the
       | use of dark patterns with which companies try to tie your
       | identity to a phone number. I think the total sum for this fine
       | is rather myopic: it ignores the long tail of possible future
       | data leaks and the impact it might have on the people behind the
       | affected accounts.
       | 
       | I created my current Twitter account a few years ago and it
       | remained dormant for a while. It was flagged as "in violation of
       | our policies" despite having not made any tweets or using a
       | handle or nickname that would cause offense to anyone. In order
       | to resolve this, I had to enter my phone number to "secure" my
       | account. I don't know what process triggered this review, but
       | I'll be damned if it didn't smell like an easy way to associate
       | an existing marketing profile with my Twitter account. Of course,
       | it's vitally important to profile a service I used to keep up
       | with industry news and post about Goban puzzles.
       | 
       | I've also run into similar patterns on Discord and similar
       | platforms; "Oops! Something suspicious is happening with the
       | account [you literally just created]. Please add a phone number
       | to your profile to proceed."
       | 
       | Although I follow a reasonable set of practices around
       | identity/password management, I usually architect my risk profile
       | with a "I don't care if I lose this account" approach. If that
       | statement isn't true, then I will happily apply all of the
       | security measures available. However, it seems like the idea of
       | creating "I don't care" accounts is becoming increasingly
       | difficult as we continue to invest in user marketing analytics
       | and lower the barrier of entry to these types of technologies
       | that do not have the consumer's best interests in mind.
        
         | pixl97 wrote:
         | >I created my current Twitter account a few years ago and it
         | remained dormant for a while. It was flagged as "in violation
         | of our policies
         | 
         | Same here, linked it to PSN to get images off my PS4 and it was
         | flagged before I could do anything.
         | 
         | Never did add my number and shortly after that they had a leak
         | where any hacker could figure your number out.
        
       | dbg31415 wrote:
       | I hate all the different ways companies target people.
       | 
       | I recently booked flight on American Airlines for my 80+ year-old
       | father. I requested the golf cart to take him between gates.
       | 
       | Immediately I got a call from "American Airlines Health Alert."
       | 
       | They made it sound like there was an issue with the booking...
       | "An important health alert related to your flight." And there was
       | a "Press 1, if you're over 50" option.
       | 
       | Anyway long story short it was some shady marketing company
       | selling me a panic button in case of falls.
       | 
       | The lady was like"these are very expensive devices"... "we'll
       | give you the device... but you pay a small fee for monitoring
       | every month."
       | 
       | Clearly she'd given the pitch 1,000 times. Didn't give me any
       | time to talk. Finally, I was like, "Hey is there a problem with
       | my Dad's flight, or are you just trying to sell me something?"
       | And she hung up on me.
       | 
       | Fuck American Airlines. Fuck all the airlines really, but it
       | should be illegal to target the elderly just because they asked
       | for help with connection flights.
        
       | mrkramer wrote:
       | I remember I got scared this might happen when Epic introduced
       | 2FA for claiming free games[0]. FTC check Epic Games too.
       | 
       | [0] https://www.pcgamer.com/uk/for-a-while-epic-games-store-
       | will...
        
       | jazzythom wrote:
        
       | linuxhansl wrote:
       | Some weeks ago I wanted to deactivate my Twitter account. I
       | hadn't used it for a while, and it claimed that my account was
       | locked. Nothing was sent from it in many months, so it wasn't
       | clear why/how it would be locked now.
       | 
       | For some reason you cannot deactivate your account when it is
       | locked.
       | 
       | So I contacted Twitter demanding that as EU citizen (which is
       | true) I hereby demand all data about me that Twitter or its
       | subsidiaries might have, including account data, to be deleted
       | under the GDPR... Or alternatively unlock my account so that I
       | would be able to deactivate it.
       | 
       | They were actually pretty responsible. My account was unlocked 30
       | minutes later and I was able to deactivate it.
        
       | heavyset_go wrote:
       | Guarantee they're doing the same thing with phone numbers used to
       | verify accounts, as well. I'm not talking about the blue check
       | mark verification, but the verification they impose upon new
       | accounts to prove that you're "real" and not a bot.
        
       | brailsafe wrote:
       | I appreciate the security of 2FA, but I don't like the liability
       | and and I don't like being required to have my phone at all
       | times. Jus one of my gripes with the world
        
         | sedatk wrote:
         | I propose multiple YubiKeys for this. Unlike TOTP, it's not
         | susceptible to phishing, and you can keep Nano keys inserted in
         | your USB ports that you regularly use. You don't need your
         | phone or anything most of the time.
        
       | lucb1e wrote:
       | > [Twitter] agreed to an order that became final in 2011 that
       | would impose substantial financial penalties if it further
       | misrepresented "the extent to which [Twitter] maintains and
       | protects the security, privacy, confidentiality, or integrity of
       | any nonpublic consumer information."
       | 
       | They violated that order and that's what the fine is for.
       | 
       | I was wondering what kind of authority the FTC has to impose
       | fines based on what as a European I'd consider a GDPR violation
       | (in the USA, this california privacy act thing sounds like it
       | would be the nearest thing, but that's not federal so that
       | couldn't be it). But what was this order about? Clicking the
       | reference in the article:
       | 
       | > The FTC's complaint against Twitter charges that serious lapses
       | in the company's data security allowed hackers to obtain
       | unauthorized administrative control of Twitter, including access
       | to non-public user information, tweets that consumers had
       | designated private, and the ability to send out phony tweets from
       | any account including those belonging to then-President-elect
       | Barack Obama and Fox News, among others.
       | 
       | > Under the terms of the settlement, Twitter will be barred for
       | 20 years from misleading consumers about the extent to which it
       | protects the security, privacy, and confidentiality of nonpublic
       | consumer information
       | 
       | So this wasn't about privacy initially, the FTC's attention came
       | from allowing some public figures' accounts to be hacked, after
       | which it imposed some broad set of requirements, which are broad
       | enough to now include this privacy issue. Not a bad outcome, but
       | interesting turn of events to get the FTC to act as data
       | protection authority.
        
       | staunch wrote:
       | At first I thought the fine sounded excessive but after thinking
       | about it, it seems far too low. I'd like to know the the people
       | that were specifically responsible for this scam.
       | 
       | Did Jack Dorsey implement and endorse this scam?
        
         | autoexec wrote:
         | There are a lot of details I don't see about this, even in the
         | order itself. How did the FTC know twitter was abusing this
         | data? Was there a whistleblower who notified them, or did they
         | break down the doors and start scanning twitter's internal
         | documents? Were they authorized to dig into twitters internal
         | processes as part of the initial security investigation?
        
       | MiddleEndian wrote:
       | Good, but it should be 10x that amount.
        
         | lelandfe wrote:
         | The FTC really ought to take a leaf out of GDPR's book, and
         | start fining truly punitive amounts:
         | https://www.tessian.com/blog/biggest-gdpr-fines-2020/#:~:tex...
         | 
         | $150M for a repeat offense affecting millions of users is
         | paltry.
        
           | transcriptase wrote:
           | Just don't take the leaf titled "rarely if ever enforce
           | anything".
        
       | xbar wrote:
       | Clownish. If I were the CEO, some folks would have already been
       | fired.
        
         | rsstack wrote:
         | Several people _were_ fired recently. We don't know why, so
         | maybe. But probably not.
        
         | neighbour wrote:
         | Personally I think the CEO would have known about this
         | happening and turned a blind eye until it became an issue. I
         | have nothing to back this up though. Just a pessimistic take on
         | corporate culture.
        
         | annoyingnoob wrote:
         | Would you fire yourself?
        
           | mrkramer wrote:
           | That's called resignation. Yea you would do it if you respect
           | your company and your users. Call in someone more mature.
        
             | zaroth wrote:
             | Clearly anyone with the ethical chops to consider resigning
             | over this would be a net-loss to Twitter if in fact they
             | resigned. Is this some sort of named paradox?
        
               | bogwog wrote:
               | > would be a net-loss to Twitter
               | 
               | Would it? It seems to me like unethical behavior is
               | always more profitable.
        
             | [deleted]
        
       | gareth_untether wrote:
       | I really can't believe companies are still doing this with
       | people's data. Insane that this is still a thing companies abuse.
        
       | oblio wrote:
       | We need to turn data into a liability.
       | 
       | There's a reason many places work on a "need to know" basis.
        
         | ProAm wrote:
         | data is a liability regardless
        
       | techsupporter wrote:
       | This is an interesting part to me: "[T]he new order[0] adds more
       | provisions to protect consumers in the future: ... Twitter must
       | provide multi-factor authentication options that don't require
       | people to provide a phone number."
       | 
       | I would like to see this be a more broad-based rule. No, I am not
       | moved by "SMS is easy" or "getting a number that can receive SMS
       | is harder for scammers to do in bulk." If you must, give users
       | the choice but not the _obligation_ to hand over a mobile number.
       | 
       | 0 - https://www.ftc.gov/legal-library/browse/cases-
       | proceedings/2...
        
         | gabereiser wrote:
         | To further expand on this. 2FA should not rely on SMS at all.
         | It should be an option but not the default one. An
         | Authenticator app should be the default. I know we assume
         | everyone has a cell phone but that's not the case.
        
           | ChrisMarshallNY wrote:
           | I just ordered something online, last night, and it had two
           | required fields:
           | 
           | 1) Mobile Phone (landline is not required)
           | 
           | 2) The phone number/address needs to be the same as for the
           | card.
           | 
           | I don't use a mobile phone for the card. I use my landline,
           | so I entered that.
        
           | fron wrote:
           | WebAuthn should be the default
        
           | autoexec wrote:
           | Authenticator apps aren't much better. Look at their privacy
           | policies. Installing Microsoft Authenticator means giving
           | them your location data 24/7 and allows the to collect even
           | more data on you than giving Twitter your phone number did.
           | Do you really think they aren't going to use that data for
           | anything else? I don't believe that anymore than I believed
           | Twitter.
           | 
           | Personally, I'd rather deal with the hassle of carrying
           | around multiple hardware tokens than give companies a
           | continuous stream of data about my personal life to use
           | against me.
        
             | koolba wrote:
             | Why does an Authenticator app even have location access?
             | Geoblocking?
        
               | vel0city wrote:
               | Exactly, IIRC you can do policies related to locations.
               | It's an optional feature, you don't need to enable it and
               | the app will overall work just fine.
        
             | __turbobrew__ wrote:
             | FreeOTP works just fine for me
        
             | C4K3 wrote:
             | You don't have to use microsoft authenticator. TOTP is a
             | big step up from SMS and most/good apps won't violate your
             | privacy.
        
             | viraptor wrote:
             | I believe GP meant authenticator app like authy, duo, or
             | any other TOTP. You're not giving anyone your location by
             | using that.
        
             | hoppyhoppy2 wrote:
             | There are free, open-source, and privacy-respecting options
             | for TOTP 2FA that don't require a mobile phone plan.
             | 
             | You can use something like KeepassXC (desktop) or something
             | like KeepassDX or Aegis (on F-Droid on Android) for your
             | OTP authentication app to manage 2FA for Google, Amazon,
             | eBay, Dropbox, etc. and there are other options as well.
        
             | bogwog wrote:
             | Afaik, TOTP is standardized, so you should be able to use
             | any authenticator app for 2FA. Idk about Microsoft, but I
             | haven't encountered any service that doesn't allow you to
             | bring your own TOTP app.
        
               | yurishimo wrote:
               | I have. I worked for an enterprise that used OneLogin
               | could only use the OneLogin Protect app for 2FA. I
               | thought 1Password was broken but I tried a different app
               | with my phone camera and it said the QR code was invalid.
        
             | andrewmackrodt wrote:
             | Are you using Microsoft Authenticator in a corporate
             | environment/profile? I just checked my personal install
             | (Android) and it does not require any permissions (location
             | is denied).
        
             | WillPostForFood wrote:
             | So don't use Microsoft Authenticator. There are many
             | options without the privacy problems with the MS App
             | (which, IMO are overblown, but whatever). Go run your own
             | if you want to be absolutely private. I'm happy with
             | 1Password for managing it.
             | 
             | http://www.nongnu.org/oath-toolkit/oathtool.1.html
        
         | gigel82 wrote:
         | I was overseas and my provider (Cricket) doesn't have roaming
         | so I usually pick up a cheap prepaid SIM locally.
         | 
         | I didn't enable 2FA on Uber but it insisted on sending me a
         | code via SMS (of course, to my inaccessible US number). That
         | was incredibly stupid and shortsighted. Meanwhile, all services
         | that were set up for Authenticator MFA worked just fine over
         | the European carrier's LTE.
        
         | li2uR3ce wrote:
         | Also SMS not nearly reliable enough. You should have
         | alternatives for that reason alone. My cell carrier was
         | blocking many SMS verification messages for a good two months.
         | It caused me all kinds of problems when my credit union merged
         | with another and I had to change account numbers all over the
         | place. Many had the option of using an email address, but there
         | were quite a few that it was SMS or play find the human on the
         | 800 number.
        
           | prirun wrote:
           | Unrelated to Twitter, but your post reminded me that consumer
           | should also have an independent "account number" for lack of
           | a better word that belongs to the user, like a telephone
           | number. Electronic payments would come out of this personal
           | account number and be forwarded to whatever institution(s)
           | the person wants. Then changing banks would be as easy as
           | changing phone carriers.
        
         | latchkey wrote:
         | They already provide these options today.
        
           | falcolas wrote:
           | It's a precedent. This isn't just about Twitter; there are
           | many who do not offer such options.
        
             | zaroth wrote:
             | I don't think it has any applicability to anyone beyond
             | Twitter.
             | 
             | Maybe it's a precedent that the FTC will tell you to add
             | non-SMS 2-factor if you are misusing the SMS factor for
             | advertising, but that's a pretty limited precedent!
        
       | karatinversion wrote:
       | For context, Twitter'S revenue in 2021 was $5 billion, on which
       | they made a loss of $220 million.
        
         | tpmx wrote:
         | Which is bizarre, in itself.
        
         | JohnJamesRambo wrote:
         | What a great buy...
         | 
         | I saw a tweet the other day that said they can't think of a
         | worse purchase since Bank of America bought Countrywide for $40
         | billion.
         | 
         | TWTR has traded flat since its inception in one of the greatest
         | bull markets of all time.
        
           | missedthecue wrote:
           | HP bought Autonomy which turned out to be a total fraud.
        
         | sitkack wrote:
         | When I think twitter, I think of a service that costs 5.2B to
         | run.
        
           | l33t2328 wrote:
           | There are some tongue in cheek answers but actually how does
           | it cost that much to run?
           | 
           | That's a ton of money for a website that is very text heavy
           | with short/low quality videos and a largely fixed feature
           | set.
        
             | viraptor wrote:
             | People. Salaries are expensive, especially SV salaries.
             | Then you need well paid management for the extra heads, and
             | group manager for them. IIRC (feel free to correct me, I'm
             | not going to dig it out again) they spent >$1B on R&D
             | itself... which is pretty much just couple hundred of
             | engineers who, (judging from the service changes recently)
             | mostly did nothing.
        
           | tpmx wrote:
           | I can't even fathom how it's possible to use $5B/yr to run
           | Twitter.
           | 
           | So, I co-architected the Opera Mini infrastructure. It peaked
           | at a similar number of users (250-300M active users). Sure,
           | Twitter is _much_ more DB-intensive, but transcoding web
           | pages is pretty CPU intensive too, and typically we
           | transcoded every single web page for them.
           | 
           | Twitter is spending $5B/300M =~ $17/user per year
           | 
           | I believe that from public sources, it's now possible to
           | deduce that we spent less than a 1/100th of that per
           | user/year.
           | 
           | Since we didn't have crazy money, we optimized things at
           | _every_ step.
        
             | nikanj wrote:
             | People forget the key word "premature" in the infamous
             | Donald Knuth quote, and think all optimization is evil
        
               | pessimizer wrote:
               | It's always premature to optimize until your company is
               | failing, because if you aren't failing yet it means it's
               | worked so far. You should always wait until your company
               | is falling apart to do a full rewrite of your core
               | product.
        
               | rmbyrro wrote:
               | Exactly. When the business is failing is when there's
               | lots of time and resources available to make your core
               | product more efficient.
        
               | StillBored wrote:
               | Is not doing stupid s%$t an optimization? (lol)
               | 
               | I'm reminded of the reddit articles a few years back when
               | they were talking about moving to AWS and having to batch
               | database requests to maintain their database perf.
               | Apparently, at the time they were literally sending
               | tens/hundreds of thousands of database queries for each
               | page load request for a logged in user because each
               | comment was collecting all its up/down votes and adding
               | them up rather than just having a per comment karma
               | attached to the comment.
               | 
               | This is what happens when you hire a whole bunch of
               | recent grads that frankly have no idea how to write code,
               | and think they are the smartest people on the planet when
               | it comes to distributed systems.
        
             | hguant wrote:
             | AWS instances don't pay for themselves
             | 
             | /s...mostly
        
             | LegitShady wrote:
             | According to their financial disclosures for 2021: On
             | income of ~$5.08B, they spent
             | 
             | - $1.8B as "cost of revenue" (costs incurred to run their
             | business),
             | 
             | - $1.25B on "research and development",
             | 
             | - $1.2B on "Sales and Marketing", and
             | 
             | - ~$1.3B together on "general and administrative"
             | (overhead) and "litigation settlement".
             | 
             | Then there are a bunch of small monies related to interest
             | expense and income, etc etc.
             | 
             | They're spending huge amounts of money and could be
             | profitable if they really wanted. I can't imagine what
             | Twitter is doing with 1.25B in research. Elon could make
             | Twitter profitable simply by cutting their research
             | department.
        
               | williamsmj wrote:
               | "Research and development" is engineering.
        
               | tlrobinson wrote:
               | > I can't imagine what Twitter is doing with 1.25B in
               | research.
               | 
               | ...and development. I assume "R&D" includes most of the
               | engineers?
               | 
               | Still, it could be a lot more lean for sure.
        
               | StillBored wrote:
               | Well presumably that 1.25B is mostly engineering salaries
               | since its "research and development" (for tax purposes
               | where they probably get huge write offs).
               | 
               | Anyway, they need that org to get their 1.8B in "cost of
               | revenue" down, which is presumably mostly the cost of
               | massive server farms to store what are mostly text
               | mesages. Although these days with all the machine
               | learning/etc to sell ads they probably "needs" all that
               | hardware to run their models and can't just optimize it
               | down to a higher perf system painting web pages.
        
       | AdvertisingMan wrote:
        
       | pessimizer wrote:
       | This is surprisingly reasonable. I would like to see a
       | decisionmaker do some time for fraud, though. They locked people
       | out of their accounts and demanded phone numbers for
       | "safeguarding," then used them for targeting in direct
       | contravention of a previously negotiated agreement with the FTC.
       | If that doesn't rise to criminality, the fraud statutes need to
       | be updated.
       | 
       | edit: they should also be required to dump the phone numbers
       | (even to be recollected later, without the deception), but I
       | didn't see that in the article. Are they being allowed to keep
       | the proceeds of a crime?
        
         | blamestross wrote:
         | One fundimental purposes of a "company" is to be an abstraction
         | to ofuscate moral responsibility for individual's actions.
        
         | rmbyrro wrote:
         | It says they cannot use the data commercially, only for the
         | stated purposes (security, recovery).
         | 
         | In practice, it'll be hard to enforce, though.
        
           | piva00 wrote:
           | Why not increase the punishment by having random audits like
           | the government do for drug checks? And make the company pay,
           | would be an even bigger deterrent if it's not just a fine...
        
         | itsoktocry wrote:
         | > _They locked people out of their accounts and demanded phone
         | numbers for "safeguarding," then used them for targeting in
         | direct contravention of a previously negotiated agreement with
         | the FTC. If that doesn't rise to criminality, the fraud
         | statutes need to be updated._
         | 
         | It's bad, I agree.
         | 
         | But _jail_? That should be reserved for the most heinous crimes
         | and criminals.
        
           | stormbrew wrote:
           | I'm basically a prison abolitionist but i don't really see
           | corporate fines as any real kind of justice at all either,
           | for big or small things. This is just putting a price tag on
           | the behaviour.
        
         | colechristensen wrote:
         | First you have to establish who goes to jail, corporations are
         | able to avoid this by having vague structures of shifting blame
         | so a jury can't decide if any particular individual is actually
         | at fault.
         | 
         | There probably should be laws establishing ultimately
         | responsible people with the unenviable duty of being
         | responsible for illegal things corporations do (sort of like an
         | engineer signing off on the design of a bridge), but doubtful
         | such a thing will happen.
         | 
         | We're left then with personal responsibility being limited to
         | people stupid enough to leave pretty explicit records of
         | nefarious intent to commit crimes.
        
           | xanaxagoras wrote:
           | Pick a C-suite exec or VP at random then. "Nobody, it's too
           | hard to unravel the organizational structure" isn't really
           | cutting it.
        
             | verisimilidude wrote:
             | That sounds kinda like the Mafia or Yakuza. Take the fall,
             | do time, protect the organization, get respect, get
             | promoted. Many people would gladly do a few years in
             | minimum security prison in exchange for million dollar
             | salaries, etc.
             | 
             | While I do think that would be better than nothing, it
             | could create its own set of bad incentives.
        
           | Beltalowda wrote:
           | I don't think it has to be that hard; you just need to
           | require that communication is preserved for companies over a
           | certain size, e.g. meeting minutes, emails, etc. This is
           | already the case for financial records and some employment
           | records, and the case with politics ("but her emails!")
           | 
           | This way, a record can be subpoenaed if needed.
           | 
           | Don't keep records or don't have records of this particular
           | decision? The person responsible for making sure the records
           | are kept for that department will be in trouble.
           | 
           | There is some administrative "red tape" here, but it's not
           | that bad, and much of these records already exist (or
           | existed).
           | 
           | The problem is the political will to enact such a law; I
           | agree that's not likely to happen.
        
             | colechristensen wrote:
             | "You have to keep a record of what happened during all
             | employee interactions so that we can prosecute you some
             | day" isn't exactly a likely-to-succeed plan. Already
             | prevalent are coaching employees not to leave records of
             | certain legally contentious topics.
        
               | oceanplexian wrote:
               | I worked for a certain rainforest company and they
               | specifically coached us on not leaving records or
               | discussing certain subjects in any form of written
               | communication.
        
               | sodality2 wrote:
               | > isn't exactly a likely-to-succeed plan
               | 
               | It doesn't have to gain popularity, be well-liked, or be
               | agreed with by the companies. If it is law, they must
               | follow it.
        
           | adamc wrote:
           | All decision-makers should go to jail in such cases. Then
           | they would work harder on making blame clear.
        
             | colechristensen wrote:
             | "All" is a vague term.
             | 
             | So jail everyone at Twitter who isn't an individual
             | contributor and is even tangentially involved?
             | 
             | This is how you escape consequences as an organization:
             | obfuscation. Make what you've done complex enough that it's
             | too difficult for a jury to decide who is responsible for
             | what, prosecutors won't be convinced they have a case and
             | will decline to pursue the matter.
        
           | RcouF1uZ4gsC wrote:
           | I think the CEO should personally be on the hook for
           | widespread organizational fraud, and in cases should be held
           | criminally responsible.
        
             | StillBored wrote:
             | Yah, some personal risk might justify some of the salary
             | package. Yah sure your going to make $50M a year, but you
             | might have to sit in jail for 20 years sounds fair.
        
           | dylan604 wrote:
           | Meh, that's why the CEO makes the big monies. "The buck stops
           | here" kind of thing. Charge the CEO. Make it the CEO's
           | problem to prove they were not responsible only by giving up
           | the person that was. I do believe sometimes CEOs are not
           | fully aware of what happens below them in the org tree, but
           | they are accountable for their people. If the CEO can't
           | handle that, then they shouldn't be accepting the roles.
           | Clearly, this has to be understood as part of the job
           | description
        
             | J-Kuhn wrote:
             | Here is a fictional timeline of events:
             | 
             | * Problem: Spammers automate creation of accounts.
             | Solution: Reuse the MFA infrastructure as some kind of
             | "CAPTCHA". The phone number is not stored.
             | 
             | * Problem: Spammers use a single phone number to unlock
             | 1000's of accounts. Solution: Store the phone number - so
             | those kinds of misuse can be detected.
             | 
             | * Problem: Ads-Team wants to sell more targeted ads.
             | Solution: There is possibly a phone number stored in the
             | user profile, use that.
             | 
             | Who is to blame here? The Ads team that didn't check if the
             | number can be used?
        
               | pornel wrote:
               | Yes! There are now privacy laws that explicitly require
               | you to check if user has given consent for such use of
               | the data.
        
               | lkschubert8 wrote:
               | How about the lowest common leader?
        
               | jeffparsons wrote:
               | Solution #2 modified to be safer:
               | 
               | > Solution: Store A HASH OF the phone number - so those
               | kinds of misuse can be detected.
               | 
               | If you don't need to store PII verbatim, don't store it
               | verbatim.
               | 
               | > Who is to blame here? The Ads team that didn't check if
               | the number can be used?
               | 
               | Yes. 100% yes. It's insane that we've normalized the idea
               | that if you can physically get your hands on some data
               | then that means you're allowed to do whatever you want
               | with it. Anyone even remotely responsible working in
               | advertising should be tracking provenance of the data
               | they're using. I've heard all sorts of excuses about why
               | this isn't practical, but with each year that passes I
               | find them less convincing, and I've finally reached the
               | point where I reject those excuses outright. If you don't
               | _know_ you're allowed to use some PII for marketing, then
               | you _can not_ use it for marketing. It's that simple.
        
           | pessimizer wrote:
           | Somehow they don't have to figure that out with felony
           | murder. Everyone who participates who is aware is liable to
           | the same punishment, then. Why not in crimes of bureaucracy?
           | Why make sure people who are just following orders are free
           | from punishment?
        
             | colechristensen wrote:
             | Because killing someone is usually pretty explicit in the
             | obviousness of a crime being committed.
             | 
             | Filling out forms, designing product features, and
             | implementing them can have each individual contributor
             | mostly ignorant that anything could possibly be wrong with
             | the request and the few people who do have some idea only
             | have a small one which is plausibly deniable. The person
             | who does get caught in those circumstances is usually just
             | a scapegoat anyway.
        
               | pessimizer wrote:
               | That's why you investigate. But awareness should be
               | enough. And if we start to have trouble proving awareness
               | (maybe employees aren't aware of a settlement), just
               | require in settlements that employees _are_ informed.
               | 
               | For felony murder, you don't have to know that the person
               | you're with is armed, intends to kill, or if you're
               | driving the getaway car you don't even have to know that
               | they _have killed anyone at all._ You 're participating
               | in something that you're expected to know is wrong, and
               | you're punished for anything that results from the entire
               | event. If this were like felony murder, the engineers
               | that implemented it (assuming awareness) would be as
               | liable for the $150 million as anyone else involved.
        
               | thaumasiotes wrote:
               | > Because killing someone is usually pretty explicit in
               | the obviousness of a crime being committed.
               | 
               | > Filling out forms, designing product features, and
               | implementing them can have each individual contributor
               | mostly ignorant that anything could possibly be wrong
               | with the request
               | 
               | That's got nothing to do with felony murder. Felony
               | murder occurs when you participate in a crime with
               | someone else and they accidentally kill someone while you
               | can't see them.
        
           | strangattractor wrote:
           | Hope they keep an eye on Truth Social that requires your
           | phone number even to use it.
        
           | [deleted]
        
       | wanderr wrote:
       | A fine is a cost. It's quite possible that Twitter made more than
       | $150m in doing this.
        
         | missedthecue wrote:
         | I don't think Twitter makes money at all.
        
           | zaroth wrote:
           | They make ~$5B a year. They just spend all of it and then
           | some.
        
             | [deleted]
        
           | nicce wrote:
           | The loss would have been bigger tho.
        
         | bpodgursky wrote:
         | I truly doubt this was a calculated tradeoff.
         | 
         | It was almost certainly a fuckup where the phone # was
         | mistakenly stored in a shared schema, and someone on the ads
         | side saw it and decided to use it for targeting, knowing
         | nothing about 2FA or how it got there. This probably only
         | affects a tiny fraction of their users.
        
           | ziddoap wrote:
           | > _I truly doubt this was a calculated tradeoff._
           | 
           | Potentially, sure.
           | 
           | > _It was almost certainly a fuckup where the phone # was
           | mistakenly stored in a shared schema, and someone on the ads
           | side saw it and decided to use it_
           | 
           | How is this an "almost certainly"? Do you have additional
           | information you'd care to share on why you think so? If this
           | were the case, it would point to _insanely_ sloppy policies,
           | procedures, and implementations.
           | 
           | > _This probably only affects a tiny fraction of their
           | users._
           | 
           | Why?
        
             | bpodgursky wrote:
             | > This probably only affects a tiny fraction of their
             | users.
             | 
             | Because most users provided their phone number when signing
             | up, not just when setting up 2FA. Twitter has always been
             | phone-centric (the first app was literally just sending SMS
             | messages).
        
       | ntoskrnl wrote:
       | Sigh. Yep. Don't ever give a company your phone number for 2FA.
       | It's insecure anyways due to SIM swapping. Stick to FIDO (e.g.
       | yubikey) or TOTP (e.g. google authenticator)
        
         | davesque wrote:
         | Yep, it's almost more accurate to describe using phone numbers
         | for 2FA as being _anti-_ secure, not just _in_ secure. That's
         | because it's effectively no better than having no 2FA and it's
         | possibly even _harder_ to detect when your account has been
         | compromised by a SIM swap. And many companies that use phone
         | numbers for 2FA also allow resetting one 's password via that
         | phone number. It's really just a tragedy that companies do
         | this, rather like when login screens prevent copy/paste.
         | 
         | If you're ever prompted to add a phone number to your account
         | on some web service for "extra security", just click "remind me
         | later" or "skip" as many times as possible.
        
         | minsc_and_boo wrote:
         | Or just get a phone and service provider who doesn't allow SIM
         | swapping (e.g. Google Fi, etc.), since many more services only
         | do 2FA with SMS than allow hardware authentication.
        
         | skybrian wrote:
         | You probably want both, as well as printing out some backup
         | codes, to avoid the risk of getting locked out when something
         | breaks.
        
           | sedatk wrote:
           | Unlike FIDO/U2F, TOTP is susceptible to phishing. Getting
           | locked out a serious problem and should be addressed with
           | printed recovery codes probably.
        
         | RamRodification wrote:
         | Any clear reason to go for Google for TOTP? As opposed to Authy
         | or something else.
        
           | jeromegv wrote:
           | No, it's all the same.
        
           | ntoskrnl wrote:
           | Those were examples I thought people were likely to
           | recognize, not vendor recommendations. I edited for clarity.
        
           | encryptluks2 wrote:
           | You do realize that TOTP is a standard that doesn't require
           | you to use either, that you can use the same secrets for any
           | TOTP app, right?
        
           | regecks wrote:
           | There is a clear reason not to use Authy, which is that
           | making your data portable is extremely annoying. No export
           | function. I ended up writing a 3rd party Authy client just to
           | get my TOTP keys out.
           | 
           | For iOS users, I cannot say enough good things about
           | https://apps.apple.com/us/app/otp-auth/id659877384. Author is
           | responsive, encrypted backups, portable data format.
        
             | nicce wrote:
             | Thanks. I have been looking replacement for Authy for quite
             | some time because of no export function.
        
             | Macha wrote:
             | For android users, Aegis provides much of these benefits
             | (as does andOTP). Both are open source, I like aegis a bit
             | better.
        
           | davis wrote:
           | There's actually a very good reason to not use Google
           | Authenticator actually.
           | 
           | They don't offer any backups (at least on iOS) and as a
           | result, if you lose your phone, you are hosed. Google
           | Authenticator also doesn't use iCloud for backup for files
           | like other apps. I also just assume at this point no one owns
           | that app and that it'll never get backups because that's how
           | Google operates.
           | 
           | I've seen multiple people lose their TOTP codes this way and
           | have been locked out of their accounts. Or even the more
           | simple case, they buy a new phone, restore from backup and
           | just assume everything is peachy then send their old phone
           | back and then don't realize it until they open the app for
           | the first time.
           | 
           | Use something with cloud backups for your safety.
        
             | bogwog wrote:
             | I got scared as hell a few years ago when an update bricked
             | the app, so launching it caused it to immediately crash.
             | Fortunately, reinstalling the app fixed it without losing
             | any data.
             | 
             | But since then I started actually backing up my recovery
             | codes, and whenever I create a new account somewhere, I set
             | up 2FA on three separate apps on my phone _just in case_.
        
             | nicce wrote:
             | Authy does not allow to make local backup (export) and it
             | is fully closed source, not really transparent. I wish
             | there were better alternatives.
        
               | PausGreat wrote:
               | Ravio OTP on iOS
        
             | SoftTalker wrote:
             | They offer one-time backup codes that can be used if a
             | device is lost. I'm not sure if this is Google or the site,
             | but for every login where I've set up Google Authenticator
             | I have copied the backup codes to my password manager for
             | that account. I'd agree that a lot of people might not do
             | that however.
        
               | Dylan16807 wrote:
               | Google the site has backup codes for logging in to
               | Google. Completely unrelated to the data in Google
               | Authenticator.
               | 
               | Google Authenticator used to have no way to get the data
               | out, but does now have an export. It still has no normal
               | backups.
        
             | jve wrote:
             | Luckily services makes clear that you should print backup
             | codes in case you lose 2FA.
             | 
             | As for Android Google authenticator - there is export
             | function, that generates QR code for all tokens. You can't
             | screenshot it, but can take a photo with different device
             | and print.
        
             | 8organicbits wrote:
             | I suppose you're talking about automatic backups, but at
             | least on Android you can manually "transfer accounts" to
             | export to another device over QR codes.
        
               | saurik wrote:
               | On iOS, I do see this feature, but it claims it will
               | _move_ the account, as opposed to  "copy" it, and so, if
               | it is a backup mechanism, they are explicitly pretending
               | it isn't one.
        
               | MarioMan wrote:
               | Although it is primarily designed and labeled as an
               | export mechanism, I can verify that it does work as a
               | backup mechanism. I regularly use it to sync up new 2FA
               | accounts to a backup phone. Simply choose to keep the
               | accounts after exporting.
        
         | jazzythom wrote:
         | Actually its more secure to use NFC yubikey w/ their app than
         | google authenticator for TOTP bc the key is in the yubikey
         | enclave vs the phones
        
           | drivers99 wrote:
           | I just started using that and would recommend it. When you
           | set it up you add each key you own from the same QR code.
        
           | cheeze wrote:
           | The tradeoff is usability though. I can have a TOTP code
           | stored on two separate phones in two separate locations
           | versus needing a yubikey always present.
           | 
           | To me, I'm too forgetful and dumb to not lose a yubikey, but
           | I manage to not lose my phone.
        
         | rvz wrote:
         | Exactly. I did tell them many times before [0], [1].
         | 
         | They just won't listen. So give them a fine instead, that will
         | make them listen.
         | 
         | The second _' wake up call'_ after the last one I've seen
         | today: [2]
         | 
         | [0] https://news.ycombinator.com/item?id=29264937
         | 
         | [1] https://news.ycombinator.com/item?id=30010434
         | 
         | [2] https://news.ycombinator.com/item?id=31510868
        
         | iotku wrote:
         | Yeah I'm extra upset about this because I would have chosen
         | TOTP if I was given the option, but only sms authentication was
         | available for 2FA for the longest time (until it became such a
         | big issue with account takeovers including jack's I believe
         | that they had no choice but to change that)
        
       | pinewurst wrote:
       | Didn't Facebook do something similar without any apparent
       | comebacks?
        
       ___________________________________________________________________
       (page generated 2022-05-25 23:00 UTC)