[HN Gopher] Hacking Grindr Accounts with Copy and Paste ___________________________________________________________________ Hacking Grindr Accounts with Copy and Paste Author : snowwolf Score : 111 points Date : 2020-10-02 21:30 UTC (1 hours ago) (HTM) web link (www.troyhunt.com) (TXT) w3m dump (www.troyhunt.com) | vmception wrote: | fuck _" responsible disclosure"_ | | the outcome of this runaround was that grindr stated they will | create a bug bounty program | | proving once again that the "market based bug bounty program" has | better aligned incentives and results in solving the same thing, | vulnerabilities that should have been fixed to begin with were | fixed. | yolomusk wrote: | Gives new meaning to backdooring on Grindr. | sebmellen wrote: | Wow, password reset tokens returned directly in-browser; that's | hard to believe. I wonder how long this had been going on? | trhway wrote: | someone designed and implemented it. Would be interesting to | know the rationale and their train of thought leading to that. | djaque wrote: | "It's friday and I want to go home" :) | | But seriously, I'm sure it was made by people that just | didn't stop to think about security and "it works, so we're | done here" Then, as a business you're not going to try and | fix it if the software already works. That would be pure | cost. | rickyc091 wrote: | If I had to guess, the developer used that to debug the reset | token to QA if the flow worked; then it was forgotten and | skipped past Code Review since the team just left a LGTM | without actually looking at the code OR there were to many | changes to the PR. | hn_throwaway_99 wrote: | I mean, I don't think it's that hard to surmise how something | like this could have happened. Yes, the bug is egregiously | bad, but I don't think it's likely the developer purposely | designed it to work like that. Some simple possibilities: (a) | perhaps the page was originally intended only to be | accessible from a user hitting from a private link sent to | their email address (i.e. how normal password resets work, or | (b) The API in use was designed to only be accessed server- | side, but it was inadvertently proxied through to a client- | side call. | | Again, yes, the bug is very bad. Software is complex, and | humans are humans, and it's not difficult to imagine how | these bugs occur. | trhway wrote: | >it's not difficult to imagine how these bugs occur | | try to imagine this - a security hole easily providing | ability to run any executable with the POC naturally using | notepad.exe was decided to be not a problem and was | downgraded to the bottom of priority - some next release | may be and definitely not a stopper for the current release | - by the product team consisting of the PMs, architects, | senior engineers, etc. on the grounds that the notepad.exe | can't do much damage, if any. | kbenson wrote: | I don't think the developer designed it to work that way | either, but something like this only happens when the | person creating it, or the people touching it after either | don't know how important this interaction flow is or don't | take it seriously enough. | | Whatever API this is using, there's _zero_ reason to show | any information about the request other than "we didn't | die, so it should succeed". Beyond even showing it, there's | zero reason for a password reset API to respond to the | request with the secret at all. If it needs to return | anything identifying about the request, it should be some | identifier that is NOT the secret, which can be used to | pull up general info about the request later if needed | (time generated, whether it was used, is it expired, etc). | Extra points if the access credentials to any back-end API | the request page uses _can 't_ even request the secret key | from a request. | | A sane API makes it very hard for something like this to | happen. Often that takes an inversion of thinking, so | instead of making an API as useful as possible and return | as much data as efficiently as possible, you make to make | it as secure as possible, which means returning as little | data as possible to satisfy the specific needs of the use | case, and different locked down credentials for specific | use cases. | hiharryhere wrote: | That's why doing pen tests on the regular is necessary. We | shouldn't rely on humans getting it right every time, or | strangers on the internet reporting it. | Nextgrid wrote: | From experience I noticed that a lot of developers don't look | at the big picture and don't have a full understanding of how | the system works, what's the rationale behind how the feature | achieved its objective and how it might be abused by a | malicious user. The #1 thing that I think about when I'm | looking at some code or feature (and recommend others do the | same) is how malformed or intentionally malicious input would | break it, but it seems like their developers clearly didn't | do so. | | This is also compounded by the drive to artificially | complicate software stacks (microservices, etc) and "silo" | developers into their own little bubble where they only work | on a small aspect of the system and never have a need (nor | the mental capacity - due to intentionally complicated stacks | with dozens of microservices in various languages) to look at | the big picture. | djaque wrote: | I guess the good news is that it requires knowledge of the user's | email address to execute. You can't just run it on random people | (emails aren't disclosed) and even if you know someone on the app | in real life, chances are good that they use a personal address | that you won't have. | | Still a pretty bad vulnerability and pretty awful that grindr was | ignoring it. | jdminhbg wrote: | > even if you knkw someone on the app, chances are good that | they use a personal address that you won't have | | I doubt that; I bet most users use whatever Gmail/etc personal | address they use for other non-work accounts. | perardi wrote: | Extremely anecdotally: it's [person_name]@gmail.com | | I know of very few friends who go through the process of | creating a burner email account to sign up for Grindr. Now, | maybe that's different in other countries, but at least in | the States, I would bet good money you can guess their Gmail | address. | johnday wrote: | In the case of gmail accounts you could simply prepend | +grindr or any other name to the user part of the email | address to get something (relatively) unguessable. | sebmellen wrote: | Imagine someone running their contact list through this. You | could find everyone you know on Grindr right away, and snoop on | their conversations and read their personal info... | | Not only that, but emails are very easy to find these days with | tools like apollo.io. | nickff wrote: | It would be very easy to target a large group of individuals | at a given organization. | djaque wrote: | Good point, even just being able to use it as a tool to play | "gay or not" has some pretty aweful implications for people | who aren't openly gay. | staplor wrote: | Yes, but even if the referenced security risk is patched | you would still be able to find out if some has an account | or not since a password reset page will tell you if it has | successfully sent an email to an account. | Nextgrid wrote: | A good password reset page would not disclose such a fact | (it would return a successful response with a message "if | this email exists, we'll email you" regardless of whether | it actually exists) however attempting to create an | account would disclose that fact by rejecting an account | creation attempt with an existing email, unless they use | emails purely as communication channels and accounts are | uniquely identified by username/account number instead. | offtop5 wrote: | Considering Egypt is using apps like this to persecute LGBT | people, this is absolutely horrifying. | | I'm so glad I've gone social media free, all the big players in | this space have shown repeatedly they don't care about the safety | of their users. Grindr was already caught sharing HIV status | information with 3rd parties. Eventually these horrible companies | will be regulated, but tons of people are going to be harmed | before that happens. | ve55 wrote: | It would be nice if this level of negligence and incompetence was | somehow punished so that it stopped happening so often | brundolf wrote: | This will continue to happen as long as companies aren't given | any reason to care. The incentives simply don't work out, and I | highly doubt the market will ever change that at this point. | 8bitsrule wrote: | That 'bug' is _so_ stupid and elementary that I 'm disinclined to | think it's a bug. If they had _any_ security people, it 'd never | have existed. So ... they just don't _give a shit_. Surprise? | ojosilva wrote: | > Hey, do you have a Grindr account? | | > Lol | | I can understand this is most probably a private lol by a | surprised. But how about we at least stop making these are you | gay? Lol! a public moment worth screenshooting? | | An Ashley Madison data leak is a national embarrassment whereas a | Grindr one, a "national security threat" [1]. Being on AM is just | a vaudevillian indiscretion, being on Grindr is bro lol that | feeds hate and wrecks lives. | | [1] | https://www.theverge.com/interface/2019/3/28/18285274/grindr... | adatavizguy wrote: | In places they are using Grindr and other apps to target and | arrest people. [0] Worse than what the phrase 'wrecks lives' | connotes. | | [0] https://www.independent.co.uk/news/world/middle- | east/egypt-l... | Kye wrote: | Grindr, honey, this is not what I had in mind when I said I was | looking to get boned. | perardi wrote: | OK, I know it's easy to say "well of course it's not safe, don't | send nudes and don't go on sketchy hookups". But, to paraphrase | Drag Race: men are rotted gila monsters. _(I'm a gay male, I can | say that. Also I speak from experience. I 've seen things you | people wouldn't believe.)_ | | So, as a thought exercise, how do you make an app like this more | secure? Harm reduction is the name of the game. What are the best | practices for this? Is it 2FA? Is it encryption keys linked to | one device? Is it copying principles from Signal? Is it just | having competent developers? | sebmellen wrote: | Uh... One part of it is not returning password reset tokens in | the browser. If you know remotely anything about web security | this is the most glaring security flaw you could ever | encounter. | | Other steps are nice to think about, but ensuring basic | security measures would preempt 99% of data breaches and | "hacks". | brundolf wrote: | The most appalling part is that this was a dedicated | endpoint, named "password-reset". This wasn't some negligent | leak, some misconfigured logger. It was done this way _on | purpose_. Somebody thought this was a _good idea_. And nobody | else saw it and thought to question it! It reveals gross | institutional incompetence that probably should have been | filtered out at the hiring stage. | perardi wrote: | Yeah, in this particular case, they were just glaringly | stupid. | | Just gaming out ideas in my head. I have friends from rather | more repressive countries, namely China, where being gay is | still a grey area in terms of legality and acceptance, and | I'm just thinking of better ways to structure a system. | ghostbrainalpha wrote: | Could you explain why that's bad for someone who knows | nothing about security? | | Where should the password reset token be? | dewey wrote: | In an email sent to the address linked to the account. | Jtsummers wrote: | It should've been sent via email to the registered email | address. That lets the account owner reject it (I didn't | request a password reset!) or use it. | sebmellen wrote: | The token should only be accessible to the user requesting | the password reset, meaning that it would be sent via email | (this is the standard password reset flow). | | The flaw here is that anyone, even if they did not control | the email of the user, could reset the password, because | the reset token was returned in the browser, where anyone | could see it. Essentially, just by knowing someone's email | (not having control over it), you could reset their | password. | darepublic wrote: | A startup I worked for had this exact same security issue. I | brought it up to the tech lead/CEO but they were in denial about | it. Handrolled password reset by dummies basically | 3pt14159 wrote: | Didn't ytcracker work for Grinder? | | It's a hard thing to Google, but I follow him on Twitter and I | thought that was the case. If so, this is a hilarious event for | some other rapper to dunk on. | Nextgrid wrote: | > we believe we addressed the issue before it was exploited by | any malicious parties | | I wonder how they are sure of this. | | In their logs, there would be no difference between a legitimate | password reset and a malicious one, given that even a legitimate | flow would result in an initial request from some IP address, | then when the user receives the email with the reset link they | will most likely click on that from the same computer, thus the | same IP address showing up on the logs. In case of a malicious | attempt the same pattern would be seen - there is no way for them | to know whether the user obtained the reset token from the e-mail | (as they should) or directly from the password reset endpoint | itself. | godelski wrote: | Not saying they did, but couldn't you make an estimate by | looking at frequency of resets by single accounts? If someone | took over an active account presumably that person would reset | the password to get back in (and have a weird email). ASSUMING | Grindr logs the person out of the app when the password is | reset. | | You might also have a few emails from users... | Nextgrid wrote: | This would detect a large-scale attack, but wouldn't detect | small-scale, targeted attacks as they would just get lost in | the noise of legitimate password resets. | | Furthermore, for dormant accounts (where the user is no | longer using the app - potentially because they are now in a | relationship) the user will not notice anything either, and | the notification email is likely to get lost in the endless | newsletter spam the non-technical majority has in their | inbox. | godelski wrote: | I think this is a good point. I'll admit that I'm naive | about web and security (not my area). Are multiple password | resets within a small time frame common? I would not expect | this to be common, but user behavior has often defied my | expectation. If it is uncommon I think you could create a | correlation and get an estimate, if it is common then I | completely agree that it would be lost in the noise. | | And yeah I agree that this type of analysis wouldn't help | with dormant accounts and also does require them to log the | user out on their phone (otherwise why issue another | reset?). But both these could be captured. This is probably | _way_ too much analysis for such an attack and over | engineering the issue, but hey that 's what we all do, | right? haha | Nextgrid wrote: | Increased volume of password resets would indeed suggest | an attack, though it can also be explained by benign | reasons (redesign of the app, marketing campaign | prompting previous users to log back in, news exposure, | the pandemic increasing loneliness and making more people | use dating apps, etc). | | However the biggest risk here is that small, targeted | attacks distributed over time (where a single attacker | only targets a handful of accounts) wouldn't stand out in | the overall statistics. | | In case of this incident, small-scale attacks (where a | single person targets a single account of someone they | don't like) are actually more likely which is why them | saying they do not believe this was exploited while being | completely unable to detect these attacks is so | misleading and lures people into a false sense of | security. | empiko wrote: | They didn't say they are sure of it. They said they believe it | :) | sebmellen wrote: | $50 says they're not. This is something every organization has | to say for PR reasons, but saying "we believe" is very fishy | wording. It could well be this bug has been around for months | before it was discovered, and used by many black/grey-hat | hackers. | grayfaced wrote: | I was disturbed by that statement as well. It's pure PR spin | based on turning a blind eye. | | They could detect mass malicious activity if a single IP was | resetting thousands of accounts. But I'm skeptical they even | checked based on the horrible initial flaw and specious | response. | erichurkman wrote: | Back when I reported a Grindr security flaw (2016), I couldn't | find them on any of the bounty sites, security@grindr.com | bounced, and support failed to route it correctly. | | Reaching out to their CTO, who I found on LinkedIn, and | firstname.lastname@grindr.com got a reply in 8 minutes. | | Sad to see they still haven't upped their security game. | hiharryhere wrote: | That's appalling | | Bug bounties are are well and good, but a basic pen test would | have picked that up. They aren't that expensive and for a | business trading in data that can get you killed in some parts of | the world, should be mandatory. ___________________________________________________________________ (page generated 2020-10-02 23:00 UTC)