[HN Gopher] iViewed your API keys
       ___________________________________________________________________
        
       iViewed your API keys
        
       Author : spellsaidwrong
       Score  : 148 points
       Date   : 2022-04-14 12:10 UTC (10 hours ago)
        
 (HTM) web link (wale.id.au)
 (TXT) w3m dump (wale.id.au)
        
       | Grimburger wrote:
       | I'd be careful about posting stuff like this as a young person in
       | Australia. The modern situation is incredibly hostile towards
       | this sort of disclosure. Especially regarding a government
       | entity.
       | 
       | It's not that you've done anything in the slightest bit wrong.
       | It's that others with power can easily make it become wrong with
       | little to no backlash in the current Australian climate.
       | 
       | I understand the desire for recognition, but certainly think
       | twice and at the very least wait until after election season is
       | over in June so tech-illiterate political opportunists don't
       | pounce to martyr you for their own gain.
       | 
       | Would suggest looking into the fall of someone widely recognised
       | like Dr Vanessa Teague, a professor who pointed out government
       | failures in e-voting and claimed health anonymization measures to
       | make up your own mind. One government department finally had
       | enough and she was out, I'm sure it's a lot cheaper than actually
       | fixing the problems raised.
       | 
       | Behind the laid back "beers and beaches" mirage, Australia is an
       | authoritarian country with huge public support for iron fists,
       | this is clear to many.
        
         | asojfdowgh wrote:
         | Any sources to these claims, other than "it feels that way so
         | it is"?
        
         | mimentum wrote:
         | Na you'd be alright -we're not authoritarian here
        
         | greggsy wrote:
         | They've been reasonable by waiting four months from initial
         | contact, but in vulnerability disclosures it's polite to add a
         | better timeline of events. There's still some detail that
         | hasn't been fully resolved, but it's not clear what the
         | residual impact is.
         | 
         | This particular post doesn't really seem to go into too much
         | depth about what these keys are used for, or the damage that
         | could be done, but I'm erring on the side of 'meh' until proven
         | otherwise. It's freely viewable content if you're in Australia.
         | They've obviously stuffed up on multiple fronts, and my money
         | is on these issues being introduced by an integrator, rather
         | than ABC employee.
         | 
         | Lastly, the ABC is a corporate entity that is fully owned by
         | the commonwealth (and beloved by most Australians) - tue
         | article describes it as 'state media', which has sinister
         | propaganda connotations of broadcasters in some other
         | countries.
        
           | aspenmayer wrote:
           | > Lastly, the ABC is a corporate entity that is fully owned
           | by the commonwealth (and beloved by most Australians) - tue
           | article describes it as 'state media', which has sinister
           | propaganda connotations of broadcasters in some other
           | countries.
           | 
           | I'd compare ABC to PBS in America. They both are state media.
           | I think more liberal usage of accurate terms in this way is
           | needed. Folks ought to know who funded and produced the
           | reporting that they consume, especially when it's those same
           | folks doing the funding. It seems like you're saying that
           | calling it "state media" is something like spreading FUD, but
           | to object to it being said in neutral terms entirely? I don't
           | see who benefits from leaving that info out either.
           | 
           | I think the larger issues is these terms being used as dog
           | whistles for propaganda in the first place. Those who have
           | the power to label others as propaganda are not themselves
           | subject to such labeling. Curious.
        
             | greggsy wrote:
             | Both PBS and ABC are independent of the 'state' though.
             | They're public broadcasters funded by public funds, which
             | are administered by a federal agency (Dep of Industry in
             | Australia).
        
         | derekbaker783 wrote:
         | The same applies in the US, unfortunately.
        
           | reuben364 wrote:
           | See: this ad from Gov Parson in response to a disclosure
           | about a state website leaking social security numbers of
           | teachers
           | 
           | https://m.youtube.com/watch?v=9IBPeRa7U8E
        
             | ascar wrote:
             | "the hacker[/journalist] decoded the HTML source code" and
             | must be prosecuted.
             | 
             | I can't believe what I just watched.
        
             | dhimes wrote:
             | For others: The ad claims that the news outlet that
             | published the fact that the state website had teacher SSNs
             | was part of the "fake news media" exploiting privacy for
             | political gains.
             | 
             | Apparently the SSNs were embedded in the pages html. The ad
             | makes it sound like a huge reverse-engineering job.
        
             | greggsy wrote:
             | The two situations aren't really comparable. The ABC, or
             | the minister responsible for selecting it's CEO (it's owned
             | and funded by the gov as a corporate entity, but not _run_
             | by it) would be laughed out of the room if they suggested
             | that the author should be charged.
        
         | femto wrote:
         | The current Prime Minister would be torn between "law and
         | order" and "sticking it to the ABC". His head would explode.
        
         | mardifoufs wrote:
         | But why is the Australian government so "police state" minded?
         | 
         | Is that really what the Australian people want? I'd guess they
         | just don't care either way, but in that case why would the
         | Australian politicians push for that? Canada has a pretty
         | similar apathy towards politics but even then we don't see the
         | government forcing Canadian citizens to implement backdoors or
         | raiding the offices of a broadcaster. (Yes the recent C-36 and
         | C-10 bills are pretty disastrous in that regard but they are
         | very recent and face quite a bit of opposition)
         | 
         | It's such a peaceful country too, so the entire "hard on crime"
         | policy does not make sense to me. Is it a partisan issue in
         | Australia or is it something both parties agree on?
        
           | jandrese wrote:
           | Propaganda works, Rupert Murdoch has known it for decades.
           | Keep the people scared and they won't question what you are
           | doing.
        
           | throwaheyy wrote:
           | In simple terms, Australia is a relatively young country that
           | formed its own government in 1901. It was also isolated from
           | the rest of the world and has a harsh environment with a lot
           | of things that can kill you. This produced an overall culture
           | of helping each other when you can (what gets called
           | "mateship"), and trust in the government to help when it is
           | needed. Australians generally like an orderly society, that
           | translates to a publically-approved police state.
        
             | YPPH wrote:
             | Slight nitpick: Australian colonial (State) governments
             | existed well before the Constitution of the Commonwealth in
             | 1901, at least since 1788.
             | 
             | I'm not so sure about the relevance of the so-called "harsh
             | environment" to political culture. Although, it may be said
             | Australians have a more deferential view of certain aspects
             | of politics than other Western countries.
        
               | throwaheyy wrote:
               | I'm well aware of the predecessor colonial governments
               | pre-Federation. My point is that those were symbols and
               | apparatus of British imposed rule, and it is relatively
               | recent that the notion of an "Australian government", as
               | its own national sovereignty, came about. Specifically
               | since the change in national identity had a direct impact
               | on the government no longer being looked at as "the
               | ruling class".
        
           | dylan604 wrote:
           | Seeing as Australia was used as a prison colony, I'd have
           | thought they'd be much more likely to be against a strong
           | ruling class.
        
             | cookie_monsta wrote:
             | In the same way that you would expect most people in the US
             | to adhere to puritan Calvinist beliefs?
        
           | cookie_monsta wrote:
           | > Is it a partisan issue in Australia or is it something both
           | parties agree on?
           | 
           | They've been boiling the frog since 9/11. All anybody has to
           | say is "national security" and the opposition party pretty
           | much folds. Imagine the US without the Bill of Rights -
           | that's the landscape.
        
         | AdamJacobMuller wrote:
         | That's really sad.
         | 
         | I've never been there but definitely romanticized the beers and
         | beaches and kangaroos motif. Oh well.
        
         | Oberbaumbrucke wrote:
         | Completely agree. The AU Gov would probably call this hacking
        
           | nomilk wrote:
           | The "ctrl + alt + u" attack vector.
        
           | greggsy wrote:
           | No, the Australian Cyber Security Centre would consider this
           | responsible disclosure and advise the ABC to uplift their
           | review processes.
           | 
           | On the other hand, Sky News (the current government's right
           | wing mouthpiece) would use this as an opportunity to
           | discredit the ABC.
        
           | michaelt wrote:
           | Quite a few countries have laws from the 1980s that basically
           | say "gaining unauthorised access to computer systems is a
           | crime"
           | 
           | Which is of course a very expansive definition. Think you've
           | found a leaked database credential and you test it before
           | reporting, so as not to create a false alarm? That's illegal
           | hacking. Almost any persistent XSS? That's illegal hacking.
           | Access an admin panel by entering a default password? You
           | guessed it, illegal hacking.
           | 
           | We might get the _impression_ these laws don 't exist,
           | because they aren't enforced internationally or if the hacker
           | can't be identified - so black-hat hacking, cryptolockers,
           | tech support scams, giant data breaches and suchlike go
           | completely unpunished. But a white-hat hacker who _identifies
           | themselves_ in hopes of getting their security report taken
           | seriously might well get a visit from the cops.
        
             | gonzo41 wrote:
             | In Australia the goto for dropping a legal hammer on a
             | digital crime is "misuse of a carriage service" which is
             | just a big lasso that puts crimes like fraud that happen on
             | the internet into a simple basket so they can attach
             | sentences as they see fit.
        
             | nerdawson wrote:
             | Both the first and third example you gave would strike me
             | as crossing the line.
             | 
             | Without permission to test the security of a system, you
             | shouldn't be trying credentials you've stumbled upon or
             | defaults.
             | 
             | If you randomly try my front door and find that it's
             | unlocked, don't expect me to be thanking you.
        
               | drdaeman wrote:
               | > If you randomly try my front door and find that it's
               | unlocked, don't expect me to be thanking you.
               | 
               | Why? If someone tries my front door, doesn't go in but
               | confirms that it is unlocked by opening it by an inch
               | (=verifies the DB credentials but doesn't run any
               | queries) without really peering into my private spaces,
               | then privately reaches out with "hey, hey, your door is
               | not locked - I haven't went in but I know it's unlocked,
               | you may wanna look into this" then I imagine while that
               | could be odd situation (e.g. depending on whenever one
               | has a lawn), I would be grateful and not in the least bit
               | offended.
               | 
               | Surely, I wouldn't be happy if I'd get an alarm that my
               | door is suddenly open (IDS alert) and would react
               | accordingly. But if my door is not locked and I'm not
               | aware and someone responsibly discloses this - I don't
               | see how that'd be an issue.
        
               | throwaway744678 wrote:
               | A friend or a nice neighbor: why not. But a random
               | stranger? I'd certainly be unhappy! Why would they even
               | try to open the door in the first place?
        
               | sodality2 wrote:
               | Better one who would let me know, than someone who would
               | steal everything and sell it, no?
        
         | spellsaidwrong wrote:
         | Thankfully, I already disclosed the issue to iView's engineer
         | team back in December 2021, and a lot of the original data has
         | since been removed from the site. I do try to take care of this
         | issue by censoring a lot of information about the security
         | issue in the write-up, but I'm not sure if that is enough.
        
           | Grimburger wrote:
           | I certainly understand, hopefully you see my point though
           | that being right sadly isn't enough for many/most in our
           | country anymore :/
           | 
           | Good luck with things, it was just a warning and hopefully
           | not a dissuasion from continuing on.
           | 
           | https://www.theguardian.com/australia-
           | news/2020/mar/08/melbo...
        
           | mimentum wrote:
           | It has been censored enough, given the timeframe and previous
           | disclosure to ABC. Good article though.
        
         | jacobsenscott wrote:
         | Also in the US, where you can be sentenced to 41 months in
         | prison for browsing a public URL at AT&T, and where the the
         | Governor of Missouri wants to make it illegal to view the html
         | source of a web page (because some state web site leaked all
         | the SSNs of their teachers in some hidden html or something).
         | 
         | If I found something like this on a site I don't think I would
         | notify anyone. Too risky. Maybe over TOR if they have a contact
         | page or something. But it is hard to be anonymous these days.
        
           | legalcorrection wrote:
           | weev didn't just "browse a public url at AT&T". That is
           | dishonestly reductionist. He noticed the bug and then used it
           | to retrieve and make public the private data of over a
           | hundred thousand people.
        
             | aspenmayer wrote:
             | The data was already publicly available. Didn't he just
             | publicize the url?
        
               | [deleted]
        
         | OJFord wrote:
         | I booked a hotel stay (in Canada, not Australia) and got an
         | error page at some point that dumped out all env vars including
         | database credentials.
         | 
         | Tried my best to report (not publicly disclose) it, including
         | asking the front desk for contact information for IT; no
         | response.
         | 
         | I think we're (on HN) often in quite a bubble of being (or
         | striving to be) hot on this sort of thing, or frankly far
         | trickier to exploit sorts of things, when really the bar for a
         | lot of ('IT is a cost centre') stuff out there is extremely
         | low.
         | 
         | I don't think this sort of leak or vulnerability is anywhere
         | near as rare (which isn't even _that_ rare) as it seems - I
         | think an awful lot must just get quietly exploited or go
         | unnoticed. We 're only hearing about this one because someone
         | thought it was 'lolz', I didn't publicize the one I noticed (in
         | my normal user behaviour of just trying to book a room!) nor
         | did I see if I could connect to the database and book myself in
         | for free or something. And I only noticed it because it a)
         | experienced an error; b) dumped env vars in the event of an
         | error - i.e. I didn't have to look for it. How many other sites
         | have I used since with similar problems but which just didn't
         | happen to serve it up on a silver platter for me?
        
           | mkl95 wrote:
           | Most C-level people couldn't care less about security. I'm
           | yet to work for a SaaS that implements 2FA, yet at some point
           | all of them have had passwords that a script kiddie could
           | brute force within an hour. The only time I've seen security
           | become a top-level priority was when some customer demanded
           | some kind of checkbox compliance like SOC / ISO27001.
        
           | someotherperson wrote:
           | Leaks are everywhere. I went to a certain country and needed
           | to register my phone, somehow ended up in a workflow that
           | allowed me to enter any national registration number (similar
           | to a social security number) and it would output the person's
           | name, phone number, address and other details for me to
           | confirm that that was me :)
           | 
           | No rate limiting on the endpoint, doesn't require auth,
           | didn't block my VPN, doesn't even set cookies (very privacy
           | conscious devs apparently). I could have mined the entire
           | country's data. Insane.
        
             | cortesoft wrote:
             | How do you know it wasn't rate limited?
        
               | AdamJacobMuller wrote:
               | for((i=0;i<=99999999999;i++));do curl
               | https://site.com/info.php?id=$i > $i.json done
        
               | duskwuff wrote:
               | Hardly matters. If an endpoint is leaking anything as
               | sensitive as national ID + name + address, a determined
               | attacker will have no problem with scraping it slowly or
               | using a network of proxies to avoid rate limits.
        
           | dylan604 wrote:
           | My favorite is just having the console open while visiting
           | the web. It is amazing the amount of information devs
           | "forget" to remove from sending to the console in production.
           | A lot of console vomit is from JS frameworks. I don't know if
           | there's a switch that can tell them to shut up in production
           | or not, but it's one thing I look out for on anything I work
           | on.
        
             | baobabKoodaa wrote:
             | const log = function(thingToLog) { if (DEV_ENVIRONMENT)
             | console.log(thingToLog) }
        
       | account-5 wrote:
       | I'm not a web developer but this stuff, from the outside looking
       | in, is reason enough for me to avoid. Obviously I'm no expert but
       | this sort of stuff happens enough that I wouldn't even know where
       | to start to learn this stuff properly.
        
         | zach_garwood wrote:
         | The disappointing thing is that preventing this sort of thing
         | doesn't take an expert. If you wouldn't post a secret to Reddit
         | or Twitter, don't send that secret to a browser client. That's
         | like webdev 101 stuff.
        
       | harg wrote:
       | I'm not sure how bad this actually is. I haven't examined all the
       | env variables exposed, but it's fairly common to expose public-
       | facing api keys for services that require client-side
       | communication with a 3rd party API. E.g. for client-side bug
       | tracking, search etc.
        
         | blenderdt wrote:
         | If it is a paid service other can now use the service while you
         | pay the price.
         | 
         | And the API might also expose data you don't want to expose to
         | the public.
         | 
         | That's why you never put these on the client side. There are
         | better options, for example a proxy that injects tokens into
         | the header.
        
           | spicybright wrote:
           | Handing anyone your API key to use as they want is just
           | asking for trouble. I'm shocked some people think that's an
           | ok pattern to do...
        
             | harg wrote:
             | Usually the public facing keys are restricted in a number
             | of ways to help prevent abuse. E.g. they'll have strict
             | rate limiting, fine-tunable scopes, domain restricted, can
             | only be used in conjunction with a server-side secret.
             | 
             | E.g. Stripe has a publishable key and a secret key. The
             | publishable key links the checkout session to a particular
             | Stripe account, but you can't actually initiate a checkout
             | session without setting a session ID from the server (which
             | requires the secret key). If the 2 keys don't belong to the
             | same account then the checkout session will fail.
             | 
             | Yes, with some services you can proxy requests via your own
             | service. But how is this more secure? If anything you've
             | just increased the potential attack surface.
        
             | kall wrote:
             | And yet it's incredibly common. I would guess 95% of users
             | of services like Firebase, Supabase, Algolia, Sentry,
             | Segment, Cloudinary, Auth0... do this, because it's the
             | point and officially endorsed. They are intended to be
             | "frontend safe" to an extent. Not against service
             | abuse/scraping maybe, but against actual RCE, unauthorized
             | actions or unauthorized data access.
             | 
             | I guess you can proxy all that, but then what do you that
             | the third party couldn't be doing for you? Can the user not
             | accomplish the same thing through your proxy that they
             | could through the key? It'll be easier to drop some
             | requests than it is to revoke an API key I guess. You could
             | use client-certificates, stuff along the lines of
             | Cloudflare's API Shield etc but I would guess that only the
             | top 5% of applications do this.
             | 
             | I've certainly done it/am doing it right now, which is
             | likely why I'm writing such a defensive comment.
             | 
             | Honestly if an enterprising user/developer wants to do
             | something like dump all data that is already accessible to
             | them, more power to them.
        
             | chatmasta wrote:
             | If your visitors are making requests to SaaS APIs on your
             | behalf, how can a SaaS identify the visitors belong to you
             | without a key?
             | 
             | In general if a SaaS has a client-side SDK, they've
             | designed around this and give you an API key just for the
             | client bundle. It has only the permissions required for the
             | client SDK, which - yes, could give a client the ability to
             | run up your bill. But you could say the same about any
             | usage based service. It's up to you and the service to
             | mitigate against that.
             | 
             | I'm not familiar with every variable in the screenshot from
             | this blog post. Of those I'm familiar with, I don't see any
             | secrets in there.
        
             | ehnto wrote:
             | Part of the selling point of Algolia is that they handle
             | the misuse mitigation for search, and running queries
             | directly through their API using public facing keys is the
             | enabling feature for that.
             | 
             | The way to hide your keys would be to bounce the search
             | through a proxy server that then does the API call on
             | behalf of the user, but you're really not gaining anything
             | by doing that, and now you're on the hook for traffic
             | misuse mitigation.
             | 
             | Most APIs designed to be used this way also whitelist your
             | token to your site domain, so they can't just steal your
             | key and use it on their own site. Regardless, Algolia and
             | services like it fully expect to be exposed to the full
             | force of the internet at all times, so there's really
             | nothing your API key can do that they're not already
             | covering their bases for.
        
         | jeroenhd wrote:
         | Currently, most strange state keys seem to have been removed.
         | When you check the web archive (http://web.archive.org/web/2021
         | 1201000716/https://iview.abc....) though, you can see variables
         | like "USER": "www-data", "HOME": "/var/www" and "PATH": "/usr/l
         | ocal/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/gam
         | es:/usr/local/games:/snap/bin". There's something called
         | "DRM_WEB_SECRET" which isn't currently in the HTML anymore, I'm
         | guessing they shouldn't have shared that.
        
           | harg wrote:
           | Yes, that's not ideal. Thankfully not too sensitive by
           | itself, but clearly something hasn't been done right.
        
         | notamy wrote:
         | If you look at the env vars in there, you can see things like
         | $PATH and $HOME leaked.
        
       | nojs wrote:
       | *.id.au is an interesting domain that I haven't seen before.
       | Apparently you can get an id.au iff you're an Australian citizen,
       | and it must approximately match your real name.
        
         | Grimburger wrote:
         | From my experience it's very common with infosec students in
         | Australia, the general public barely know it exists and
         | probably consider it a _weird_ second level domain.
        
       | raxxorraxor wrote:
       | To be fair, I think a lot of developers begin with that. There is
       | a logistical problem in providing secrets to a process without
       | getting the secret exposed. Environment variables are an often
       | chosen approach. Of course when the software is tested and ready
       | to be deployed, the step to use a secure container containing
       | credentials is often neglected like it was probably done here.
       | This isn't necessarily sloppy programming, it is just skipping an
       | essential step.
       | 
       | How do you provide your secrets to your apps? Using an external
       | service? That would still require another set of credentials.
       | Using environment variables? A file only the user running the app
       | has access too? Another way?
        
         | maccard wrote:
         | > How do you provide your secrets to your apps? Using an
         | external service? That would still require another set of
         | credentials. Using environment variables? A file only the user
         | running the app has access too? Another way?
         | 
         | It sucks for a small team or for anyone who is trying to run a
         | free tier, but terraform plus aws secrets manager or vault
         | works really well. Using a db password as an example, for our
         | app we generate a random password, store it in secrets manager,
         | and our containers on fargate run with an iam role that allows
         | access to that secret. Our state is stored in S3, and the infra
         | is applied on commit to main with a terraform plan run on the
         | merge request to main.
         | 
         | Our biggest security vector is always going to be someone using
         | elevated credentials to access something, but this way there is
         | no state on a developers machine at any point for any of our
         | production infra.
        
         | omegalulw wrote:
         | > How do you provide your secrets to your apps? Using an
         | external service? That would still require another set of
         | credentials. Using environment variables? A file only the user
         | running the app has access too? Another way?
         | 
         | A credential/key storage service, either on device/server or as
         | a separate device, with IAM to control whether the user
         | executing that process can use that secret or not. The user in
         | this case for prod services should be a prod (non-human) user.
         | 
         | When all your services are like this, no person should have
         | direct access. You generate key/secrets depending which
         | services you want to allow to communicate with each other and
         | the keys live in the key store. For secrets for external
         | services, e.g. API keys, someone would have to enter them once,
         | yes.
         | 
         | You also should try not to rely on secrets, rather invest in
         | proper authentication/authorization logic.
        
           | Macha wrote:
           | There's a chicken and egg problem here. If you move your
           | secrets to a secret management service, how do you provide
           | the credentials to unlock to that? Whether it's on disk, in
           | the environment or on an internal endpoint like IAM host
           | roles there's ways for this to be exposed in the event of
           | bugs or security vulnerabilities in your application
        
             | SomeCallMeTim wrote:
             | > in the event of bugs or security vulnerabilities in your
             | application
             | 
             | The article talks about keys being published as part of the
             | web page configuration.
             | 
             | That's far worse than "they could hack a server and gain
             | its credentials!"
        
         | danenania wrote:
         | Shameless plug, but we built EnvKey[1] for exactly this
         | purpose.
         | 
         | Instead of providing secrets directly to a server, you generate
         | an ENVKEY token which is used to fetch and decrypt the app's
         | secrets/config and supply them as environment variables to the
         | process.
         | 
         | The ENVKEY still needs to be protected, but you can limit which
         | IPs can access it _and_ if access is cut off, the process will
         | be killed and secrets cleared immediately, so you do get some
         | additional protection. All access attempts are also logged.
         | 
         | 1 - https://envkey.com
        
         | alias_neo wrote:
         | The step they seem to be missing is _the entire development
         | process_.
         | 
         | If you're using API keys to access stuff, you do it on your
         | backend, there's no excuse for that stuff to make it to the
         | frontend.
         | 
         | If your "client" needs access to sensitive API keys, you need
         | to rethink your architecture.
         | 
         | As a (senior) backend software engineer, this reeks of a
         | person/team who doesn't know how to architect and/or implement
         | web applications/software.
        
           | raxxorraxor wrote:
           | Yeah, it shouldn't reach the client in any case. But
           | providing secrets to applications isn't really a well solved
           | problem in my opinion. Even if it is just an environment
           | variable for the server process it could get exposed.
           | 
           | If a clients needs an API key I would think to route the
           | requests through the server and add the key information at
           | that point, but I am not a web developer and not sure if that
           | always scales for any use case.
        
             | SomeCallMeTim wrote:
             | It's simply software engineering _malpractice_ to have
             | _ever_ sent any of those keys to the client.
             | 
             | There is no excuse.
             | 
             | It _is_ a well-solved problem to handle secrets; there are
             | better and worse solutions. An environment variable for a
             | server can get exposed _if the server is hacked_ ; a secret
             | sent to a client _is exposed the second the server goes
             | live_. One of these is much worse than the other.
             | 
             | There are also better solutions than environment variables.
             | A competent team would be aware of many options. Whoever
             | coded this is _not competent,_ full stop. It 's not that
             | they didn't finish; these services should never have
             | accessed from the client _at all._
        
           | _fat_santa wrote:
           | According to the article, they were keeping their environment
           | variables in React's local state. To anyone that works with
           | React professionally, or even on the side, this is so
           | baffling that a team would do this.
           | 
           | I'm honestly wondering who they hired for the job. Because
           | this is one of the most fundamental failings in security I've
           | ever seen.
        
             | ushakov wrote:
             | > I'm honestly wondering who they hired for the job
             | 
             | let me guess: bootcamp graduates? whoever was the cheapest?
        
               | SomeCallMeTim wrote:
               | Or outsourced to the lowest bidder.
        
           | ratww wrote:
           | Is is a bit more nuanced than that. This web client needs
           | access to non-secret keys that are passed via environment
           | variables. This is absolutely commonplace.
           | 
           | However there are two _real_ issues here: first, some bug in
           | the code is causing _all_ environment variables to be dumped
           | into the JS bundle. You can see that in inane keys like PATH,
           | HOME, PORT.
           | 
           | This issue wouldn't be such a huge problem by itself. The
           | second problem is that during the build process for the
           | frontend, there are environment variables that shouldn't be
           | there, such as the secret ones. The CI or build machine
           | should have been well isolated enough to prevent this problem
           | from happening.
           | 
           | This is a problem in the CI coupled with a bad build process.
        
             | greggsy wrote:
             | Great explanation - client side apps often seem to be a bit
             | of a catch-22 in some cases.
        
               | danenania wrote:
               | It's crucial to always use an allow-list approach to
               | passing config through to a client.
        
               | ratww wrote:
               | Yep. Maybe just forbidding enumerating environment
               | variables in the runtime would be enough for this case.
               | 
               | However the CI shouldn't have backend-only private
               | variables available to frontend builds... some separation
               | here would be safer regardless of developer mistakes.
        
       | darepublic wrote:
       | Agile did not save their asses. Sprint planning, grooming, modern
       | frameworks, all the trappings of modernity but crucially no one
       | present to say "secret key in frontend is a no-no"
        
       | ehnto wrote:
       | The Algolia side is required and expected, no? I know you can
       | hide said details to be even safer, but it's expected to have the
       | API public tokens available to the client so they can use the API
       | from your site. The keys shouldn't work on other sites since the
       | API will whitelist your application URL, so stealing them is
       | pointless.
        
         | saimiam wrote:
         | wouldn't `curl <algolia url> -H 'api key' -H'origin:whitelist-
         | url'` let you use Algolia as if you were ABC if all Algolia
         | does is URL whitelisting?
        
           | simonw wrote:
           | Yes, and that can't be avoided. Best you can hope for is that
           | Algolia have their own rate limiting in place that will kick
           | in if someone starts scraping their API with our API key.
        
           | anamexis wrote:
           | Yes, which is fine. Same is if you snagged a Google Maps API
           | key off of any old site and used it with curl.
        
           | chatmasta wrote:
           | You can also use your browser to go to the site and query
           | Algolia. What's the difference?
        
       | gitgud wrote:
       | You can easily see 5 environment variables ending in "_KEY"
       | stored in "window.__INITIAL_STATE__"... crazy this got into
       | production...
       | 
       | view-source here -> https://iview.abc.net.au/
        
       | rhacker wrote:
       | I wouldn't be surprised if there's a significant number of small
       | or large deployments out there that use a nodejs build such as
       | webpack, that has pulled in some kind of JSON configuration file
       | with prod keys exposed hidden in those huge bundles.
        
       | intunderflow wrote:
       | Given it's Australia how long until ABC claim to have been "the
       | victim of a sophisticated hack" and get the author arrested
        
       | louissan wrote:
        
         | speedgoose wrote:
         | Not really?
        
           | zach_garwood wrote:
           | Kinda really. This entire class of security lapse can be
           | avoided by not building a js app on the client.
        
             | gcommer wrote:
             | Plenty of server-rendered apps have been caught putting
             | private data in responses. In this very same comment
             | section people have mentioned the story of a state website
             | that included teacher SSN's in hidden fields[1], and OJFord
             | shared a story of a server that included a full env var
             | dump in error messages[2].
             | 
             | This sort of thing happens all the time to all sorts of
             | services. Rather than just blaming JS, it's far more
             | productive to think of technical controls that could catch
             | this. For example Taint Checking[3] or scanning server
             | responses for API keys.
             | 
             | [1]: https://news.ycombinator.com/item?id=31026374
             | 
             | [2]: https://news.ycombinator.com/item?id=31026415
             | 
             | [3]: https://en.wikipedia.org/wiki/Taint_checking
        
             | [deleted]
        
             | speedgoose wrote:
             | Yes I guess if you don't build apps you have fewer security
             | issues.
        
           | louissan wrote:
           | The reason I react (no pun intended haha) this way is because
           | I have seen with my own eyes blue chip (US -- is that the
           | right expression for "company name known quite literally the
           | world over"?) companies doing exactly these sort of things.
           | 
           | I have seen the plastering of webpacked-gulpified-obfuscated-
           | minified-hoistpropified-younamedifiedit mind-blowing amounts
           | of Javascript to create the latest "cool" app.
           | 
           | I have seen web pages best described as "this webpage is so
           | heavy not even light can escape it". The (not full) cast, not
           | in order of appearance:
           | 
           | . 5MB+ of Javascript (not kidding) -- _after_ the above
           | 
           | . gazillions of HTTP requests
           | 
           | . internal data bled through to the outside world (not always
           | problematic from a security point of view, but one may as
           | well: 1) save on the bytes 2) not give malicious ideas to
           | you-know-who)
           | 
           | . code coverage is inversely proportional: with grand score
           | hovering around 3-5%.....
           | 
           | End result:
           | 
           | . from inside: a bloated over-engineered chaos (see entry:
           | "Big Ball of Mud")
           | 
           | . from outside: catastrophic load times of around 35 seconds
           | w/o a primed cache.
           | 
           | Talk about the latest shiny new "single page
           | application".....
           | 
           | Granted, it's not always like that! :)
        
         | ovyerus wrote:
         | No it's just bad code/devops
        
           | zach_garwood wrote:
           | theyre-the-same-picture.jpg
        
       ___________________________________________________________________
       (page generated 2022-04-14 23:01 UTC)