[HN Gopher] Does Apple really log every app you run? A technical...
       ___________________________________________________________________
        
       Does Apple really log every app you run? A technical look
        
       Author : jacopoj
       Score  : 125 points
       Date   : 2020-11-14 20:39 UTC (2 hours ago)
        
 (HTM) web link (blog.jacopo.io)
 (TXT) w3m dump (blog.jacopo.io)
        
       | m463 wrote:
       | > As you probably have already learned during Apple's OCSP
       | responder outage, you can block OCSP requests in several ways,
       | the most popular ones being Little Snitch
       | 
       | Uninformed advice - apple prevents little snitch from blocking
       | this traffic in big sur.
        
         | miles wrote:
         | >> you can block OCSP requests in several ways, the most
         | popular ones being Little Snitch
         | 
         | > Uninformed advice - apple prevents little snitch from
         | blocking this traffic in big sur.
         | 
         | You can prevent Apple from preventing Little Snitch from
         | blocking that traffic:
         | https://tinyapps.org/blog/202010210700_whose_computer_is_it....
        
         | istjohn wrote:
         | Keep reading.
         | 
         | >If you use macOS Big Sur, blocking OCSP might not be as
         | trivial.
        
       | olliej wrote:
       | I get that a dev cert isn't the same as identifying the software
       | itself... but that only applies for developers that have multiple
       | apps, and I suspect most do not.
       | 
       | Then unencrypted requests are also a Bad Thing, because anyone
       | has access to the same info - it may require a lot of work to get
       | general knowledge of what apps someone is using, but if you were
       | looking for a specific one then I don't see any real difficulty
       | identifying that.
       | 
       | e.g. if I wanted to know if someone was using signal I just look
       | for the signal cert being queried. That's a much easier problem,
       | and can be dangerous to the end user.
        
       | Nightshaxx wrote:
       | The idea that sending information about the cert is somehow not
       | exposing the app is crazy. An attacker could easily download apps
       | and sniff the network traffic to correlate cert info with an app.
       | 
       | Also i don't get the argument for using HTTP. Aren't these two
       | separate systems?
        
       | tedd4u wrote:
       | Seems like the solution is just put a short timeout on the OCSP
       | call and fail positive? Nets the same behavior as when you're
       | offline.
        
         | saagarjha wrote:
         | That's what they do.
        
       | supernova87a wrote:
       | I think it would help if someone could quote or reference Apple's
       | official position / explanation on this (if there is one).
       | 
       | You know, before declaring the end of the world, is there any
       | information from the source (Apple)? Discussions here seem to
       | have had several thousand comments without obtaining this basic
       | info. It would be good to know, I would think?
        
         | saagarjha wrote:
         | Apple rarely posts their official positions on things.
        
       | intricatedetail wrote:
       | Apple should change their name to "Peeping Tim".
        
         | chpmrc wrote:
         | Did you even read the article? It clearly shows that all that's
         | being sent to Apple is some opaque info about the dev
         | certificate used for the app(s).
        
           | justinclift wrote:
           | It clearly shows that Apple is getting fed the dev
           | certificate info for each application being launched.
           | 
           | For developers with multiple applications, then sure, that's
           | not going to be as clear as individually identifying the
           | application.
           | 
           | But there are plenty of developers around with just one
           | popular application. Sending the dev certificate for them is
           | effectively the same as sending the application hash itself.
        
             | oneplane wrote:
             | They already know they exist (they sign them) and most of
             | those are downloaded via the AppStore (they run that) and
             | people tend to log in using iCloud (which they own).
             | 
             | I get it, we're all supposed to trust nobody and have 7
             | billion independent islands where you don't have to trust
             | anyone or work with anyone.
             | 
             | I have not seen any solution, just people piling on. Having
             | PKI and signatures using a central authority is the least-
             | worst solution we have right now, and until something
             | better is created we don't really have a lot of places to
             | go (unless we accept downgrading common user's security and
             | usability).
        
               | justinclift wrote:
               | I'm not sure what you're getting at. ;)
               | 
               | "They already know they exist ..." doesn't really seem to
               | match up? Like, of course they do.
               | 
               | Anyway, I was just pointing out that the communication
               | still seems pretty close to sending Apple the list of
               | applications being run. At least, for applications
               | created by dev's with only one major program for their
               | certificate.
        
           | Deathmax wrote:
           | Which is still sufficient information to narrow down to the
           | set of applications developed by a single entity. And because
           | this is being done over HTTP, anyone along the network chain
           | has visibility as well.
        
           | randyrand wrote:
           | So opaque that this journalist figured out what it is and
           | what it stands for in a few hours.
        
         | coldtea wrote:
         | A low effort comment.
         | 
         | Doubly bad, because the actual post is about how Apple doesn't
         | actually do the "tracks your every use of an app" peeping they
         | original post that made all the fuss says they do.
        
           | intricatedetail wrote:
           | Learn about Big Sur(veillance). You can't block telemetry and
           | it bypasses any VPN.
        
             | oneplane wrote:
             | Seems to work fine here, nothing bypasses my VPN nor does
             | it bypass my firewall. Perhaps that's because my VPN and my
             | firewall aren't running inside the computer but external to
             | it, as it should.
        
               | iso947 wrote:
               | The direction we are going is built in cell comms for all
               | devices, good luck firewalling that.
        
             | Wowfunhappy wrote:
             | Well, you _can_ block it. You need disable SIP and edit a
             | plist.
        
       | paultopia wrote:
       | Has anyone used a pi-hole to block apple privileged servers, like
       | the OCSP one, while running Big Sur? I'm thinking of setting one
       | up---not necessarily to block OCSP, because the points in this
       | post about actually wanting to know when a certificate has been
       | revoked are sensible---but to at least have the option in case of
       | another disaster...
       | 
       | Relatedly, does anyone know if Big Sur allows one to use a custom
       | DNS server on the device level with those privileged
       | destinations? (He says, mulling the complexities of getting a pi-
       | hole working with his mesh system.)
        
         | cybralx wrote:
         | Custom DNS servers are available on Big Sur. My home network
         | uses pfSense as a gateway for LAN. This gives more options
         | blocking outbound connections or routing connections thru a VPN
         | connection based on certain conditions.
         | 
         | https://www.pfsense.org
        
       | Technically wrote:
       | Oh look, another asshole wading into journalism with zero
       | training. If you can't interpret sources don't comment on them!
       | 
       | It's not clear which standards are enforced by the mods here. It
       | seems like the "don't start trouble" standard....
        
       | banachtarski wrote:
       | There will be a day when all apps on a mac will only be
       | installable from the app store. Developers will be forced to buy
       | macs and subscribe to Apple's developer program to support it.
       | Customers will be trained to not care. And HN Apple fanboys and
       | fangirls will try to justify why this is a Good Thing(TM).
        
         | john_alan wrote:
         | Tell me more about the future?
        
         | macintux wrote:
         | We've been hearing that for years, yet it hasn't happened.
         | Apple seems to recognize the value of the Mac as an general
         | computing platform.
        
       | ravenstine wrote:
       | > macOS does actually send out some opaque information about the
       | developer certificate of those apps, and that's quite an
       | important difference on a privacy perspective.
       | 
       | Yes, and no. If you're using software that the state deems to be
       | subversive or "dangerous", a developer certificate would make the
       | nature of the software you are running pretty clear. They don't
       | have to know exactly which program you're running, but just
       | enough information to put you on a list.
       | 
       | > You shouldn't probably block ocsp.apple.com with Little Snitch
       | or in your hosts file.
       | 
       | I never asked them to do that in the first place, so I'll be
       | blocking it from now on.
        
         | mkskm wrote:
         | Privacy concerns aren't the only reason to block it. It also
         | makes software way more responsive. I was experiencing daily
         | freezes that would disconnect my keyboard and mouse
         | (particularly when waking the computer or connecting to an
         | external display) on my 2020 MacBook Air before adding the
         | entry to my hosts file which fixed the issue entirely. It was
         | so pronounced and irreparable by Apple support technicians I
         | nearly ended up getting rid of the computer.
        
         | josephcsible wrote:
         | > I never asked them to do that in the first place, so I'll be
         | blocking it from now on.
         | 
         | Apple's working on making sure you can't block it. They already
         | keep you from blocking their own traffic with Little Snitch and
         | similar tools: https://news.ycombinator.com/item?id=24838816
        
           | m463 wrote:
           | It's worth noting that on ios you can never block anything -
           | just have to put up with it.
        
             | sesuximo wrote:
             | ... and that apple wants to merge its operating systems
        
             | delvinj wrote:
             | You can still block access by host by using an HTTP proxy
             | like Fiddler or Charles.
             | 
             | Settings > WIFI > Proxy
        
           | IfOnlyYouKnew wrote:
           | You also didn't ask them to put a clock in the top right
           | corner. I hope that gets the same level of righteous
           | exasperation.
           | 
           | (Just discovered MacOS includes at least two fonts and a
           | printer driver I never asked for. How dare they?)
        
             | saagarjha wrote:
             | The clock is where Notification Center lives, so you can't
             | get rid of it. (Of course that makes sense.)
        
             | ravenstine wrote:
             | I mean, if macOS didn't come with a clock, I would say
             | "why's there no clock?" I don't think 99.9% of people are
             | asking the same thing about their OS sending potentially
             | sensitive data to a centralized server owned by a Silicon
             | Valley tech giant without ever asking them while forcing
             | every other app to jump through permissions hoops.
        
           | api wrote:
           | The hosts file works for now. Use 127.0.0.1 and ::1 on two
           | separate lines. Used tcpdump to verify.
        
           | ravenstine wrote:
           | Isn't that just with Big Sur? Also, I'm using the hosts file
           | method.
        
             | claudeganon wrote:
             | I don't think it's actually just in Big Sur. At the bottom
             | of this post describing how to stop them from hiding
             | traffic, they mention someone did a test on Catalina and
             | ran into an issue with the Messages app:
             | 
             | https://tinyapps.org/blog/202010210700_whose_computer_is_it
             | ....
        
             | anamexis wrote:
             | The OP is about Big Sur.
        
           | cute_boi wrote:
           | if they keep doing like this I will block their entire ASN .
        
             | iso947 wrote:
             | Until they front it via cloudflare or aws. I got hit by AWS
             | blocking when setting up a network in Russia for the 2018
             | World Cup - my unifi controller was on an ec2 instance that
             | was blocked due to telegram shenanigans. Worked around the
             | problem but shows that blocking an AS can lead towards an
             | unusable computer.
        
       | [deleted]
        
       | sz4kerto wrote:
       | Can someone explain my why is this significantly less problematic
       | than sending out app hashes? If we accept that most developers
       | don't have many similarly popular apps, then isn't this enough to
       | infer what apps are users running?
       | 
       | In the example from the article: if Mozilla's certificate is
       | sent, then it's very likely that the app that has been opened is
       | Firefox, as the a priori likelihood of using Firefox is way
       | higher than eg using Thunderbird.
       | 
       | If the developer is Telegram LLC, then ... and so on.
        
         | readams wrote:
         | It is only very very slightly less concerning than sending the
         | app hashes. Coming to the conclusion that this is all great and
         | fine is really absurd.
        
       | api wrote:
       | There are two issues here. One is the privacy problem which I
       | agree is not quite as bad as some think. The second is the stupid
       | fact that if some server goes down you can't launch apps. That is
       | just awful.
        
       | the_duke wrote:
       | While other posts on this topic are too alarmist, this one is way
       | too Apple apologetic for my taste.
       | 
       | * There is no information on how often the validation happens.
       | All this investigation concludes is that it doesn't happen when
       | closing and immediately re-opening an app. Is it every week?
       | Every reboot? Every hour? If it's less, that's essentially the
       | same as doing it on every launch.
       | 
       | * There is no justification for sending this information in
       | cleartext. I don't follow the "browsers and loops" argument. This
       | is a system-service that only has to trust a special Apple
       | certificate, which can be distributed via other side-channels.
       | 
       | * Many developers only have a single app, so it still is a
       | significant information leak. It's really not much different from
       | sending a app-specific hash. Think: remote therapy/healthcare
       | apps, pornographic games, or Tor - which alone could get you into
       | big trouble or on a watchlist in certain regions.
       | 
       | I assume they will push a fix with better timeouts and
       | availability detection.
       | 
       | But Apple simply has to find a more privacy-aware system designs
       | for this problem which does not leak this kind of data without an
       | opt-in and also does not impact application startup times.
       | (revocation lists?)
       | 
       | But I reckon this data might just be too attractive not to have.
       | Such a "lazy" design is hard to imagine coming out of Apple
       | otherwise.
        
         | olliej wrote:
         | I feel Apple has done privacy well in so many cases, that the
         | way this works is really disappointing :-/
        
           | smnrchrds wrote:
           | Apple has done a fantastic PR job regarding privacy. I am
           | more skeptical about the status of actual privacy given their
           | iMessage situation and now this.
        
             | nojito wrote:
             | Calling Apple's privacy stance PR is extremely misleading.
             | 
             | It's been engrained in them since the 80s and with the
             | growth of Google, it became fun to vilify Apple because of
             | it.
        
             | olliej wrote:
             | Their iMessage situation?
        
               | smnrchrds wrote:
               | They backup the private key to iCloud unless you manually
               | disable backups. So even though iMessage is advertised as
               | E2E encrypted, for the vast majority of users, Apple can
               | read each and every message.
               | 
               | (And even if you disable backups, Apple can still read
               | most if not all of your messages, because the persons on
               | the other side of the conversations have not disabled
               | backups)
        
               | tandav wrote:
               | https://news.ycombinator.com/item?id=25078317
        
           | ghayes wrote:
           | A better privacy solution would be to sync revocation lists
           | every so often (and, if you must, right before opening a new
           | app). Is there any privacy-preserving reason to not go this
           | direction? How often would you expect certificates to be
           | rescinded? You could also use a bloom filter to significantly
           | reduce the false-positive rate.
        
             | olliej wrote:
             | CRLs are how we dealt with OCSP in the browser, and I feel
             | like those must surely have more insanity than the
             | Developer ID certs
        
         | throwawayg123 wrote:
         | Wait. Is it not common knowledge that Android and iOS log every
         | application you open down to the exact millisecond you open and
         | close them?
         | 
         | Is it not common knowledge how telemetry works for the
         | operating systems? They generally batch up a bunch of logs like
         | this, encrypt them, compress them, and then send them to the
         | mothership (hopefully when you're on WiFi).
        
           | viraptor wrote:
           | Logging and telemetry are completely separate use cases. And
           | no, it's not widely known or documented - there is no good
           | description of what telemetry exists or contains on iOS that
           | I know of.
        
           | randyrand wrote:
           | don't you need to enable analytics?
        
           | [deleted]
        
           | kenniskrag wrote:
           | first compressed and then encrypted. A good encryption is
           | indistinguishable from random data.
        
         | lapcatsoftware wrote:
         | > There is no information on how often the validation happens.
         | 
         | I wrote a blog post about this. My analysis indicates that
         | Developer ID OCSP responses were previously cached for 5
         | minutes, but Apple changed it to half a day after Thursday's
         | outage, probably to reduce traffic:
         | 
         | https://lapcatsoftware.com/articles/ocsp.html
        
           | saagarjha wrote:
           | 5 minutes is an absurdly short cache time...
        
             | lapcatsoftware wrote:
             | Pure speculation from me, but my guess is that the
             | intention is check an app on every launch, and the 5
             | minutes is there just to lower the chances of DoS from an
             | app getting repeatedly launched for some reason.
        
         | fiddlerwoaroof wrote:
         | Isn't OCSP an open standard for handling certificate
         | revocations? The standard specifies plaintext, because the
         | standard can't assume that the client has a way to form an
         | encrypted connection to the revocation list.
        
       | dmitriid wrote:
       | > Maybe the hash is computed only once (e.g. the first time you
       | run the app) and it is stored somewhere.
       | 
       | This would explain why some games take minutes to launch the
       | first time to run them. I've experienced this many times with
       | Steam. You install a game, you launch it, and nothing happens for
       | up to several minutes, and then the game runs. No delays after in
       | launching after that.
        
         | jeffbee wrote:
         | This behavior drives me crazy. The only way to figure out
         | what's going on is to open the Activity Monitor. On my 2015
         | iMac (top-of-the-line, at the time) initial launch of some
         | large games has taken tens of minutes, and it happens whenever
         | the game is updated, not just after it is initially installed.
        
       | pubkraal wrote:
       | If anyone is concerned with ocsp activity and verifications being
       | requested all over the web, then oh boy stay away from https.
       | 
       | OCSP is a good thing, and the web - and your signed applications
       | - are better off with it.
        
         | ctz wrote:
         | OCSP which fails open combines pointlessness with terrible
         | privacy. It's why Mozilla is moving to CRLite for privacy-
         | friendly revocation.
        
         | cblconfederate wrote:
         | Everybody knows that when they request a website their action
         | can be logged. The opposite is true about desktop apps
        
         | feanaro wrote:
         | I guess you haven't heard of OCSP stapling?
         | https://en.wikipedia.org/wiki/OCSP_stapling
         | 
         | Active OCSP is far from being considered a good thing
         | universally.
        
         | brendoelfrendo wrote:
         | Yeah, I feel like I'm taking crazy pills; did everyone just not
         | know about OCSP until Apple did it?
         | 
         | Spoiler alert, you've probably already used OCSP on the web.
        
           | saagarjha wrote:
           | Most of the people affected by the issue have no idea what
           | OCSP is.
        
           | iso947 wrote:
           | Most browsers are stopping ocsp because of the privacy use
           | and the triviality to block it. Did Chrome ever do it?
           | 
           | That's why CT came around.
           | 
           | Some background for those unfamiliar.
           | 
           | https://scotthelme.co.uk/revocation-is-broken/
        
             | judge2020 wrote:
             | Chrome uses its own CRL, which pulls from OCSP
             | 
             | https://medium.com/@alexeysamoshkin/how-ssl-certificate-
             | revo...
        
       | cute_boi wrote:
       | Clearly this article doesn't reveal every truth. Certificates
       | authority should have been decentralized but is it happening?
       | 
       | And just by looking ip address, and app usage and other data they
       | receive they can connect the data and identify its me. And what
       | security has apple provided till now?
       | 
       | "You shouldn't probably block ocsp.apple.com with Little Snitch
       | or in your hosts file."
       | 
       | That's far better than freezing computer which doesn't work,
       | doesn't run any apps. If I don't need apple mercy and protection
       | please don't force me.
       | 
       | Already installed Linux and its a start.
        
       | vmateixeira wrote:
       | So, not only apple, but pretty much everyone, can eavesdrop on
       | the HTTP request and find out from which developer I'm running
       | apps from?
        
       | ThePhysicist wrote:
       | Not sure whether the non-privacy related aspect about OCSP is
       | less worrying. Officially Apple does this to protect innocent
       | users from malware, but as we've seen it also allows them to
       | remotely disable any developers' software. Not really something
       | that I'd want on my machine.
        
         | cute_boi wrote:
         | is there any statistics of how many innocent users have become
         | victim? Clearly Apple just want control. Just like there is old
         | saying More truth less trust is needed.
        
           | ThePhysicist wrote:
           | Software certificates make sense in general I think, but
           | there shouldn't be just a single party that can grant and
           | validate them.
        
         | judge2020 wrote:
         | OCSP also allows CAs to revoke random websites' certificates,
         | yet nobody is making a big fuss about that (presumably because
         | no OCSP server has encountered what Apple's did and prevented
         | websites from opening).
        
           | ThePhysicist wrote:
           | Yeah but the thing is that there are many CAs. The main
           | problem is (IMHO) when you have a single party with
           | conflicting commercial interests that controls all
           | certificates for a given platform.
        
       | jgilias wrote:
       | So the takeaways are:
       | 
       | * Your Mac periodically sends plain text information about the
       | developer of all apps you open, which in most cases makes it
       | trivial for anyone able to listen to your traffic to figure out
       | what apps you open. Better not use a Mac if you're a journalist
       | working out of an oppressive country.
       | 
       | * Because of this Macs can be sluggish opening random
       | applications.
       | 
       | * A Mac is not a general purpose computing device anymore. It's a
       | device meant for running Apple sanctioned applications, much like
       | a smartphone. Which may be fine, depends on the use case.
       | 
       | Yeah... No Mac for me anytime soon then.
        
       | musicale wrote:
       | > You should be aware that macOS might transmit some opaque
       | information about the developer certificate of the apps you run.
       | This information is sent out in clear text on your network.
       | 
       | Wow, that is bad from a privacy perspective!
       | 
       | Since certificate revocation is rare, it makes more sense to
       | simply periodically update a list of revoked certificates instead
       | of repeatedly checking each certificate. That would solve the
       | privacy issue while still allowing certificates to be revoked.
       | 
       | OCSP seems like a bad idea for web browsing for similar reasons.
        
         | dabeeeenster wrote:
         | I don't quite understand why anyone would send data in clear
         | text anymore, let alone Apple.
        
       | neolog wrote:
       | So Apple sends an app-developer identifier in clear text each
       | time you open an app? That sounds really bad.
        
       | theodric wrote:
       | By default, Android logs every app you use. You have to disable -
       | bafflingly - features including saving locations in Google Maps
       | and fully-functional voice recognition to (supposedly) disable
       | that behavior. What I'm saying is: don't look so surprised.
        
       | lifeisgood99 wrote:
       | Being able to identify the developer of any app I run on my own
       | machine is already too far. You have to assume all these requests
       | are logged and available for state actors on legal demand.
       | 
       | I wonder how big a local revocation list would be. I would
       | support a on-by-default local check.
        
       | jrockway wrote:
       | OCSP doesn't seem like the right protocol for this. Apple should
       | probably just ship you a list of hashes of revoked certificates
       | once a day, and should do the check locally. (Obviously, the
       | global certificate database is too big to send to every user, but
       | Apple should be able to determine the subset of certificates they
       | trust, and the even smaller subset of those that are revoked or
       | compromised.)
       | 
       | To me, it sounds like they decided to take the quick-and-easy
       | path of reusing an existing protocol for the use case of stopping
       | malware, but it doesn't really fit. The latency, privacy, and
       | availability guarantees of OCSP just don't match with the
       | requirements for "run a local application".
        
         | m463 wrote:
         | There might also be usage data they collect conveniently.
        
         | Nightshaxx wrote:
         | I agree, how is sending a list of revoked certs not the best
         | idea?
        
         | ben509 wrote:
         | Going back to a CRL (certificate revocation list) for code-
         | signing certs makes more sense. And, really, there shouldn't be
         | a huge number of developer certs being revoked.
         | 
         | If that's happening, they need to put more work up front into
         | certifying them in the first place.
        
       ___________________________________________________________________
       (page generated 2020-11-14 23:00 UTC)