[HN Gopher] macOS has checked app signatures online for over 2 y...
       ___________________________________________________________________
        
       macOS has checked app signatures online for over 2 years
        
       Author : giuliomagnifico
       Score  : 469 points
       Date   : 2020-11-25 12:15 UTC (10 hours ago)
        
 (HTM) web link (eclecticlight.co)
 (TXT) w3m dump (eclecticlight.co)
        
       | stefan_ wrote:
       | The irony of arguing that the rapid rate of certificate
       | revocations is proof of the system being necessary and secure.
       | No, it's proof that the system is useless. Code signing is a dead
       | end, and we have known that latest with Stuxnet.
        
         | viktorcode wrote:
         | With the system checking for certificate validity Stuxnet would
         | have stopped shortly after its certificate was revoked.
         | 
         | Regardless, Stuxnet example is way off the mark. It was
         | designed to work in air-gapped network and defeat particular
         | set of obstacles.
        
         | jezfromfuture wrote:
         | A nation state virus that stole a cert is not a good argument
         | against this , windows only started checking certs very
         | recently please get your head out of the sand. It's worked
         | great for many years for us Mac users.
        
       | lxgr wrote:
       | Is there any UI indication that OCSP checks have been
       | consistently failing for some period of time?
       | 
       | My concern is less local malware (if something malicious has
       | gained the privileges to filter OCSP, it's probably already game
       | over) but rather networks filtering ocsp.apple.com (for whatever
       | reason).
        
         | gruez wrote:
         | Does it matter? Would anyone (but the most paranoid) care? My
         | guess is that OCSP is supposed to be part of a defense in depth
         | strategy, so it doesn't have to be 100% airtight to achieve its
         | goals
        
           | raxxorrax wrote:
           | I care that Apple or Microsoft don't get to know what
           | executables I run. That is information highly relevant to
           | security.
           | 
           | And what do mean with paranoid? I think computing is in
           | danger of getting locked down and that would be such a large
           | overall detriment for everybody. It is a matter of having a
           | broad perspective.
           | 
           | It is also political because you create information
           | asymmetry. To be honest, to dismiss it as paranoia might not
           | be a really valuable assessment. Subjective, but I think it
           | fairly lacking.
        
             | gruez wrote:
             | > I care that Apple or Microsoft don't get to know what
             | executables I run. That is information highly relevant to
             | security.
             | 
             | I think you missed the context of this comment thread. The
             | "Does it matter" and "Would anyone care?" questions were
             | directed at the question of what happens (presumably from a
             | security point of view), if OSCP requests were being
             | blocked. It's not related to the discussion about whether
             | OSCP queries violate privacy or not.
        
       | [deleted]
        
       | xalava wrote:
       | Interesting, but reading the conclusion I'm fascinated in this
       | affaire how technically knowledgeable people loose common sense
       | to defend their favorite brand: - Per launch verification is
       | terrible for privacy, vis-a-vis Apple and the whole network when
       | it happens in plain text - "They should also explain how, having
       | enjoyed their benefits for a couple of years, they've suddenly
       | decided they were such a bad idea after all", another key issue:
       | user information, consent and control. - Additionally, the public
       | was made aware because it malfunctioned, which is also a security
       | issue. - Considering the current corporate culture, there are
       | legitimate concerns of what those choices might lead towards
        
         | ogre_codes wrote:
         | > I'm fascinated in this affaire how technically knowledgeable
         | people loose common sense to defend their favorite brand
         | 
         | What I find weird is regardless of what the discussion is
         | involving Apple, someone needs to pop in with one of these
         | theories about Apple tribalism.
         | 
         | Very very few people are in fact "defending" Apple here. Even
         | among those few, the sentiment is largely that this is bad and
         | Apple is fixing it.
        
         | rootusrootus wrote:
         | > how technically knowledgeable people loose common sense to
         | defend their favorite brand
         | 
         | Someone who says this usually holds an opposing position and
         | simply has their own tribal allegiance. Perhaps assume good
         | faith on the part of those who do not make the same choices you
         | do.
        
         | [deleted]
        
         | netsharc wrote:
         | Yeah, the article's last paragraph irks me... to reformulate it
         | in the context of domestic spying, it'd be like saying "NSA's
         | communication monitoring have kept you safe for years, now that
         | you've heard of it, you decide it's a bad idea?".
         | 
         | Most people probably never noticed this phone-home feature
         | existed, just like they never knew that NSA was recording
         | everything. (Obviously anyone who bothered to look under the
         | hood could've seen it, but hey, how many people do that).
        
           | Ar-Curunir wrote:
           | The Apple thing was not designed for explicit mass-
           | surveillance; NSA's programs are. That's kind of a big
           | difference.
        
         | valuearb wrote:
         | How is certificate checking a terrible idea?
         | 
         | It doesn't leak the application name or any personal
         | information, and Apple doesn't store it permanently.
        
           | ogre_codes wrote:
           | It initially logged IP address and the associated developer
           | ID which was a genuinely bad idea. They've stopped logging IP
           | address now.
           | 
           | The concept here is fine, they just screwed the pooch a bit
           | on implementation. And as usual, HN blew it out of
           | proportion.
        
           | saagarjha wrote:
           | Developer ID is an extremely good proxy for application name.
        
         | thinkingemote wrote:
         | No-Logo by Naomi Klein outlined how brands work.
         | 
         | One factor in the irrational defence could be a kind of
         | psychological protection of investment. Apple isnt just another
         | company, its an entire lifestyle ecosystem. Those invested in
         | Apple have the watch, tv, laptop, itunes etc. And together they
         | really do "just work" - the user experience is great!
         | 
         | So to admit that Apple is flawed, that their investment was a
         | bad idea is to admit they were wrong and that their time and
         | money was wasted. No-one wants to be a sucker. Far better
         | therefore to protect your investment. Apple really are genius
         | to pull this off. Apple is part of people's identity.
        
           | nbzso wrote:
           | "One factor in the irrational defence could be a kind of
           | psychological protection of investment." The Author of this
           | publication is clearly invested deeply in Apple ecosystem as
           | a developer. One of the reasons that I consider using Mac OS
           | behind hardware firewall in the future is clear realisation
           | of this process. This telemetry malpractice clearly must be
           | prevented by legislative measures. Trust is earned by
           | transparency, not by some kind of Security slogans.
        
           | blub wrote:
           | But they're not wrong at all and their time and money was
           | well invested.
           | 
           | The ecosystem does just work and the user experience is great
           | compared to the alternatives. It's also possible to use only
           | specific products and switch off various cloud or telemetry
           | options. There's still a long way to go to reach a private
           | OS, but Apple has by far the best privacy stance when
           | compared to Google or Microsoft and there is _nobody_ who
           | offers such an OS right now. The best one can hope for is
           | build their own Linux-based distro or use BSD and then one
           | has to be prepared to invest a significant amount of time.
           | 
           | Apple didn't get challenged through the GDPR yet because
           | everyone's busy with Google and Facebook still. But, if you
           | or anyone else would like to lodge a complaint, maybe this
           | will be decided for the customers (I think it's borderline)
           | and we'll get an option to switch it off.
        
           | krrrh wrote:
           | The article points out that while there are drawbacks to
           | checking app signatures, there have also been documented
           | benefits in terms of uncovering vulnerabilities and making
           | systems more secure, which _also has direct privacy benefits_
           | to the users whose systems don 't become compromised by
           | malware.
           | 
           | The balancing act between freedom and security is never going
           | to not be a debate. Engaging in it in good faith as in the
           | linked article is a reasonable approach (that you don't
           | usually see represented in Klein's oeuvre): consider
           | tradeoffs, counterarguments, and historical context from
           | different perspectives. Apple is flawed sure, because all
           | complex solutions are inherently flawed. They have a
           | responsibility to be more open and transparent, and I'd
           | prefer to see more details and updates to their otherwise
           | laudable security whitepaper [1], and clearer more accessible
           | user-definable toggles. But your or my preferred solution
           | probably isn't the ideal default for most users, or for the
           | ecosystem as a whole.
           | 
           | [1] https://manuals.info.apple.com/MANUALS/1000/MA1902/en_US/
           | app...
        
         | mhh__ wrote:
         | I think there's an element of people desperately wanting to
         | believe that apple is their tribe, rather than just another
         | company.
         | 
         | I can believe Apple do care about privacy, but ultimately
         | they're just another company. For example, I'm sure apple would
         | love the Epic lawsuit to be decided based on a poll of HN users
         | - "I would rather not have the freedom to run whatever I want,
         | because [insert bizarre anecdote]".
         | 
         | Don't project your own beliefs onto apple, vote with your
         | wallet if they annoy you - it's just a trackpad.
        
           | blub wrote:
           | Could the people that vote with their wallet please also stop
           | caring about Apple so much, to the extent that they _have_ to
           | rescue those that are still Apple customers and nitpick
           | anything Apple-related to bits? Take a clean break, it 's
           | healthier that way.
        
         | athms wrote:
         | >I'm fascinated in this affaire how technically knowledgeable
         | people loose common sense to defend their favorite brand...when
         | it happens in plain text
         | 
         | I'm fascinated how technically knowledgeable people don't
         | understand OCSP.
         | 
         | Checking the revocation status of certificates is why OCSP was
         | created. It happens via HTTP. Why? Because you cannot check a
         | certificate used for the HTTPS connection when you are using
         | HTTPS for the connection. Apple leveraged OCSP for Gatekeeper
         | since it does the same thing, checking certificates, in this
         | case a developer certificate. That is all it does.
        
           | moduspol wrote:
           | It's also easy to imagine what the blog posts would look like
           | if they did the same thing except over TLS--in a way that the
           | harmlessness / purpose of the request was not immediately
           | apparent.
           | 
           | I agree with you, though--it seems like they solved a valid
           | problem with the most obvious, commonly-used solution. The
           | real debate is probably just over whether or not the problem
           | is a sufficiently large threat to justify the downsides.
        
       | sneak wrote:
       | Apple also makes apps themselves, and, just like AmazonBasics,
       | has a tendency to clone popular apps (such a tendency that it has
       | a name: to be "sherlocked").
       | 
       | To ignore the fact that Apple receives global platform app
       | popularity data, which provides them a competitive advantage over
       | every other app developer, is somewhat foolish.
       | 
       | Gatekeeper has security benefits, yeah. But this data, just like
       | 3P sales stats to Amazon, has commercial value, even
       | anonymized/aggregated, that permits them benefits over everyone
       | else they are competing with in the app market.
       | 
       | Let's not conflate things that are good for users with things
       | that are good for Apple, and be careful when assigning decisions
       | exclusively to one bucket or the other.
       | 
       | I spoke yesterday on the societal dangers of giving Apple such a
       | pass on privacy violations:
       | 
       | https://www.youtube.com/watch?v=iG-7FpHvv-8
        
         | [deleted]
        
       | daniellarusso wrote:
       | So, can you use macOS offline for extended periods of time?
        
         | mikkelam wrote:
         | Yes, it only checks when you're online
        
       | lapcatsoftware wrote:
       | I give up on Hacker News. Go ahead and wallow in your ignorance,
       | downvoting experts.
        
         | zepto wrote:
         | It has been widely known that these checks were happening.
         | 
         | Not only that, this isn't the first server problem that
         | impacted launch performance. It's just the most severe.
         | 
         | The main difference is that this time around there are people
         | who are claiming that Apple is using the OCSP checks for some
         | kind of nefarious tracking purposes.
         | 
         |  _These people have no evidence._
        
           | Sigmoid wrote:
           | "The main difference is that this time around there are
           | people who are claiming that Apple is using the OCSP checks
           | for some kind of nefarious tracking purposes."
           | 
           | What proof is there that we should trust Apple? They could
           | tracking for nefarious purposes for all you know. Thats the
           | problem.
        
             | zepto wrote:
             | That's true of every single organization and every single
             | individual.
             | 
             | You can always justify a conspiracy theory on the basis
             | that you can't prove a negative like this.
             | 
             | Let's consider another conspiracy theory:
             | 
             | "A state actor wants to install spyware, and Apple's OCSP
             | is a barrier to their goal. They are running an influence
             | campaign to get users to opt out of security protections."
             | 
             | There is no evidence for this theory.
             | 
             | But "for all you know" certain people posting here have
             | been paid to spread disinformation as part of this
             | conspiracy.
             | 
             | (Just to be clear - there is no evidence for this, and I
             | don't think it is likely)
             | 
             | In the absence of evidence, it is not rational to
             | _completely dismiss_ either or both possibilities (that
             | Apple has a hidden agenda or that there is a conspiracy to
             | weaken Apple's security).
             | 
             | What is irrational is _to use the absence of evidence to
             | the contrary to convince yourself that something is
             | obviously true_.
             | 
             | However on the broader point - I agree that _we should not
             | be reliant on trusting Apple_ for our privacy and security,
             | and cannot afford to be as we move into the future.
             | 
             | We need a public domain infrastructure that produces
             | similar or better security and privacy outcomes to the ones
             | Apple is claiming to provide.
        
               | Sigmoid wrote:
               | > You can always justify a conspiracy theory on the basis
               | that you can't prove a negative like this.
               | 
               | Its not about definitively claiming they are being
               | nefarious, its about they CAN be, and Apple isn't
               | transparent enough for us to know if they're not. So its
               | about risk. People can use Apple products, I don't really
               | care, but they risk their privacy when they do, and thats
               | not a risk people should have to take when using an OS.
        
               | zepto wrote:
               | Any software vendor CAN be nefarious.
               | 
               | It is just innuendo to claim it about a particular one
               | without evidence.
               | 
               | People don't risk their privacy by trusting Apple any
               | more than they do by trusting anyone else. Almost
               | certainly less so than by trusting a company that makes
               | money out of personal information.
               | 
               | Singling Apple out without evidence is misleading
               | innuendo.
               | 
               | If we want people to have the option not to trust private
               | corporations, we need to create infrastructure that
               | currently doesn't exist.
        
               | [deleted]
        
           | [deleted]
        
       | dcow wrote:
       | Security and privacy are not parallel concerns, they're
       | orthogonal. Strong security absolutely does not imply utmost
       | privacy. I find this to be the most dangerous misconception of
       | the late privacy trend. You can't just turn security and privacy
       | dials to 11. They're actually two ends of the same dial, or
       | opposing poles of the same sphere. To increase privacy you must
       | move away from perfect security.
       | 
       | Why? Because security is all about who you trust (and who you
       | don't). Privacy is about concealing things from people you trust
       | (and especially from those you don't). Security is best served
       | with strong identity and periodic integrity checks/monitoring.
       | Privacy is best served via anonymity and opacity. If something is
       | private, by definition there lacks the transparency to audit its
       | integrity.
       | 
       | So what I lament is not that a company is trying to achieve both,
       | but rather that as a consumer I'm not educated on the topic and
       | able to make a choice as to where I want to set the dial.
       | 
       | You will continue to see "headlines" like this so long as
       | socially we're obsessed with trying to implement both security
       | and privacy and fall subject to marketing suggesting some service
       | provides the maximum of both.
       | 
       | If you trust Apple to verify the integrity of apps on your
       | devices and secure your system from unwanted software, then you
       | trust Apple to maintain the privacy around the data needed to
       | achieve such. That's the whole value prop of their platform and
       | ecosystem. It's a walled garden with abundant privacy koolaid
       | fountains.
       | 
       | The only reason this is news is because people don't understand
       | the privacy vs security dichotomy. And because Apple does not
       | provide a way for consumers to choose just how much security
       | they're comfortable with.
       | 
       | If you don't trust Apple then stop pretending you do by using
       | their hardware/ecosystem.
        
         | WhyNotHugo wrote:
         | While security and privacy are not parallel, they are not
         | orthogonal either.
         | 
         | You can have both. Integrity checks can be done anonymously
         | (e.g.: you could have a p2p network of devices sharing a signed
         | database of certificate revocations).
         | 
         | Encrypting something gives me privacy. Signing something gives
         | me security. Encrypting a signed package gives me both.
        
           | dcow wrote:
           | They _are_ orthogonal, you're just saying that you can have
           | some of both which is exactly my my point about them existing
           | on a spectrum.
           | 
           | And it's not as simple as encrypting data. You have to trust
           | somebody to determine what good integrity looks like and to
           | then verify the integrity information is fresh. The same
           | privacy concern exists if you run OCSP against cypher-text as
           | it does plaintext. You still have a stream of all the things
           | people do. Bad for privacy.
           | 
           | Running a decentralized system means you have to trust all
           | the nodes to not to store data or collude. Same problem in a
           | different way. You simply cannot achieve integrity
           | verification if you don't trust anyone to do it. And this is
           | why they are orthogonal. Trust is not compatible with doubt.
           | 
           | The issue here is that Apple took off the shelf OCSP and
           | applied it in a way it was not designed for. So there _are_
           | actual problems with their late implementation. They should
           | be fixed. And personally I think OCSP is kinda dumb because
           | it mechanically defeats the advantage of certs (you don't
           | need a cert if you're going to phone home for every
           | invocation, just check a hash), but meh.
        
       | jchw wrote:
       | > What has been puzzling me ever since is that these OCSP checks
       | have been well-known for a couple of years,
       | 
       | This is where you're wrong. It was known, but not well-known.
       | Users do not expect an HTTP request to be blocking their
       | application executions.
       | 
       | Personally, I don't see why CRL is not sufficient. Yes I want
       | malicious signatures blacklisted, but can't I just get a list
       | instead? Some of the reasons CRL is no longer used do not apply
       | to code signatures.
        
       | mensetmanusman wrote:
       | The only charitable understanding of this program is that Apple
       | has no actual table connecting software to hashes, but that they
       | could use the information to understand outbreaks of
       | botnets/spyware that they could then help inform ISPs/global law
       | enforcement to help stop.
       | 
       | Is this even reasonable?
        
         | valuearb wrote:
         | OCSP requests don't identify software, they identify
         | certificates. Each certificate can be used for dozens of
         | different apps.
        
         | lawnchair_larry wrote:
         | That is not correct. It does a live check when presented with a
         | certificate, to make sure that certificate has not been revoked
         | for signing malware. It doesn't store anything. Apple are not
         | saving information. It's just an online blacklist check. That's
         | how OCSP works everywhere, it isn't an Apple thing. They are
         | using the standard protocol as documented in the RFC.
        
           | zeusflight wrote:
           | There is nothing in the plain OCSP that prevents the
           | responder server from logging the request along with the
           | originating IP. Any claims that a particular server doesn't
           | do so is either just an assumption or based on trust alone.
           | This is why OCSP-stapling is preferred against plain OCSP in
           | browsers and also why plain OCSP can be disabled. In this
           | particular case, trustd and other system daemons are known to
           | skip VPN and firewall blocks - so it's mandatory information
           | leak.
        
         | user-the-name wrote:
         | The requests contain only app hashes. They do not contain the
         | unique hardware identifier that Apple computers have. They do
         | not contain your Apple ID, identifying you as a user.
         | 
         | Why would you not interpret this charitably as them not
         | actually trying to spy on you? If they wanted to spy on you,
         | why on Earth would they not send the actual valuable
         | information?
        
           | mensetmanusman wrote:
           | Your IP address can be pretty accurately matched to an
           | identity due to PRISM.
           | 
           | Has a third party verified they don't keep track of IP
           | addresses?
        
             | user-the-name wrote:
             | An Apple ID _is_ tied to an identity right now. They don 't
             | use it. Why?
        
           | beervirus wrote:
           | Is it even app hashes, or developer certificate hashes?
        
             | jtbayly wrote:
             | The larger the program, the longer it takes to verify the
             | first time you run it. I can only interpret that as meaning
             | they are basing the whole program.
             | 
             | Source: me, downloading and running various apps
        
               | jtbayly wrote:
               | Apparently the initial launch delay is Gatekeeper, which
               | is something different.
        
           | m463 wrote:
           | I believe you are trying to think about this rationally, but
           | this is not the only thing going on.
           | 
           | When a mac boots up or changes network location, a long list
           | of processes on your machine (like AppleIdAuthAgent,
           | identityservicesd, , and maybe 10 or 20 more) connect to
           | various apple servers associating your _actual identity_ with
           | the ip address. It will continue to do these kinds of things
           | while you are online. And all this is interleaved with oscp
           | requests.
        
             | cdubzzz wrote:
             | Do you have sources for this? I don't doubt you, I'd just
             | be interested to read some details about what exactly those
             | services are doing/sending.
        
             | user-the-name wrote:
             | It is not possible to accurately connect an identity with
             | an IP address. Many computers share IP addresses, and many
             | others jump IP addresses frequently.
        
               | danShumway wrote:
               | Shoot, apparently those Tor devs have been completely
               | wasting their time.
               | 
               | Somebody in the security industry should let them know
               | that their work on the network level is useless and
               | unnecessary because leaking IP addresses isn't a real
               | privacy threat.
        
               | user-the-name wrote:
               | Accurately. The key word in there was "accurately".
        
               | danShumway wrote:
               | If IP addresses couldn't in some cases accurately track
               | users, then it wouldn't be a priority to build a network
               | that obscured them.
        
               | user-the-name wrote:
               | If they can only do it "in some cases", then they can't
               | do it accurately, is the entire point. It can do it
               | SOMETIMES.
               | 
               | Apple has more accurate information. They are not sending
               | it. Why?
        
               | danShumway wrote:
               | I think you might be confusing accuracy and reliability.
               | But I'm not here to argue about semantics, you can use
               | whatever definition of accuracy you want. IP addresses
               | are large attack vector for deanonymization/tracking, and
               | people should be thinking more about how IP addresses get
               | leaked and in what contexts. Whatever definition of
               | "accuracy" you want to use, I don't think that changes
               | the overall point that IP addresses matter.
               | 
               | To your earlier comment:
               | 
               | > Why would you not interpret this charitably as them not
               | actually trying to spy on you?
               | 
               | I fully agree with this. I think Apple's intention is not
               | to spy on users, it's to A) stop malware, and B) exert
               | more control over their ecosystem in general.
               | 
               | I have a problem with point B, but that's a separate
               | conversation.
               | 
               | To your later point:
               | 
               | > It is not possible to accurately connect an identity
               | with an IP address.
               | 
               | This is just plain wrong; it is possible to accurately
               | connect identities to IP addresses, people do it all the
               | time. Reliability is a separate conversation. If you're
               | working off of a different definition of "accurate",
               | then, whatever, I don't care. But I stand by the point
               | that IP addresses are a privacy risk and that they can be
               | used to track/identify users in the real world, I don't
               | think that's a disputable fact.
               | 
               | The privacy worry here is twofold. First that these
               | requests are (currently) sent in plaintext. To their
               | credit, I think Apple is fixing that issue. Which is
               | good, because plaintext payloads allow adversaries on the
               | same network to potentially sniff which applications
               | you're using.
               | 
               | The second privacy worry is that regardless of whether or
               | not Apple is trying to spy on users, they still might be
               | storing that data, and we only have their word to go on
               | that it'll be stored in a protected way or deleted
               | regularly. If a court order comes down asking Apple to
               | reveal that information, or if it gets hacked, that data
               | increases the threat model for Apple users.
               | 
               | An adversary with access to Apple's data (whether that
               | adversary is a hacker or a government) could
               | theoretically tie app usage to real-world identities if
               | the IP addresses are logged, especially if they can match
               | IP addresses with logging of other data being sent to
               | Apple's servers. That's the point m463 is making in his
               | original comment -- the data here doesn't just exist in
               | isolation, it's IP addresses that an attacker could
               | correlate with other real-world identifying information.
        
               | Bjartr wrote:
               | The real key word is "legally". IP addresses + other
               | metadata (browser fingerprinting and the like) can be
               | enough to sufficiently identify an individual, or at
               | least a household, for some purposes, such as making a
               | more effective advertising profile.
        
               | valuearb wrote:
               | Coulda woulda shoulda isn't evidence.
        
               | Bjartr wrote:
               | I'm having trouble figuring out your meaning in this
               | context. Care to explain?
        
               | valuearb wrote:
               | There is no evidence Apple is storing any information
               | collected via OCSP for advertising, or collecting
               | metadata along with it. Effectively all they know is an
               | IP address and a hash identifying a developer certificate
               | that can correspond to any of dozens of apps.
        
               | user-the-name wrote:
               | Collecting the data based on IP would basically have the
               | exact same legal status as doing it based on other
               | identifiers. No browser is involved here, so no
               | fingerprinting. And finally, Apple is not in the
               | advertising business and has no need for advertising
               | profiles.
        
               | bogwog wrote:
               | Let's say you're one out of 10,000 users in a large
               | network sharing a single public IP address.
               | 
               | Anyone trying to identify you just needs to narrow that
               | down from 10,000 to one. This can be done many different
               | ways by combining data sources. You could automate it
               | with algorithms and maybe some machine learning, but it'd
               | also be pretty trivial for a dedicated human to do it.
               | 
               | Browser fingerprint, mac address, software versions,
               | browsing behavior (https doesn't hide URLs), access
               | times, _and application hashes_ will all help narrow down
               | that search.
               | 
               | Maybe your university requires you to install some
               | special software (like one of those locked down browsers
               | for exams), and the attacker knows an approximate range
               | of time when you installed it based on your exam
               | schedule. They could narrow their search by intercepting
               | OCSP requests and filtering for application hashes that
               | match the specific version of the software you most
               | likely downloaded, based on the time you most likely
               | downloaded it.
        
               | user-the-name wrote:
               | But why? Why go to these ridiculous lengths to try to
               | extract information from an unreliable source?
               | 
               | Apple already has the device identifier and Apple ID.
               | These are reliable. They do not need combining different
               | data sources and algorithm and machine learning. They are
               | the accurate data already. If they wanted it, they could
               | just send it.
               | 
               | They don't. Why not? If they wanted this information, why
               | on god's green earth would they not just send it?
        
               | bogwog wrote:
               | I didn't mention Apple. Anyone intercepting the OCSP
               | traffic (which isn't encrypted) could use it to track
               | you.
        
               | sudosysgen wrote:
               | It's quite obvious. That way you can't get nailed for
               | breaching privacy.
               | 
               | It's exactly the same concept as the NSA saying that they
               | are only collecting metadata and not doing any spying.
        
               | user-the-name wrote:
               | It would do absolutely nothing whatsoever for the
               | legality of any collection they might want to do.
        
               | sudosysgen wrote:
               | Of course, it would. IPs are required data to be
               | gathered. They can be almost all the time correlated with
               | real identities using other requests, but under many data
               | regulation laws they wouldn't be allowed to send personal
               | identifying information that isn't necessary.
        
               | makeworld wrote:
               | Many computers don't share IP addresses too. And there
               | are times when there is only one macOS under an IP. This
               | is not something to just wave away.
        
               | user-the-name wrote:
               | The point is, this information is UNRELIABLE.
               | 
               | Apple has access to information that is ACTUALLY
               | RELIABLE.
               | 
               | However, they choose to not use the reliable information.
               | 
               | Why would you see this, and assume that they, on purpose,
               | decided to NOT use the reliable information, but instead
               | use unreliable information to spy on you?
               | 
               | Why would they do something so bone-headedly stupid?
        
               | LocalH wrote:
               | Do you reject the possibility that they _combine_ the
               | reliable information with the unreliable information to
               | strengthen the latter? I mean, if you have logs stating
               | that a specific Apple ID connected from a specific IP
               | address at a specific time, and they know the identity
               | tied to that Apple ID, it seems reasonable that it would
               | strengthen the claim of  "the user at this IP address is
               | most likely this person" quite a bit, especially if the
               | same IP shows up regularly with that Apple ID? I'm not
               | saying they _do_ this currently, but I 'm just drawing
               | attention to the fine line between privacy and security.
               | Vigilance isn't necessarily a bad thing, as long as it's
               | grounded in the full reality. It's always possible that
               | one day, Apple's focus in this area may change. When it
               | does, it helps if part of that discussion is already
               | being had.
        
               | user-the-name wrote:
               | I reject that, yes, because they could just collect the
               | reliable information. The unreliable information is no
               | value.
        
           | athms wrote:
           | >The requests contain only app hashes.
           | 
           | No, application hashes are never sent. People are confusing
           | OCSP and notarization. The requests sent via OCSP check the
           | developer certificate. Notarization checks are done over
           | HTTPS.
        
           | whatever1 wrote:
           | What does that even mean? Of course they can identify you,
           | you are knocking their door with the same IP with your iCloud
           | account. Maybe the file you are giving them does not have
           | your uid, but as long as you have connected your Mac to your
           | Apple account you are uniquely identified.
        
             | user-the-name wrote:
             | Plenty of computers share a single public IP address.
             | Plenty of computers jump IP addresses constantly. It is not
             | at all reliable trying to tie IP addresses together that
             | way.
             | 
             | If they wanted the information, they would need to send it.
        
               | oauea wrote:
               | Maybe you have a funny network situation, but my network
               | situation is solid, and my IP address rarely changes. IP
               | addresses are absolutely identifiable information.
        
               | user-the-name wrote:
               | They are SOMETIMES identifiable information. They are not
               | RELIABLE for identification.
               | 
               | If you are going to be collecting information, you are
               | not going to choose unreliable information when you have
               | the option to collect reliable information. That would
               | not make sense.
        
               | whatever1 wrote:
               | I can right now open my vpn, change my ip address, go to
               | my apple account and it will report that my Mac is
               | "Online", meaning that apple knows that my particular
               | machine is on with a given IP address. It would be
               | completely useless (redundant at the very least) to also
               | include the uid in the signature file.
        
               | user-the-name wrote:
               | There might ten other, a hundred other people with the
               | same public IP visible to Apple, if you ISP is running
               | you through a NAT.
               | 
               | That fact that you have connected from an IP recently in
               | no way guarantees that the next connection from that IP
               | will be coming from you. That is exactly why you need to
               | send an identifier for any data being collected.
        
               | whatever1 wrote:
               | And Apple updates your ip continuously in their server as
               | evidenced in the "find my" service.
        
               | m463 wrote:
               | Every time you change network locations, all of these
               | daemons contact apple again. There are plenty of push
               | services on your machine and apple keeps track of where
               | you are.
               | 
               | also ipv6
        
           | lovelyviking wrote:
           | >They do not contain the unique hardware identifier that
           | Apple computers have
           | 
           | Different part of MacOS can send it and you would have no
           | idea until it malfunction like in this case.
        
             | user-the-name wrote:
             | They would not be accurately tied together, though, since
             | these requests are not sent with any kind of identifier.
        
               | lovelyviking wrote:
               | Who knows, may be they can start to send id too by a
               | special request from the server or send those hashes by
               | other channels in addition to this one, but much later.
               | And server can make such request based on some heuristics
               | or based on some 'black list' of hashes. How can you know
               | for sure? We can only guess to the certain degree without
               | looking into sources.
        
           | oauea wrote:
           | The request also contains your ip address. Charitable would
           | be distributing a bloom filter and checking matches locally.
        
             | user-the-name wrote:
             | Your IP address is a very weak and unreliable way to
             | identify you. If they actually wanted to identify you, they
             | have much better ways to do it, and those ways are not
             | used.
        
       | 9867680515divya wrote:
       | 8655873941 divu
        
       | JackC wrote:
       | > Those who consider that Apple's current online certificate
       | checks are unnecessary, invasive or controlling should
       | familiarise themselves with how they have come about, and their
       | importance to macOS security. They should also explain how,
       | having enjoyed their benefits for a couple of years, they've
       | suddenly decided they were such a bad idea after all, and what
       | should replace them.
       | 
       | I agree that anyone critiquing Apple's OCSP design should
       | understand it, and the critique should be more nuanced than "just
       | turn that feature off." Computers are now skeleton keys to our
       | lives and we have to go forward rather than back in figuring out
       | how to design them so they can safely do everything we need them
       | to do.
       | 
       | But it's not hard to justify the sudden criticism here -- it
       | happened after Apple's bad design of the OCSP feature broke local
       | applications, drawing a lot more attention to how it worked. It's
       | reasonable to then ask whether other parts of the design were
       | also poor, as Apple itself obviously is from the changes it's
       | already announced.
       | 
       | To take the author up on what should replace OCSP checks -- how
       | about using something like bloom filters for offline checks, and
       | something like haveibeenpwned's k-anonymity for online checks, to
       | remove the possibility that either Apple or a third party could
       | use OCSP for surveillance?
        
         | nmadden wrote:
         | The browser vendors have been looking at this problem for a
         | long time. See
         | https://blog.mozilla.org/security/2020/01/09/crlite-part-1-a...
         | for example (bloom filters included).
        
         | bitL wrote:
         | Why can't Apple download all footprints of bad apps locally
         | instead of monitoring every single invocation of apps? Is
         | second execution of an app the same security risk as the first
         | one? That's the design flaw.
        
           | Someone1234 wrote:
           | You mean bad certificates rather than applications.
           | 
           | OCSP can be locally cached, and Apple's implementation does
           | exactly that. But eventually you'll have to refresh the cache
           | and then the implementation needs to be fault tolerant
           | (Apple's wasn't).
           | 
           | OCSP leaks what vendors your installed applications are from.
           | The list of leaked certificates changes daily, so any good
           | implementation is going to check again at least several times
           | a week. If you download the entire database, you're just
           | consuming hundreds of megabytes of bandwidth/storage but
           | aren't removing the need to refresh/expire the cache.
           | 
           | I'd argue the two biggest flaws Apple's system has is bad
           | fault tolerance and also no user accessible opt out (even if
           | just for emergencies).
        
             | aftbit wrote:
             | I doubt the full list of hashes of all revoked certs is
             | 100s of MB, and even if it is, the daily update file surely
             | isn't that big.
        
             | kstrauser wrote:
             | I wonder how hard it'd be to serve this data via DNS, like
             | "dig -t TXT 0xdeadbeef.ocsp.apple.com". Then you get a
             | nice, distributed architecture with lots of built-in cache
             | handling, and since the data is currently served via HTTP,
             | it wouldn't expose any more data to your ISP than already
             | is today. It would also mean that if you have 100 people in
             | the office and a local DNS cache, then each OCSP query
             | would be made exactly once and then its answer shared among
             | everyone else in the office.
        
               | saagarjha wrote:
               | That still tells people what you're running, though.
        
               | kstrauser wrote:
               | Imagine you have a shared office DNS resolver (which is
               | pretty common). That resolver would aggregate all of the
               | requests into one shared, cached stream. Then the
               | question becomes "hey Apple, one person of how ever many
               | thousand are behind me would like to know if Adobe's
               | certificate is still valid". That's reasonably
               | anonymized, I think.
        
             | pydry wrote:
             | Differential downloads are a solved problem :
             | https://docs.microsoft.com/en-
             | us/windows/deployment/update/p...
        
             | loeg wrote:
             | > OCSP can be locally cached, and Apple's implementation
             | does exactly that.
             | 
             | In the earlier HN thread when the server was offline, it
             | was said that Apple only cached OCSP results for 5 minutes.
             | Is that not true? If it is true, I don't think that's what
             | GP is asking for as far as local caching.
        
               | rbrtl wrote:
               | It was changed after the event and is now 12 hours. Which
               | is probably more appropriate considering that most people
               | don't restart their applications every few minutes.
        
             | ehsankia wrote:
             | Just so that I understand this correctly, by caching to you
             | mean the results of a specific check, or as the above was
             | implying, downloading the list of all bad signature and
             | doing the check 100% locally.
             | 
             | The issue from my understanding was half the breakage, but
             | half the fact that Apple was sending back telemetry about
             | what apps you launched.
        
           | [deleted]
        
           | epistasis wrote:
           | The black list of malware is called Xprotect and dates back
           | to 2009. This check for revoked certificates is a different
           | security layer.
           | 
           | The second check of an app is necessary to check for
           | revocation: for a developer that decides that they've been
           | compromised and wants to stop execution of their software.
           | The alternative would be to use certificate revocation lists
           | instead of OCSP. CRL lists can get long, so OCSP is often
           | preferred to CRLs.
        
             | als0 wrote:
             | I've been wondering why CRL couldn't be used if the OCSP
             | goes down or no reply is received. That way you get the
             | benefits of both. Any reason why this would be a bad idea?
             | 
             | Also are CRLs really that bad in practice? I know it would
             | be a bad idea on a smartphone but is it really an issue on
             | a laptop?
        
             | benhurmarcel wrote:
             | Probably it's not necessary to check for revocation that
             | often though. It could even be argued that it's only needed
             | when the binary is updated.
             | 
             | And it's absolutely not normal to fail if that revocation
             | check doesn't succeed anyway.
        
           | [deleted]
        
           | rbrtl wrote:
           | In answer to the first: Because the information is unreliable
           | without _more_ invasive technologies ensuring that the local
           | file is up to date. To the second: Perhaps not, but if the
           | information on bad actors (app distributors in this instance)
           | you 'll continue running a compromised app.
           | 
           | Are you familiar with OCSP conceptually? I have done a
           | reasonable amount of work with signatures and certificates,
           | including OCSP. All my experience is in a commercial,
           | enterprise context but I think these technologies need to
           | start filtering down to the consumer before the capability
           | for security evaporates.
           | 
           | I think it's a consumer-positive direction for Apple to
           | provide this service. I would be interested to hear from
           | someone who holds the view that this is not a service, or
           | disagrees in other ways, but I think this is the right
           | direction for consumers. The alternative, as I see it, is
           | that every person installing an app needs to start searching
           | for CVE notices and headlines in trade papers declaring a
           | compromise.
           | 
           | Apple have applied an enterprise middleware to their
           | infrastructure. I think perhaps they could have been more
           | transparent in the delivery. A lot of the outrage now is
           | driven by people only finding out about the underlying
           | process for the first time. I stand by the right of these
           | companies to choose their business model to disallow (or
           | restrict) execution of apps they believe to be compromised. I
           | also firmly believe in a varied and free market for software,
           | hardware, and infrastructure.
           | 
           | In essence: You can choose to use Apple and do it the Apple
           | way. Equally you can choose to build your computer from
           | components sourced from anywhere, install any free OS, and
           | any apps. Personally I do choose to do it the Apple way, and
           | I am inconvenienced by that from time to time. I curse my
           | computer and its creators on a daily basis. It's part of the
           | relationship we all build with our tools.
           | 
           | got a bit off track towards the end...
        
         | flemhans wrote:
         | How many revocations do they do? How about just downloading the
         | whole list to the clients which they can check offline.
        
           | JackC wrote:
           | It sounds like that's how it used to work, and then it
           | changed for some reason, maybe to do with the size of the
           | list and a need for faster updates.
           | 
           | Actually that unknown exposes the problem with the original
           | article's demand that critics explain what they want to
           | replace Apple's system. No one outside of Apple is in a
           | position to design a system that addresses all of the design
           | constraints -- we don't even know what they all are. But we
           | _are_ in a position to assert some additional design
           | constraints, such as requiring that the system not leak
           | developer certs to eavesdroppers every time an application is
           | run, and expect Apple to figure out a solution that takes
           | them into account.
        
       | 9867680515divya wrote:
       | 8655873941
        
       | musicale wrote:
       | How long they've done it has no bearing on whether it is an
       | outrageously bad idea. It was bad in Catalina and it's bad in Big
       | Sur.
       | 
       | I found it particularly annoying in Catalina to have these slow,
       | unnecessary and intrusive online checks every time you run a
       | Python script and it drove me up a wall until I figured out you
       | could turn off the idiotic behavior in preferences.
        
       | intricatedetail wrote:
       | Any opt out protection racket should be illegal. Even if it is
       | "anonymous", someone could be identified from their behavioural
       | patterns which are unique to each human. I hope that Apple gets
       | at least hundred billion fine for such brazen violation of
       | privacy so they will learn their lesson and they should be
       | ordered to delete all personal data they don't have legitimate
       | business need for.
        
         | user-the-name wrote:
         | Why would they try to identify you from your "behavioural
         | patterns", when they already have the device identifier and
         | your Apple ID, which identify you are your computer uniquely?
         | 
         | Which, to be very specific, they _do not send_. They COULD
         | identify you extremely easily, and they specifically chose not
         | to do that.
        
           | yyyk wrote:
           | It's sent unencrypted. Anyone else could use said patterns to
           | track and identify, with good reason because they do not have
           | the ID.
        
           | lovelyviking wrote:
           | May be they have other channels which could or already do
           | send device identifiter and Apple ID. How do you know what
           | _else_ they do ?
           | 
           | Who knew they were sending hashes till the recent
           | malfunctioning? What _esle_ we do not know now?
        
             | user-the-name wrote:
             | If they wanted that information tied together, they would
             | send it together. Otherwise they have to keep guessing
             | about what goes together with what. That would make no
             | sense.
             | 
             | And it was known this data was being sent.
        
               | lovelyviking wrote:
               | >That would make no sense.
               | 
               | Hiding such feature without option to turn it off would
               | also make no sense to me but they did it.
               | 
               | >If they wanted that information tied together, they
               | would send it together.
               | 
               | sure, unless they wanted to hide the fact they wanted
               | information tied together. In that case they can always
               | provide the line of argument you provide once they caught
               | and say "Oh, if we wanted to spy we would do it openly
               | and brutally. There is no sense for us to make it
               | complicated." But something is telling me that one who
               | wish to spy would do it in some sophisticated manner. The
               | bottom line we do not know and we _can_not_ know since we
               | did not see _all_ sources.
               | 
               | You also cannot know what they wanted, so your guess is
               | as good as any other? Or somehow you know something we
               | don't?
               | 
               | The bottom line here is the same: who knows.
               | 
               | But I think if they wanted to make it open and secure
               | they would simply put option in GUI with clear text
               | explaining what it does and how it does it with ability
               | to turn this thing off. And they didn't do it this way.
               | For me it can mean 'they are hiding something' possibly.
               | What else they hide and why we simply do not know.
        
       | nullc wrote:
       | A common refrain in arguments that we don't need laws to protect
       | privacy is that the market will take care of it. The market can't
       | act against what it can't see. Privacy loss is often
       | irreversible.
       | 
       | A common refrain in arguments that we don't need to reject closed
       | source software to protect privacy is that being closed source
       | doesn't hide the behaviour, and people will still notice
       | backdoors and privacy leaks. Sometimes they do, sometimes they
       | don't.
       | 
       | Parts of the US Governments unlawful massive domestic
       | surveillance apparatus were described in the open in IETF drafs
       | and patent documents for years. But people largely didn't notice
       | and regarded reports as conspiracy theories for a long time.
       | 
       | Information being available doesn't necessarily make any
       | difference and making a difference is what matters.
        
         | chrisweekly wrote:
         | Great points.
         | 
         | Related tangent: for those interested in this topic of
         | "unlawful massive domestic surveillance", I heartily recommend
         | Cory Doctorow's "Little Brother" novels -- esp the most recent,
         | "Attack Surface". They get the technical details right while
         | remaining accessible and engaging regardless of the reader's
         | geek acumen.
        
         | ClumsyPilot wrote:
         | Lying to the customer about what your product does, or having
         | secret functionality, should be a criminal offence in the same
         | way as breaking and entering or stalking are.
         | 
         | Then, we would find out very quickly what people value.
         | 
         | I firmly believe this ecosystem (as in privacy violating ad and
         | data selling business model) is only dominant because companies
         | are able to mislead with impunity, so it's basically a form of
         | fraud
        
           | valuearb wrote:
           | Yes because there are never unwanted side effects from more
           | laws.
        
             | oblio wrote:
             | That's human nature. As soon as something beneficial to few
             | and detrimental to others is banned, those who benefit seek
             | to find other ways to continue benefitting, again to the
             | detriment of others.
             | 
             | This doesn't mean we shouldn't continue trying to stop
             | them.
             | 
             | And we stop them through laws.
             | 
             | Common sense is not that common and human decency doesn't
             | scale.
        
               | valuearb wrote:
               | In the US, the typical citizen commits an average of a
               | felony a day. The legal code and associated regulations
               | are so lengthy no one can read all of them. The tax code
               | alone is 2,600 pages and associated rulings 70,000 pages.
               | 
               | When you have so many laws, they can be applied
               | selectively depending on your political status, or to
               | benefit the regulators or their friends. We just caught
               | the sheriff of Santa Clara extorting citizens for tens of
               | thousands of dollars to get concealed carry permits,
               | which is why many people carry illegally, just like
               | criminals.
               | 
               | Creating a law where non-disclosure of the smallest
               | feature opens you up to government regulators harassment
               | is another avenue for graft and corruption. Like when the
               | EU selectively prosecutes US companies, or US regulators
               | selectively harass companies not in lock step with the
               | current administration.
               | 
               | I think if you really need a nanny state to protect you
               | from your own decisions you should be required to give up
               | all your decisions to the state.
        
               | jamesgeck0 wrote:
               | > In the US, the typical citizen commits an average of a
               | felony a day
               | 
               | This is surprising to me. Could you provide examples of
               | such common felonies US citizens commit in ignorance?
        
               | oblio wrote:
               | I think felony is the more serious one? Misdemeanor being
               | stuff like parking violation? I'd definitely want to see
               | some examples for felonies, too :-)
        
               | oblio wrote:
               | Laws can always be applied selectively. They have always
               | been applied selectively.
               | 
               | The point of laws is statistics: you can't discourage
               | everything, you need to discourage enough to have order.
               | 
               | And there is a middle ground between no laws and giving
               | everything up.
               | 
               | Also, I give up my liberties and obey laws in exchange
               | for protection from many nasty things people do when
               | there are no laws.
               | 
               | Based on your examples and vocabulary ("nanny state" is a
               | clear giveaway), you're American. Go live for 5-10 years
               | in a country with lax or non-existent laws and law
               | enforcement. We call those countries bad names for a
               | solid reason.
        
               | ClumsyPilot wrote:
               | For your argument to make sence you have to demonstrate
               | that this amounts to nanny state.
               | 
               | I don't think it does, I think is fraud. For it to be
               | consequences of of my choices, what choice do I make to
               | select a mobile phone carrier that does not sell data of
               | my location? Such choice does not exist.
               | 
               | You can't 'non-disclose' some 'small feature' of a
               | mortgage contract, of a loan, etc. Personal data deserves
               | similar respect.
               | 
               | Lastly, we can and do have different laws for individuals
               | and multi-billion dollar corporations - you cant use this
               | as an argument when we are discussing securities fraud
               | and banking regulations.
        
               | IfOnlyYouKnew wrote:
               | > In the US, the typical citizen commits an average of a
               | felony
               | 
               | Sorry but that's just obvious BS. If it were true, you'd
               | include examples and far more people a certain world
               | leader doesn't like would be "locked up".
        
           | ethbr0 wrote:
           | The act of breaching privacy is technically difficult to
           | prohibit in a way many of us would find palatable.
           | 
           | What should be targeted is the _product_ of said breaches.
           | Something like the blood diamond approach.
           | 
           | If your company has PII, then you by law must be able to
           | produce a consented attestation chain all the way back to the
           | source.
           | 
           | If you do not, then you're charged a fine for every piece of
           | unattested PII on every individual.
        
             | _jal wrote:
             | - Can you demonstrate the provenance of every phone number,
             | email address and other contact mode on your phone? Note
             | people's birthdays? Sure, you only want to target
             | companies. Make sure you choose your acquaintances wisely,
             | I guess, or make sure you record their grant of permission
             | to email them, because people abuse laws like this to
             | harass each other every single day.
             | 
             | - This also punishes possession, not use. If you think
             | about that for a minute, it should become clear both how
             | this doesn't attack the right problem, and how companies
             | would evade it.
             | 
             | - Finally... how are you going to audit Ford or Geico?
             | Honest question. Who pays for the audit of "every piece of
             | unattested PII on every individual"? How often, what is the
             | dispute mechanism, and who administers that? Seriously -
             | this sounds like a job for a new agency combining
             | significant portions of the IRS, the FBI and PwC.
        
               | ethbr0 wrote:
               | > _Sure, you only want to target companies_
               | 
               | Yes. There are many laws (e.g. accounting) that only
               | apply to companies, when it's scale that amplifies harm.
               | 
               | > _possession, not use_
               | 
               | How are they different, in this context? The latter
               | requires the former, and the former is unprofitable
               | without the latter.
               | 
               | > _how are you going to audit Ford or Geico?_
               | 
               | As you note, similarly to how we audit now, albeit
               | hopefully more proactively. If the law requires a signed
               | off third-party PII audit, and holds an auditor legally
               | liable for signing off on one... I expect the problem
               | would (mostly) take care of itself.
               | 
               | PII is always going to be a game of edge cases, but we've
               | managed to make it work with PCI and PHI in similarly
               | messy domains.
               | 
               | Right now, companies have GDPR & CCPA to nudge them in
               | data architecture. National laws would just further that.
               | I can attest to major companies retooling how they handle
               | and track consumer data just due to the CCPA.
        
             | ClumsyPilot wrote:
             | I thonk we are both arguing that clear consent must be
             | present, and the customer must have clearly agreed to
             | whatever you are doing with the data - that appears similar
             | to GDPR.
             | 
             | However, how do you prove John Doe has actually agreed to
             | this? What if John says he did not click accept button? Do
             | we require digital signature with certificates, given that
             | most people don't have them or know how to use them?
             | 
             | I think the problem is more tractable for physical products
             | running firmware - there you have real proof of purchase,
             | and, at present, firmware that does whatever it wants.
        
               | IfOnlyYouKnew wrote:
               | This is civil law. You find out by asking employees in
               | court. They aren't going to risk perjury, a criminal
               | offense, to spare their employer.
        
               | ethbr0 wrote:
               | It's analogous to the credit card fraud problem, no? E.g.
               | disputing charges and chargebacks?
               | 
               | I don't work in that space, but my understanding is that
               | the card processors essentially serve as dispute
               | mediators in those instances.
               | 
               | So it would seem unavoidable (although not great) to have
               | some sort of trusted, third-party middle person between
               | collectors and end users, who can handle disputes and
               | vouch for consent.
               | 
               | Blockchain doesn't seem like a solution, given that the
               | problem is precisely in the digital-physical gap. E.g. I
               | have proof of consent (digital) but no way to tie it to a
               | (disputed) act of consent (physical).
        
             | zepto wrote:
             | By the time someone is charged with a crime, the damage is
             | already done.
             | 
             | And, there will be many scammers who are simply out of
             | reach of meaningful legal remedies.
        
               | ben509 wrote:
               | That's true of virtually all criminal laws, though. If
               | you're beat up and your assailant is charged with
               | assault, you've already been beat up.
        
               | zepto wrote:
               | Yes, which is why you need both laws, _and_ prevention
               | methods.
        
           | samatman wrote:
           | I agree with this, to some degree.
           | 
           | But I've also known that macOS verifies signatures for as
           | long as it's been doing it. This was no secret, it was
           | _advertised as a feature_.
           | 
           | I assumed it wasn't being done in plaintext, because who
           | would be so foolish as to code it that way? and I'm still
           | plenty mad about that. Anyone could have checked this at any
           | time, presumably people did, and the only reason it became a
           | story is because the server got really slow and we noticed.
           | 
           | Apple says there will be a fix next year, which... eh better
           | than nothing, not even 10% as good as shipping the feature
           | correctly to begin with.
           | 
           | But of the many things about this episode which are worthy of
           | criticism, Apple being deceitful is nowhere among them. Never
           | happened.
        
             | saagarjha wrote:
             | It was mentioned in a developer presentation with very few
             | details as to how it worked. Apple did not go into details;
             | the information presented here was mostly reversed by app
             | developers and security engineers.
        
             | loa_in_ wrote:
             | > This was no secret, it was advertised as a feature.
             | 
             | I wish you could prove this.
        
               | egsmi wrote:
               | https://developer.apple.com/videos/play/wwdc2019/703/ The
               | second half of the talk is about the "hardened runtime"
               | 
               | And in the wider tech press
               | 
               | https://appleinsider.com/articles/19/06/03/apples-macos-
               | cata...
               | 
               | "Mac apps, installer packages, and kernel extensions that
               | are signed with Developer ID must also be notarized by
               | Apple in order to run on macOS Catalina"
               | 
               | Even on hacker news
               | https://news.ycombinator.com/item?id=21179970
        
         | ogre_codes wrote:
         | > A common refrain in arguments that we don't need laws to
         | protect privacy is that the market will take care of it.
         | 
         | Stronger privacy laws hurt Google, Facebook, and Amazon far
         | more than Apple. Most of Apple's privacy gaffs are just
         | bonehead moves like this one which shouldn't happen, but also
         | don't drive revenue.
        
           | tonyedgecombe wrote:
           | I've always thought Apple's focus on privacy (putting aside
           | the current incident) is rather clever as Google have no way
           | to respond. Any improvements to privacy undermine their
           | business model.
        
             | refulgentis wrote:
             | You'd think that'd be how it worked out, but funnily
             | enough, it's net negative even when Google actively tries
             | to enhance privacy! Using Apple's privacy-sensitive ad
             | blocker standard made people protest the limitations of a
             | fixed count of sites, and competitors spun it into a way
             | for Google to advantage themselves (details on that
             | somewhat murky)
             | 
             | We have a long way to go as an industry on keeping users
             | _well_ informed on their privacy, and I'm afraid Apple's
             | set us back years by morphing it into an advertising one-
             | liner over actually helping users along.
        
         | internet_user wrote:
         | which IETF drafts and patents? Can you point me to some links?
         | 
         | tia
        
           | nullc wrote:
           | Sure, lemme give you an example:
           | 
           | https://tools.ietf.org/html/draft-cavuto-dtcp-00
           | 
           | This protocol was created so that monitoring infrastructure
           | could reprogram asic-based packet filters on collection
           | routers (optical taps feed routers with half-duplex-mode
           | interfaces), which grab sampled netflow plus specific targets
           | selected by downstream analysis in realtime. It has to be
           | extremely fast so that it can race TCP handshakes.
           | 
           | I don't think it's much of an exaggeration to say that the
           | technical components of almost all the mass surveillance
           | infrastructure is described in open sources. Yes, they don't
           | put "THIS IS FOR SPYING ON ALL THE PEOPLE" on it, but they
           | also don't even bother reliably scrubbing sigint terms like
           | "tasking". Sometimes the functionality is described under the
           | color of "lawful intercept", though not always.
           | 
           | One of the arguments that people made against the existence
           | of widescale internet surveillance -- back before it was
           | proved to exist-- was that it would require so much
           | technology that it would be impossible to keep secret: the
           | conspiracy would have to be too big. But it wasn't kept
           | secret, not really-- we just weren't paying attention to the
           | evidence around us.
           | 
           | For a related patent example:
           | https://patents.google.com/patent/US8031715B1 which has
           | fairly explicit language on the applications:
           | 
           | > The techniques are described herein by way of example to
           | dynamic flow capture (DFC) service cards that can monitor and
           | distribute targeted network communications to content
           | destinations under high traffic rates, even core traffic
           | rates of the Internet, including OC-3, OC-12, OC-48, OC-192,
           | and higher rates. Moreover, the techniques described herein
           | allow control sources (such as Internet service providers,
           | customers, or law enforcement agencies) to tap new or current
           | packet flows within an extremely small period of time after
           | specifying flow capture information, e.g., within 50
           | milliseconds, even under high-volume networks.
           | 
           | > Further, the techniques can readily be applied in large
           | networks that may have one or more million of concurrent
           | packet flows, and where control sources may define hundreds
           | of thousands of filter criteria entries in order to target
           | specific communications.
        
         | nerbert wrote:
         | On argument to make is that the market only cares about the
         | majority of their customers, not all of them. If 0.5% of their
         | customers has their privacy busted by a backdoor, or that 0.01%
         | of google users have their account arbitrarily deleted, this
         | percentage of users is screwed like no one and the company
         | doesn't suffer any damage.
        
         | jaredklewis wrote:
         | Yes, unfortunately there is no safe harbor.
         | 
         | Companies can make mistakes or add backdoors and we won't know.
         | See this clusterfuck.
         | 
         | Open Source, likewise, can make mistakes and (much more rarely)
         | add backdoors, and we could know, but few have the resources to
         | so. See heartbleed.
        
         | simias wrote:
         | > The market can't act against what it can't see. Privacy loss
         | is often irreversible.
         | 
         | You're not wrong, but on the other hand has "the market" shown
         | any serious signal that it cares about privacy? From what I can
         | see people seem more than glad to trade privacy and personal
         | information for free services and cheaper hardware. Take
         | Samsung putting ads on their "smart" TV's UI and screenshotting
         | what people are watching for profiling, that's been known for a
         | while now. The market seems fine with it.
         | 
         | And I mean, at this point I could just gesture broadly at all
         | of Facebook.
        
           | andrepd wrote:
           | >that's been known for a while now
           | 
           | Ask a representative sample and I wager only a very small
           | percentage of people are actually aware of (1) the breaches
           | of privacy that are happening (e.g. your TV sending mic dumps
           | and screenshots of what you're watching), and (2) the hard
           | consequences of those invasions (that is, beyond the
           | immediate fact that you're being snooped upon), like higher
           | insurance premiums on auto and health, being targeted by your
           | opinions, etc.
        
             | mehrdadn wrote:
             | To be honest I would expect if you told people "your TV
             | will report what you're watching" they will think "wait,
             | doesn't my cable company _already_ know what I 'm
             | watching???"
             | 
             | If you tell them it's the manufacturer this time in
             | addition to the cable company, I'm not sure how many would
             | freak out over the extra entity.
        
               | moduspol wrote:
               | Very few.
               | 
               | I'm not sure how you'd measure it, but there seems to be
               | a huge disconnect between techie privacy advocates and
               | the rest of the world. The former keeps claiming the
               | latter just doesn't understand, or needs to be informed,
               | but I just don't think that's realistic.
               | 
               | I think it's pretty common knowledge that these companies
               | are harvesting all imaginable data to serve users more /
               | better advertisements and to keep them on the site, yet
               | usage continues to grow despite all the scandals. I think
               | advocates need to make more convincing claims to everyone
               | else that they're being harmed.
        
               | andrepd wrote:
               | In my anecdotical experience, that's not remotely true.
               | Yes there is a disconnect between techies and regular
               | people, in that for us it's obvious that these companies
               | are harvesting all this data and processing it in all
               | this ways and sharing it with all these people, but for
               | the majority of people it's not. Even for you, do you
               | think you fully understand how your data is being
               | utilised and what the consequences are?
        
               | moduspol wrote:
               | I think we've been repeatedly shown that people don't
               | care. From Cambridge Analytica, to what Snowden shared,
               | to voice-based assistants (like Amazon Alexa), every
               | instance is met with feigned surprise and then a
               | collective shoulder shrug.
               | 
               | > Even for you, do you think you fully understand how
               | your data is being utilised and what the consequences
               | are?
               | 
               | I don't think anyone can say with certainty, but I read
               | these threads so I'm quite aware they're collecting and
               | monetizing every imaginable thing they can. It's
               | difficult to articulate any measurable / real negative
               | consequences to me, personally.
        
           | eternalban wrote:
           | > You're not wrong, but on the other hand has "the market"
           | shown any serious signal that it cares about privacy?
           | 
           | Is there a pro-privacy Google out there whose products
           | languished while Google's succeeded?
           | 
           | The Silicon Valley VC network did not fund nor support
           | companies that promoted privacy. I can not think of a single
           | example.
           | 
           |  _Not a single major venture from SV VC network even
           | attempted to innovate the "business model"._ We can make
           | machines reason now but, alas, a business model that does not
           | depend on eradicating privacy is beyond the reach of the
           | geniuses involved. It is "Impossible"? I think, "undesirable"
           | is more likely. No one is even seriously trying. Point: money
           | behind SV tech giants is not motivated at all to fund the
           | anti-panopticon.
           | 
           | The salient, sobering, facts are that all these companies are
           | sitting on a SV foundation that was and remains solidly
           | "national security", "military", and "intelligence". The
           | euphemism used is to mention SV's "old boy network".
           | 
           | https://steveblank.com/secret-history/
        
           | ardy42 wrote:
           | > You're not wrong, but on the other hand has "the market"
           | shown any serious signal that it cares about privacy? From
           | what I can see people seem more than glad to trade privacy
           | and personal information for free services and cheaper
           | hardware. Take Samsung putting ads on their "smart" TV's UI
           | and screenshotting what people are watching for profiling,
           | that's been known for a while now. The market seems fine with
           | it.
           | 
           | I'd guess that most of that's due to information asymmetry.
           | Privacy losses aren't advertised(and often hidden) and are
           | more difficult to understand, but price is and is understood
           | by everyone.
           | 
           | Take that Samsung example: it's not like they had a bullet
           | point on the feature list saying "Our Smart TVs will let us
           | spy one what you're watching, so we can monetize that
           | information."
        
           | oarsinsync wrote:
           | > has "the market" shown any serious signal that it cares
           | about privacy?
           | 
           | Depends on what you consider a serious signal of care. If
           | 'voting with you wallet' is the measure, increasing levels of
           | income inequality, stagnant wages, weakening employee rights
           | through the gig-economy, etc. are effectively taking away
           | that choice, as most market participants cannot _afford_ to
           | make the choice.
           | 
           | Also, what is the paid alternative to Apple or Google photos
           | that allows me to have the same end user experience, without
           | giving up my privacy? "the market" doesn't even have such an
           | offer that I can see. The closest I can find (and that's
           | through here) is photostructure.com, and even that's lacking
           | all the local-ML-foo that makes Apple/Google photos a
           | compelling option over anything else.
           | 
           | > Take Samsung putting ads on their "smart" TV's UI and
           | screenshotting what people are watching for profiling, that's
           | been known for a while now.
           | 
           | Known by who? I'd wager a year's salary that >50% of Samsung
           | TV owners (and bump that number up to >75% of Samsung TV
           | _users_ ) do not know this is happening.
        
             | [deleted]
        
             | zepto wrote:
             | Income inequality and stagnant wages etc are not the
             | consequences of Apple and Google.
        
               | [deleted]
        
               | sudosysgen wrote:
               | No one suggested they are. But they still make it much
               | less likely for the average Joe to put money and effort
               | to protect their privacy, and this is known to those that
               | make major pricing decisions.
        
               | kortilla wrote:
               | We're talking about Apple here. People who can afford a
               | Mac over a cheap pc/chrome book already have disposable
               | income and are making trade-offs with it.
        
               | sudosysgen wrote:
               | That is not really true. Many people of lower income will
               | buy a Mac after their cheap laptops break because they
               | perceive them, correctly or not, to be the most reliable
               | and thus less expensive on the long term laptops.
               | Sometimes even second hand.
               | 
               | Also, Chromebooks are not viable alternatives for many
               | people especially of lower income. Having to rely on
               | being always online is an issue when you can't always
               | guarantee having internet. Been there, done that.
        
               | kortilla wrote:
               | > That is not really true. Many people of lower income
               | will buy a Mac after their cheap laptops break because
               | they perceive them, correctly or not, to be the most
               | reliable and thus less expensive on the long term
               | laptops.
               | 
               | This is not true. The tiny user-base of OSX compared to
               | Windows is already evidence that most people are not
               | buying MacBooks.
        
           | ethbr0 wrote:
           | > _From what I can see people seem more than glad to trade
           | privacy and personal information for free services and
           | cheaper hardware_
           | 
           | I'd argue that's more of what parent was talking about with
           | regards to visibility.
           | 
           | "Privacy" isn't something anyone can see. What you see are
           | the effects of a lack of privacy.
           | 
           | Given the dark market around personal data (albeit less in
           | the EU), how are consumers to attribute effects to specific
           | privacy breaches?
           | 
           | If Apple sells my app history privately to a credit score
           | bureau, and I'm denied a credit card despite having a stellar
           | FICO, how am I supposed to connect those dots?
        
           | tinkertamper wrote:
           | I hear this argument a lot but I think it is exactly OPs
           | point when he says, "The market can't act against what it
           | can't see".
           | 
           | Your average consumer doesn't know the extent of what they're
           | trading. Take Facebook, even with high profile stories and
           | documentaries it's reasonable for your average consumer to
           | assume that what Facebook tracks about them is what they
           | actively give to Facebook themselves.
           | 
           | I've had conversations with people that say, "I rarely even
           | post on Facebook" and "If they want to monitor pictures of my
           | food/dog/etc whatever who cares", without any solid
           | understanding of what even having the app installed alone is
           | giving Facebook.
        
             | [deleted]
        
             | oblio wrote:
             | It's ok, if they will be educated they will care. Just like
             | they care now about not using single use plastics, buying
             | the biggest and most gas guzzling SUV or flying on holidays
             | across the globe.
             | 
             | They won't care even when they'll know. And they might not
             | ever know.
        
               | gpanders wrote:
               | Yes exactly. I don't think an abstract understanding of
               | the costs is enough. If the cost isn't physically or
               | viscerally felt, it just doesn't factor into people's
               | decision making.
               | 
               | This is where the pricing system really comes into great
               | effect. People buy and drive fewer SUVs when gas is more
               | expensive. If we want people to buy fewer SUVs, increase
               | the fuel tax.
               | 
               | Education is not enough, and in fact might not even be
               | necessary at all. Just introduce real costs to capture
               | the "abstract" costs (externalities) and the problem will
               | likely correct itself.
        
         | ardy42 wrote:
         | > A common refrain in arguments that we don't need to reject
         | closed source software to protect privacy is that being closed
         | source doesn't hide the behaviour, and people will still notice
         | backdoors and privacy leaks. Sometimes they do, sometimes they
         | don't.
         | 
         | And there are cases where it's not practical for "people to
         | notice." For instance: a privacy leak that only uses the cell
         | network connection of a phone, which would avoid easily-sniffed
         | connections.
        
           | iso1631 wrote:
           | Even with traffic monitoring, how do you distinguish bad TLS
           | traffic from a machine you don't control to a cloudflare IP
           | to good TLS traffic?
        
         | pibechorro wrote:
         | Ya no thanks. Top down regulation will just make startups less
         | likely to enter new disruptive tech. The solution is choice,
         | stop using Apple products and all their shadyness stops being
         | an issue.
        
           | lawnchair_larry wrote:
           | Nothing shady about this. It isn't logged and it protects
           | customers from malware. That was the purpose. Most customers
           | want that.
        
           | eganist wrote:
           | > Ya no thanks. Top down regulation will just make startups
           | less likely to enter new disruptive tech. The solution is
           | choice, stop using Apple products and all their shadyness
           | stops being an issue.
           | 
           | And since people have demonstrated that they're not
           | appropriately incentivized to stop, that's where the
           | regulations snap in, which brings us back to where we
           | started: there's a need for it.
           | 
           | Solution could just be to regulate based on a tightly managed
           | definition of age. Enable younger companies to have a bit
           | more flexibility in determining their business model as
           | controls slowly snap in as the company ages. There are some
           | pretty clear loopholes that immediately come to mind (e.g.
           | re-chartering the company every few years and transferring
           | assets) that'll need to somehow be managed, but it should be
           | enough to give companies runway to figure out how to disrupt
           | and monetize while coming into compliance with consumer
           | protections.
        
           | diffeomorphism wrote:
           | > The solution is choice
           | 
           | Correct. Among apple, MS and google you have no choice. That
           | is why regulation is necessary.
        
           | oblio wrote:
           | How do you provide choice? How do you commoditize an entire
           | hardware and software ecosystem, vertically integrated?
           | 
           | Just like Standard Oil or AT & T were displaced, right? By
           | customers going to their competitors? I'm being sarcastic,
           | obviously :-)
        
         | polote wrote:
         | > A common refrain in arguments that we don't need laws to
         | protect privacy is that the market will take care of it
         | 
         | Seriously, who has ever been successful at defending that idea
         | ?
        
           | lucideer wrote:
           | Many lobbyists and lawmakers, unfortunately.
        
         | api wrote:
         | > The market can't act against what it can't see. Privacy loss
         | is often irreversible.
         | 
         | I would add that the market can't act to prevent risks that are
         | outside the market and not taken into account by the market.
         | 
         | The big risks from widespread privacy loss are the exploitation
         | of private data by criminals, foreign unconventional warfare by
         | terrorists or hostile states, and the rise of a totalitarian
         | government here in the USA.
         | 
         | Criminal action can to some extent be priced into a market, but
         | the other two really can't be.
        
         | valuearb wrote:
         | Probably no one cares because Apple's OCSP checks don't reduce
         | your privacy.
        
           | grupthink wrote:
           | They should care. The checks are sent unencrypted over HTTP
           | to Apple's OCSP.
        
             | valuearb wrote:
             | Since they don't identify specific apps you use, so what's
             | your point?
        
               | throwaway525142 wrote:
               | As far as I understand it, most vendors ship a single
               | digit amount of apps. If you start the Tor browser,
               | everyone on your network will know. If you start Firefox,
               | everyone on your network will know you started a Mozilla
               | product, most likely Firefox. If you start the Zoom
               | client, everyone on your network knows you started the
               | Zoom client.
               | 
               | I don't think the "it's only the vendor" defense of Apple
               | is any good.
        
               | athms wrote:
               | On MacOS, developer certificate requests are NOT done for
               | every application launch. Responses are cached for a
               | period of time before a new check is done.
               | 
               | FYI -- Both Firefox and Safari use OCSP to check server
               | certificates. Anybody sniffing your network could figure
               | out which websites you visit. Chrome still uses CRL; it
               | trades precision for performance.
        
               | pseudalopex wrote:
               | That period of time was 5 minutes.
        
             | athms wrote:
             | HTTP is specified in the RFC. Only the developer
             | certificate is checked. OCSP is also used by web browsers
             | to check the revocation status of certificates used for
             | HTTPS connections. Apple leveraged OCSP for its Gatekeeper
             | functionality. This is not the same thing as notarization,
             | which is checked over HTTPS.
             | 
             | https://blog.jacopo.io/en/post/apple-ocsp/
             | 
             | Perhaps you should learn about OCSP before complaining
             | about its use of HTTP.
        
               | samatman wrote:
               | Vendors MAY use TLS, and Apple didn't (though they say
               | they'll start).
               | 
               | You might want to read the RFC, rather than a blog post
               | about it, before making such confident pronouncments.
        
         | amelius wrote:
         | So, we should take away the market's incentive to infringe upon
         | our privacy: by making user-tracking illegal.
        
         | absolutelyrad wrote:
         | The market only acts fairly when the product is a commodity.
         | The time for the market to react for a product with the
         | complexity of a mac is decades.
         | 
         | As the ecosystem grows, the cost of switching increases.
         | Therefore market starts acting more and more inefficiently.
         | 
         | This is why countries have state intervention in such cases.
         | And anti trust exists.
         | 
         | If the option was a mac with privacy vs a mac without privacy
         | but $10 cheaper, I'd think the market would pick the mac with
         | privacy. No such choice within the budget exists, and if a
         | company has a monopoly on the ecosystem, then the market cannot
         | react when held hostage. The cost to transition to a different
         | ecosystem is thousands of $'s when you've been using a mac your
         | entire life. It's not like you'd want to relearn many things
         | when you get older, perhaps the sunk cost fallacy as many would
         | define it.
         | 
         | It's a bundled deal, it's not like you can pick the parts you
         | like and throw away the ones you don't like it would be in an
         | efficient market.
         | 
         | If even someone like me cannot be bothered to move to more
         | privacy friendly platforms, then I don't have hope for other
         | people. It's just not worth it. Until the majority face the
         | consequences of their apathy, any action you take is fruitless.
         | Let them taste the pain, then offer the solution.
         | 
         | I know I can be tracked. So what? I can't verify the silicon in
         | my device, can I? Gotta trust someone, maybe blockchain will
         | solve the problem of trust among humans, in a verifiable way.
        
           | _puk wrote:
           | Whilst I agree with the sentiment, it does occur to me just
           | how many kindles I see with ads.
           | 
           | Is there any data released on ads Vs no ads versions?
           | 
           | That's the closest comparator I can think of.
        
             | michaelmrose wrote:
             | They sold a lot of the ad enabled tablets one black friday
             | for like $80 which seemed like a good deal at the time
             | despite not running google apps which most people desire,
             | having ads on the lock screen, and having the worlds
             | shittiest home screen app for android. I bought one for my
             | wife. If you are lazy or not inclined you can actually pay
             | after the fact to remove ads.
             | 
             | Alternatively you can deliberately break the ad
             | functionality, install google apps, and android lets you
             | change your home screen.
             | 
             | 1 and 2 worked but they broke the ability to set your own
             | home so the fix is a hacky app to hijack the home button to
             | show the right home of your choosing. This worked for over
             | a year then they repeatedly blacklisted such apps, then it
             | worked but with a 2 second delay which is basically
             | horrible and extensions started crashing the gmail app.
             | 
             | Worst piece of shit ever.
        
             | oarsinsync wrote:
             | > Whilst I agree with the sentiment, it does occur to me
             | just how many kindles I see with ads.
             | 
             | > Is there any data released on ads Vs no ads versions?
             | 
             | Do they offer a tracking vs no tracking option too? The
             | absence of adverts does not mean the absence of tracking.
        
               | [deleted]
        
               | [deleted]
        
               | bcrosby95 wrote:
               | For the non-tablet Kindle, I'm not sure if the
               | differentiation between tracking and ads makes a lot of
               | sense. You (well, I at least) already use Amazon to buy
               | the books that are on my Kindle. I guess they could track
               | how fast I read them. But that seems trivial compared to
               | already knowing what I'm reading.
        
               | derefr wrote:
               | The tracking is somewhat inherent to the software --
               | syncing what page you've read up to in a book between
               | devices (a feature many people find crucial!), cannot
               | really be divorced from having the raw data to create
               | server-side metrics about people's reading habits.
               | 
               | Even if you E2E-encrypt each user's data for cloud
               | storage and have devices join a P2P-keybag, ala iMessage,
               | consider the ad-tech department of your same company as
               | if they were an external adversary for a moment. What
               | would an external adversary do in that situation? Traffic
               | analysis of updates to the cloud-side E2E-encrypted
               | bundle. That alone would _still_ be enough to create a
               | useful advertising profile of the customer 's reading
               | habits, since your app is single-purpose -- the only
               | reason for that encrypted bundle to be updated, is if the
               | user is flipping pages!
               | 
               | And, together with the fact that your ad-tech department
               | _also_ knows what books the customer is reading (because
               | your device can only be used to read books _you sell_ ,
               | and thus books you have transaction records for selling
               | them), this department can probably guess what the user
               | is reading anyway. No matter how much your hardware-
               | product department tries to hide it.
        
               | simpss wrote:
               | ofcourse it can be divorced, keep the data client-side,
               | as kindle does?
        
               | zrm wrote:
               | > syncing what page you've read up to in a book (a
               | feature many people find crucial!), cannot really be
               | divorced from having the raw data to create server-side
               | metrics about people's reading habits.
               | 
               | Sure it can. You encrypt the data so the client can read
               | it and the server can't. When you add a new device, you
               | e.g. scan a QR code on your old device so that it can add
               | the decryption key to the new device, and the server
               | never knows what it is.
        
               | dcow wrote:
               | That scenario is addressed in the next few paragraphs.
        
               | zrm wrote:
               | The next few paragraphs were not originally in the post.
               | And you obviously get less information from knowing the
               | user has updated some encrypted data than having the
               | specific page of the specific book. It is also not
               | inherently necessary for the server to have even that
               | information; it could be sent directly from one device to
               | the other(s) or via an independent third party relay
               | (e.g. Tor).
        
               | catlifeonmars wrote:
               | This. Distributed consensus is easy and cheap to
               | implement when all participants can be trusted (as
               | opposed to the Byzantine problem that I.e. blockchain
               | tries to solve). It's even easier if you're ok with
               | approximately eventual consistency with a reasonably high
               | probability of convergence. Page state seems like it fits
               | into this category
        
             | lambda_obrien wrote:
             | I wish i could pay for no ads on everything. If there are
             | only like three ad networks for 99 percent of the internet,
             | and they can track me easily, and they know how much I'm
             | worth to them, can't they just email me every month and
             | say, "if you want a no-Doubleclick ad experience next
             | month, it will cost you $4.65, click here to pay." Not like
             | they aren't already doing the tracking, and this would
             | increase the value to their ad customers since they
             | wouldn't pay for views from people who don't want to see
             | ads.
        
             | paublyrne wrote:
             | > Whilst I agree with the sentiment, it does occur to me
             | just how many kindles I see with ads.
             | 
             | True, although the price difference for the Kindle is about
             | 20%. If the discount on a Macbook Air was similar, I'm sure
             | it would be well subscribed.
        
               | Spooky23 wrote:
               | The discount for what? There are no ads on the Mac.
        
               | paublyrne wrote:
               | > If the option was a mac with privacy vs a mac without
               | privacy but $10 cheaper
               | 
               | OP was comparing the Kindle with ads discount to an
               | imaginary Mac without privacy discount.
        
               | Spooky23 wrote:
               | It sounds like one of the weirder "freedom" arguments.
               | Who is going to pay extra for a Mac without key security
               | features?
        
             | ClumsyPilot wrote:
             | I think its worthwhile trying to test the hypothesys, but i
             | dont think anyone takes privacy on a book reader with the
             | same passion as they do on a mobile phone.
        
             | buckminster wrote:
             | I bought a kindle. It didn't have ads. I did a factory
             | reset. Now it does. This is the first time I've heard about
             | a choice.
        
             | tzs wrote:
             | I've never had a non-eInk Kindle, so maybe it is different
             | for them, but I've had eInk Kindles both with and without
             | ads. My first was without ads. Since you can add the
             | "without ads" option to a "with ads" Kindle later by paying
             | the difference, I bought my second with ads to see how bad
             | it was.
             | 
             | Here are the only differences I've noticed:
             | 
             | * With ads, the sleep screen displays some artwork from a
             | book Amazon is selling [1] and some text suggesting you
             | come to the store and buy books,
             | 
             | * The home screen has an ad at the bottom.
             | 
             | Since when I'm not using the Kindle the sleep screen is
             | hidden behind the cover, and when I am using it I'm
             | somewhere other than the home screen 99.9% of the time, I
             | never saw any reason to add the "without ads" option, and
             | when it was time to replace that Kindle I again went with
             | ads.
             | 
             | I suspect that the vast majority of people that buy the "no
             | ads" option up front do so because when they see "with ads"
             | they are envisioning a web-like experience where the ads
             | are all over the place and intrusive and animated and
             | distracting.
             | 
             | [1] Usually a romance novel for me, even though I've never
             | bought any romance novels from Amazon, nor anything even
             | remotely like a romance novel. In fact, I don't think any
             | book I've bought from them even had any romantic relations
             | between any of its characters.
        
               | heelix wrote:
               | Heh. With ads, this was always one of the ways I could
               | embarrass my Bride. You are reading '50 Shades of Grey'
               | or whatever trashy novel shows up on the cover of her
               | gadget. Always makes her blush.
        
           | wkrsz wrote:
           | I imagine it more as a tragedy of commons situation. I don't
           | care that much about my privacy right now to sacrifice a lot
           | of convenience. However if enough consumers rejected products
           | that encroach on privacy, we'd get privacy and keep most of
           | convenience.
        
           | erikpukinskis wrote:
           | > The market only acts fairly when the product is a
           | commodity.
           | 
           | The market only acts fairly in the window after a product is
           | commoditized and before regulatory capture happens.
           | 
           | And in some cases that window is closed before it opens.
        
           | specialist wrote:
           | _" The market only acts fairly when the product is a
           | commodity."_
           | 
           | Based on context (rest of your comment), did you mean
           | "substitute good" instead of "commodity"?
        
           | 0xWTF wrote:
           | > The time for the market to react for a product with the
           | complexity of a mac is decades.
           | 
           | This makes me think of the 2 slit experiment as applied to
           | basketballs. There is a period of time that decisions need to
           | be properly considered, presumably simple decisions need
           | little time, and complex decisions need more time. There is
           | also a period of time that is required to make a decision,
           | and a level at which the decision is made.
           | 
           | There are lowly software engineers making decisions like
           | this, some of which may not even percolate to daily stand-up,
           | yet will require the Supreme Court to unpack 10 years from
           | now. And no one really knows.
           | 
           | It's like trying to pass a basketball through a slit. Yes,
           | there's a theoretical interference pattern, and you can
           | calculate it, but you can't do the experiment because you
           | can't pass the basketball through a slit that small (in the
           | case of basketballs, if I recall, the largest slit is
           | angstroms if not smaller).
           | 
           | So in software development, You've got huge uncertainty in
           | the societal implications of some decisions about what
           | information you're going to pass over the network, but you
           | can't even get them all through daily stand-up, let alone to
           | Congress or the Supreme Court. Somewhere along the way, some
           | of them end up on Hacker News with people mis-quoting Eric
           | Hoffer, "Every great security decision starts off as a
           | movement, turns into a business, and ends up as a racket."
        
             | tsimionescu wrote:
             | > It's like trying to pass a basketball through a slit.
             | Yes, there's a theoretical interference pattern, and you
             | can calculate it, but you can't do the experiment because
             | you can't pass the basketball through a slit that small (in
             | the case of basketballs, if I recall, the largest slit is
             | angstroms if not smaller).
             | 
             | I know this doesn't have too much to do with the core of
             | your post, but I want to mention it nevertheless: QM does
             | not imply that there is an interference pattern for
             | basketballs. There might or might not exist one, but we
             | would need an answer to the question of measurement to
             | predict one way or the other.
        
           | triangleman wrote:
           | >maybe blockchain will solve the problem of trust among
           | humans
           | 
           | Absolutely not.
           | 
           | https://www.schneier.com/blog/archives/2019/02/blockchain_an.
           | ..
        
             | hanniabu wrote:
             | There's so much wrong with this post I'm not even sure
             | where to start. Literally almost every paragraph starts
             | something untrue. The whole article is written from a false
             | understanding.
        
               | kstrauser wrote:
               | If you're going to claim Schneier is wrong on crypto
               | stuff, you'll want to bring a suitcase of evidence along
               | if you want people to take your claim seriously.
        
               | h_anna_h wrote:
               | Well, he does think that one is unable to establish the
               | integrity and authenticity of a message by using DKIM,
               | so...
               | 
               | He does seem to have a weird cult of personality around
               | him but I can't really understand why. He has been
               | irrelevant for quite a while now.
        
               | alwillis wrote:
               | _If you're going to claim Schneier is wrong on crypto
               | stuff, you'll want to bring a suitcase of evidence
               | along..._
               | 
               | How about $348 billion dollars that says he's wrong about
               | his take on Bitcoin?
               | 
               | Look, I get it. I respect Schneier's knowledge on
               | encryption but he is wrong about blockchains and about
               | Bitcoin in particular.
               | 
               | But he wouldn't be the first establishment
               | technologist/economist/politician to be wrong about
               | Bitcoin.
               | 
               | As I write this, the market cap of Bitcoin is a little
               | over $348 billion dollars [1]; there's no way it gets to
               | this valuation if its distributed trust model didn't
               | work.
               | 
               | He's making _social_ and _process_ arguments, not
               | technical ones.
               | 
               | [1]: https://bitbo.io
        
               | jonnytran wrote:
               | Can you please explain what those untrue statements are
               | and what his false understanding is?
        
               | colejohnson66 wrote:
               | The "false understanding" is most likely how he starts at
               | the conclusion of "Bitcoin == bad" and works from that
               | instead. But both types of essays are ok. The former is
               | just an "argumentative" or "persuasive" essay, while the
               | latter is akin to a "compare and contrast" one.
        
               | colejohnson66 wrote:
               | I'm actually curious what's wrong about it? I read it
               | from an outsider perspective and it's full of very
               | convincing arguments against "blockchains." You're
               | absolutely correct that he's writing from his
               | understanding, but Schneider's been in the field for
               | decades (more than many Bitcoin proponents are _old_ ),
               | so I'm more inclined to believe he knows what he's
               | talking about than some other random person on the
               | internet.
        
               | wongarsu wrote:
               | I think the main reason for the dissonance is that
               | Schneier talks about the trust that happens (and maybe
               | has to happen in real-world scenarios) while the bitcoin
               | community likes to talk about the minimum amount of trust
               | necessary.
               | 
               | You don't _have to_ trust the software, you can verify it
               | or implement your own. You don 't _have to_ trust your
               | internet uplink, the protocol would work over carrier
               | pigeons or with dead drops. You don 't _have to_ trust
               | exchanges, just exchange bitcoin for local currency with
               | your neighbor. And even if you use an exchange you
               | shouldn 't store money there anyways. The minimum
               | required trust is tiny (basically you yourself), but of
               | course as Schneier points out the amount of trust
               | involved in practise isn't nearly as low, and for many
               | people the failure cases are much worse
        
               | spurgu wrote:
               | This is pretty much how I interpret it as well, in
               | regards to trust.
               | 
               | One detail Schneier misses though is:
               | 
               | > Honestly, cryptocurrencies are useless. They're only
               | used by speculators looking for quick riches, people who
               | don't like government-backed currencies, and criminals
               | who want a black-market way to exchange money.
               | 
               | The second statement contradicts the first. People who
               | don't like/trust government-backed currencies aren't
               | fanatics. The way the big banks handle money is quite
               | reckless and dangerous, as we've seen time and time
               | again. And buying drugs for your own personal use should
               | be legal (IMO) but is not. Cryptocurrencies _are_ useful.
               | Maybe not to most people, or to Bruce Schneier, but that
               | (blanket) statement is false.
        
               | [deleted]
        
               | [deleted]
        
         | Spooky23 wrote:
         | This isn't that. I've been aware of this for some time, pretty
         | sure it was in the security white paper and talked about as a
         | feature.
         | 
         | People forget about CRLs because browsers mostly ignore them.
         | 
         | People just go crazy for any Apple story because it attracts
         | attention. People have been paying to send all sorts of app
         | launch analytics to AV companies for example since the 90s.
        
         | blub wrote:
         | The laws are already there. If you care about this and have
         | some free time you may try to make a complain to the Irish data
         | protection commission.
         | 
         | I'm not sure if it's infringing though. If Apple says that they
         | do not collect personal data and that the information is thrown
         | away _and_ this is done for a legitimate business purpose or
         | for the customers (i.e. protecting customers), it may well be
         | fine according to the GDPR.
        
       | keyle wrote:
       | I know this because if my internet drops out - the router still
       | is up, but packets are going nowhere - opening an app, especially
       | one of mine, takes north of 10 seconds, which is the time for
       | this thing to give up on talking to Apple.
        
       | rednum wrote:
       | Another fun fact about this system: something changed in how the
       | binaries are evaluated and one VST plugins I've downloaded months
       | ago was marked as malware. The plugin is quite popular in
       | community so I think it's unlikely it contains actual malicious
       | code (in fact I've contacted the developer and he said he has
       | done some fixes for Apple's security policies recently). Imagine
       | my shock when I open an old project in Ableton and suddenly some
       | sounds just don't work. This really sucks, I don't want to worry
       | about whether my music will work five or ten years from now (I
       | can imagine I may I want to remix some old piece). I suppose I
       | can err on side of safety and export all tracks to wav.
       | 
       | However, it's not an isolated problem. It feels that every other
       | week something happens that undermines my confidence in Macbook
       | as good device for making music.
        
         | skuthus wrote:
         | >I don't want to worry about whether my music will work five or
         | ten years from now
         | 
         | This is exactly what Apple has already done to the iTunes
         | world, music you had a decade ago is suddenly inaccessible
        
           | lawnchair_larry wrote:
           | What is this referring to? Mine seems to work fine.
        
           | acdha wrote:
           | Do you have an example of this?
        
           | tpush wrote:
           | > [...] music you had a decade ago is suddenly inaccessible
           | 
           | Music bought via iTunes doesn't have any DRM since 2009.
        
             | santraginean wrote:
             | But even without DRM, if you go with the default settings
             | and don't download your music locally, you lose access to
             | it if they decide to remove it from their catalog. Same
             | goes for movies/TV shows; I've had both disappear from my
             | iTunes library at various points.
        
         | pacifika wrote:
         | It's been good practice for a long time to "freeze" or "render"
         | the tracks out after the song is finished so that the song can
         | be loaded without the plugins.
        
           | wintermutestwin wrote:
           | True, but this shouldn't be necessary in response to anti-
           | consumer behavior.
        
             | dkonofalski wrote:
             | This is not anti-consumer behavior. Consumers are, overall,
             | protected when they can verify the source of an application
             | or extension on their computers. Their freedom may be
             | limited but it's not a black-and-white "this is anti-
             | consumer".
        
               | cma wrote:
               | Some signatures are invalidated due to business disputes
               | on entirely different platforms (Epic dispute on iOS,
               | signatures invalidated, or threatened to be before court
               | order prevented it, on OS X, for no security reason).
        
               | dkonofalski wrote:
               | Epic violated the Terms of Use for their developer
               | agreement which applies to _all_ platforms. They knew
               | that and they violated it willingly. The court order only
               | prevented it temporarily to reduce the damages that may
               | be incurred and until a determination was made in the
               | initial case.
               | 
               | That is not anti-consumer.
        
               | cma wrote:
               | Apple promised it would only be used for security related
               | stuff on desktop. I wouldn't want my desktop audio
               | project to break because the VSTs were unsigned due to an
               | iOS app business dispute. That's anti-consumer.
        
               | dkonofalski wrote:
               | I agree that it's anti-consumer. I disagree that it's
               | Apple that's being anti-consumer. The company developing
               | the app would be anti-consumer for knowingly violating
               | the Terms of Use to try and pull a PR stunt.
        
               | exoji2e wrote:
               | Well even if Epic are the bad guys by violating the ToU
               | willingly, it still impacts the user. As a user I don't
               | want my apps (which I depend on) to stop working, because
               | of a business disagreement.
               | 
               | Revoking signatures and disabling the apps on user
               | devices to protect your business model is definitely
               | anti-consumer in my book.
               | 
               | You could easily see Apple revoking signatures because of
               | DMCA claims. Even faulty ones, like the claim RIAA made
               | against youtube-dl on GitHub.
        
               | dkonofalski wrote:
               | Of course it impacts the user... And if Epic was found
               | doing something illegal and was shut down or bankrupted,
               | that would also impact the user. Your over-simplification
               | that it's a "business disagreement" is disingenuous and
               | incomplete. The signature revocation system you're
               | claiming is simply to "protect their business model" is
               | the same system that allows Apple to immediately shut
               | down any malware that makes its way into the App Store
               | inadvertently. It's the same system that's been used in
               | the past to _protect_ users from private key leaks.
               | 
               | The only anti-consumer behavior in your situation came
               | from Epic who knowingly violated the rules as a PR stunt.
        
         | blub wrote:
         | I assume you upgraded OS, in which case it's annoying but not
         | unusual that plugins stop working.
         | 
         | A machine that's used for making professional music should not
         | be upgraded or connected to the internet. If it's for a
         | hobby... I think we will have to live with the compromise if we
         | want to have the latest security fixes and connect to the
         | internet.
        
       | [deleted]
        
       | jrochkind1 wrote:
       | Honest question I'm not an expert: The initial commments in this
       | thread are painting it as a severe privacy violation. (The actual
       | OP article author does not necessarily share this perspetive).
       | How is what is being done with OCSP different in more concerning
       | way for privacy (if it is) from Firefox or Chrome's use of OCSP?
        
         | pfortuny wrote:
         | Because when you browse the Internet you know you are browsing
         | it. When you run software, you do not expected "unexpected"
         | Internet use.
         | 
         | You go for a walk, you carry an umbrella, or go dressed. You
         | are at home, you do not expect it to "rain" or for someone to
         | "watch you".
        
           | cdubzzz wrote:
           | That seems incredibly naive. I can't think of a single
           | program off the top of my head that doesn't use the Internet
           | to some extent while running. Even many CLI tools I use for
           | development do update checks (and sometimes analytics) in the
           | background.
        
             | danShumway wrote:
             | > I can't think of a single program off the top of my head
             | that doesn't use the Internet to some extent while running.
             | 
             | What OS are you using? Purely off the top of my head, Linux
             | programs I don't expect to be regularly connecting to the
             | overall Internet in the background (at least unless I set
             | an explicit opt-in setting or use a user-initiated action):
             | 
             | - Keepassxc
             | 
             | - Krita
             | 
             | - Blender
             | 
             | - Nearly all of my CLI tools
             | 
             | - digiKam
             | 
             | - VLC
             | 
             | - Audacity
             | 
             | - etc...
             | 
             | Programs I have that do check for updates or otherwise send
             | activity to the Internet on boot:
             | 
             | - Firefox
             | 
             | - Calibre
             | 
             | - Emacs (Spacemacs)
             | 
             | - (Probably) something somewhere else that I'm forgetting?
             | 
             | Admittedly, I'm kind of cheating by using Linux instead of
             | Windows/Mac. Those are OSes that don't have good update
             | managers, so many apps manage updates themselves.
             | 
             | But even with those apps, there's a difference between an
             | app itself initiating an Internet connection
             | opportunistically in the background, and an app initiating
             | an Internet connection that actually interrupts the app
             | from launching. Very few of my native apps ( _side-eyes
             | Emacs irritably_ ) will stop working or freeze on boot if
             | their update server is down or responding slowly.
             | 
             | Remember that the reason people found out about this in the
             | first place is that they went to start launching apps on
             | their Macs and found out none of them would start. That's
             | the kind of behavior I would expect from a web browser if
             | some global server was getting hammered (even if I would
             | still be irritated to see it), but that I basically never
             | expect for a native app.
        
               | cdubzzz wrote:
               | > Admittedly, I'm kind of cheating by using Linux instead
               | of Windows/Mac.
               | 
               | I don't think that is cheating. I was primarily thinking
               | of my own work development environment in macOS but
               | comparing that to Linux is perfectly valid. Again though,
               | I didn't mean to say, "I bet you can't name a single
               | program that doesn't use the Internet!" I just meant to
               | point out that programs using the Internet are probably a
               | fairly considerable majority for most regular users.
               | 
               | I use KeepassXC on macOS and it definitely does check for
               | updates. Does it not do that on Linux?
        
               | danShumway wrote:
               | Almost none of my programs on Linux do that by default,
               | because they're handled by my package manager.
               | 
               | Programs like Spacemacs (updating ELPA repos on boot,
               | which I actually kind of think is a mistake) and Calibre
               | (just kind of doing its own thing) are the exception to
               | that rule, but they're pretty rare in my personal
               | experience. Even Firefox doesn't update itself on my
               | Linux box.
               | 
               | That's kind of why I was thinking of Linux as cheating on
               | some level. Windows/Mac programs basically _can 't_ do
               | the same thing, since they don't have the same
               | infrastructure.
               | 
               | > I just meant to point out that programs using the
               | Internet are probably a fairly considerable majority for
               | most regular users.
               | 
               | I would push back a tiny bit on this -- I don't think
               | regular users would be surprised by a native program
               | contacting the Internet, but I do think they would be
               | surprised if that rest request failing meant that the
               | program couldn't launch.
        
               | cdubzzz wrote:
               | > I do think they would be surprised if that rest request
               | failing meant that the program couldn't launch.
               | 
               | I fully agree. The only point I was aiming to make with
               | my original comment was that the mere act of a program
               | connecting to the Internet "unexpectedly" is by no means
               | abnormal.
               | 
               | > That's kind of why I was thinking of Linux as cheating
               | on some level. Windows/Mac programs basically can't do
               | the same thing, since they don't have the same
               | infrastructure.
               | 
               | MacOS, at least, is definitely headed in that direction
               | with its App Store and the move to Apple silicon.
        
             | pfortuny wrote:
             | Your choice, not mine.
             | 
             | Like: Houdini Emacs Latex A file manager Audacity A
             | Terminal ... Off the tip of my head, used almost daily.
        
               | cdubzzz wrote:
               | I didn't mean to imply that one could not name programs
               | that don't communicate. Of course they exist. I'm not
               | arguing for this type of behavior, just pointing out that
               | it is pretty much the norm. On macOS I use Little Snitch
               | to find and shutdown must of this extra traffic.
        
               | gspr wrote:
               | > I can't think of a single program off the top of my
               | head that doesn't use the Internet to some extent while
               | running
               | 
               | > I didn't mean to imply that one could not name programs
               | that don't communicate.
               | 
               | Can you understand how people would read your first
               | comment that way, though?
        
               | cdubzzz wrote:
               | Yes, definitely! But in my defense, it is simply not what
               | I tapped out (:
               | 
               | I was trying to think of a regular program I use on my
               | macOS (work) laptop that doesn't connect to the Internet
               | "unexpectedly" and I came up blank.
        
             | UweSchmidt wrote:
             | All of that is of course bad. Ask around if people know
             | their CLI tool is phoning home, most people aren't even
             | aware. Let me control if I want to update something. Let me
             | control what information goes out, and when.
             | 
             | There is just no way to defend an underhand tactic that you
             | didn't know about. If it was so necessary and so good and
             | so pure, why does it have to be revealed like that?
        
             | mrlala wrote:
             | >I can't think of a single program off the top of my head
             | that doesn't use the Internet to some extent while running.
             | 
             | Ok but will those programs magically break if there is no
             | internet? I would then argue nearly all applications work
             | offline just fine. It's the exception that a program
             | REQUIRES an internet connection..
        
               | cdubzzz wrote:
               | Sure. I totally agree. That statement was in response to
               | this from OP though:
               | 
               | > When you run software, you do not expect (sic)
               | "unexpected" Internet use.
               | 
               | When I run software, I _do_ expect unanticipated Internet
               | use. That's why I use and love Little Snitch on macOS.
        
             | krzyk wrote:
             | > That seems incredibly naive. I can't think of a single
             | program off the top of my head that doesn't use the
             | Internet to some extent while running.
             | 
             | It is not.
             | 
             | Imagine that I don't have that great wifi coverage in all
             | places around the house. But I take my laptop there.
             | 
             | You know what happens when you have poor wifi connection
             | and you wake up laptop? Even the keyboard+mouse are
             | unresponsive. I was wondering why on earth my keyboard
             | stoppe working on a brand new laptop.
             | 
             | All suddenly started to work when I moved to a different
             | room.
             | 
             | I don't have issues with notarization if and only if it
             | does behave normally when the internet connection is spotty
             | - it is a laptop for gods sake, not a desktop with
             | ethernet.
        
               | jrochkind1 wrote:
               | The MacOS OCSP feature has been documented to be non-
               | functional when there is no internet connection, it does
               | not interrupt or slow down any functioning in that case.
        
               | cdubzzz wrote:
               | Slow Internet and no Internet are different things
               | though. I have experienced this issue as well --
               | sometimes after boot my regular apps will just bounce and
               | bounce (in the macOS dock) and never start. Then when I
               | plug in to ethernet and shut off my wifi everything all
               | of a sudden fires up and starts working.
        
               | jrochkind1 wrote:
               | If there isn't a reasonable timeout set, that does sound
               | like a bug. More than 2 seconds sounds pretty
               | unreasonable to me (possibly should be even less), for a
               | service that is willing to no-op give up when there is no
               | network. Someone would have to do some reverse
               | engineering/debugging maybe by observing/manipulating
               | network traffic to be sure what is going on there, unless
               | Apple wants to tell us but I suspect the suspicious
               | wouldn't believe them.
               | 
               | Missing or too-high timeout should be fixed, but I don't
               | think that'd be enough to to satisfy critics in this
               | thread? Would it you?
               | 
               | [Not setting a timeout on a network request is a common
               | bug in, say, web development. It does make me lose some
               | confidence in Apple's technical abilities if they make
               | that bug in a place with such high consequences. But
               | that's different than ill-intent or a privacy violation]
               | 
               | People seem to object to the basic idea of OCSP, which I
               | think means objecting to the basic idea of app signing.
               | 
               | App signing seems reasonable to me (although it is
               | important to me there be a way for users to choose to
               | launch un-signed apps; there still is in MacOS). And OCSP
               | seems important part of a app signing implementation.
               | Improvements to the particular OCSP implementation for
               | both privacy and performance may be advisable though.
        
               | MiddleEndian wrote:
               | >People seem to object to the basic idea of OCSP, which I
               | think means objecting to the basic idea of app signing.
               | 
               | I am. It's one of the reasons I ditched OS X when 10.7
               | came out despite using Mac OS since 7.6. It's nobody
               | else's business what I run on my machine.
        
               | krzyk wrote:
               | > Missing or too-high timeout should be fixed, but I
               | don't think that'd be enough to to satisfy critics in
               | this thread? Would it you?
               | 
               | A fix in /etc/hosts is all I needed, but if there was a
               | timeout of 2 seconds I wouldn't even notice the problem
               | -> so I wouldn't block notarization.
        
               | krzyk wrote:
               | Just like the sibling poster replied, it is not the issue
               | of "no Internet".
               | 
               | You have three cases here: 1. Working internet connection
               | 2. Slow internet connection 3. No internet connection
               | 
               | Everything works fine in 1 and 3, but breaks the OS when
               | in 2 (and it is very common, just walk further from your
               | router and you'll notice it - e.g. when waking up the mac
               | from sleep).
               | 
               | What is strange to me is that I had an old Macbook Air
               | with latest pre-BigSur macos, bought a new Macbook Air
               | (early 2020) with the same OS and I saw the problem only
               | on it - at first I thought it was broken.
        
               | jrochkind1 wrote:
               | That does sound like a bug, and if that's what's going on
               | it makes me question Apple's technical excellence if they
               | forget to put a reasonable timeout on a network call in
               | such a high-impact place. This bug is not really about
               | privacy though.
        
               | cdubzzz wrote:
               | I absolutely agree with that. And I have experienced a
               | similar issue with my laptop from time to time. It is
               | pretty dumb and seems like poor implementation. I don't
               | think that is relevant to OPs argument though.
        
         | sneak wrote:
         | The browsers are moving away from OCSP and using other
         | protocols specifically because it's terrible for privacy. Also,
         | the whole online/offline thing as someone else pointed out: it
         | is a violation of the principle of least surprise for launching
         | a local app to make a network request.
        
         | m463 wrote:
         | [X] Query OCSP responder servers to confirm the current
         | validity of certificates
         | 
         | You can uncheck this box in firefox.
         | 
         | You cannot uncheck anything in macos.
         | 
         | Arguably firefox won't let you opt-out of automatic updates and
         | a bunch of other annoying stuff, but apple is significantly
         | worse.
        
           | athms wrote:
           | >Arguably firefox won't let you opt-out of automatic updates
           | 
           | Yes it will, but you need to create a policy and be using
           | version 60 which includes the Enterprise Policy Engine.
           | 
           | https://support.mozilla.org/en-US/products/firefox-
           | enterpris...
           | 
           | The Enterprise Policy Generator add-on will help create the
           | policy file.
        
       | jgilias wrote:
       | It seems that many tend to overlook the main issue with this.
       | 
       | Because of this feature, there are people on this lovely planet
       | of ours who may be in actual physical danger _at this very
       | moment_.
        
         | valuearb wrote:
         | Citation needed.
        
           | tasinn wrote:
           | Santa's 2020 Christmas List 15 Stunning Products That You
           | Probably Haven't Heard About.!! check them!!
           | https://ro.pinterest.com/yourcoolgadgets
        
           | jgilias wrote:
           | Citation to what?
           | 
           | The fact that if you're a journalist in Belarus with a
           | Macbook right now, the kind of apps you open can point you
           | out to authorities controlling the local internet
           | infrastructure in no time?
           | 
           | Or do you expect repressive governments to make press
           | releases explaining in detail how they came to rounding up
           | someone?
        
             | valuearb wrote:
             | OCSP doesn't send app signatures.
        
               | jgilias wrote:
               | Developer certificates are enough to figure out you're
               | opening, say Tor, or Telegram.
               | 
               | See this thread for people discussing this:
               | https://news.ycombinator.com/item?id=25095438
               | 
               | When they implement a toggle for switching this off,
               | along with data encryption, then this particular concern
               | becomes a non-issue.
        
       | matthewmacleod wrote:
       | It kind of feels like there's a bit too much noise around this
       | topic.
       | 
       | I'm getting the same feeling I did years ago when it was
       | discovered that the iPhone had a historical database of all the
       | locations you'd been to. There were rather a lot of articles
       | about how Apple were "tracking you everywhere you went" and so
       | on.
       | 
       | The reason it's similar - they are both dumb, technically bad,
       | and privacy-compromising decisions, and in both cases much of the
       | public discussion about it has been a little hysterical and off-
       | base.
       | 
       | Apple should 100% be criticised for this particular failure. It's
       | _obviously_ a bad implementation from a technical and usability
       | point of view; the privacy implications are bad, and this
       | features should not have been able to make it out as-is.
       | 
       | But I've legitimately seen people describe this as "Apple's
       | telemetry" which is just _obvious nonsense_ and distracts from
       | the actual problem - how did such a bad implementation of a
       | useful feature end up in a major commercial product, and how are
       | they going to make sure it doesn 't happen again?
        
         | dblohm7 wrote:
         | I'm with you on this. If Apple genuinely views privacy as a
         | human right (and FWIW I believe that the right people at Apple
         | do), then they need to learn about privacy by design.
        
         | lawnchair_larry wrote:
         | This is a basic by-the-RFC implementation. The developer who
         | was assigned this just used existing libraries and followed the
         | protocol. This was a rational move on their part. Especially
         | when mucking with x509 has been historically fraught with
         | vulnerabilities.
         | 
         | OCSP has since been improved to increase privacy and security,
         | but the extensions to enable that only considered OCSP in the
         | context of TLS.
        
           | dfox wrote:
           | Just to correct slightly incorrect perception: there is
           | nothing inherently insecure or vulnerable about
           | X.500/ASN.1/BER/DER parsing, in fact it is probably more sane
           | format to parse than JSON. The perception that it is somehow
           | fraught with parser vulnerabilities comes from various
           | implementations that tried to implement BER/DER parser by
           | transforming something more or less equivalent to ASN.1
           | grammar into actual parser code by means of C preprocessor
           | macros, which is somewhat obviously wrong approach to the
           | problem, at least in the security context.
        
         | zepto wrote:
         | It's actually not at all obvious how a local list of locations
         | used to power suggestions in maps or Siri, is in _any way_ a
         | compromise of privacy or technically bad.
         | 
         | The only thing that made it sound bad were people saying things
         | like "Apple stores your location history", knowing that it
         | would create the false impression that Apple was uploading
         | location data to their servers.
         | 
         | This situation is similar in that there are people posting
         | misleading inuendo about Apple having some hidden agenda, but
         | the difference is that there _do seem like real design problems
         | with the mechanism this time_.
        
       | raxxorrax wrote:
       | How does Windows check executables? I hope they don't do the
       | same. Does it come with a master list of public keys from
       | manufacturers to check the signature against?
       | 
       | How does that work for new vendors?
        
         | viro wrote:
         | ...Do people not know AV's randomly upload files to the cloud
         | to be analyzed?
        
         | satysin wrote:
         | Microsoft SmartScreen is very similar to Apple's approach.
         | Although I believe you can disable SmartScreen on Windows still
         | which, afaik, you cannot do on macOS without resorting to
         | "hacks" such as editing the hosts file to loopback the Apple
         | OCSP server.
         | 
         | https://en.wikipedia.org/wiki/Microsoft_SmartScreen
        
           | minot wrote:
           | Good thing too because it is easy to screw up signing.
           | 
           | For example, even the dotnet team has trouble with it
           | https://github.com/dotnet/core/issues/5202
        
             | ameyab wrote:
             | Exactly. The policy applies to everyone. Unsigned new
             | binaries (executables) on the internet are not to be
             | trusted by the general public.
        
           | raxxorrax wrote:
           | Unfortunately Microsoft is that predictable that it is safe
           | to say they already think about removing the option to
           | disable it. Another reason not to use an MS account. I think
           | this is a plain data leak and arguably worse than a trojan
           | can do to your system in context of modern banking security
           | measures. Also I doubt most users are aware that MS gets info
           | on every app you run.
        
         | layoutIfNeeded wrote:
         | >Does it come with a master list of public keys from
         | manufacturers to check the signature against?
         | 
         | That won't handle revocations.
        
           | beervirus wrote:
           | It will if you update it frequently.
        
         | [deleted]
        
       | DavideNL wrote:
       | > "Those who consider that Apple's current online certificate
       | checks are unnecessary, invasive or controlling should
       | familiarise themselves with how they have come about, and their
       | importance to macOS security. They should also explain how,
       | having enjoyed their benefits for a couple of years, they've
       | suddenly decided they were such a bad idea after all, and what
       | should replace them."
       | 
       | A simple opt-out toggle, for privacy reasons, would be a good
       | start... people should stay in control of their own data and be
       | able to choose themselves whether or not they are willing to
       | trade in their privacy (for security in this case).
        
         | dilap wrote:
         | Apple has said they plan to do this, and also encrypt the
         | checking payload. Sounds good to me, though definitely a
         | privacy failure that they didn't do this in the first place.
         | 
         | The other thing I'd like to see is the app open immediately, w/
         | the check happening asynchronously in the background. (This
         | seems like super-basic good engineering to me.) No idea if
         | they're planning to fix that or not.
        
           | ascagnel_ wrote:
           | > The other thing I'd like to see is the app open
           | immediately, w/ the check happening asynchronously in the
           | background. (This seems like super-basic good engineering to
           | me.) No idea if they're planning to fix that or not.
           | 
           | How would this work? The point of the check is to block
           | malware from running, and opening without the check would, by
           | definition, negate the entire system. If malware authors get
           | wise to the async scheme, they can write programs that
           | deliver their payload in the opening milliseconds of an app's
           | execution, while the network call is running (even the
           | fastest pings would leave 1 or 2 ms worth of window).
        
         | originalvichy wrote:
         | This is turning out to be a bit of a similar case as the iPhone
         | battery degradation performance throttling issue. Instead of
         | clearly messaging what they were doing to your phone, they did
         | things behind the scenes because they knew better, and decided
         | not to give the user the choice to run the phone at full
         | performance.
        
           | Bud wrote:
           | You clearly need to familiarize yourself with the actual
           | facts of this case. Also, I'm pretty sure that for 99% of
           | users, "full performance" is not characterized by "randomly
           | crashing due to lack of intelligent power management".
        
           | zepto wrote:
           | The throttled performance _is_ full performance. Without
           | throttling the phone would simply crash.
        
         | darksaints wrote:
         | But that would defeat the purpose. Apple can be thought of as
         | the equivalent of the NSA: they "care" about your privacy in
         | the sense that they don't want anybody but themselves to have
         | access to it.
         | 
         | Unfortunately we don't have an Apple competitor that cares
         | enough about your privacy to not want anybody, _including
         | themselves_ , to have access to it.
        
           | zepto wrote:
           | And yet, this claim about Apple's intent isn't made with a
           | shred of evidence.
        
             | darksaints wrote:
             | Oh, you mean the article up at the top of the fucking page?
             | Yeah, I didn't think it was necessary.
        
               | zepto wrote:
               | That article contains no evidence that Apple has a hidden
               | agenda.
        
               | darksaints wrote:
               | I'm sure it probably doesn't to an Apple Cultist with
               | their head in the sand. But for those people, gesturing
               | broadly at all of the evidence available doesn't seem to
               | work either.
        
               | zepto wrote:
               | Gesturing broadly doesn't work because _it's not
               | evidence_. It is only innuendo.
               | 
               | If you had evidence you'd be able to be specific.
        
               | darksaints wrote:
               | There are mountains of evidence available in all corners
               | of the internet. You should know because as one of
               | Apple's elite True Believers, the vast majority of your
               | HN activity is trying vigorously to handwave away any and
               | all evidence of Apple's malfeasance.
               | 
               | The real problem is that for someone like you, no amount
               | of evidence is sufficient. Someone could post a video of
               | Steve Jobs masturbating furiously to Kim Kardashian's
               | iCloud photos folder, and you'd still find some way to
               | spin it as him having an erotic fetish for Apple's data
               | security and privacy practices.
        
               | hu3 wrote:
               | "Apple dropped plan for encrypting backups after FBI
               | complained" doesn't sound privacy oriented to me.
               | 
               | https://www.reuters.com/article/us-apple-fbi-icloud-
               | exclusiv...
        
       | nickpp wrote:
       | I sometimes wonder if the mods won't end up banning "political"
       | talk on HN. Because these days everything becomes political, even
       | if it really is a technical issue.
       | 
       | Case to the point: online signature check was a technical
       | decision, to fight malware. It was implemented similarly by other
       | OS vendors (Microsoft) and it's been this way for years.
       | 
       | Now we discover that it has the unfortunate side-effect that it
       | lessens privacy. Apple (and probably other OS vendors) are
       | working to improve that in the future. Also, a technical issue.
       | 
       | It was never about privacy. It was never a political issue. Can
       | we please just discuss it from a technological standpoint?
        
         | solinent wrote:
         | It's possible that the technical discussion about online
         | signature checking was subsumed by the political discussion--
         | the two are interrelated and if we don't get into these
         | discussions here I don't really see another place for them to
         | happen.
         | 
         | It's easy to shrug it off as simply a technical issue, and it's
         | very convenient for the PR department as well.
        
         | submeta wrote:
         | When a giant like Apple makes a decision to disallow installing
         | apps that were not downloaded from the macOS-app store (and by
         | preventing me to open the app, what Apple does is basically
         | disallowing the app; my retired father does not know how to
         | circumvent this), then this is a political descision that
         | effects the lives of thousands of mac users.
         | 
         | Computers have become a gateway to the digital world, which
         | makes up a large part of our lives. And Apple is a huge player
         | that has the power to shape the future of computing. Apple's
         | vision of computing is on a trajectory where the end game is
         | clear: Users have no control over their devices and will only
         | be doing what Apple allows them to do. This is the case on iOS
         | already, and macOS will be there in a few years.
         | 
         | Some say: Decide with your purse. If you don't like Apple's
         | vision, then don't buy their products. But this argument is
         | like saying "This is how we handle things in country X, if you
         | don't like it, move somewhere else." Some of us have invested
         | lots in tools and software in Apple land, so changing platforms
         | will take some time. But even if we can change, Apple is
         | shaping the future of this industry, in a way that it restricts
         | freedom and limits options. And this is not a future we should
         | accept. Instead we should be opposing and fighting it.
        
         | sneak wrote:
         | > _Case to the point: online signature check was a technical
         | decision, to fight malware._
         | 
         | This is an oversimplification. It also helps protect Apple's
         | business model: you must pay Apple a fee for services (and show
         | ID) to be able to sign your apps for distribution on this
         | platform. Imagine if you had to show ID to get a TLS
         | certificate for your website.
         | 
         | Don't conflate the issue - this is _also_ a move to protect
         | certain streams of Apple services revenue, in addition to
         | protecting users from malware, and it always has been.
        
           | valuearb wrote:
           | Without facts, your thoughts are merely conspiracy theories.
           | There is no difference between them and claims the US
           | presidential election was stolen.
        
             | sneak wrote:
             | I've not posted any theories, only widely-accepted and
             | recognized facts.
        
               | valuearb wrote:
               | " this is also a move to protect certain streams of Apple
               | services revenue"
               | 
               | Citation needed please, Mr. Giuliani.
        
               | crazygringo wrote:
               | > _this is also a move to protect certain streams of
               | Apple services revenue_
               | 
               | That's a theory, not a fact, nor is it widely recognized.
               | 
               | By any reasonable accounting of costs, the $99 Apple
               | Developer Program fee isn't meant to be a profit center
               | that Apple is trying to "protect". It mainly helps
               | prevent spam accounts and helps offset the cost of
               | reviewing and distributing free apps in the Mac and iOS
               | app stores.
        
           | shadowfiend wrote:
           | I _see you_ asserting this over and over.
           | 
           | What I don't see is you providing _any real evidence_ that
           | this is a core part of the decision-making process.
           | 
           | Apple isn't particularly incentivized to find a _different_
           | way that avoids the tools they _already have_ that already
           | make it harder and costlier for parties to get around their
           | security mechanisms. That is _not the same_ as making
           | decisions _because_ they protect the business model.
           | 
           | Which is to say, it appears that you're the one
           | oversimplifying and conflating (btw, I do not think that word
           | means what you think it means) some very different
           | motivations.
        
             | shawnz wrote:
             | I agree that there is no direct evidence that this decision
             | was part of their formal decision-making process. But there
             | is still something to be said for designing systems where
             | it's not possible for those negative incentives to exist,
             | whether or not there is any current intention of taking
             | advantage of them.
             | 
             | Of the tens of thousands of people who had a hand in
             | shaping macOS today, it's impossible to say what their
             | collective intentions were in all the decisions they made.
             | So I think it's useless to talk only about the intentions
             | you can prove just by looking at their formal decision-
             | making process. That is why we need to be working to
             | protect privacy at every level with a "defense in depth"
             | approach.
             | 
             | And of course it goes without saying that all major vendors
             | have issues like this and could be working harder to make
             | sure that these incentives don't get created.
        
               | shadowfiend wrote:
               | > designing systems where it's not possible for those
               | negative incentives to exist
               | 
               | No doubt, but even in simple systems this is considerably
               | more difficult than it sounds. Incentive systems are not
               | easy, and any incentive system is often twisted into a
               | game that produces unexpected poor behaviors. Try to
               | achieve that at a 100+k employee company and you're
               | guaranteed to end up with misaligned or counterintuitive
               | incentives.
               | 
               | The reason something stronger than simple assertion
               | matters here is because Apple actually _has_ added an
               | incentive system: trumpeting privacy as a core feature
               | means they tie their brand to their ability to deliver on
               | privacy. That means that being called out for privacy
               | issues has greater potential harm for the company, and
               | thus its bottom line, in a way that 's far more direct
               | and monetarily impactful than $100 annual developer fees
               | ever will be.
               | 
               | Even cynically, you can see the same mechanic at work
               | with the recent reduction of app store percentages for
               | small businesses. Apple has made it a core part of its
               | developer outreach that the app store is a good thing for
               | developers: they've made it part of their brand. If
               | they're called out for something that makes many of those
               | developers disagree, or even for something that makes
               | _users_ perceive that part of the brand as incorrect, it
               | has implications to the whole company 's bottom line (not
               | just the developer id sliver of it).
               | 
               | > it's useless to talk only about the intentions you can
               | prove just by looking at their formal decision-making
               | process
               | 
               | For what it's worth, my point isn't tied to formal
               | decision-making, it's tied to the informal parts as well.
               | What I'm saying is that this can be a completely
               | technical decision that relies on the existing business
               | structure without ever taking into account whether it
               | will raise revenue from developer ids. The developer id's
               | existence is a fact. The "DoS resistance"
               | characteristics, if you will, of having the developer ids
               | cost money is a fact. As a system architect, leveraging
               | those facts for system security seems perfectly
               | reasonable. Yes, absolutely they should have taken
               | privacy into account as well. "We haven't used it" and
               | "we won't use it" aren't the same as "we can't use it".
               | 
               | But here's the thing: to me, the strongest indicator of
               | whether a company is committed to an approach is whether
               | they react positively when they are called out, or
               | whether they double down on their mistakes. Apple was
               | called out here, and they've committed to doing just
               | about all of the things they should be doing. You could
               | use this same framing to say that Apple isn't as
               | committed to developers thriving on their platform as
               | they are to living up to their privacy commitment, of
               | course.
               | 
               | Tl;dr: customers are part of the incentive system, tying
               | your brand to a commitment aggressively as Apple has tied
               | theirs to privacy has an impact on customers, and this is
               | a great way to introduce an additional external forcing
               | function to your internal teams.
        
             | sneak wrote:
             | Apple charges 10x the market rate for credit card
             | processing on the purchase of mobile apps on iOS. Why do
             | you think this is possible?
             | 
             | Take it from dhh if you don't believe me:
             | 
             | https://mobile.twitter.com/dhh/status/1328339591389175808
        
               | shadowfiend wrote:
               | Sorry, you seem to have deviated into an unrelated axe
               | you're grinding. Try again, this time with the axe you
               | were originally grinding, which I'll help with:
               | 
               | > [Online signature check] is also a move to protect
               | certain streams of Apple services revenue, in addition to
               | protecting users from malware, and it always has been.
               | 
               | To restate and avoid drifting into another non sequitur,
               | this ascribes _intent_ ; that is, it suggests that part
               | of the reason online signature check was _added_ , and
               | part of how it has been _evolved_ , is to protect certain
               | streams of Apple services revenue. That would be your
               | argument, which has no evidence to support it, but you
               | suggest is backed by "facts"[1], which appear to be
               | nowhere to be found.
               | 
               | Are these facts somewhere to be found? Or are you stating
               | hypotheses as facts?
               | 
               | [1] https://news.ycombinator.com/item?id=25210475
        
               | sneak wrote:
               | I have no axe to grind with Apple. I'm a happy Apple
               | customer and have been for most of 30 years.
               | 
               | The same code that keeps malware from running on a mac
               | (or iphone) keeps non-app-store apps from running on an
               | iphone, or prompts you to move non-notarized apps to the
               | trash on a mac.
               | 
               | It's not some separate thing: the exact same code path
               | that protects the consumer store revenue and developer
               | notarization service revenue also protects users against
               | malware.
               | 
               | EDIT, for clarity: I am speaking of Apple-developed,
               | Apple-owned platform security code, where root keys are
               | not held by anyone other than Apple, not generic crypto
               | primitives or the concept of code signing in general
               | (where we have a P-as-in-public PKI).
        
               | Spivak wrote:
               | Which is the same code that keeps unsigned bootloaders
               | from running on PCs which is the same code that keeps
               | unsigned packages from being installed on Linux systems
               | which is the same code that keeps unsigned browser
               | extensions from running on Firefox which is the same code
               | that shows the scary warning on Windows.
               | 
               | Everyone seems to like code signing.
        
               | novok wrote:
               | Lol you have never had to deal with apple's over
               | complicated code signing as a developer.
               | 
               | Adds a lot of wrenches when your just trying to do basic
               | stuff like codesign and push test builds onto a USB
               | connected device from a bash script and it is flaky and
               | undocumented as fuck.
               | 
               | I am honestly jealous of my android counterparts with
               | their far simpler system and first class command line
               | support via adb.
        
         | Y-bar wrote:
         | > everything becomes political, even if it really is a
         | technical issue
         | 
         | I'm being reminded of an old song by Tom Lehrer
         | [https://www.youtube.com/watch?v=QEJ9HrZq7Ro]
        
         | raxxorrax wrote:
         | I think informing the vendor of an OS about every program you
         | run has necessarily a political component and is of relevance
         | for discussion about security. It is the first step to define
         | the threat model and there certainly are additional threats you
         | are exposed to. This data is highly valuable to anyone that
         | might want to infiltrate computer systems.
         | 
         | The technical discussion of signatures is well documented.
        
         | driverdan wrote:
         | > It was never about privacy. It was never a political issue.
         | Can we please just discuss it from a technological standpoint?
         | 
         | No. Unintended consequences are important. There are privacy
         | issues, therefore we need to discuss privacy.
        
         | shawnz wrote:
         | The technical issue is "can we provide these features without
         | weakening privacy?"
         | 
         | The political issue is "if we can't provide these features
         | without weakening privacy, should we still provide them?"
         | 
         | Aren't they both important points to discuss?
        
           | nickpp wrote:
           | They are, but the difference is that we can fix technical
           | issues, or at least improve them. We can (mostly) agree about
           | what's right and wrong and what's better or worse.
           | 
           | Political issues on the other hand, just end up antagonizing
           | us ever more. We argue endlessly, go on countless tangents
           | and nobody agrees on anything because we see the very issues
           | under different lights, experiences, values and cultures.
           | 
           | I am tired of politics. I just want to get stuff done.
           | Hopefully good stuff, but I'd settle for slightly better.
        
             | lwouis wrote:
             | Politics is about where we go. Tech is about how we arrive
             | there. If you disagree on the direction we are headed,
             | discussing different ways to arrive there seems pointless.
             | 
             | You may agree with the status-quo politics, but other
             | people don't. For them it's not about the tech, it's about
             | the overall direction. For those people the important
             | discussion to have is political.
             | 
             | For instance, I personally am against the current direction
             | macOS and Windows are headed. I have no problem with these
             | kind of security measures _as long as there is a button to
             | opt out_. Currently, Apple decides for everyone, and doesn
             | 't provide ways to opt out for power users. I dislike this,
             | and I feel like discussing better cryptography, security
             | protocols, etc, doesn't address my priorities.
        
             | shawnz wrote:
             | I think this is a terrible attitude. We can improve
             | political issues and we do it by defending our opinions in
             | the public sphere, which has the power to change others'
             | opinions. If you don't work to present your ideas to the
             | world in the best possible light then you will just allow
             | other, weaker ideas to become more convincing in
             | comparison, thus doing a disservice to anyone who could
             | have been convinced otherwise.
             | 
             | I am sorry that politics is tiring but I think that is just
             | a reality of politics being a manifestation of natural
             | forces. The natural world is competitive, it's competitive
             | on the cellular level, the food chain is competitive, and
             | human societies are competitive too. Looking the other way
             | doesn't change the reality, it just guarantees that the
             | opinions of others will become more significant to the
             | world than your own. Just like how our cells compete to
             | form the best possible body, I think we are obligated to
             | compete with our ideas to form the best possible society.
        
               | mattalbie wrote:
               | People tire of politics when (they feel) the conversation
               | becomes unconvincing, pedantic, or monotonus. That
               | doesn't mean those people have the wrong attitude, it
               | means the argument is not compelling.
        
       | viro wrote:
       | So you guys know AV's randomly upload ur files to the cloud
       | right?
        
       | KingOfCoders wrote:
       | "Privacy is not a feature".
        
         | woahAcademia wrote:
         | Directly contradicting current Apple Marketing.
         | 
         | I lothe Apple(and other unethical companies) for lying in their
         | ads.
         | 
         | Any benefits of macOS are instantly gone because you cannot
         | Trust Apple to tell the truth. It's as unreliable as Google
         | keeping a service around.
        
           | zepto wrote:
           | You don't have any evidence to support the claim that they
           | are lying.
        
             | woahAcademia wrote:
             | First time on HN? There are like 2 new Apple security or
             | privacy issues every week.
             | 
             | You know about PRISM/edward snowden?
             | 
             | You can verify all of this with almost no effort. Any links
             | I post you won't believe. It's up to you.
        
               | acdha wrote:
               | You must be new yourself not to have seen how quickly
               | those claims are debunked or shown to be far less
               | exciting than claimed. For example, you're repeating
               | Greenwald's misunderstanding of the PRISM report
               | seemingly unaware of the last decade of discussion.
        
               | zepto wrote:
               | Not one of those security or privacy issues substantiates
               | that _Apple has a hidden agenda to collect data on you_.
               | 
               | They do substantiate that Apple has a long way to go in
               | terms of technically solving privacy problems.
               | 
               | Yes, the NSA has an agenda to track you.
        
               | woahAcademia wrote:
               | Ahh my book doesn't actually make you a millionaire in a
               | day,but that's just because I failed technically. Anyway-
               | 
               | "Buy my book, make 1 million dollars in a day".
               | 
               | It's not a hidden agenda, I just suck.
        
               | zepto wrote:
               | Apple _is_ actually checking certificates. That is the
               | agenda.
               | 
               | Claiming they are doing more requires evidence.
        
           | m463 wrote:
           | I first started getting an idea that something was amiss with
           | apple years and years ago.
           | 
           | "Apple machines don't get viruses"
           | 
           | Everyone "knew" that. Everyone knew that apple machines were
           | secure.
           | 
           | Turns out there were plenty of things going on with apple
           | just like other companies. The difference was that apple
           | would actively persecute (even prosecute) people who explored
           | or discovered these things. Basically apple marketing was
           | able to prevent a lot of vulnerabilities from publicity.
           | Probably marketing 101, protect your brand.
        
             | Bud wrote:
             | Please cite evidence for these extraordinary claims. Who,
             | precisely, did Apple "prosecute" for "exploring" security
             | issues?
             | 
             | Please don't post made-up things in public.
             | 
             | As an aside, no, everyone did not "know" that Macs don't
             | get viruses. But yes, most informed people were aware that
             | Macs were somewhat less susceptible to malware for a while.
             | Please don't exaggerate wildly.
        
           | grishka wrote:
           | Apple's marketing basically boils down to "please trust us
           | that we respect your privacy because we tell you so". You can
           | at least reverse engineer the hardware you own and the
           | software on it, but what about cloud services they're pushing
           | so hard? You don't, and can't, know what happens to your data
           | in someone else's infrastructure.
        
             | hu3 wrote:
             | Well we do know they dropped iCloud end-to-end ecryption
             | after FBI complained.
             | 
             | https://www.reuters.com/article/us-apple-fbi-icloud-
             | exclusiv...
             | 
             | We also know that Apple handed all chinese cloud data to
             | the goverment by changing host to a state firm.
             | 
             | https://www.theverge.com/2018/2/28/17055088/apple-chinese-
             | ic...
        
         | intricatedetail wrote:
         | Privacy is a currency you pay hidden charges with.
        
           | anticensor wrote:
           | Privacy is a thing you only get to choose once per a line of
           | events. Once you commit to the non-private option, you are
           | out of luck. No amount of postcautions will fix that.
        
       | GekkePrutser wrote:
       | I don't agree with the article's statement that this is
       | necessary.
       | 
       | I'm sure it serves a purpose. But it should be more transparent
       | to the user what's going on, and it should be possible to switch
       | it off if the user decides they don't want this.
       | 
       | And really, the article also mentions Apple used to do this with
       | a local cache but stopped doing this in Catalina. The question
       | should be asked why. A local cache arguably offers better
       | protection as it will work even without a network connection
       | whereas the OCSP has no alternative other than failing open or
       | stopping the system from working.
        
         | emteycz wrote:
         | Most things serve a purpose, even if it's a convoluted, dumb,
         | wrong or evil one.
        
         | musicale wrote:
         | > And really, the article also mentions Apple used to do this
         | with a local cache but stopped doing this in Catalina
         | 
         | This exactly. Local cache works fine, certificate revocation is
         | rare, and a marginal to nonexistent improvement in security is
         | not worth the slowdown, denial of service, and privacy
         | invasion.
         | 
         | Chrome uses a certificate revocation list for basically the
         | entire internet; certainly macOS can (and indeed should) go
         | back to using such a list for developer certificates, as they
         | did in Mojave.
        
       | lo_fye wrote:
       | yes. with no personally identifying details included. and it was
       | publicly documented.
        
       | ben509 wrote:
       | > What has been puzzling me ever since is that these OCSP checks
       | have been well-known for a couple of years, and only now have
       | attracted attention.
       | 
       | It's not much of a puzzle. Everyone's Mac essentially froze
       | simultaneously, and that's what drew all the initial attention.
       | Then people were digging into why, and the cause of it got a lot
       | of play. Privacy advocates took the opportunity to advocate
       | privacy and had a large and annoyed audience.
        
         | acdha wrote:
         | Don't forget to account for selection bias: some people had
         | apps delay launching and they started telling everyone about it
         | as if it was everyone which affected all apps and users. Most
         | of us had no idea there was a problem and thus didn't feel
         | obligated to start talking about it on social media.
        
       ___________________________________________________________________
       (page generated 2020-11-25 23:00 UTC)