[HN Gopher] After criticism, Apple to only seek abuse images fla...
       ___________________________________________________________________
        
       After criticism, Apple to only seek abuse images flagged in
       multiple nations
        
       Author : ldayley
       Score  : 107 points
       Date   : 2021-08-13 21:49 UTC (1 hours ago)
        
 (HTM) web link (mobile.reuters.com)
 (TXT) w3m dump (mobile.reuters.com)
        
       | tehnub wrote:
       | They put out a new paper [0] describing the security thread model
       | they were working with, and this paragraph on page 9 stood out to
       | me:
       | 
       | The perceptual CSAM hash database is included, in an encrypted
       | form, as part of the signed operating system. It is never
       | downloaded or updated separately over the Internet or through any
       | other mechanism. This claim is subject to code inspection by
       | security researchers like all other iOS device-side security
       | claims.
       | 
       | Could someone tell me how that inspection works? Are there
       | researchers who are given the source code?
       | 
       | [0]: https://www.apple.com/child-
       | safety/pdf/Security_Threat_Model...
        
       | whytaka wrote:
       | Conveniently, the other participating nations are all members of
       | Five Eyes. Probably.
        
       | metiscus wrote:
       | This article misses the point completely. It was never about what
       | they were looking for, it was that they were looking at all. The
       | quiet part is, once they get acceptance of the snooping using the
       | bogeyman of the day, it will eventually encompass other behavior
       | up to and including political dissent etc.
        
         | okamiueru wrote:
         | There is also the part where plausible deniability is lost.
         | Closed door arbitration of heinous crimes are the perfect tool
         | to put away dissidents. A very convenient excuse to avoid
         | proper checks and balances. "No, you cannot verify, because you
         | aren't allowed to look at the illegal material that proves
         | their guilt"
        
           | hahajk wrote:
           | Is this actually true? If I was arrested for an image on my
           | phone, would no one in the judge/jury team be allowed to see
           | the image that was flagged? I understand it's allegedly
           | contraband, but if I were a judge I don't think I'd take an
           | algorithm's word for it.
        
             | heavyset_go wrote:
             | > _it's allegedly contraband, but if I were a judge I don't
             | think I'd take an algorithm's word for it._
             | 
             | The prosecution can hire more expert witnesses with
             | excessive credentials than you can, and they will explain
             | how there's a one in one trillion possibility that the
             | system is wrong, and that the defendant is assuredly a
             | monster.
             | 
             | Juries eat that up when it comes to bogus DNA, bite mark,
             | fingerprint, or other forensic evidence claims. Most people
             | think computers can't be wrong or biased, and people's
             | perceptions of what can be deemed reasonable doubts or not
             | seem to shift when computers are involved, or when smart,
             | credentialed people tell them their reasonable doubts
             | aren't reasonable at all because of that one in a trillion
             | chance of the computers being wrong.
        
         | jsjohnst wrote:
         | I agree. Further, is there any doubt that these "CSAM"2 hash
         | databases aren't already shared between countries? Because if
         | there was doubt, there shouldn't be, because agencies do share
         | across borders.
         | 
         | 2 - reason for the scare quotes is because I have first hand
         | knowledge of non-CSAM content being in NCMEC's database (most
         | likely via data entry errors, but I can't be for sure).
        
           | belorn wrote:
           | Images of bonsai trees?
        
             | jsjohnst wrote:
             | > Images of bonsai trees?
             | 
             | Ha! Come to think of it, I think that was one example. The
             | main examples that came to memory (it's been almost 8 years
             | since I was involved) when discussing this last week were
             | essentially extremely common photos (like the stock Windows
             | XP background, among others).
        
         | deugo wrote:
         | I think this is (quietly) already happening:
         | https://www.reuters.com/technology/exclusive-facebook-tech-g...
         | 
         | Instead of hashes looking for sexual abuse images, hashes look
         | at signs of white supremacist behavior. Instead of reporting
         | such racism or behavior to the government, it is shared with
         | all the major tech companies, including AirBnB and Paypal.
         | 
         | So, if Facebook deems you to be a political outcast, based on
         | your conversations on their platform, you may find yourself
         | without access to hotels or online payment platforms. The
         | bogeyman of the day is the white supremacist, and coordinated
         | snooping by big tech is the quiet part.
         | 
         | > Over the next few months, the group will add attacker
         | manifestos - often shared by sympathizers after white
         | supremacist violence - and other publications and links flagged
         | by U.N. initiative Tech Against Terrorism. It will use lists
         | from intelligence-sharing group Five Eyes, adding URLs and PDFs
         | from more groups, including the Proud Boys, the Three
         | Percenters and neo-Nazis.
         | 
         | > The firms, which include Twitter (TWTR.N) and Alphabet Inc's
         | (GOOGL.O) YouTube, share "hashes," unique numerical
         | representations of original pieces of content that have been
         | removed from their services. Other platforms use these to
         | identify the same content on their own sites in order to review
         | or remove it.
         | 
         | As a company to become part of the Global Internet Forum to
         | Counter Terrorism, you have to pledge your support to expanding
         | from terrorism towards all online extremism. If you can't
         | accept that, they will still send a speaker to educate your
         | company.
        
         | spoonjim wrote:
         | And, just introducing this idea can make the concept real, even
         | if Apple cancels this plan tomorrow. Country X could introduce
         | a law saying "all companies that sell phones here must scan and
         | report for these hashes" and based on recent trends, the
         | companies will just roll over and say Uncle.
        
       | 2OEH8eoCRo0 wrote:
       | Wrong answer, Apple.
        
       | pembrook wrote:
       | This is by far the biggest misstep of Tim Cook's tenure at Apple
       | thus far.
       | 
       | If they don't kill this program soon, it's going to overshadow
       | the entire upcoming iPhone event, and will follow Apple around
       | like a dark cloud for _years._
       | 
       | I can see the headlines now: "New iPhone launches amid massive
       | new privacy concerns."
       | 
       | Anytime someone praises Apple for privacy, anywhere on the
       | internet, there will be a tidal wave of people bringing up this
       | program in rebuttal. From people who would have previously
       | defended Apple to the grave!
       | 
       | I cannot fathom how on earth anybody thought this was a good
       | idea. It's like taking decades and billions of dollars worth of
       | hard won reputation for privacy and throwing it in the garbage at
       | the worst possible moment.
        
         | n8cpdx wrote:
         | The folks who choose iPhone in part because of a perception of
         | relative privacy and respect for the user (like myself) will
         | think twice.
         | 
         | My two concerns are that Android as an ecosystem is almost
         | certainly still worse, and that the vast majority of users will
         | not care.
         | 
         | I'm tempted to jailbreak my devices going forward, although I
         | guess the folks at Apple would say that makes me a pedophile.
        
       | amq wrote:
       | A backdoor is a backdoor.
        
       | cletus wrote:
       | And... Apple misses the point of the criticism completely.
       | 
       | This problem is the capability, not what it's used for. Any such
       | capability will be abused by new use cases be it terrorism, drug
       | trafficking, human trafficking or whatever. Plus there will
       | inevitably be unauthorized access.
       | 
       | The only way to prevent all this is for the system not to exist.
       | 
       | I don't buy into theories that Apple is being pressured or
       | coerced on any of this. I believe it's far more likely this is
       | just tone-deaf but well-intentioned incompetence. It's classic
       | "why won't anyone think of the children?" and we've seen it time
       | and time again with encryption backdoors and similar.
       | 
       | The big question for me is how and why Tim Cook and Apple's board
       | signed off on a plan to do huge damage to Apple's reputation and
       | user trust. If they didn't know, it's a problem. If they knew and
       | didn't realize the inevitable backlash, well that's a different
       | problem.
        
         | lucasyvas wrote:
         | You are completely correct and I find this turn of events
         | hilarious. They are onto the bargaining stage of grief.
        
         | k2enemy wrote:
         | > This problem is the capability
         | 
         | > The only way to prevent all this is for the system not to
         | exist.
         | 
         | I've been having trouble following this argument over the last
         | week. Isn't it clear that the capability already exists?
         | Whether or not Apple goes through with its CSAM plan, the
         | capability is evidently there.
         | 
         | In other words, since Apple is a closed system, the capability
         | was there before they announced the CSAM plans. Their
         | announcement has changed nothing about capability other than
         | reminding people that Apple has privileged access to the data
         | on iPhones.
         | 
         | I guess the question is, if Apple does the CSAM program does
         | that make them more likely to cave to government pressure to
         | search for other things? And to do so without telling users?
        
           | n8cpdx wrote:
           | There shouldn't be a single line of code, or single binary
           | blob, on my device that can compute these photo DNA hashes.
           | 
           | Apple could always go ahead and add that functionality in an
           | update, but then there would be a big backlash and the
           | opportunity to not update or switch providers.
        
             | k2enemy wrote:
             | Wouldn't users have the ability to not update or switch
             | providers if Apple expands the scope of its search beyond
             | CSAM?
             | 
             | I guess I don't see the huge difference between the
             | surveillance code existing on the phone but not used for
             | objectionable purposes versus the line of code sitting in a
             | different branch in git and not deployed on actual phones
             | (yet).
             | 
             | I'm completely against this move by the way -- not trying
             | to defend it. But I want to be able to argue effectively
             | against it.
        
           | ipv6ipv4 wrote:
           | There is a qualitative difference between tweaking an
           | existing code base and coercing a company to dedicate teams
           | of tens to hundreds of engineers over years to create a brand
           | new code base from scratch.
        
         | heavyset_go wrote:
         | > _This problem is the capability, not what it 's used for. Any
         | such capability will be abused by new use cases be it
         | terrorism, drug trafficking, human trafficking or whatever.
         | Plus there will inevitably be unauthorized access._
         | 
         | Apple even says it themselves[1]:
         | 
         | > _This program is ambitious, and protecting children is an
         | important responsibility. These efforts will evolve and expand
         | over time._
         | 
         | [1] https://www.apple.com/child-safety/
        
           | goodells wrote:
           | I can't wait until /child-safety 404s or redirects to /safety
           | and there's a wall of marketing blurb (possibly only in
           | Chinese at first) that explains how 'national security'
           | concerns are reported to the CCP.
           | 
           | This has totally pushed me over the edge, though I'll admit I
           | was oblivious to begin with. My plan is to replace the
           | MacBooks with a Thinkpad P15 gen 2 running Ubuntu and replace
           | the iPhone with something running Ubuntu Touch (Volla Phone,
           | Fairphone, OnePlus One). Screw not having control.
        
             | merpnderp wrote:
             | That's a smart plan. Apple didn't misunderstand the
             | criticism, they just have ulterior motives. It's the only
             | rational explanation.
        
             | hypothesis wrote:
             | I also note that /child-safety page is only(?) accessible
             | from outside search/link. There is no corresponding press
             | article. This thing is just floating in the air somehow...
        
             | hef19898 wrote:
             | Same here. Just that I'll get rid of MS and Google. That
             | means CalyxOS on my Pixel 2 and some version of Linux on my
             | private ThinkPad. Once I find time that is, which to be
             | honest can take a while. I think the last time I used my
             | private laptop was over a month ago. Phone is different,
             | but again time constraints. It will happen so.
        
         | nimbius wrote:
         | I just have to wonder who green-lit this idea to begin with.
         | Every single other company is riding a post-covid tidal wave of
         | sales. as WFH becomes a reality for so many workers, apple
         | products are positioned to really take center stage for a
         | massive segment of consumers...
         | 
         | and then this. Apple intentionally injects uncertainty and
         | controversy? what were they thinking?? sure its just phones
         | right now but im sure mac users are wondering about
         | workstations and laptops? the damage control being spun right
         | now is absolutely overwhelming.
         | 
         | im also surprised to see no other players like MS or Google
         | rushing to take advantage of the outrage. even players like
         | Purism seem to be ignoring the event.
        
           | lethologica wrote:
           | How much of an impact do you think this will have on the non
           | tech savvy user though? Most people already don't care about
           | ad tracking, location tracking, etc. I think your standard
           | user will just shrug this off and go "eh, I have nothing to
           | hide and it's good for the kids"
        
         | laserlight wrote:
         | > The only way to prevent all this is for the system not to
         | exist.
         | 
         | I'll go one step further. Apple should have implemented a
         | system that makes these kind of backdoors impossible. I don't
         | know how or if such a system is possible, but given Apple's
         | track record, they are in a position to attempt it.
        
         | finolex1 wrote:
         | Unfortunately, I don't think this has been anything more than a
         | small blip to Apple's reputation. If I were not on Hackernews
         | (like 99+% of Apple users), I would have either not heard or
         | given second thought to this move.
        
           | musha68k wrote:
           | It's on Reuters and many other news outlets. The press are
           | still doing their job for the moment at least.
        
             | busymom0 wrote:
             | I think it dominating Reddit for last week might be
             | bringing a lot of attention too.
             | 
             | Kind of surprising how tone deaf Apple's response has been
             | despite this.
        
           | AnthonyMouse wrote:
           | Bad press on tech sites is the _worst_ thing for a tech
           | company 's reputation. They're the people other people go to
           | when deciding what to buy.
           | 
           | When Uncle Bob asks the family computer engineer about the
           | pros and cons of a given platform and hears that one of them
           | scans your device and a false positive could get you arrested
           | for child pornography, Uncle Bob may develop an aversion to
           | that platform.
        
         | kevin_thibedeau wrote:
         | > The big question for me is how and why Tim Cook and Apple's
         | board signed off on a plan
         | 
         | "Do us a favor or the AG launches an investigation into your
         | business practices."
        
         | norov wrote:
         | The only explanation that makes sense to me is that they are
         | doing huge damage to their reputation and trust for bigger
         | reasons. Perhaps there are international or national security
         | issues they are told to comply with that are not quite in the
         | public view yet.
        
           | heavyset_go wrote:
           | Occam's razor says they aren't playing 11th dimensional chess
           | and that they just fucked up. It happens.
        
             | frickinLasers wrote:
             | "Build this. Don't explain why" is not chess. It would be a
             | perfectly logical order from the power structures Snowden
             | exposed.
        
       | babesh wrote:
       | I wonder what the other countries will be and whether they will
       | be disclosed?
       | 
       | My bet is on Canada, UK, Australia, and New Zealand.
       | 
       | https://en.wikipedia.org/wiki/Five_Eyes
        
       | codezero wrote:
       | I expect any powerful nation could "share" with a desperate ally
       | or otherwise leverage them into adopting the same database,
       | creating a false overlap.
        
         | bellyfullofbac wrote:
         | Yeah. "Dear debtor countries, the attached file contain hashes
         | of new files deemed illegal for the harmonious well-being of
         | peoples for this month, please enter them into your databases."
        
         | [deleted]
        
       | paulie4542 wrote:
       | Trust lost last week. This doesn't matter.
        
         | tgsovlerkhgsel wrote:
         | I wouldn't say "this doesn't matter", I'd say "this makes it
         | worse" because they're reaffirming that they want to push
         | forward with this despite the criticism.
        
         | newsbinator wrote:
         | Of course it matters- if Apple does a 180o reversal I'll
         | applaud and buy more Apple products. I _want_ companies to be
         | able to acknowledge mistakes unequivocally and fix them.
         | 
         | If Apple says "we heard the backlash, we're sorry, we'll never
         | do it again", I'll be more of a supporter than I had been
         | before.
         | 
         | Until they do, trust lost.
        
       | firebaze wrote:
       | I'm not sure if this addresses the main point. Apple promised
       | privacy, as long as you're not violating laws. The most recent
       | move changed this: now you're a suspect by default, your photos
       | will be scanned even on your local device, and you won't know
       | what the criteria are to flag you, since they're hidden from your
       | knowledge and you can't defend yourself against that hidden
       | criteria.
       | 
       | So what's gotten better by this? In the kindest interpretation,
       | you'll be safe from Fascististan's hashes. But what if
       | Fascististan has been bribed by China or the US? And so on.
       | 
       | A line which should never have been crossed was crossed.
        
       | cblconfederate wrote:
       | lol, so first it's "at least 30 images", then it's "multiple
       | nations" (how many? which pairs are allowed?)
       | 
       | Soon your pictures will be subject to a 3-week review period to
       | decide whether they 're fit to be scanned
        
       | heavyset_go wrote:
       | This does not address the issue of perceptual hash collisions and
       | false positives at all, nor does it address the issue of on
       | device scanning nor their claim that, according to Apple[1]:
       | 
       | > _This program is ambitious, and protecting children is an
       | important responsibility. These efforts will evolve and expand
       | over time._
       | 
       | People don't want their property spying on them and reporting
       | them to the police. They don't want people looking at their
       | photos or thumbnails. It's patronizing, invasive and embarrassing
       | to have your privacy violated like that.
       | 
       | [1] https://www.apple.com/child-safety/
        
         | concinds wrote:
         | You may be interested in this:
         | 
         | Security Threat Model Review of the Apple Child Safety Features
         | [pdf] (apple.com) https://news.ycombinator.com/item?id=28173134
        
       | swiley wrote:
       | They're willing to kill the platform over this? What's going over
       | there on the west coast?
        
         | ud_0 wrote:
         | They know they can essentially wait out the public outcry, plus
         | they can even buffer things a little bit by calling their
         | critics "confused" and introduce measures that don't address
         | anything but sound reasonable to lay persons in an effort to
         | distort the conversation.
         | 
         | The platform is not in danger. But if it were, then yes, I
         | think they'd still risk it. It's either that or getting shut
         | out of markets gradually by law enforcement and governments.
        
           | swiley wrote:
           | > It's either that or getting shut out of markets gradually
           | by law enforcement and governments.
           | 
           | How come that's not happening to desktop Linux then?
        
       | concinds wrote:
       | These stories have been top of HN for more than a week,
       | indicating fascination. Yet non-technical mainstream articles
       | make the bulk of it, and Apple's just released detailed (and
       | convincing) threat report PDF is getting buried. An unfortunate
       | recipe that'll solidify misunderstandings and keep the
       | conversation much more emotional than it should be.
       | 
       | Security Threat Model Review of the Apple Child Safety Features
       | [pdf] (apple.com) https://news.ycombinator.com/item?id=28173134
        
         | tgsovlerkhgsel wrote:
         | > solidify misunderstandings
         | 
         | The key issue here is the very concept of "Apple turns your
         | iPhone into a snitch", and I haven't seen any misunderstanding
         | around that.
        
       | kook_throwaway wrote:
       | This perfectly illustrates the problem: the content being scanned
       | for can change arbitrarily and on a whim. Gun, meet foot.
        
       | adrr wrote:
       | How does that stop a FISA subpoena with a gag order? The system
       | exists that any secret government order can exploit.
        
         | ls612 wrote:
         | A FISA subpoena or an NSL can only compel Apple to give certain
         | types of information that Apple already has. They can't compel
         | Apple to gather information they don't already have.
        
           | heavyset_go wrote:
           | Check this out[1].
           | 
           | [1] https://en.wikipedia.org/wiki/PRISM_(surveillance_program
           | )#/...
        
           | kook_throwaway wrote:
           | You have far more faith in a system of secret courts and gag
           | orders than I do.
        
             | busymom0 wrote:
             | Yep. FBI Lawyer, Kevin Clinesmith gets to lie to the FISA
             | court, fabricate emails to get 2-hop warrants and then gets
             | zero days in prison and as of this month is allowed to
             | practice law again. It's an absolute joke.
        
           | captainredbeard wrote:
           | Hahaha an NSL can do more than that.
        
           | adrr wrote:
           | That can't be true because the old pen/trap orders required
           | installation of a device to record the calls.
        
       | thepasswordis wrote:
       | We could probably increase the vaccination rate if we snooped on
       | peoples' phones to find out if they are participating in vaccine-
       | hesitant behavior, and then rate-limited traffic to websites and
       | apps that peddle this dangerous misinformation.
       | 
       | How many people have to lose their lives in the service of some
       | pedantic idea of "privacy"? It's a computer looking at it, it's
       | not even a human person.
       | 
       | I think tech companies need a hippocratic oath similar to "first
       | do no harm". Apple should not be engaging in misinformation
       | trafficking, and should at the very least be working to minimize
       | harm by preventing people form falling victim to dangerous,
       | unsubstantiated, and un-fact-checked information. This is
       | especially important when our elected officials use the
       | considerable power that has been gifted to them by the people to
       | put peoples' lives in danger by spreading dangerous
       | misinformation.
       | 
       | What role did apple's inaction on this have in the pandemic? In
       | the January 6th insurrectionist's attempt to overturn a
       | validated, secure, and duly certified democratic election? What
       | role did Apple's inaction play in the attempted kidnapping of the
       | governor of Michigan?
       | 
       | By refusing to help, they are partially responsible for these
       | things. It's time for us to demand that they do their fair share
       | of helping. Inaction is itself an action.
       | 
       | --
       | 
       | This is sarcasm, of course. For how long?
        
       | PierceJoy wrote:
       | It was not that long ago that Apple wrote this letter
       | https://www.apple.com/customer-letter/.
       | 
       | > Building a version of iOS that bypasses security in this way
       | would undeniably create a backdoor. And while the government may
       | argue that its use would be limited to this case, there is no way
       | to guarantee such control.
        
       | post_break wrote:
       | Apple these days is the type of company to make a mistake, think
       | they are never wrong, blame anyone but themselves and then change
       | a little bit, only for it to go over like a dead balloon. It
       | reminds me of the Apple Silicon dev kit. No refunds, ok refunds
       | but you have to spend them in a certain time, ok fine you can use
       | them for a longer period and for anything.
       | 
       | They are probably completely flabbergasted that people are upset
       | about this so they make this change after a week, completely
       | missing the point.
        
         | adventured wrote:
         | They've been like that since the early days of the company,
         | it's core to their culture, which was built by Steve Jobs.
         | Apple has always had an elitist snobbery at their center, such
         | that if you don't get a thing then you're wrong. That's what
         | Think Different is all about, that's a hyper arrogant
         | proclamation that could only be spewed out of an elitist never-
         | wrong orifice, it's borderline assholic in nature. Part of the
         | cult of Apple has always been that attitude, the Apple fans eat
         | it up, they love the idea that buying Apple products made them
         | feel different (or it used to, before Apple became mainstream),
         | made them feel separate from the masses. Steve Jobs points that
         | out in the D5 interview in 2007 with Gates, how Apple needs the
         | inferior taste of the masses to prime their superiority off of
         | (which is another way of saying: they're wrong, we're right).
         | 
         | Go back to their response over the iPhone 4 reception problems
         | ("you're holding it wrong"), same asshole culture then as now,
         | and that culture has been in place for decades:
         | 
         | https://www.engadget.com/2010-06-24-apple-responds-over-ipho...
        
       | istingray wrote:
       | What happened to it being USA only?
       | 
       | This sounds like one of those errors where they double down on it
       | and call it a "solution".
       | 
       | Sounds like they're misunderstanding people's concerns pretty
       | badly. Sorry if we miscommunicated. Kill this "feature"
       | yesterday. Thanks Tim!
        
         | dannyw wrote:
         | This is just business. Governments around the world are
         | thinking of breaking up the app store, and Apple is saying
         | they'll help governments surveil on their citizens inside their
         | walled garden. Would be a shame if users can escape from the
         | surveillance by choosing alternates outside of the walled
         | garden, wouldn't it?
         | 
         | Expect an App Store rule soon - all photos apps must scan for
         | CSAM.
         | 
         | In a year, expect Signal to be banned from all app stores, just
         | like how it's already banned from some countries.
        
         | treesprite82 wrote:
         | USA-only referred to which accounts would be scanned. Whereas
         | it's theoretically a good thing to require the CSAM hashes to
         | be verified by organisations in multiple countries (rather than
         | a single US government-tied organisation that dismissed privacy
         | concerns as "the screeching voices of the minority").
         | 
         | I'm hoping for something along the lines of _" iOS will reject
         | hashes that haven't been signed by independent organisations in
         | US, Russia, China, and India"_ to make it very difficult to
         | push through anything except actual CSAM. Won't be much of a
         | guarantee if it's just Apple saying _" we promise we're only
         | using hashes that have been checked by Australia and the US"_.
        
       | sergiomattei wrote:
       | We don't want it pushed back, we want it dead in the water.
        
       | SnowProblem wrote:
       | I have a 4" iPhone SE. Love it, but Apple has continued to push
       | me away these last four years. I no longer use any other Apple
       | products except for this 4-year-old phone. What alternatives are
       | there for small phone fans? Sony used to make an Xperia Compact
       | that was nice, but I think they discontinued it.
        
         | swiley wrote:
         | I think there's an out of tree kernel for the SE. You could
         | give PostmarketOS a shot if you have a _lot_ of patience.
        
       | Invictus0 wrote:
       | It should be obvious by now that Apple has been coerced into
       | building this scanning system, and the CSAM scanning is just the
       | front for a much more broad "national security" backdoor. Apple's
       | PR team has surely been shitting bricks for the entire week--this
       | is the worst PR debacle they've had since the iPhone 4
       | Antennagate in 2010.
        
         | istingray wrote:
         | Completely agree. A friend of mine asked what was happening,
         | and I just said "it's a big deal, biggest thing in a long
         | time". Antennagate is a good comparison -- though this seems
         | bigger. Impacts of Antennagate long term seemed minimal.
        
           | klipklop wrote:
           | Sadly this will likely be little to no impact on Apple long
           | term either. I too agree this is coming from the government.
           | They have found their loophole of using "private industry" to
           | do all the dirty stuff they cannot directly do.
        
             | tgsovlerkhgsel wrote:
             | Given that privacy has been what Apple was using as they
             | key selling point in their ad campaign, and now they've
             | completely subverted that and mainstream media is picking
             | it up and people are noticing, I suspect this will impact
             | them far more than they expected.
             | 
             | While Apple deserves all the fallout they're getting and
             | more, I'm disappointed that the NCMEC that pushed for this
             | isn't also receiving more scrutiny and criticism. Multiple
             | people have now pointed out that their database contains
             | false positives, which is absolutely terrifying. Completely
             | legal, harmless, no-nudity-no-humans pictures can get your
             | life ruined. The truth coming out later doesn't matter when
             | your home gets raided and it slips out that you were caught
             | sharing multiple images that matched hashes from the NCMEC
             | child porn database.
        
           | musha68k wrote:
           | This is way bigger than any RF engineering mishap.
           | 
           | I'm going to actively evangelize alternatives to Apple
           | devices and that's coming from someone who has been doing
           | Apple evangelizing since OSX Panther days. Tens of thousands
           | of dollars of bought devices and services not including all
           | the people I convinced to make the switch over the years.
           | 
           | I'm sure I'm not the only one in this regard.
        
       | akomtu wrote:
       | Translating from corporatese: "Apple hasn't given up the idea and
       | is looking for a better excuse to push their surveillance
       | solution."
        
       | schappim wrote:
       | With this much bad PR (and Apple hates bad PR), why is Apple
       | pretty much holding its position?
        
         | tgsovlerkhgsel wrote:
         | They put themselves into a position where backing out is also
         | extremely costly.
         | 
         | a) it means they have to publicly admit that what they did was
         | a bad idea, which will fuel another news cycle
         | 
         | b) backing out of it will get them criticized for "protecting
         | pedophiles", maybe even the NCMEC (who privately praised them
         | in the "screeching voices of the minority" memo) will now
         | publicly criticize and shame them to get what it wants
         | 
         | c) now that they've put this idea on the table, there will be
         | even more government pressure to mandate/implement it. Even if
         | they back out of the implementation, just by bringing this
         | proposal up, Apple may have just destroyed not just the privacy
         | of their users, but of everyone.
        
         | klipklop wrote:
         | Because this is likely not about making money, it's about
         | expanding surveillance. Once this becomes accepted they can
         | ratchet it up bit-by-bit. Each the the outrage will be less and
         | less.
         | 
         | For example imagine creating a new tax for 1% of purchases.
         | People would be outraged. Now imagine raising sales tax instead
         | by 1% (ex 7% to 8%.) Sure some people would be upset, but it
         | would be a smaller number than the "new" tax. The reason is
         | because people are already used to paying a sales tax. What is
         | "normal" is more easily accepted.
         | 
         | If Apple can get past the initial outrage (like when sales tax
         | was implemented in many countries in history) they can increase
         | the surveillance once scanning local files becomes the "new
         | normal".
        
           | bpodgursky wrote:
           | The reason Apple actually cares about the feature though is
           | probably to maintain sales in China (which will require this
           | kind of surveillance soon).
        
           | adventured wrote:
           | You can't have a massive new fake war - the War on Domestic
           | Terorrism - without ideally having new ways to track and
           | surveil anyone (everyone) that is going against the machine
           | politically. They need new capabilities and they're
           | absolutely going to put them into place, asap. Did you see
           | their reaction to the populist Trump circus? The globalists
           | viewed it as a mortal threat. Everything they built since WW2
           | was in jeopardy as far as they were concerned. Their ability
           | to launch more forever wars, endless foreign adventurism,
           | obsessive (psychotic) superpower global meddling, all of it
           | under threat from rising populism (represented by both Trump
           | and Sanders) and a growing rejection of the entirely failed
           | decades-long approach by the globalists that has led to the
           | US being buried under nearly $30 trillion in debt and
           | numerous catastrophic wars. Just look at what these monsters
           | are saying publicly on MSNBC, CNN, et al. They're telling
           | everybody what they're planning to do, they're putting it
           | right out there. Look at their framing: we're going to target
           | the domestic population; they're de facto saying that. They
           | couldn't be more open about the authoritarian nightmare they
           | have in mind. They perceive _their_ system as being under
           | risk from populist revolt against them and their agenda, and
           | they 're launching a counter offensive (the empire is
           | striking back), it's under way right now. Bet on it.
        
         | CommieBobDole wrote:
         | The thing that strikes me as most suspicious is their
         | insistence on keeping the client-side scanning feature while
         | wrapping it in all sorts of supposed restrictions so it's
         | essentially server-side scanning, just implemented on the
         | client device for some reason.
         | 
         | If the problem is "We need to scan stuff on iCloud for CSAM"
         | then "Let's build a client-side scanning agent and distribute
         | it to hundreds of millions of devices and figure out a way to
         | protect the hashes from the end user and then figure out a way
         | to lock it down so it can't be used to scan other things even
         | if we wanted to or somebody ordered us to" is a singularly (and
         | suspiciously) inelegant solution to the problem.
         | 
         | Just scan the files on your own servers, like everybody else
         | does. Anything else is rightly going to make people suspicious
         | that you've got some ulterior motive.
        
         | cf499 wrote:
         | Maybe they're telling you to run.
        
         | chrischen wrote:
         | Potentially something like a national security letter and they
         | can't reveal the real reason behind it. I agree this feature is
         | off-brand for Apple and came out of the blue.
        
           | underseacables wrote:
           | What would happen if Apple reveal the national security
           | letter? What's the government going to do, arrest Tim Cook?
        
           | silasdavis wrote:
           | Is that consistent with the amount of time this feature took
           | to implement?
        
           | vmception wrote:
           | Maybe this is the warrant canary
           | 
           | Lets push that idea so allies in the government can go
           | looking and void it
        
         | Overton-Window wrote:
         | Their handlers have them on a tight leash.
        
           | asteroidbelt wrote:
           | This sounds like a conspiracy theory unless you provide more
           | explanations with some evidence. For example, who are these
           | handlers? What are their motives?
           | 
           | The simplest explanation is usually the correct one. The
           | simplest explanation would be some director in Apple wants to
           | get promoted to VP so he or she started this project, pitched
           | it, and very few people opposed it because who wouldn't to
           | prevent harm of children?
        
         | floatingatoll wrote:
         | They're convinced that it's worth the benefit to abused
         | children to let the tech community vent fears about the world
         | at them for a while. I don't really blame them. My views don't
         | align with the mob here, so it would be a waste of my time to
         | discuss them while y'all are upset and angry about this; best
         | just to hold my position, remain silent or reserved, and wait
         | for the pitchfork mob to find another target or exhaust itself.
         | I'm honestly surprised Apple even commented at all.
        
           | tgsovlerkhgsel wrote:
           | > I'm honestly surprised Apple even commented at all.
           | 
           | I think they commented because they expected a brief storm of
           | outrage that quickly dies down, but instead it has become a
           | growing wave that is spilling into mainstream media, with a
           | mostly negative reaction, and threatening to completely
           | destroy their "Privacy. That's iPhone" mantra that has been
           | the core of their marketing campaign this year.
        
       | grishka wrote:
       | It's a policy decision, while the technical one is still there,
       | unchanged.
        
       | havelhovel wrote:
       | We've known Apple would use sources besides NCMEC since day one.
       | This is neither news nor a solution to any of the issues that
       | have been raised. From their website on August 5th: "...the
       | system performs on-device matching using a database of known CSAM
       | image hashes provided by NCMEC and other child safety
       | organizations."
       | (https://web.archive.org/web/20210805194603/https://www.apple...)
        
       | underseacables wrote:
       | Still not good enough. That's like an invading army saying they
       | have a "kinder, gentler machine gun hand."
        
       | mikeiz404 wrote:
       | Apple's published paper from the article talks generally about
       | the CSAM scanning threat protections starting on page 5 here --
       | https://www.apple.com/child-safety/pdf/Security_Threat_Model...
       | 
       | Side Note: These protections rely on the client (phone) services
       | and their subsequent updates to be implemented correctly and as
       | stated for these claims to hold. So trust is inherently required
       | in Apple, much like trust is requires in the developer of any
       | closed source client which is supposed to enforce security
       | guarantees (not that open source guarantees security but it does
       | potentially add evidence in its favor). They do state that
       | security audits are done.
        
       | fennecfoxen wrote:
       | In before China starts diplomatically leaning on "multiple
       | nations" to flag dissident imagery for review.
        
       | dkersten wrote:
       | So they're still retaining the ability to look on users devices?
       | I have nothing against them looking at the abuse material, I have
       | a problem with them having power over my device, which could be
       | piggy-backed on by state actors (or others) for their own
       | purposes later.
       | 
       | Nobody is against them trying to prevent child sexual abuse,
       | pretty sure we all agree that fighting that is important, but
       | doing so by creating what is essentially a back door of sorts
       | into my devices isn't the right approach to doing so.
       | 
       | It sounds to me that this is still allowing them that access, so
       | this changes absolutely nothing.
        
         | dwighttk wrote:
         | Don't use iCloud photos
        
           | thoughtstheseus wrote:
           | Don't use Apple products. The scanning is on device.
        
             | uninformedhn wrote:
             | ...if you use iCloud Photos.
             | 
             | How is it day 47 of HN arguing about this and you all still
             | can't get basic facts right, and oh boy, you showed me
             | downvoting me twice in 60 seconds for pointing out a lack
             | of fundamental comprehension of what you're actually upset
             | about
             | 
             | This community is obnoxious
        
           | wilkystyle wrote:
           | This is as unhelpful as saying "don't have CSAM on your
           | phone." As has been reiterated many times, the problem is not
           | the capability it is claimed to start off with, it is how the
           | capability will be evolved, abused, and exploited as time
           | goes on.
        
           | justinzollars wrote:
           | You need to read what they are doing. It's on-device
           | scanning. You have no privacy or choice. May as well have a
           | Facebook Phone.
        
       ___________________________________________________________________
       (page generated 2021-08-13 23:00 UTC)