[HN Gopher] Apple's Plan to "Think Different" About Encryption O...
       ___________________________________________________________________
        
       Apple's Plan to "Think Different" About Encryption Opens a Backdoor
       to Your Life
        
       Author : bbatsell
       Score  : 734 points
       Date   : 2021-08-05 20:20 UTC (2 hours ago)
        
 (HTM) web link (www.eff.org)
 (TXT) w3m dump (www.eff.org)
        
       | mcone wrote:
       | I wish there was a privacytools.io for hardware. I've been an
       | iPhone user since the beginning but now I'm interested in
       | alternatives. Last I checked, PinePhone was still being actively
       | developed. Are there any decent phones that strike a balance
       | between privacy and usability?
        
         | teddyh wrote:
         | The Librem 51 is both more powerful then the PinePhone, and is
         | slated2 to get RYF certification3 from the FSF.
         | 
         | 1. https://puri.sm/products/librem-5/
         | 
         | 2. https://puri.sm/posts/librem-5-update-shipping-estimates-
         | and...
         | 
         | 3. https://ryf.fsf.org/
        
         | Knighttime wrote:
         | There are tons of devices compatible with LineageOS. I suggest
         | taking a look there. https://lineageos.org/
        
           | kivlad wrote:
           | I'd go a step further and recommend https://grapheneos.org/
           | with a Pixel phone.
        
             | Knighttime wrote:
             | That too! It's restricted to Pixel devices though, and (I'm
             | not 100% sure on this. It at least doesn't include it.)
             | doesn't support things like MicroG which is a must for
             | getting some apps that rely on Play Services to work
             | correctly. I really think Graphene is only good for
             | hardcore privacy and security enthusiasts, or for
             | situations that actually require the security. I guess it
             | just depends on how much convenience you want to sacrifice.
        
             | josh_today wrote:
             | Serious question- how can anyone know these operating
             | systems are truly secure? Is there a way to test the source
             | code? From a code perspective could Google have placed a
             | back door in Android to access these forks?
        
       | nicetryguy wrote:
       | I'm looking forward to this platform being expanded to facially
       | ID against more databases such as criminals, political
       | dissenters, or anyone with an undesirable opinion so that SWAT
       | teams can barge into the homes of false positive identifications
       | to murder them and their dogs.
        
       | babesh wrote:
       | Apple is part of the power structure of the US. That means that
       | it has a hand in shaping the agenda for the US but with that
       | power comes the responsibility to carry out the agenda.
       | 
       | This also means that it is shielded from attack by the power
       | structure. That is the bargain that the tech industry has struck.
       | 
       | The agenda is always towards increasing power for the power
       | structure. One form of power is information. That means that
       | Apple is inexorably drawn towards increasing surveillance. Also,
       | Apple's massive customer base both domestic and overseas is a
       | juicy surveillance target.
        
         | babesh wrote:
         | And if you don't believe me, ask yourself who holds the keys to
         | iCloud data for both foreign and domestic customers. Ask Apple
         | if it has ever provided data for a foreign customer to the US
         | government. What do you think GDPR is for?
         | 
         | Hint: it isn't end to end encrypted, Apple doesn't need your
         | password to read the information, and you will never know
         | 
         | Who the frack would design a system that way and why?
        
         | babesh wrote:
         | The die was cast with the 2020 elections when Apple decided get
         | into the fray. Much of tech also got into the fray. Once they
         | openly decided to use their power, they couldn't get back out.
        
       | strictnein wrote:
       | This is an excellent example of how far off the rails the EFF has
       | gone. This is completely false:
       | 
       | > "Apple is planning to build a backdoor into its data storage
       | system and its messaging system"
        
         | Kaytaro wrote:
         | How so? That's literally what it is.
        
           | shuckles wrote:
           | None of the announcements describe an iMessage back door,
           | even if you're being extremely generous about what back door
           | means.
        
       | thedream wrote:
       | The Cult Of The Apple hawks its slimy surveillance Snake Oil to a
       | gluttonous throng of thralls.
       | 
       | So where's the news?
        
       | everyone wrote:
       | When u upload any build to app store, before you can have it in
       | testflight or submit it for release, you have to fill out this
       | questionnaire asking "does your app use encryption?" If you say
       | yes, you're basically fucked, good luck releasing it.. You have
       | to say no as far as I'm aware.
        
       | arihant wrote:
       | I'm very concerned that a bunch of false positives will send
       | people's nudes to Apple for manual review. I don't trust apple's
       | on device ML for something this sensitive. I also can't imagine
       | that Apple will now not be forced to implement government forced
       | filtering and reporting on iMessage. And this will likely affect
       | others like WhatsApp because now governments know that there is a
       | way to do this on E2E.
       | 
       | What are some other fully encrypted photo options out there?
        
       | [deleted]
        
       | hncurious wrote:
       | Apple employees successfully pressured their employer to fire a
       | new hire and are petitioning to keep WFH.
       | 
       | https://www.vox.com/recode/2021/5/13/22435266/apple-employee...
       | 
       | https://www.vox.com/recode/22583549/apple-employees-petition...
       | 
       | Will they apply that energy and leverage to push back on this?
       | 
       | How else can this be stopped before it goes too far? Telling
       | people to drop Apple is even less effective than telling people
       | to delete Facebook.
        
         | lijogdfljk wrote:
         | I doubt this will be as clean. A large swath of people will
         | defend this "for the children".
        
       | mccorrinall wrote:
       | They are putting their own users under surveillance. Didn't
       | expect that from Apple.
        
       | triska wrote:
       | I remember an Apple conference where Tim Cook personally assured
       | us that Apple is fully committed to privacy, that everything is
       | so secure because the iPhone is so powerful that all necessary
       | calculations can happen on the device itself, and that we are
       | "not the product". I think the Apple CEO said some of this in the
       | specific context of speech processing, yet it seemed a specific
       | case of a general principle upheld by Apple.
       | 
       | I bought an iPhone because the CEO seemed to be sincere in his
       | commitment to privacy.
       | 
       | What Apple has announced here seems to be a complete reversal
       | from what I understood the CEO saying at the conference only a
       | few years ago.
        
         | avnigo wrote:
         | I'm still waiting on iCloud backup encryption they promised a
         | while back. There were reports that they scrapped those plans
         | because the FBI told them to, but nothing official announced
         | since 2019 on this.
        
           | minsc__and__boo wrote:
           | Yet Apple gave access to all the chinese user iCloud data to
           | the Chinese government, including messages, emails, pictures,
           | etc.
           | 
           | NYT Daily had an episode where they talked about how the CCP
           | is getting Apple to bend it's commitment to privacy:
           | 
           | https://www.nytimes.com/2021/06/14/podcasts/the-
           | daily/apple-...
        
         | Klonoar wrote:
         | I think the EFF is probably doing good by calling attention to
         | the issue, but let's... actually look at the feature before
         | passing judgement, e.g:
         | 
         | https://twitter.com/josephfcox/status/1423382200880439298/ph...
         | 
         | - It's run for Messages in cases where a child is potentially
         | viewing material that's bad.
         | 
         | - It's run _before upload to iCloud Photos_ - where it would've
         | already been scanned anyway, as they've done for years (and as
         | all other major companies do).
         | 
         | To me this really doesn't seem that bad. Feels like a way to
         | actually reach encrypted data all around while still meeting
         | the expectations of lawmakers/regulators. Expansion of the tech
         | would be something I'd be more concerned about, but considering
         | the transparency of it I feel like there's some safety.
         | 
         | https://www.apple.com/child-safety/ more info here as well.
        
           | aaomidi wrote:
           | > - It's run _before upload to iCloud Photos_ - where it
           | would've already been scanned anyway, as they've done for
           | years (and as all other major companies do).
           | 
           | Then why build this functionality at all? Why not wait until
           | it's uploaded and check it on their servers and not run any
           | client side code? This is how literally every other non-
           | encrypted cloud service operates.
        
             | Klonoar wrote:
             | I assume (and this is my opinion, to be ultra-clear) that
             | it's a blocker for E2E encryption. As we've seen before,
             | they wanted to do it by backed off after government
             | pressure. It wouldn't surprise me if this removes a
             | blocker.
             | 
             | Apple has shown that they prefer pushing things to be done
             | on-device, and in general I think they've shown it to be a
             | better approach.
        
               | aaomidi wrote:
               | That really makes little to no sense - it's not E2EE if
               | you're going to be monitoring files that enter the
               | encrypted storage. That's snakeoil encryption at that
               | point.
               | 
               | I sincerely doubt Apple is planning to do E2EE with
               | iCloud storage considering that really breaks a lot of
               | account recovery situations & is generally a bad UX for
               | non-technical users.
               | 
               | They're also already scanning for information on the
               | cloud anyway.
        
               | Klonoar wrote:
               | Eh, I disagree - your definition feels like moving the
               | goalposts.
               | 
               | Apple is under no obligation to host offending content.
               | Check it before it goes in (akin to a security checkpoint
               | in real life, I guess) and then let me move on with my
               | life, knowing it couldn't be arbitrarily vended out to x
               | party.
        
               | philistine wrote:
               | Going on with your life in this situation means police
               | officers have been given copies of the photos that
               | triggered the checkpoint. Do you want that?
        
               | pseudalopex wrote:
               | Apple's paper talks about decrypting suspect images. It
               | isn't end to end.[1]
               | 
               | [1] https://www.apple.com/child-
               | safety/pdf/CSAM_Detection_Techni...
        
               | Klonoar wrote:
               | Feel free to correct me if I'm wrong, but this is a
               | method for decrypting _if it's matching an already known
               | or flagged item_. It's not enabling decrypting arbitrary
               | payloads.
               | 
               | From your link:
               | 
               | >In particular, the server learns the associated payload
               | data for matching images, but learns nothing for non-
               | matching images.
               | 
               | Past this point I'll defer to actual cryptographers (who
               | I'm sure will dissect and write about it), but to me this
               | feels like a decently smart way to go about this.
        
               | pseudalopex wrote:
               | Matching means suspect. It doesn't have to be a true
               | match.
               | 
               | It could be worse. But end to end means the middle has no
               | access. Not some access.
        
               | aaomidi wrote:
               | And remember the E2EE is pure speculation at this point.
        
               | aaomidi wrote:
               | Then don't offer "E2EE"
        
           | xienze wrote:
           | > Expansion of the tech would be something I'd be more
           | concerned about
           | 
           | Yeah, and that's precisely what will happen. It always starts
           | with child porn, then they move on to "extremist content", of
           | which the term expands to capture more things on a daily
           | basis. Hope you didn't save that "sad Pepe" meme on your
           | phone.
        
           | kps wrote:
           | > _considering the transparency of it_
           | 
           | What transparency? Apple doesn't publish iOS source.
        
           | mapgrep wrote:
           | > It's run _before upload to iCloud Photos_ - where it
           | would've already been scanned anyway
           | 
           | Right, so ask yourself, why is it on the device? Why not just
           | scan on the server?
           | 
           | To me (agreeing with much of the commentary I've seen) the
           | likeliest answer is that they are confining the scan to pre
           | uploads now not for any technical reason but to make the
           | rollout palatable to the public. Then they're one update away
           | from quietly changing the rules. There's absolutely no reason
           | to do the scan on your private device if they plan to only
           | confine this to stuff they could scan away from your device.
        
           | karaterobot wrote:
           | Since nobody would ever object to it, protecting against
           | child abuse gets used as a wedge. As the article points out,
           | the way this story ends is with this very backdoor getting
           | used for other things besides preventing child abuse:
           | anything the government asks Apple to give them. It's an
           | almost inevitable consequence of creating a backdoor in the
           | first place, which is why you have to have a zero-tolerance
           | policy against it.
        
           | [deleted]
        
           | randcraw wrote:
           | So your argument is, if you've done nothing wrong, you have
           | nothing to worry about. Really? Will you feel the same when
           | Apple later decides to include dozens more crimes that they
           | will screen for, surreptitiously? All of which are searches
           | without warrants or legal oversight?
           | 
           | Let me introduce you to someone you should know better. His
           | name is Edward Snowden. Or Louis Brandeis, who is spinning in
           | his grave right about now.
           | 
           | The US Fourth Amendment exists for a damned good reason.
        
             | fredgrott wrote:
             | Hmm, seems to me since most smart criminals understand not
             | to leave a digital footprint that what Apple will catch is
             | those are idiots and make a honest mistake and those how
             | are dumb and make a mistake in putting their illegality
             | online.
             | 
             | So I would ask US Lawmakers why cannot the phone companies
             | make the same commitments? As the reason seems to be we
             | have bad people doing crime using digital communication
             | devices.
             | 
             | Last time I checked the digital pipeline ie phone lines is
             | still under FFC rules is it not?
             | 
             | If they answer that its to hard tech wise then why cannot
             | Apple make the same argument ot law makers?
        
             | Klonoar wrote:
             | You do realize you could get this message across without
             | the needlessly arrogant tone, yeah? All it does is make me
             | roll my eyes.
             | 
             | Anyway, that wasn't my stated position. I simply pointed
             | out that this is done for a subset of users (where there's
             | already existing reasons to do so, sub-13 and all) and that
             | on syncing to iCloud this _already happens anyway_.
             | 
             | I would gladly take this if it removes a barrier to making
             | iCloud E2E encrypted; they are likely bound to do this type
             | of detection, but doing it client-side before syncing feels
             | like a sane way to do it.
        
               | kickopotomus wrote:
               | > I would gladly take this if it removes a barrier to
               | making iCloud E2E encrypted; they are likely bound to do
               | this type of detection, but doing it client-side before
               | syncing feels like a sane way to do it.
               | 
               | But there is an issue there. Now there is a process on
               | your phone capable of processing unencrypted data on your
               | phone and communicating with the outside world. That is
               | spyware which will almost certainly be abused in some
               | way.
        
               | xondono wrote:
               | > Now there is a process on your phone capable of
               | processing unencrypted data on your phone and
               | communicating with the outside world.
               | 
               | What? That's what all apps _by definition_ do. My retinas
               | can't do decryption yet!
        
               | JackGreyhat wrote:
               | Actaully, I don't think it will remove a barrier for
               | iCloud E2E encryption at all. On the contrary. All it
               | will remove, is the barrier for what we find acceptible
               | for companoes like Apple to implement. I think Apple made
               | a very intrusive move, one that we will come to accept
               | over time. After that, a next move follows...and so on.
               | That's the barrier being moved. A point will be reached
               | when E2E encryption is nothing more than a hoax, a non-
               | feature with no added value. A mirage of what it is
               | supposed to be. All of these things are implemented under
               | the Child Protection flag. Sure, we need child
               | protection, we need it badly, but the collateral is huge
               | and quite handy too for most 3 letter agencies. I don't
               | have the solution. The other day my 3 year old son had a
               | rash, I took pictures of it over the course of a few
               | days. A nude little boy, pictures from multiple angles. I
               | showed my dermatologist. What will happen in the future?
               | Will my iPhone "flag" me as a potential child predator?
               | Can I tell it I'm a worried dad? Do I even have to be
               | thinking about these things?
        
           | thesimon wrote:
           | "Feels like a way to actually reach encrypted data all around
           | while still meeting the expectations of lawmakers/regulators"
           | 
           | And isn't that a problem? Encrypted data should be secure,
           | even if lawmakers don't want math to exist.
        
             | Klonoar wrote:
             | Your data should be encrypted on Apple's servers and
             | unreadable by them; rather, this is my desire from Apple.
             | They are likely bound to scan and detect for this kind of
             | abusive content.
             | 
             | This handles that client-side instead of server side, and
             | if you don't use iCloud photos, it doesn't even affect you.
             | If syncing? Sure, decrypt it on device and check it before
             | uploading - it's going to their servers after all.
             | 
             | Don't want to even go near this? Don't use Message or
             | iCloud, I guess. Very possible to use iOS/iDevices in a
             | contained manner.
        
           | wayneftw wrote:
           | It runs on _my device_ and uses my CPU, battery time and my
           | network bandwidth (to download /upload the hashes and other
           | necessary artifacts).
           | 
           | I'd be fine with them scanning stuff I uploaded to them with
           | their own computers because I don't have any really
           | expectation of privacy from huge corporations.
        
             | kps wrote:
             | Apple already uses 'your' CPU, battery time and network
             | bandwidth for its Find My / AirTag product.
        
               | babesh wrote:
               | You can turn it off.
        
             | Klonoar wrote:
             | I feel like this argument really doesn't add much to the
             | discussion.
             | 
             | It runs only on a subset of situations, as previously noted
             | - and I would be _shocked_ if this used more battery than
             | half the crap running on devices today.
             | 
             | Do you complain that Apple runs code to find moments in
             | photos to present to you periodically...?
        
               | aaomidi wrote:
               | What is the point of running this on device? The issue
               | here is now Apple has built and is shipping what is
               | essentially home-phoning malware that can EASILY be
               | required with a court order to do something entirely than
               | what it is designed to do.
               | 
               | They're opening themselves to being forced by 3 letter
               | agencies around the world to do some really fucked up
               | shit to their users.
               | 
               | Apple should never have designed something that allows
               | for fingerprinting of files & users for stuff stored on
               | their own device.
        
               | Klonoar wrote:
               | Your entire argument could be applied to iOS itself. ;P
        
               | aaomidi wrote:
               | Not really, iOS didn't really have the capability of
               | scanning and reporting files based on a database received
               | by the FBI/other agencies.
               | 
               | There is a big difference when this has been implemented
               | & deployed to devices. Fighting questionable subpoenas
               | and stuff becomes easier when you don't have the
               | capability.
        
               | wayneftw wrote:
               | > I feel like this argument really doesn't add much to
               | the discussion.
               | 
               | Oh, I guess I should have just regurgitated the Apple
               | press release like the gp?
               | 
               | > It runs only on a subset of situations...
               | 
               | For now. But how does that fix the problem of them using
               | my device and my network bandwidth?
               | 
               | > I would be _shocked_ if this used more battery than
               | half the crap running on devices today.
               | 
               | You think you'll be able to see how much it uses?
               | 
               | > Do you complain that Apple runs code to find moments in
               | photos to present to you periodically...?
               | 
               | Yes. I hate that feature, it's a waste of my resources.
               | I'll reminisce when I choose to, I don't need some
               | garbage bot to troll my stuff for memories. I probably
               | already have it disabled, or at least the notifications
               | of it.
        
           | lights0123 wrote:
           | My big issue is what it opens up. As the EFF points out, it's
           | really not a big leap for oppressive governments to ask Apple
           | to use the same tech (as demoed by using MS's tech to scan
           | for "terrorist" content) to remove content they don't like
           | from their citizens' devices.
        
             | acdha wrote:
             | That's my concern: what happens the first time a government
             | insists that they flag a political dissident or symbol? The
             | entire system is opaque by necessity for its original
             | purpose but that seems to suggest it would be easy to do
             | things like serve a custom fingerprints to particular users
             | without anyone being any the wiser.
        
               | philistine wrote:
               | My heart goes to the queer community of Russia, whose
               | government will pounce on this technology in a heartbeat
               | and force Apple to scan for queer content.
        
               | acdha wrote:
               | They'd have many other countries keeping them company,
               | too.
               | 
               | One big mess: how many places would care about false
               | positives if that gave them a pretext to arrest people? I
               | do not want to see what would happen if this
               | infrastructure had been available to the Bush
               | administration after 9/11 and all of the usual ML failure
               | modes played out in an environment where everyone was
               | primed to assume the worst.
        
           | vimy wrote:
           | Teens are also children. Apple has no business checking if
           | they send or receive nude pics. Let alone tell their parents.
           | This is very creepy behavior from Apple.
           | 
           | Edit: I'm talking about this https://pbs.twimg.com/media/E8DY
           | v9hWUAksPO8?format=jpg&name=...
        
             | xondono wrote:
             | Call me crazy, but if your 13yo is sending nudes, I think
             | that as a parent you want to know that.
             | 
             | Current society is pushing a lot of adult behavior into
             | kids, and they don't always understand the consequences of
             | their actions.
             | 
             | Parents can't inform their kids if they aren't aware.
        
             | hb0ss wrote:
             | Children as defined by Apple differs per legal region, for
             | the US it is set to 13 years or younger. Also, your parents
             | need to have added your account to the iCloud family for
             | the feature to work.
        
             | Closi wrote:
             | That's not what this solution is doing, it's checking a
             | hash of the photo against a hash of known offending
             | content.
             | 
             | If someone sends nude pics there is still no way to tell
             | that it's a nude pic.
        
               | [deleted]
        
               | randcraw wrote:
               | Nude pic ID is routine online. Facebook developed this
               | capability over 5 years ago and employs it liberally
               | today, as do many other net service providers.
        
               | mrits wrote:
               | Not true. We don't know how the fuzzy hash is working.
               | It's very likely a lot of nudes would fall in the
               | threshold Apple has set.
        
               | krrrh wrote:
               | That's only the first part of what was announced and
               | addressed in the article.
               | 
               | The other part is on-device scanning for nude pics a
               | child is intending to send using machine learning and
               | securely notifying the child, and then parents within the
               | family account. The alert that the kids get by itself
               | will probably be enough to stop a lot of them from
               | sending the pic in the first place.
        
               | slownews45 wrote:
               | I'm a parent. It's weird seeing HN push against this.
               | 
               | This sounds like a feature I'd like
        
               | [deleted]
        
               | philistine wrote:
               | I agree with you in principle, but I also know that kids
               | will soon find methods of sharing that defeat any scans.
               | Other apps and ephemeral websites can be used to escape
               | Apple's squeaky-clean version of the world.
        
               | slownews45 wrote:
               | Sure - that's fine.
               | 
               | But if I'm picking a phone for my kid, and my choice is
               | this (even if imperfect) and the HN freedomFone - it's
               | going to be Apple. We'll see what other parents decide.
        
               | Bud wrote:
               | False, and this shows that you didn't read the entire
               | article. You should go and do that.
        
               | [deleted]
        
               | artimaeis wrote:
               | You're conflating the CSAM detection of photos uploaded
               | to iCloud with the explicit detection for child devices.
               | The latter is loosely described here:
               | https://www.apple.com/child-safety/.
               | 
               | > Messages uses on-device machine learning to analyze
               | image attachments and determine if a photo is sexually
               | explicit. The feature is designed so that Apple does not
               | get access to the messages.
        
             | Spooky23 wrote:
             | It would be if that were what they were doing. They are
             | not.
        
               | Bud wrote:
               | Yes, it is, and you need to read the entire article.
        
               | spiderice wrote:
               | I think you're both partially right.
               | 
               | > In these new processes, if an account held by a child
               | under 13 wishes to send an image that the on-device
               | machine learning classifier determines is a sexually
               | explicit image, a notification will pop up, telling the
               | under-13 child that their parent will be notified of this
               | content. If the under-13 child still chooses to send the
               | content, they have to accept that the "parent" will be
               | notified, and the image will be irrevocably saved to the
               | parental controls section of their phone for the parent
               | to view later. For users between the ages of 13 and 17, a
               | similar warning notification will pop up, though without
               | the parental notification.
               | 
               | This specifically says that it will not notify the
               | parents of teens, as GGP claims. So GP is right that
               | Apple isn't doing what GGP claimed. However I still think
               | you might be right that GP didn't read the full article
               | and just got lucky. Lol.
        
             | krrrh wrote:
             | Parents do have a legal and moral responsibility to check
             | on their children's behaviour, and that includes teens.
             | It's somewhat analogous to a teacher telling parents about
             | similar behaviour taking place at school.
             | 
             | I suspect a lot of how people feel about this will come
             | down to whether they have kids or not.
        
               | anthk wrote:
               | I don't know about your country but in mine in Europe
               | teens have privacy rights OVER their parents' paranoia.
               | 
               | That includes secrecy in private communications and OFC
               | privacy within their own data in smartphones.
        
             | dhosek wrote:
             | The fact that teens are children means that if, say a 16-yo
             | sends a nude selfie to their s.o., they've just committed a
             | felony (distributing child pornography) that can have
             | lifelong consequences (thanks to hysterical laws about sex
             | offender registries, both kids could end up having to
             | register as sex offenders for the rest of their life and
             | will be identified as having committed a crime that
             | involved a minor. Few if any of the registries would say
             | more than this and anyone who looks in the registry will be
             | led to believe that they molested a child and not shared a
             | selfie or had one shared with them). The laws may not be
             | just or correct, but they are the current state of the
             | world. Parents need to talk to their kids about this sort
             | of thing, and this seems one of the less intrusive way for
             | them to discover that there's an issue. If it were
             | automatically shared with law enforcement? That would be a
             | big problem (and a guarantee that my kids don't get access
             | to a device until they're 18), but I'm not ready1 to be up
             | in arms about this yet.
             | 
             | 1. I reserve the right to change my mind as things are
             | revealed/developed.
        
               | jfjsjcjdjejcosi wrote:
               | > if, say a 16-yo sends a nude selfie to their s.o.,
               | they've just committed a felony ... The laws may not be
               | just or correct, but they are the current state of the
               | world.
               | 
               | Hence, strong E2E encryption designed to prevent unjust
               | government oppression, _without_ backdoors.
               | 
               | Parents should talk to their teenagers about sex
               | regardless of if they get a notification on their phone
               | telling them they missed the boat.
        
               | SahAssar wrote:
               | I get your points, but the end result is that the client
               | in an E2EE system can no longer be fully trusted to act
               | on the clients behalf. That seems alarming to me.
        
               | philistine wrote:
               | I'd argue that the problem of minors declared sex
               | offenders for nude pictures has reached a critical mass
               | that scares me. At this point, sex offenders of truly
               | vile things can hide by saying that they are on a sex
               | offender registry because of underage selfies. And I
               | think most people will believe them.
        
               | wccrawford wrote:
               | I worked with someone that claimed this, years ago. And
               | they were still young enough that I believed them.
        
               | anthk wrote:
               | > The laws may not be just or correct, but they are the
               | current state of the world
               | 
               | America, not Europe. Or Japan.
        
               | bambax wrote:
               | > _they 've just committed a felony (distributing child
               | pornography)_
               | 
               | In the US, maybe (not sure if this is even true in all
               | states), but not in most other countries in the world,
               | where a 16-year-old is not a child, nudity is not a
               | problem, and "sex offender registries" don't exist.
               | 
               | The US is entitled to make its own (crazy, ridiculous,
               | stupid) laws, but we shouldn't let them impose those on
               | the rest of us.
        
             | nerdponx wrote:
             | > Apple has no business checking if they send or receive
             | nude pics. Let alone tell their parents.
             | 
             | Some people might disagree with you.
             | 
             | There are people out there who are revolted by the
             | "obviously okay" case of 2 fully-consenting teenagers
             | sending each other nude pics, without any coercion, social
             | pressure, etc.
             | 
             | Not to mention all the gray areas and "obviously not okay"
             | combinations of ages, circumstances, number of people
             | involved, etc.
        
               | slownews45 wrote:
               | It will be the parents who are deciding this -
               | particularly if they are buying these phones.
               | 
               | If parents don't like this feature, they can buy a
               | lineage OS type phone. If parents do they will buy this
               | type of phone for their kids.
        
               | philistine wrote:
               | Big correction: it will be the other kids' parent who
               | will decide for your kid. Apple will give your children's
               | picture to the other kid's parents.
               | 
               | That's terrifying.
        
               | slownews45 wrote:
               | What a lie - I'm really noticing an overlap between folks
               | fighting this type of stuff (which as a parent I want)
               | and folks just lying horribly.
               | 
               | "if a child attempts to send an explicit photo, they'll
               | be warned before the photo is sent. Parents can also
               | receive a message if the child chooses to send the photo
               | anyway." - https://techcrunch.com/2021/08/05/new-apple-
               | technology-will-...
               | 
               | So this gives kids a heads up that they shouldn't send
               | it, and that if they do, their parent will be notified.
               | So that's me in case you are not reading this clearly.
               | 
               | Now yes, if someone is SENDING my child porn from a non-
               | child account, I as the parent will be notified. Great.
               | 
               | If this is terrifying - that's a bit scary! HN is going a
               | bit off the rails these days.
               | 
               | Allow me to make a prediction - users are going to like
               | this - it will INCREASE their trust in apple in terms of
               | a company trying to keep them and their family safe.
               | 
               | I just looked it up - Apple is literally the #1 brand
               | globally supposedly in 2021. So they are doing the right
               | thing in customers minds so far.
        
               | thomastjeffery wrote:
               | That's entirely GP's point: preferring to cater to those
               | people affects the rest of us in a way we find
               | detrimental.
        
             | Klonoar wrote:
             | You describe it as if Apple's got people in some room
             | checking each photo. It's some code that notifies their
             | parents in certain situations. ;P
             | 
             | I know several parents in just my extended circle alone
             | that would welcome the feature, so... I just don't think I
             | agree with this statement. These parents already resort to
             | other methods to try and monitor their kids but it's
             | increasingly (or already) impossible to do so.
             | 
             | I suppose we should also take issue with Apple letting
             | parents watch their kids location...?
        
             | strictnein wrote:
             | This is not how the system works at all.
        
               | vimy wrote:
               | https://pbs.twimg.com/media/E8DYv9hWUAksPO8?format=jpg&na
               | me=... What am I reading wrong?
        
               | strictnein wrote:
               | Already answered here:
               | https://news.ycombinator.com/item?id=28079919
        
               | Bud wrote:
               | Answered incorrectly. You need to read the rest of the
               | article.
        
           | dabbledash wrote:
           | The point of encrypted data is not to be "reached."
        
           | ElFitz wrote:
           | True. But, first, it also means anyone, anywhere, as long as
           | they use iOS, is vulnerable to what the _US_ considers to be
           | proper. Which, I will agree, likely won't be an issue in the
           | case of child pornography. But there's no way to predict how
           | that will evolve (see Facebook's ever expanding imposing of
           | American cultural norms and puritanism).
           | 
           | Next, it also means they _can_ do it. And if it can be done
           | for child pornography, why not terrorism? And if it can be
           | done for the US' definition of terrorism, why not China 's,
           | Russia's or Saudi Arabia's? And if terrorism and child
           | pornography, why not drugs consumption? Tax evasion? Social
           | security fraud? Unknowingly talking with the wrong person?
           | 
           | Third, there _apparently_ is transparency on it today. But
           | who is to say it 's possible expansion won't be forcibly
           | silenced in the same way Prism's requests were?
           | 
           | Fourth, but that's only because I slightly am a maniac, how
           | can anyone unilaterally decide to waste the computing power,
           | battery life and data plan of a device I paid for without my
           | say so? (probably one of my main gripes with ads)
           | 
           | All in all, it means I am incorporating into my everyday life
           | a device that can and will actively snoop on me and
           | potentially snitch on me. Now, while I am not worried _today_
           | , it definitely paves the way for many other things. And I
           | don't see why I should trust anyone involved to stop here or
           | let me know when they don't.
        
             | adventured wrote:
             | The US is very openly, publicly moving down the road called
             | The War on Domestic Terrorism, which is where the US
             | military begins targeting, focusing in on the domestic
             | population. The politicians in control right now are very
             | openly stating what their plans are. It's particularly
             | obvious what's about to happen, although it was obvious at
             | least as far back as the Patriot Act. The War on Drugs is
             | coming to an end, so they're inventing a new fake war to
             | replace it, to further their power. The new fake war will
             | result in vast persecution just as the last one did.
             | 
             | You can be certain what Apple's scanning is going to be
             | used for is going to widen over time. That's one of the few
             | obvious certainties with this. These things are a Nixonian
             | wet dream. The next Trump type might not be so politically
             | ineffectual; more likely that person will be part of the
             | system and understand how to abuse & leverage it to their
             | advantage by complying with it rather than threatening its
             | power as an outsider. Trump had that opportunity, to give
             | the system what it wanted, he was too obtuse and rigid, to
             | understand he had to adapt or the machine would grind him
             | up (once he started removing the military aparatus that was
             | surrounding him, like Kelly and Mattis, it was obvious he
             | would never be allowed to win a second term; you can't keep
             | that office while being set against all of the military
             | industrial complex including the intelligence community,
             | it'll trip you up on purpose at every step).
             | 
             | The US keeps getting more authoritarian over time. As the
             | government gets larger and more invasive, reaching ever
             | deeper into our lives, that trend will continue. One of the
             | great, foolish mistakes that people make about the US is
             | thinking it can be soft and cuddly like Finland. Nations
             | and their governments are a product of their culture. So
             | that's not what you're going to get if you make the
             | government in the US omnipotent. You're going to get either
             | violent Latin American Socialism (left becomes dominant) or
             | violent European Fascism (right becomes dominant). There's
             | some kind of absurd thinking that Trump was right-wing, as
             | in anti-government or libertarian; Trump is a proponent of
             | big government, just as Bush was, that's why they had no
             | qualms about spending like crazy (look at the vast
             | expansion of the government under Bush); what they are is
             | the forerunners to fascism (which is part of what their
             | corporatism is), they're right wingers that love big
             | government, a super dangerous cocktail. It facilitates a
             | chain of enabling over decades; they open up pandora boxes
             | and hand power to the next authoritarian. Keep doing that
             | and eventually you're going to get a really bad outcome
             | (Erdogan, Chavez, Putin, etc) and that new leadership will
             | have extraordinary tools of suppression.
             | 
             | Supposed political extremists are more likely to be the
             | real target of what Apple is doing. Just as is the case
             | with social media targeting & censoring those people. The
             | entrenched power base has zero interest in change, you can
             | see that in their reaction to both Trump and Sanders. Their
             | interest is in maintaining their power, what they've built
             | up in the post WW2 era. Trump and Sanders, in their own
             | ways, both threatened what they constructed. Trump's chaos
             | threatened their built-up system, so the globalists in DC
             | are fighting back, they're going to target what they
             | perceive as domestic threats to their system, via their new
             | War on Domestic Terrorism (which will actually be a
             | domestic war on anyone that threatens their agenda). Their
             | goal is to put systems in place to ensure another outsider,
             | anyone outside of their system, can never win the
             | Presidency (they don't care about left/right, that's a
             | delusion for the voting class to concern themselves about;
             | the people that run DC across decades only care if the
             | left/right winner complies with their agenda; that's why
             | the Obamas and Clintons are able to be so friendly with the
             | Bushes (what Bush did during his Presidency, such as Iraq,
             | is dramatically worse than anything Trump did, and yet Bush
             | wasn't impeached, wasn't pursued like Trump was, the people
             | in power - on both sides - widely supported his move on
             | Iraq), they're all part of the same system so they
             | recognize that in eachother, and reject a Trump or Sanders
             | outsider like an immune system rejecting a foreign object).
             | 
             | The persistent operators in DC - those that continue to
             | exist and push agenda regardless of administration hand-
             | offs - don't care about the floated reason for what Apple
             | is doing. They care about their power and nothing else.
             | That's why they always go to the Do It For The Kids
             | reasoning, they're always lying. They use whatever is most
             | likely to get their agenda through. The goal is to always
             | be expanding the amount of power they have (and that
             | includes domestically and globally, it's about them, not
             | the well-being of nations).
             | 
             | We're entering the era where all of these tools of
             | surveillence they've spent the past few decades putting
             | into place, will start to be put into action against
             | domestic targets en masse, where surveillence tilts over to
             | being used for aggressive suppression. That's what Big Tech
             | is giddily assisting with the past few years, the beginning
             | of that switch over process. The domestic population
             | doesn't want the forever war machine (big reasons Trump &
             | Sanders are so popular, is that both ran on platforms
             | opposed to the endless foreign wars); the people that run
             | DC want the forever war machine, it's their machine, they
             | built it. Something is going to give, it's obvious what
             | that's going to be (human liberty at home - so the forever
             | wars, foreign adventurism can continue unopposed).
        
             | chrsstrm wrote:
             | > is vulnerable to what the US considers to be proper
             | 
             | This stirs up all sorts of questions about location and the
             | prevailing standards in the jurisdiction you're in. Does
             | the set of hashes used to scan change if you cross an
             | international border? Is the set locked to whichever
             | country you activate the phone in? This could be a travel
             | nightmare.
        
               | judge2020 wrote:
               | As this isn't a list of things the U.S. finds prudish,
               | but actual images of children involved in being/becoming
               | a victim of abuse, it doesn't look like there are borders
               | [yet].
               | 
               | If the situation OP suggests happens in the form of
               | FBI/other orgs submitting arguably non-CSAM content, then
               | Apple wouldn't be complicit or any wiser to such an
               | occurrence unless it was after-the-fact. If it happens in
               | a way where Apple decides to do this on their own dime
               | without affecting other ESPs, I imagine they wouldn't
               | upset CCP by applying US guidance to Chinese citizen's
               | phones.
        
             | Klonoar wrote:
             | I think your points are mostly accurate, and that's why I
             | led with the bit about the EFF calling attention to it.
             | Something like this shouldn't happen without scrutiny.
             | 
             | The only thing I'm going to respond to otherwise is this:
             | 
             | >Fourth, but that's only because I slightly am a maniac,
             | how can anyone unilaterally decide to waste the computing
             | power, battery life and data plan of a device I paid for
             | without my say so? (probably one of my main gripes with
             | ads)
             | 
             | This is how iOS and apps in general work - you don't really
             | control the amount of data you're using, and you never did.
             | Downloading a changeset of a hash database is not a big
             | deal; I'd wager you get more push notifications with data
             | payloads in a day than this would be.
             | 
             | Battery life... I've never found Apple's on-device
             | approaches to be the culprit of battery issues for my
             | devices.
             | 
             | I think I'd add to your list of points: what happens when
             | Google inevitably copies this in six months? There really
             | is no competing platform that comes close.
        
               | falcolas wrote:
               | > what happens when Google inevitably copies this in six
               | months? There really is no competing platform that comes
               | close.
               | 
               | Then you have to make a decision about what matters more.
               | Convenience and features, or privacy and security.
               | 
               | I've made that decision myself. I'll spend a bit more
               | time working with less-than-perfect OSS software and
               | hardware to maintain my privacy and security.
        
             | pseudalopex wrote:
             | What transparency? The algorithm is secret. The reporting
             | threshold is secret. The database of forbidden content is
             | secret.
        
         | mtgx wrote:
         | It's all been downhill since we heard that they stopped
         | developing the e2e encrypted iCloud solution because it might
         | upset the FBI even more.
        
         | nerdponx wrote:
         | The cynical take is that Apple was _never_ committed to privacy
         | in and of itself, but they are commited to privacy as long as
         | it improves their competitive advantage, whether by marketing
         | or by making sure that only Apple can extract value from its
         | customers ' data.
         | 
         | Hanlon's razor does not apply to megacorporations that have
         | enormous piles of cash and employ a large number of very smart
         | people, who are either entirely unscrupulous or for whom
         | scruples are worth less than their salaries. We probably aren't
         | cynical _enough_.
         | 
         | I am not arguing that we should always assume every change is
         | always malicious towards users. But our index of suspicion
         | should be high.
        
           | withinboredom wrote:
           | I'd say you're spot on, but I can't say why.
        
           | hpen wrote:
           | I've always been convinced that Apple cared about privacy as
           | a way of competitive advantage. I don't need them to be
           | committed morally or ethically, I just need them to be
           | serious about it because I will give them my money if they
           | are.
        
             | philistine wrote:
             | Tim Cook looks like he believes in money, first and
             | foremost. Anything goes second.
        
         | robertoandred wrote:
         | Except the hashing and hash comparison are happening on the
         | device itself.
        
           | zionic wrote:
           | That's even worse
        
         | dylan604 wrote:
         | It is secure, as long as you have nothing to hide. If you have
         | no offending photos, then the data won't be uploaded! See, it's
         | not nefarious at all! /s
        
         | JohnFen wrote:
         | > because the CEO seemed to be sincere in his commitment to
         | privacy.
         | 
         | The sincerity of a company officer, even the CEO, should not
         | factor into your assessment. Officers change over time (and
         | individuals can change their stance over time), after all.
        
       | unstatusthequo wrote:
       | 4th Amendment. Plaintiff lawyers gear up.
        
       | skee_0x4459 wrote:
       | wow. in the middle of reading that, i realized that this is a
       | watershed moment. why would apple go back on their painstakingly
       | crafted image and reputation of being staunchly pro privacy? its
       | not for the sake of the children lol. no, something happened that
       | has changed the equation for apple. some kind of decisive shift
       | has occurred. maybe apple has finally caved in to the chinese
       | market, like everyone else in the US, and is now making their
       | devices compatible with chinese surveillance. or maybe the US
       | government has finally managed to force apple to crack open its
       | shell of encryption in the name of a western flavored
       | surveillance. but either way, i think it is a watershed moment
       | because securing privacy will from this moment onward be a fringe
       | occupation in the west. unless a competitor rises up, but thats
       | impossible because there arent enough people who care about
       | privacy to sustain a privacy company. thats the real reason why
       | privacy has died today.
       | 
       | and also i think its interesting how kids will adjust to this. i
       | think a lot of kids wont hear about this and will find themselves
       | caught up in a child porn case with foamy mouthed parents of the
       | recipient sapping any reserves of rational motive that the police
       | might have left. because if im not mistaken you can be convicted
       | of child porn distribution even if it was images of your own body
       | taken by yourself being distributed.
        
       | cblconfederate wrote:
       | Makes you rally for NAMBLA
        
       | new_realist wrote:
       | Moral panics are nothing new, and have now graduated into the
       | digital age. The last big one I remember was passage of the DMCA
       | in 1999; it was just absolutely guaranteed to kill the Internet!
       | And as per usual, the Chicken Littles the world were proven
       | wrong. The sky will not fall in this case, either. Unfortunately
       | civilization has produced such abundance and free time that
       | outage viruses like this one will always circulate.
        
       | dukeofdoom wrote:
       | Technocrats are the new railway tycoons
        
       | new_realist wrote:
       | Studies have shown that CCTV reduces crime (https://whatworks.col
       | lege.police.uk/toolkit/Pages/Interventi...). I expect results
       | here will be even better.
       | 
       | This technology uses secret sharing to ensure a threshold of
       | images are met before photos are flagged. In this case, it's even
       | more private than CCTV.
       | 
       | Totalitarian regimes to do not need some magic bit of technology
       | to abuse citizens; that's been clear since the dawn of time.
       | Those who are concerned about abuse would do well to direct their
       | efforts towards maintenance of democratic systems: upholding
       | societal, political, regulatory and legal checks and balances.
       | 
       | Criminals are becoming better criminals by taking advantage of
       | advancements in technology right now, and, for better or worse,
       | it's an arms race and society will simply not accept criminals
       | gaining the upper hand.
       | 
       | If not proven necessary, society is capable of reverting to prior
       | standards (Habeas Corpus resumed after the Civil War, and parts
       | of the Patriot Act have expired, for example.).
        
         | kappuchino wrote:
         | You link to an article that says ... "Overall, the evidence
         | suggests that CCTV c an reduce crime.". And then continues
         | mention that specific context matters: Vehicle crime ... oh
         | well, I wonder if we could combat that without surveilance,
         | like better locks, remote disable of the engine ... There as
         | here with the phones, society has to evaluate the price of the
         | loss of privacy and abuse by totalitarien systems, which will
         | happen - we just can't say when. This is why some - like me -
         | resist backdoors at all if for the price of "more crime".
        
       | RightTail wrote:
       | This is going to be used to suppress political dissidents aka
       | "populist/nationalist right" aka the new alqaeda
       | 
       | searching for CP is the original pretext
        
         | anthk wrote:
         | More like the reverse, fool. The power loves right wing people
         | and racists.
         | 
         | If anything, the left and progressive left will be prosecuted.
         | 
         | China? They even attacked Marxist demonstrations in
         | universities. Current ideology in China is just Jingoism or
         | "keep shit working no matter how".
        
         | robertoandred wrote:
         | How? Please be specific.
        
           | gruez wrote:
           | Presumably by adding signatures for "populist/nationalist
           | right" memes.
        
       | [deleted]
        
       | iamleppert wrote:
       | It's pretty trivial to iteratively construct an image that has
       | the same hash as another, completely different image if you know
       | what the hash should be.
       | 
       | All one needs to do, in order to flag someone or get them caught
       | up in this system, is to gain access to this list of hashes and
       | construct an image. This data is likely to be sought after as
       | soon as this system is implemented, and it will only be a matter
       | of time before a data breach exposes it.
       | 
       | Once that is done, the original premise and security model of the
       | system will be completely eroded.
       | 
       | That said, if this does get implemented I will be getting rid of
       | all my Apple devices. I've already switched to Linux on my
       | development laptops. The older I get, the less value Apple
       | products have to me. So it won't be a big deal for me to cut them
       | out completely.
        
         | jjtheblunt wrote:
         | Cryptographic hashes are exactly not trivial to "dupe".
         | 
         | https://en.wikipedia.org/wiki/Cryptographic_hash_function
        
           | pseudalopex wrote:
           | Perceptual hashes aren't cryptographic.
        
             | handoflixue wrote:
             | Is there anything stopping them from using an actual
             | cryptographic hash, though?
        
               | layoutIfNeeded wrote:
               | Ummm... the fact that changing a single pixel will let
               | the baddies evade detection?
        
               | pseudalopex wrote:
               | Even the smallest change to an image changes a
               | cryptographic hash.
        
           | kickopotomus wrote:
           | They are not using cryptographic hashes. They are using
           | perceptual hashes[1] which are fairly trivial to replicate.
           | 
           | [1]: https://en.wikipedia.org/wiki/Perceptual_hashing
        
             | tcoff91 wrote:
             | This seems dumb. I'm sure that sophisticated bad people
             | will just alter colors and things to defeat the hashes and
             | meanwhile trolls will generate collisions to cause people
             | to falsely be flagged.
        
         | tcoff91 wrote:
         | what is the hashing scheme? I assume it must not be a
         | cryptographically secure hashing scheme if it's possible to
         | find a collision. It's not something like sha256?
        
           | cyral wrote:
           | They call it NeuralHash, there is a lengthy technical spec
           | and security analysis in their announcement
        
       | swiley wrote:
       | I'm really worried about everyone. Somehow I've missed this until
       | now and I've felt sick all day since hearing about it.
        
       | andrewmcwatters wrote:
       | I suspect Apple is subject to government and gag orders and
       | Microsoft has already been doing this with OneDrive but no one
       | has heard about it yet.
        
       | wellthisisgreat wrote:
       | Apple's parental controls are HORRIBLE. There is at least 20%
       | false positives there, that flag all sorts of absolutely benign
       | sites as "adult".
       | 
       | Any kind of machine-based contextual analysis of users' content
       | will be a disaster.
        
         | robertoandred wrote:
         | Good news! It's not doing contextual analysis of content. It's
         | comparing image hashes.
        
           | wellthisisgreat wrote:
           | oh that's actually kind of good news then. I couldn't believe
           | Apple wouldn't know about the inadequacy of their PC
        
           | pseudalopex wrote:
           | You mixed up the 2 new features. The child pornography
           | detection compares perceptual hashes. The iMessage filter
           | tries to classify sexually explicit images.
        
       | threatofrain wrote:
       | Recent relevant discussion.
       | 
       | https://news.ycombinator.com/item?id=28068741
       | 
       | https://news.ycombinator.com/item?id=28075021
       | 
       | https://news.ycombinator.com/item?id=28078115
        
       | new_realist wrote:
       | Moral panics are nothing new, and have now graduated into the
       | digital age. The last big one I remember was passage of the DMCA
       | in 1999; it was just absolutely guaranteed to kill the Internet!
       | And as per usual, the Chicken Littles the world were proven
       | wrong. The sky will not fall in this case, either. Unfortunately
       | civilization has produced such abundance and free time that
       | outage viruses like this one will always circulate. Humans need
       | something to spend their energy on.
        
         | nopeYouAreWrong wrote:
         | uhhh....dmca has been a cancer and destroyed people...so...the
         | fears werent exactly unfounded
        
       | kevin_thibedeau wrote:
       | It would be a shame if we had to start an investigation into your
       | anti-competitive behavior...
        
       | klempotres wrote:
       | Technically speaking, if Apple plans to perform PSI on device (as
       | opposed to what Microsoft does), how come that "the device will
       | not know whether a match has been found"?
       | 
       | Is there anyone who's familiar with the technology so they can
       | explain how it works?
        
         | gruez wrote:
         | >how come that "the device will not know whether a match has
         | been found"
         | 
         | Probably using some sort of probabilistic query like a bloom
         | filter.
        
       | c7DJTLrn wrote:
       | Catching child pornographers should not involve subjecting
       | innocent people to scans and searches. Frankly, I don't care if
       | this "CSAM" system is effective - I paid for the phone, it should
       | operate for ME, not for the government or law enforcement.
       | Besides, the imagery already exists by the time it's been found -
       | the damage has been done. I'd say the authorities should
       | prioritise tracking down the creators but I'm sure their
       | statistics look much more impressive by cracking down on small
       | fry.
       | 
       | I've had enough of the "think of the children" arguments.
        
         | burself wrote:
         | The algorithms and data involved are too sensitive to be
         | discussed publicly and the reasoning is acceptable enough to
         | even the most knowledgeable people. They can't even be
         | pressured to prove that the system is effective.
         | 
         | This is the perfect way to begin opening the backend doors.
        
         | bambax wrote:
         | Yes. I'm not interested in catching pedophiles, or drug
         | dealers, or terrorists. It's the job of the police. I'm not the
         | police.
        
         | 2OEH8eoCRo0 wrote:
         | Why is it always "think of the children"? It gets people
         | emotional? What about terrorism, murder, or a litany of other
         | heinous violent crimes?
        
           | falcolas wrote:
           | I invite you to look up "The Four Horsemen of the
           | Infocalypse". Child Pornography is but one of the well
           | trodden paths to remove privacy and security.
        
             | kazinator wrote:
             | And remember, a minor who takes pictures of him or herself
             | is an offender.
        
         | zionic wrote:
         | I'm furious. My top app has 250,000 uniques a day.
         | 
         | I'm considering a 24h black out with a protest link to apple's
         | support email explaining what they've done.
         | 
         | I wonder if anyone else would join me?
        
         | mrits wrote:
         | There isn't any reason to believe the CSAM hash list is only
         | images. The government now has the ability to search for
         | anything in your iCloud account with this.
        
       | geraneum wrote:
       | Didn't they [Apple] make the same points that EFF is making now,
       | to avoid giving FBI a key to unlock an iOS device that belonged
       | to a terrorist?
       | 
       | " Compromising the security of our personal information can
       | ultimately put our personal safety at risk. That is why
       | encryption has become so important to all of us."
       | 
       | "... We have even put that data out of our own reach, because we
       | believe the contents of your iPhone are none of our business."
       | 
       | " The FBI may use different words to describe this tool, but make
       | no mistake: Building a version of iOS that bypasses security in
       | this way would undeniably create a backdoor. And while the
       | government may argue that its use would be limited to this case,
       | there is no way to guarantee such control."
       | 
       | Tim Cook, 2016
        
         | rubatuga wrote:
         | Think of the children!!!
        
       | bississippi wrote:
       | First they built a walled garden beautiful on the inside and
       | excoriated competitors [1] for their lack of privacy. Now that
       | the frogs have walked into the walled garden, they have started
       | to boil the pot [2] . I don't think the frogs will ever find out
       | when to get off the pot.
       | 
       | [1] https://www.vox.com/the-goods/2019/6/4/18652228/apple-
       | sign-i...
       | 
       | [2] https://en.wikipedia.org/wiki/Boiling_frog
        
       | roody15 wrote:
       | My two cents: I get the impression this is related to NSO pegasus
       | software. So once the Israeli firms leaks were made public Appple
       | had to respond and has patched some security holes that were
       | exposed publicly.
       | 
       | NSO used exploits in iMessage to enable them to grab photos,
       | texts among other things.
       | 
       | Now shortly after Apple security patches we see them pivot and
       | now want to "work" with law enforcement. Hmmm almost like once
       | access was closed Apple needs a way to justify "opening" access
       | to devices.
       | 
       | Yes I realize this could be a stretch based on the info. Just
       | seems like an interesting coincidence... back door exposed and
       | closed.... now it's back open... almost like governments demand
       | access
        
       | Spooky23 wrote:
       | This article is irresponsible hand-waving.
       | 
       | " When Apple releases these "client-side scanning"
       | functionalities, users of iCloud Photos, child users of iMessage,
       | and anyone who talks to a minor through iMessage will have to
       | carefully consider their privacy and security priorities in light
       | of the changes, and possibly be unable to safely use what until
       | this development is one of the preeminent encrypted messengers."
       | 
       | People sending messages to minors that trigger a hash match have
       | more fundamental things to consider, as they are sending known
       | photos of child exploitation to a minor.
       | 
       | The EFF writer knows this, as they describe the feature in the
       | article. They should be ashamed of publishing this crap.
        
         | morpheuskafka wrote:
         | You've got it mixed up. The messages are scanned for any
         | explicit material (which in many but not all cases is illegal),
         | not specific hash matches. That's only for uploads to iCloud
         | Photos.
         | 
         | Additionally, you are not "obliged" to report such photos to
         | the police. Uninvolved service providers do have to submit some
         | sort of report iirc, but to require regular users to do so
         | would raise Fifth Amendment concerns.
        
         | itake wrote:
         | > they are sending known photos of child exploitation to a
         | minor
         | 
         | How do you know its a known photo of child exploitation? The
         | original image that was hashed and then deleted. Two completely
         | different images have the same hash.
         | 
         | WhatsApp automatically saves images to photos. What if you
         | receive a bad image and are reported due to someone else
         | sending the image to you?
        
           | Spooky23 wrote:
           | You're obliged to report that image to the police. These
           | types of images are contraband.
        
             | itake wrote:
             | > You're obliged to report that image to the police.
             | 
             | Is this a legal obligation for all countries that iPhones
             | operate in? I wasn't able to find a law via a quick google
             | search for the US.
             | 
             | For US law, are there protections for people that report
             | the contraband? I'm not sure if good samaritan or whistle
             | blower laws protect you.
        
       | temeritatis wrote:
       | the road to hell is paved with good intentions
        
       | NazakiAid wrote:
       | Wait until a corrupt govenment starts forcing Apple or Microsoft
       | to scan for leaked documents exposing them and then automatically
       | notifying them. Just one of the many ways this could go wrong in
       | the future.
        
       | m3kw9 wrote:
       | Gonna get downvoted for this, I maybe the few that supports this
       | and I hope they catch these child exploiters by the boat load and
       | save 1000s of kids from traffickers and jail their asses
        
         | pseudalopex wrote:
         | The child pornography detection only tries to find known child
         | pornography. It does nothing to stop traffickers.
        
       | panny wrote:
       | I left Apple behind years ago after using their gear for more
       | than a decade. I recently received a new M1 laptop from work and
       | liked it quite a bit. It's fast, it's quiet, it doesn't get hot.
       | I liked it so much, that I was prepared to go back full Apple for
       | a while. I was briefly reviewing a new iPhone, a M1 mini as a
       | build server, a display, and several accessories to go along with
       | a new M1 laptop for myself. (I don't like to mix work and
       | personal)
       | 
       | Then this news broke. Apple, you just lost several thousand
       | dollars in sales from me. I had items in cart and was pricing
       | everything out when I found this news. I will spend my money
       | elsewhere. This is a horrendous blunder. I will not volunteer
       | myself up to police states by using your gear now or ever again
       | in the future. I've even inquired about returning the work laptop
       | in exchange for a Dell.
       | 
       | Unsafe at any speed. Stallman was right. etc etc etc.
        
       | shmerl wrote:
       | Is anyone even using Apple if they care about privacy and
       | security?
        
       | cwizou wrote:
       | The FT article mentioned it was US only (at least according to
       | the title, please correct me if wrong), but I'm more afraid of
       | how other governments will try to pressure Apple to adapt said
       | technology to their needs.
       | 
       | Can they trust _random_ government to give them a database of
       | only CSAM hashes and not insert some extra politically motivated
       | content that they deem illegal ?
       | 
       | Because once you've launched this feature in the "land of the
       | free", other countries will require for their own needs their own
       | implementation and want to control said database.
       | 
       | And how long until they also scan browser history for the same
       | purpose ? Why stop at pictures ? This is opening a very dangerous
       | door that many here will be uncomfortable with.
       | 
       | Scanning on their premises would be a much better choice, this is
       | everything but (as the "paper" linked tries to say) privacy
       | forward.
        
         | aalam wrote:
         | The initial rollout is limited to the US, with no concrete
         | plans reported yet on expansion.
         | 
         | "The scheme will initially roll out only in the US. [...]
         | Apple's neuralMatch algorithm will continuously scan photos
         | that are stored on a US user's iPhone and have also been
         | uploaded to its iCloud back-up system."
         | 
         | Researchers interviewed for the article would agree with your
         | analysis. "Security researchers [note: appears to be the named
         | security professors quoted later in the article], while
         | supportive of efforts to combat child abuse, are concerned that
         | Apple risks enabling governments around the world to seek
         | access to their citizens' personal data, potentially far beyond
         | its original intent."
         | 
         | Article link for ease of access:
         | https://www.ft.com/content/14440f81-d405-452f-97e2-a81458f54...
        
           | cwizou wrote:
           | Thanks, after some fiddling I managed to finally read the
           | full text from the article and it's definitely short on
           | details on the rollout. Let's hope they rethink this.
        
       | falcolas wrote:
       | Apple,
       | 
       | Not that you care, but this is the straw that's broken this
       | camel's back. It's too ripe for abuse, it's too invasive, and I
       | don't want it.
       | 
       | You've used one of the Four Horsemen of the Infocalypse
       | perfectly... and so I'm perfectly happy to leave your ecosystem.
       | 
       | Cheers.
        
       | FpUser wrote:
       | Luckily I only use phone to make phone calls, offline GPS and to
       | control some gizmos like drones. Do not even have data plan. Not
       | an Apple customer either so I guess my exposure to things
       | mentioned is more limited.
        
       | imranhou wrote:
       | I think it's easy to say no to any solution, but harder to say
       | "this is bad, but we should do this instead to solve the
       | problem". In a world with ubiquitous/distributed communication,
       | the ideas that come up would generally avoid direct interception
       | but some way to identify a malicious transaction.
       | 
       | I only urge that whenever we come up and say no to ideas like
       | this, one should only do that only when accompanied by their
       | thoughts on an alternative solution.
        
       | trangus_1985 wrote:
       | I've been maintaining a spare phone running lineage os exactly in
       | case something like this happened - I love the apple watch and
       | apple ecosystem, but this is such a flagrant abuse of their
       | position as Maintainers Of The Device that I have no choice but
       | to switch.
       | 
       | Fortunately, my email is on a paid provider (fastmail), and my
       | photos are on a NAS, I've worked hard to get all of my friends on
       | Signal. While I still use google maps, I've been trialing out OSM
       | alternatives for a minute.
       | 
       | The things they've described are in general, reasonable and
       | probably good in the moral sense. However, I'm not sure that I
       | support what they are implementing for child accounts (as a queer
       | kid, I was terrified of my parents finding out). On the surface,
       | it seems good - but I am concerned about other snooping features
       | that this portents.
       | 
       | However, with icloud photos csam, it is also a horrifying
       | precedent that the device I put my life into is scanning my
       | photos and reporting on bad behavior (even if the initial dataset
       | is the most reprehensible behavior).
       | 
       | I'm saddened by Apple's decision, and I hope they recant, because
       | it's the only way I will continue to use their platform.
        
         | rasengan wrote:
         | Your original post said postmarketOS. That is weird that you
         | changed it to lineage (and misspelled that).
        
           | trangus_1985 wrote:
           | Yeah, sorry, I mixed them up in my head. I'm currently
           | running Lineage on a PH-1, not Postmarket. I would not
           | consider what I have set up to be "production ready", but I'm
           | going to spend some time this weekend looking into what
           | modern hardware can run Lineage or other open mobile OSes
        
           | trangus_1985 wrote:
           | Oh hey wait you're the freenode guy. While we're on the topic
           | of hostile actions by a platform provider...
        
             | rasengan wrote:
             | Doesn't change that you're a liar.
        
           | hncurious wrote:
           | Why is that weird?
        
         | artimaeis wrote:
         | It's not the device that's less secure or private in this
         | context, it's the services. There's no reason you couldn't just
         | continue using your NAS for photo backup and Signal for
         | encrypted-communications completely unaffected by this.
         | 
         | Apple seems to not have interest in users devices, which makes
         | sense -- they're not liable for them. They _do_ seem interested
         | in protecting the data that they house, which makes sense,
         | because they're liable for it and have a responsibility to
         | remove/report CSAM that they're hosting.
        
           | [deleted]
        
         | Andrew_nenakhov wrote:
         | Signal is still a centralised data silo where by default you
         | trust CA to verify your contacts identify.
        
           | chimeracoder wrote:
           | > Signal is still a centralised data silo where by default
           | you trust CA to verify your contacts identify.
           | 
           | You can verify the security number out-of-band, and the
           | process is straightforward enough that even nontechnical
           | users can do it.
           | 
           | That's as much as can possibly be done, short of an app that
           | literally prevents you from communicating with anyone without
           | manually providing their security number.
        
             | Andrew_nenakhov wrote:
             | I said, 'by default'. I know that it is possible to do a
             | manual verification, but I am yet to have a chat with a
             | person who would do that.
             | 
             | Also, the Signal does not give any warnings or indication
             | that chat partner identify is manually verified. Users are
             | supposed to trust Signal and not ask difficult questions
        
               | chimeracoder wrote:
               | > I said, 'by default'. I know that it is possible to do
               | a manual verification, but I am yet to have a chat with a
               | person who would do that.
               | 
               | I'm not sure what else you'd expect. The alternative
               | would be for Signal not to handle key exchange at all,
               | and only to permit communication after the user manually
               | provides a security key that was obtained out-of-band.
               | That would be an absolutely disastrous user experience.
               | 
               | > Also, the Signal does not give any warnings or
               | indication that chat partner identify is manually
               | verified
               | 
               | That's not true. When you verify a contact, it adds a
               | checkmark next to their name with the word "verified"
               | underneath it. If you use the QR code to verify, this
               | happens automatically. Otherwise, if you've verified it
               | manually (visual inspection) you can manually mark the
               | contact as verified and it adds the checkmark.
        
               | Andrew_nenakhov wrote:
               | > I'm not sure what else you'd expect.
               | 
               | Ahem. I'd expect something that most xmpp clients could
               | do 10+ years aho with OTR: after establishing an
               | encrypted session the user is given a warning that chat
               | identify of a partner is not verified, and is given
               | options on how to perform this verification.
               | 
               | With CA you can make a mild warning that identity is
               | verified by Signal, and give an options to dismiss
               | warning or perform off-the-band verification.
               | 
               | Not too disastrous, no?
               | 
               | > That's not true. When you verify a contact, it adds a
               | checkmark next to their name with the word "verified"
               | 
               | It has zero effect if the user is given no indication
               | that there should be the word _verified_.
               | 
               | It is not true what you say. _This_ [1] is what a new
               | user sees in Signal - absolutely zero indication. To
               | verify a contact user must go to  "Conversation settings*
               | and then "View safety number". I'm not surprised nobody
               | ever established a verified session with me.
               | 
               | [1]: https://www.dropbox.com/s/ab1bvazg4y895f6/screenshot
               | _2021080...
        
               | int_19h wrote:
               | I did this with all my friends who are on Signal, and
               | explained the purpose.
               | 
               | And it does warn about the contact being unverified
               | directly in the chat window, until you go and click
               | "Verify". The problem is that people blindly do that
               | without understanding what it's for.
        
           | trangus_1985 wrote:
           | Yeah, but it's also useful for getting my friends on board. I
           | think it's likely that I eventually start hosting matrix or
           | some alternative, but my goal is to be practical here, yet
           | still have a privacy protecting posture.
        
             | Sunspark wrote:
             | Your friends aren't going to want to install an app to have
             | it connect to trangus_1985's server. Be happy just getting
             | them on Signal.
        
         | Saris wrote:
         | I think no matter what devices you use, you've nailed down the
         | most important part of things which is using apps and services
         | that are flexible, and can be easily used on another platform.
        
           | trangus_1985 wrote:
           | I knew that eventually it'd probably matter what devices I
           | used, I just didn't expect it to be so soon.
           | 
           | But yeah, I could reasonably use an iphone without impact for
           | the foreseeable future with some small changes.
        
         | OJFord wrote:
         | > While I still use google maps
         | 
         | I use Citymapper simply because I find it better (for the city-
         | based journeys that are my usual call for a map app) - but it
         | not being a Google ~data collection device~ service is no
         | disadvantage.
         | 
         | At least, depending why you dislike having everything locked up
         | with Google or whoever I suppose. Personally it's more having
         | _everything_ somewhere that troubles me, I 'm reasonably happy
         | with spreading things about. I like self-hosting things too,
         | just needs a value-add I suppose, that's not a reason in itself
         | _for me_.
        
         | JumpCrisscross wrote:
         | > _with icloud photos csam, it is also a horrifying precedent_
         | 
         | I'm not so bugged by this. Uploading data to iCloud has always
         | been a trade of convenience at the expense of privacy. Adding a
         | client-side filter isn't great, but it's not categorically
         | unprecedented--Apple executes search warrants against iCloud
         | data--and can be turned off by turning off iCloud back-ups.
         | 
         | The scanning of childrens' iMessages, on the other hand, is a
         | subversion of trust. Apple spent the last decade telling
         | everyone their phones were secure. Creating this side channel
         | opens up all kinds of problems. Having trouble as a controlling
         | spouse? No problem--designate your partner as a child.
         | Concerned your not-a-tech-whiz kid isn't adhering to your
         | house's sexual mores? Solved. Bonus points if your kid's phone
         | outs them as LGBT. To say nothing of most sexual abuse of
         | minors happening at the hands of someone they trust. Will their
         | phone, when they attempt to share evidence, tattle on them to
         | their abuser?
         | 
         | Also, can't wait for Dads' photos of their kids landing them on
         | a national kiddie porn watch list.
        
           | mojzu wrote:
           | If The Verge's article is accurate about how/when the CSAM
           | scanning occurs then I don't have a problem with that, sounds
           | like they're moving the scanning from server to client side,
           | the concerns about false positives seem valid to me but I'm
           | not sure the chance of one occurring has increased over the
           | existing icloud scanning. Scope creep for other content
           | scanning is definitely a possibility though so I hope people
           | keep an eye on that
           | 
           | I'm not a parent but the other child protection features seem
           | like they could definitely be abused by some parents to exert
           | control/pry into their kids private lives. It's a shame that
           | systems have to be designed to prevent abuse by bad people
           | but at Apple's scale it seems like they should have better
           | answers for the concerns being raised
        
             | SquishyPanda23 wrote:
             | > sounds like they're moving the scanning from server to
             | client side
             | 
             | That is good, but unless a system like this is fully open
             | source and runs only signed code there really aren't many
             | protections against abuse.
        
           | js2 wrote:
           | > designate your partner as a child.
           | 
           | That's not how it works, unless you control your partner's
           | Apple ID and you lie about their DOB when you create their
           | account.
           | 
           | I created my kids Apple IDs when they were minors and
           | enrolled them in Family Sharing. They are now both over 18
           | and I cannot just designate them as minors. Apple
           | automatically removed my ability to control any aspects of
           | their phones when they turned 18.
           | 
           | > Dads' photos of their kids landing them on a national
           | kiddie porn watch list.
           | 
           | Indeed, false positives is much more worrying. The idea that
           | my phone is spying on my pictures... like, what the hell.
        
             | odyssey7 wrote:
             | > That's not how it works, unless you control your
             | partner's Apple ID and you lie about their DOB when you
             | create their account.
             | 
             | Rather than reassuring me, this sounds like an achievable
             | set of steps for an abuser to carry out.
        
               | leereeves wrote:
               | More than achievable. Abusers often control their
               | victims' accounts.
        
           | selykg wrote:
           | I feel like you're sensationalizing this a lot.
           | 
           | There's two functions here. Both client side.
           | 
           | First, machine learning to detect potentially inappropriate
           | pictures for children to view. This seems to require parental
           | controls to be on. Optionally it can send a message to the
           | parent when a child purposefully views the image. The image
           | itself is not shared with Apple so this is notification to
           | parents only.
           | 
           | The second part is a list of hashes. So the Photos app will
           | hash images and compare to the list in the database. If it
           | matches then presumably they do something about that. The
           | database is only a list of KNOWN child abuse images
           | circulating.
           | 
           | Now, not to say I like the second part but the first one
           | seems fine. The second is sketchy in that what happens if
           | there's a hash collision. But either way it seems easy enough
           | to clear that one up.
           | 
           | No father is going to be added to some list for their
           | children's photos. Stop with that hyperbole.
        
             | JumpCrisscross wrote:
             | > _the Photos app will hash images and compare to the list
             | in the database. If it matches then presumably they do
             | something about that. The database is only a list of KNOWN
             | child abuse images circulating._
             | 
             | This seems fine as it's (a) being done on iCloud-uploaded
             | photos and (b) replacing a server-side function with a
             | client-side one. If Apple were doing this to locally-stored
             | photos on iCloud-disconnected devices, it would be nuts.
             | Once the tool is built, expanding the database to include
             | any number of other hashes is a much shorter leap than
             | compelling Apple to build the tool.
             | 
             | > _it seems easy enough to clear that one up_
             | 
             | Would it be? One would be starting from the point of a
             | documented suspicion of possession of child pornography.
        
             | 015a wrote:
             | This is Apple installing code on their users' devices with
             | the express intent to harm their customers. That's it! This
             | is inarguable! If this system works as intended, Apple is
             | knowingly selling devices that will harm their customers.
             | We can have the argument as to whether the harm is
             | justified, whether the users _deserved it_. Sure, this only
             | impacts child molesters. That makes it ok?
             | 
             | "But it only impacts iCloud Photos". Valid! So why not run
             | the scanner in iCloud and not on MY PHONE that I paid OVER
             | A THOUSAND DOLLARS for? Because of end-to-end encryption.
             | Apple wants to have their cake and eat it too. They can say
             | they have E2EE, but also give users no way to opt-out of
             | code, running on 100% of the "end" devices in that "end-to-
             | end encryption" system, which subverts the E2EE. A
             | beautiful little system they've created. "E2EE" means
             | different things on Apple devices, for sure!
             | 
             | And you're ignoring (or didn't read) the central, valid
             | point of the EFF article: _Maybe_ you can justify this in
             | the US. Most countries are far, far worse than the US when
             | it comes to privacy and human rights. The technology
             | exists. The policy has been drafted and enacted; Apple is
             | now alright with subverting E2EE. We start with hashes of
             | images of child exploitation. What 's next? Tank man in
             | China? Photos of naked adult women, in conservative parts
             | of the world? A meme criticizing your country's leader? I
             | want to believe that Apple will, AT LEAST, stop at child
             | exploitation, but Apple has already estroyed the faith I
             | held in them, only yesterday, in their fight for privacy as
             | a right.
             | 
             | This isn't an issue you can hold a middleground position
             | on. Encryption doesn't only kinda-sorta work in a half-ass
             | implementation; it doesn't work at all.
        
         | 2OEH8eoCRo0 wrote:
         | >While I still use google maps
         | 
         | You can still use Google Maps without an account and
         | "incognito". I wish they'd allow app store usage without an
         | account though- similar to how any Linux package manager works.
        
           | trangus_1985 wrote:
           | That's not really the issue. The issue is that for google
           | maps to work properly, it requires that the Play services are
           | installed. Play services are a massive semi-monolithic blob
           | that requires tight integration with Google's backend, and
           | deep, system-level permissions to operate correctly.
           | 
           | I'm not worried about my search history.
        
             | boring_twenties wrote:
             | Last I checked (about a year ago), the Google Maps app did
             | work with microG (a FOSS reimplementation of Google Play
             | Services).
        
               | trangus_1985 wrote:
               | I use maps on my phone on a regular basis - I would
               | vastly prefer to have something less featured and stable
               | versus hacking the crap out of my phone. But that's good
               | to know.
        
             | brundolf wrote:
             | One workaround is to use the mobile web app, which is
             | surprisingly pretty decent for a web app. And because it's
             | a web app, you can even disable things like sharing your
             | location if you want to
        
             | 2OEH8eoCRo0 wrote:
             | Ahhh, gotcha. Did not realize that. Makes sense.
        
             | techrat wrote:
             | People need to remember that most of Android got moved into
             | Play Services. It was the only way to keep a system
             | relatively up to date when the OEMs won't update the OS
             | itself.
             | 
             | Yeah, it's a dependency... as much as the Google Maps APK
             | needing to run on Android itself.
        
           | opan wrote:
           | In addition to F-Droid, you can get Aurora Store (which is on
           | F-Droid) which lets you use an anonymous login to get at the
           | Play Store. I use it for a couple free software apps that
           | aren't on F-Droid for some reason.
        
             | C19is20 wrote:
             | What are the apps?
        
             | sunshineforever wrote:
             | I also recommend Aurora Store as a complete replacement for
             | the Play store. The one thing is that I've never tried
             | using apps that I paid for on it but it works very well for
             | any free apps. There is an option to use a Google account
             | with Aurora but I've only ever used the anonymous account.
             | 
             | The only slight dowbside is that I haven't figured out how
             | to auto update appd, so your apps will get out of date
             | without you being notified and you have to manually do it.
             | This problem might literally be solved by a simple setting
             | thay I haven't bothered to look for, IDK.
             | 
             | On the plus side it includes all the official play store
             | apps, along side some that aren't allowed by play store.
             | 
             | For examples, Newpipe, the superior replacement YouTube app
             | that isn't allowed on play store due to it subverting
             | advertisements and allowing a few features that are useful
             | for downloading certain things.
        
         | bambax wrote:
         | > _probably good in the moral sense_
         | 
         | How, how is it even morally good?? Will they start taking
         | pictures of your house to see if you store drugs under your
         | couch? Or cook meth in your kitchen??
         | 
         | What is moral is for society to be in charge of laws and law
         | enforcement. This vigilante behavior by private companies who
         | answer to no one is unjust, tyrannical and just plain crazy.
        
         | _red wrote:
         | Yes, my history was Linux 95-04, Mac 04-15, and now back to
         | Linux from 2015 onwards.
         | 
         | Its been clear Tim Cook was going to slowly harm the brand. He
         | was a wonderful COO under a visionary CEO-type, but he holds no
         | particular "Tech Originalist" vision. He's happy to be part of
         | the BigTech aristocracy, and probably feels really at home in
         | the powers it affords him.
         | 
         | Anyone who believes this is "just about the children" is naive.
         | His chinese partners will use this to crack down on "Winnie the
         | Poo" cartoons and the like...before long questioning any Big
         | Pharma product will result in being flagged. Give it 5 years at
         | max.
        
           | ursugardaddy wrote:
           | you make that sound like a bad thing, I'd love to live in a
           | world without child abuse spreading rampant on the internet
           | and not having to suffer though what passes for political
           | speech (memes) these days.
           | 
           | maybe once we detect and stop stuff like this from happening
           | before it gets very bad, we can grow as a society and adjust
           | our forms of punishment accordingly too
        
             | adamrt wrote:
             | Is this a bot comment? Account is two hours old.
             | 
             | You want to not suffer through political memes? And jump to
             | scanning private messages for dissent, by authoritarian
             | governments being okay!?
             | 
             | What?!
        
               | ursugardaddy wrote:
               | No, I'm being serious. technology like this could be very
               | beneficial.
               | 
               | there's a good chance that if we continue to improve
               | surveillance law enforcement agencies and justice
               | departments could begin to focus on rehabilitation and
               | growing a kinder world.
               | 
               | right they are like firemen trying to put out fires after
               | the building has been ruined
               | 
               | if it doesn't work we're doomed anyway, so what's the
               | problem?
        
               | empressplay wrote:
               | The problem is what you're describing is literal facism?
        
             | Lammy wrote:
             | _Helen Lovejoy voice_ Won 't somebody _please_ think of the
             | children!?
        
             | runjake wrote:
             | Once you give them the power, they'll never willingly hand
             | it back.
        
             | withinboredom wrote:
             | I don't think anyone is arguing that making it harder to
             | abuse children is a bad thing. It's what is required to do
             | so that is the bad thing. It'd be like if someone installed
             | microphones all over every house to report on when you
             | admit that you're guilty to bullying. No one wants
             | bullying, but I doubt you want a microphone recording
             | everything and looking for certain trigger words. Unless
             | you have an Alexa or something, then I guess you probably
             | wouldn't mind that example.
        
         | LazyR0B0T wrote:
         | Organic Maps on Fdroid is a really clean osm based map.
        
           | JackGreyhat wrote:
           | Nearly the same as MagicEarth...I use it all the time.
        
           | crocodiletears wrote:
           | Does it let you select from multiple routes? I've been using
           | Pocketmaps, but it only gives you a single option for
           | routing, which can lead to issues in certain contexts
        
           | Sunspark wrote:
           | I'm impressed, it actually has smooth scrolling unlike OsmAnd
           | which is very slow loading tiles in.
           | 
           | Critical points I'd make about Organic Maps, I'd want a lower
           | inertia setting so it scrolls faster, and a different color
           | palette.. they are using muddy tones of green and brown.
        
         | alksjdalkj wrote:
         | Have you found any decent google maps alternatives? I'd love to
         | find something but nothing comes close as far as I've found.
         | Directions that take into account traffic is the big thing that
         | I feel like nobody (other than Apple, MS, etc.) will be able to
         | replicate.
         | 
         | Have you tried using the website? I've had some luck with that
         | on postmarketOS, and it means you don't need to install Play
         | services to use it.
        
           | krobbn wrote:
           | I really like Here WeGo, and it allows you to download maps
           | for specific countries too to have available offline.
        
           | beermonster wrote:
           | OsmAND
        
           | manuelmagic wrote:
           | I'm using since many years HERE Maps https://wego.here.com/
        
           | nickexyz wrote:
           | Organic maps is pretty good:
           | https://github.com/organicmaps/organicmaps
        
         | new_realist wrote:
         | The argument from reactionary HN neckbeards is basically,
         | "can't you see that this _could_ be used for great evil?"
         | 
         | No shit. That's obvious to just about... everyone on the
         | planet. Many things in this world can be used for great evil:
         | knives, gasoline, guns, TNT, cars--even most household items
         | when used with creativity. It is quite impossible to create
         | something which can't be abused in some form. But society still
         | allows them, because it judges that the good outweighs the bad,
         | and systems exist to manage the risk of evil use.
         | 
         | In this case, I have every expectation that this scanning will
         | be auditable, and society will eventually work out most of the
         | imperfections in systems like these, and strike the right
         | balance to make the world a better place.
        
           | [deleted]
        
         | threatofrain wrote:
         | What is your home NAS setup like?
        
           | trangus_1985 wrote:
           | Freenas, self-signed tightly-scoped CA installed on all of my
           | devices. 1TBx4 in a small case shoved under the stairs.
           | 
           | tbh, i would vastly prefer to use a cloud based service with
           | local encryption - I'm not super paranoid, just overly
           | principled
        
             | quest88 wrote:
             | What do you use to sync phone photos to your NAS? I like
             | Google Photos' smartness, but I also want my photos on my
             | Synology NAS.
        
               | antgiant wrote:
               | I personally am a fan of Mylio for that.
               | https://mylio.com/
        
             | voltaireodactyl wrote:
             | If you haven't already heard of it, cryptomator might be
             | just what you're after.
        
         | lcfcjs wrote:
         | Found the paedo.
        
         | cle wrote:
         | Unfortunately with SafetyNet, I feel like an investment into
         | Android is also a losing proposition...I can only anticipate
         | being slowly cut off from the Android app ecosystem as more
         | apps onboard with attestation.
         | 
         | We've collectively handed control of our personal computing
         | devices over to Apple and Google. I fear the long-term
         | consequences of that will not be positive...
        
           | trangus_1985 wrote:
           | I don't think it's implausible that I carry around a phone
           | that has mail, contacts, calendars, photos, and private chat
           | on it. And then, have a second, older phone that has like
           | Instagram and mobile games. It's tragic.
        
             | sodality2 wrote:
             | Unfortunately a big bulk of the data they profit off of is
             | simply the ads and on-platform communication and behavior.
             | Doesn't really matter if you use a different device if you
             | still use the platform. Sure, it's slightly better, but it
             | really isn't a silver bullet if you're still using it. And
             | this is coming from someone who does this already.
        
           | heavyset_go wrote:
           | > _We 've collectively handed control of our personal
           | computing devices over to Apple and Google_
           | 
           | Hey now, the operating system and app distribution cartels
           | include Microsoft, too.
        
           | techrat wrote:
           | Loosing sight of the forest for this one tree.
           | 
           | 1) Google doesn't release devices without unlockable
           | bootloaders. They have always been transparent in allowing
           | people to unlock their Nexus and Pixels. Nexus was for
           | developers, Pixels are geared towards the end user. Nothing
           | changed with regards to the bootloaders.
           | 
           | 2) Google uses Coreboot for their ChromeOS devices. Again,
           | you couldn't get more open than that if you wanted to buy a
           | Chromebook and install something else on it.
           | 
           | 3) To this day, app sideloading on Android remains an option.
           | They've even made it easier for third party app stores to
           | automatically update apps with 12.
           | 
           | 4) AOSP. Sure, it doesn't have all the bells and whistles as
           | the latest and greatest packaged up skin and OS release, but
           | all of the features that matter within Android, especially if
           | you're going to de-Google yourself, are still there.
           | 
           | Any one of those points, but consider all four, and I have
           | trouble understanding why people think REEEEEEEE Google.
           | 
           | So you can't play with one ball in the garden (SafetyNet),
           | you've still got the rest of the toys. That's a compromise
           | I'm willing to accept in order to be able to do what I want
           | to and how I want to do it. (Eg, Rooting or third party
           | roms.)
           | 
           | If you don't like what they do on their mobile OS, there's
           | _nothing_ that Google is doing to lock you into a Walled
           | Garden to where the only option you have is to completely
           | give up what you 're used to...
           | 
           | ...Unlike Apple. Not one iOS device has been granted an
           | unlockable bootloader. Ever.
        
             | shbooms wrote:
             | "1) Google doesn't release devices without unlockable
             | bootloaders. They have always been transparent in allowing
             | people to unlock their Nexus and Pixels. Nexus was for
             | developers, Pixels are geared towards the end user. Nothing
             | changed with regards to the bootloaders."
             | 
             | This is not accurate. Pixels that come from Verizon have
             | bootloaders that cannot be fully unlocked.
        
             | Zak wrote:
             | Safetynet is becoming a problem, and the trend shows to
             | signs of slowing down.
             | 
             | I shouldn't have to choose between keeping full control
             | over my device and being able to use it to access the
             | modern world.
        
       | Shank wrote:
       | I really love the EFF, but I also believe the immediate backlash
       | is (relatively) daft. There is a potential for abuse of this
       | system, but consider the following too:
       | 
       | 1. PhotoDNA is already scanning content from Google Photos and a
       | whole host of other service providers.
       | 
       | 2. Apple is obviously under pressure to follow suit, but they
       | developed an on-device system, recruited mathematicians to
       | analyze it, and published the results, as well as one in-house
       | proof and one independent proof showing the cryptographic
       | integrity of the system.
       | 
       | 3. Nobody, and I mean nobody, is going to successfully convince
       | the general public that a tool designed to stop the spread of
       | CSAM is a "bad thing" unless they can show concrete examples of
       | the abuse.
       | 
       | For one and two: given the two options, would you rather that
       | Apple implement serverside scanning, in the clear, or go with the
       | on-device route? If we assume a law was passed to require
       | serverside scanning (which could very well happen), what would
       | that do to privacy?
       | 
       | For three: It's an extremely common trope to say that people do
       | things to "save the children." Well, that's still true. Arguing
       | against a CSAM scanning tool, which is technically more privacy
       | preserving than alternatives from other cloud providers, is an
       | extremely uphill battle. The biggest claim here is that the
       | detection tool _could_ be abused against people. And that very
       | well may be possible! But the whole existence of NCMEC is
       | predicated on stopping the active and real danger of child sex
       | exploitation. We know with certainty this is a problem. Compared
       | to a certainty of child sex abuse, the hypothetical risk from
       | such a system is practically laughable to most people.
       | 
       | So, I think again, the backlash is daft. It's been about two days
       | of the announcement being public (leaks). The underlying
       | mathematics behind the system has barely been published [0]. It
       | looks like the EFF rushed to make a statement here, and in doing
       | so, it doesn't look like they took the time to analyze the
       | cryptography system, to consider the attacks against it, or to
       | consider possible motivations and outcomes. Maybe they did, and
       | they had advanced access to the material. But it doesn't look
       | like it, and in the court of public opinion, optics are
       | everything.
       | 
       | [0]: https://www.apple.com/child-
       | safety/pdf/Alternative_Security_...
        
         | api wrote:
         | (2) is important. Apple put effort into making this at least
         | somewhat privacy-respecting, while the other players just scan
         | everything with no limit at all. They also scan everything for
         | any purpose including marketing, political profiling, etc.
         | 
         | Apple remains the most privacy respecting major vendor. The
         | only way to do better is fully open software and open hardware.
        
         | echelon wrote:
         | > There is a potential for abuse of this system, but consider
         | the following too
         | 
         | > I think again, the backlash is daft.
         | 
         | Don't apologize for this bullshit! Don't let your love of brand
         | trump the reality of what's going on here.
         | 
         | Machinery is being put in place to detect what files are on
         | your supposedly secure device. Someone has the reins and
         | promises not to use it for anything other than "protecting the
         | children".
         | 
         | How many election cycles or generations does it take to change
         | to an unfavorable climate where this is now a tool of great
         | asymmetrical power to use against the public?
         | 
         | What happens when the powers that be see that you downloaded
         | labor union materials, documents from Wikileaks, or other files
         | that implicate you as a risk?
         | 
         | Perhaps a content hash on your phone puts you in a flagged
         | bucket where you get pat downs at the airport, increased
         | surveillance, etc.
         | 
         | The only position to take here is a full rebuke of Apple.
         | 
         | edit: Apple apologists are taking a downright scary position
         | now. I suppose the company has taken a full 180 from their 1984
         | ad centerpiece. But that's okay, right, because Apple is a part
         | of your identity and it's beyond reproach?
         | 
         | edit 2: It's nominally iCloud only (a key feature of the
         | device/ecosystem), but that means having to turn off a lot of
         | settings. One foot in the door...
         | 
         | edit 3: Please don't be complicit in allowing this to happen.
         | Don't apologize or rationalize. This is only a first step. We
         | warned that adtech and monitoring and abuse of open source were
         | coming for years, and we were right. We're telling you - loudly
         | - that this will begin a trend of further erosion of privacy
         | and liberty.
        
           | artimaeis wrote:
           | It's not doing any sort of scanning of your photos while
           | they're just sitting on your device. The CSAM scanning only
           | occurs when uploading photos to iCloud, and only to the
           | photos being uploaded.
           | 
           | Source (pdf): https://www.apple.com/child-
           | safety/pdf/CSAM_Detection_Techni...
        
             | pseudalopex wrote:
             | > It's not doing any sort of scanning of your photos while
             | they're just sitting on your device.
             | 
             | Yet. The point is the new system makes it feasible.
        
         | cblconfederate wrote:
         | What is the point of E2EE vs TLS/SSL based encryption?
        
         | randcraw wrote:
         | You presume Apple and the DoJ will implement this with human
         | beings at each step. They won't. Both parties will automate as
         | much of this clandestine search as possible. With time, the
         | external visibility and oversight of this practice will fade,
         | and with it, any motivation to confirm fair and accurate
         | matches. Welcome to the sloppiness inherent in clandestine law
         | enforcement intel gathering.
         | 
         | As with all politically-motivated initiatives that boldly
         | violate the Constitution (consider the FISA Court, and its
         | rubber stamp approval of 100% of the secret warrants put before
         | it), the use and abuse of this system will go largely
         | underground, like FISA, and its utility will slowly degrade due
         | to lack of oversight. In time, even bad matches will log the
         | IDs of both parties in databases that label them as potential
         | sexual predators.
         | 
         | Believe it. That's how modern computer-based gov't intel works.
         | Like most law enforcement policy recommendation systems,
         | Apple's initial match algorithm will never be assessed for
         | accuracy, nor be accountable for being wrong at least 10% of
         | the time. In time it will be replaced by other third party
         | screening software that will be even more poorly written and
         | overseen. That's just what law enforcement does.
         | 
         | I've personally seen people suffer this kind of gov't abuse and
         | neglect as a result of clueless automated law enforcement
         | initiatives after 9-1-1. I don't welcome more, nor the gradual
         | and willful tossing of everyone's basic Constitutional rights
         | that Apple's practice portends.
         | 
         | The damages to personal liberty that are inherent in conducting
         | secret searches without cause or oversight is exactly why the
         | Fourth Amendment requires a warrant before conducting a search.
         | NOW is the time to disabuse your sense of 'daftness'; not years
         | from now, after the Fourth and Fifth Amendments become
         | irreversibly passe. Or should I say, 'daft'?
        
         | avnigo wrote:
         | I'd be interested to see what any Apple executives would
         | respond to the concerns in interviews, but I don't expect Apple
         | to issue a press release on the concerns.
        
         | vorpalhex wrote:
         | Who verifies CSAM databases? Is there a way to verify the CSAM
         | hashlist hasn't been tampered with and additional hashes
         | inserted?
         | 
         | Would it be ok to use this approach to stop "terrorism"? Are
         | you ok with both Biden and Trump defining that list?
        
         | feanaro wrote:
         | > that a tool designed to stop the spread of CSAM is a "bad
         | thing"
         | 
         | It's certainly said to be designed to do it, but have you seen
         | concerns raised in the other thread
         | (https://news.ycombinator.com/item?id=28068741)? There have
         | been reports from some commenters of the NCMEC database
         | containing unobjectionable photos because they were merely
         | _found in a context alongside some CSAM_.
         | 
         | Who audits these databases? Where is the oversight to guarantee
         | only appropriate content is included? They are famously opaque
         | because the very viewing of the content is illegal. So how can
         | we know that they contain what they are purported to contain?
         | 
         | This is overreach.
        
           | shuckles wrote:
           | That's a problem with NCMEC, not Apple's proposal today.
           | Furthermore, if it were an actual problem, it would've
           | already manifested with the numerous current users of
           | PhotoDNA which includes Facebook and Google. I don't think
           | the database of known CSAM content includes photos that
           | cannot be visually recognized as child abuse.
        
             | tsimionescu wrote:
             | Why do you not think that? As far as I understand, there is
             | no procedure for reviewing the contents, it is simply a
             | database that law enforcement vouches is full of bad
             | images.
        
               | shuckles wrote:
               | NCMEC, not law enforcement, produces a list of embeddings
               | of known images of child abuse. Facebook and Google run
               | all photos uploaded to their platforms against this list.
               | Those which match are manually reviewed and if confirmed
               | to depict such scenes, are reported to CyberTip. If the
               | list had a ton of false positives, you think they
               | wouldn't notice that their human reviewers were spending
               | a lot of time looking at pictures of the sky?
        
           | Shank wrote:
           | > Who audits these databases? Where is the oversight to
           | guarantee only appropriate content is included? They are
           | famously opaque because the very viewing of the content is
           | illegal. So how can we know that they contain what they are
           | purported to contain?
           | 
           | I wholeheartedly agree: there is an audit question here too.
           | The contents of the database are by far the most dangerous
           | part of this equation, malicious or not, targeted or not. I
           | don't like the privacy implications about this, nor the
           | potential for abuse. I would love to see some kind of way to
           | audit the database, or ensure that it's only used "for good."
           | I just don't know what that system is, and I know that
           | PhotoDNA is already in use on other cloud providers.
           | 
           | Matthew Green's ongoing analysis [0] is really worth keeping
           | an eye on. For example, there's a good question: can you just
           | scan against a different database for different people? These
           | are the right questions given what we have right now.
           | 
           | [0]: https://twitter.com/matthew_d_green/status/1423378285468
           | 2091...
        
             | shuckles wrote:
             | Matt should read the release before live tweeting FUD. The
             | database is shipped in the iOS image, per the overview, so
             | targeting users is not an issue (roughly).
        
               | pushrax wrote:
               | Is the database frozen or can they push out updates
               | independently of iOS updates? If not, targeting
               | individual users definitely doesn't seem possible unless
               | you control OS signing.
        
               | shuckles wrote:
               | The database is shipped in the iOS image.
        
               | pushrax wrote:
               | That's what you wrote originally - and to me it doesn't
               | indicate whether it can also be updated from other
               | sources or not.
               | 
               | Lots of content is shipped in the iOS image but can
               | update independently.
        
               | shuckles wrote:
               | The technical summary provides a lot of detail. I don't
               | think Apple would omit remote update functionality from
               | it if such capability existed, especially since database
               | poisoning is a real risk to this type of program. I'm
               | comfortable with interpreting the lack of evidence as
               | evidence of absence of such a mechanism. Explicit
               | clarification would certainly help though, but my
               | original point stands: there is positive evidence in the
               | docs which the FUD tweets don't engage with.
               | 
               | In particular, I'm referencing the figure which says that
               | the database of CSAM hashes is "Blinded and embedded"
               | into the client device. That does not sound like an asset
               | the system remotely updates.
        
               | pseudalopex wrote:
               | You should understand the difference between a protocol
               | and an easy to change implementation detail before
               | throwing around words like FUD.
        
               | shuckles wrote:
               | Has Matt redacted any of the FUD from his tweets last
               | night which aren't true given the published details from
               | today? For example, his claim that the method is
               | vulnerable to black box attacks from GANs isn't
               | applicable to the protocol because the attacker can't
               | access model outputs.
               | 
               | Furthermore, if "an easy to change implementation detail"
               | in your threat model is anything which could be changed
               | by iOS update, you should've stopped using iPhone about
               | 14 years ago.
        
               | [deleted]
        
         | indymike wrote:
         | > backlash is daft
         | 
         | Fighting to preserve a freedom is not daft, even if it is David
         | vs. Goliath's bigger, meaner brother and his friends.
        
         | [deleted]
        
         | throwaway888abc wrote:
         | 1. was new to me.
         | 
         | TIL - (2014) PhotoDNA Lets Google, FB and Others Hunt Down
         | Child Pornography Without Looking at Your Photos
         | 
         | https://petapixel.com/2014/08/08/photodna-lets-google-facebo...
        
         | shivak wrote:
         | > recruited mathematicians to analyze it, and published the
         | results, as well as one in-house proof and one independent
         | proof showing the cryptographic integrity of the system.
         | 
         | Apple employs cryptographers, but they are not necessarily
         | acting in your interest. Case in point: their use of private
         | set intersection, to preserve privacy..of law enforcement, not
         | users. Their less technical summary:
         | 
         | > _Instead of scanning images in the cloud, the system performs
         | on-device matching using a database of known CSAM image hashes
         | provided by NCMEC and other child safety organizations. Apple
         | further transforms this database into an unreadable set of
         | hashes that is securely stored on users' devices._
         | 
         | > _Before an image is stored in iCloud Photos, an on-device
         | matching process is performed for that image against the known
         | CSAM hashes. This matching process is powered by a
         | cryptographic technology called private set intersection.._
         | 
         | The matching is performed on device, so the user's privacy
         | isn't at stake. But, thanks to PSI and the hash preprocessing,
         | the user doesn't know what law enforcement is looking for.
        
           | xondono wrote:
           | Well, it'd be kind of dumb to make the mistake of building a
           | system to stop child pornography only to have it become the
           | biggest distributor of CP photos in history
        
         | wayneftw wrote:
         | This is an abuse my property rights. The device is my property
         | and this activity will be using my CPU, battery time and my
         | network bandwidth. That's the abuse right there.
         | 
         | They should just use their own computers to do this stuff.
        
           | samatman wrote:
           | Photos is just an app.
           | 
           | You can use another photo app, link it to another cloud
           | provider, and be free of the burden.
           | 
           | If you use Photos, you're along for the ride, and you've
           | consented to whatever it does.
           | 
           | You don't get a line-item veto on code you choose to run,
           | that's never been how it works.
           | 
           | For what it's worth, I'm basically with the EFF on this: it
           | looks like the thin end of a wedge, it sucks and I'm not
           | happy about it.
           | 
           | But being histrionic doesn't help anything.
        
             | zionic wrote:
             | No it's not, it's the entire OS.
        
             | jimbob45 wrote:
             | I don't know how true this is. I don't see any way to block
             | Photos from viewing the files on this device and I see no
             | reason that it can't read files from my other apps.
        
           | jdavis703 wrote:
           | Then you have two choices, disable iCloud photo backups or
           | don't upgrade to iOS 15. There are plenty of arguments
           | against Apple's scheme, but this isn't one of them.
        
       | Sunspark wrote:
       | This is going to do wonders for Apple's marketshare once the
       | teenagers realize that Apple is going to be turning them in to
       | the police.
       | 
       | Teens are not stupid. They'll eventually clue-in that big brother
       | is watching and won't appreciate it. They'll start by using other
       | messengers instead of imessage and then eventually leaving the
       | ecosystem for Android or whatever else comes down the pike in the
       | future.
        
       | Calvin02 wrote:
       | I think the issue is that what the tech community sees as privacy
       | is different than what the general public thinks of as privacy.
       | 
       | Apple, very astutely, understands that difference and exploited
       | the latter to differentiate its phones from its main competitor:
       | cheap(er) android phones.
       | 
       | Apple didn't want the phones to be commoditized, like personal
       | computers before it. And "privacy" is something that you can't
       | commoditize. Once you own that association, it is hard to fight
       | against it.
       | 
       | Apple also understands that the general public will support its
       | anti child exploitation and the public will not see this as a
       | violation of privacy.
        
       | etempleton wrote:
       | I think this is probably the reasonable and responsible thing for
       | Apple to do as a company, even if it it goes against their
       | privacy ethos. Honestly they probably have been advised by their
       | own lawyers that this is the only way to cover themselves and
       | protect shareholder value.
       | 
       | The question will be if Apple will bend to requests to leverage
       | this for other reasons less noble than the protection of
       | children. Apple has a lot of power to say no right now, but they
       | might not always have that power in the future.
        
       | websites2023 wrote:
       | Apple's battle is against Surveillance Capitalism, not against
       | state-level surveillance. In fact, there is no publicly traded
       | company that is against state-level surveillance. It's important
       | not to confuse the two.
       | 
       | Think of it this way: If you want to hide from companies, choose
       | Apple. If you want to hide from the US Government, choose open
       | source.
       | 
       | But if your threat model really does include the US government or
       | some other similarly capable adversary, you are well and truly
       | fucked already. The state-level apparatus for spying on folks
       | through metadata and traffic interception is now mode than a
       | decade old.
        
         | krrrh wrote:
         | The problem is that as governments gain access to new
         | technological capabilities and exploit crises to acquire more
         | emergency powers, increasingly large numbers of peoples' threat
         | models begin to include government.
         | 
         | The best hopes against a population-wide Chinese-style social
         | credit system being implemented in the US remain constitutional
         | and cultural, but the more architectural help we get from
         | technology the better. "Code is law" is still a valid
         | observation.
        
         | [deleted]
        
         | tablespoon wrote:
         | > Think of it this way: If you want to hide from companies,
         | choose Apple. If you want to hide from the US Government,
         | choose open source.
         | 
         | It's not just the US government: they've been cooperating with
         | the PRC government as well (e.g. iCloud in China runs on
         | servers owned by a state-owned company, and apparently China
         | rejected the HSM Apple was using elsewhere, so they designed
         | one specifically for China). Apple has some deniability there,
         | but I personally wouldn't be surprised if China could get any
         | data from them that it wanted.
         | 
         | https://www.nytimes.com/2021/05/17/technology/apple-china-ce...
        
       | tomxor wrote:
       | I keep thinking, It's like they are _trying_ to be the most
       | ironic company in history...
       | 
       | But then I have to remind myself, the old Apple is long gone, the
       | new Apple is a completely different beast, with a very different
       | concept of what it is marketing.
        
         | amelius wrote:
         | It's the RDF. People still think of Apple as the Old Apple. The
         | rebellious company that stood for creative freedom. The maker
         | of tools that work _for_ the user, not _against_ the user.
        
       | endisneigh wrote:
       | Unless the entire stack you're using is audited and open source
       | this sort of thing is inevitable.
       | 
       | As far as this is concerned, seems like if you don't use iMessage
       | or iCloud you're safe for now.
        
         | _red wrote:
         | >don't use iMessage
         | 
         | 1. Send someone you hate a message with cartoon making fun of
         | tyrant-president.
         | 
         | 2. That person is now on a list.
         | 
         | Its swatting-as-a-service.
        
           | ezfe wrote:
           | If you read the article, you'd understand that among ALL the
           | issues, this is not one:
           | 
           | - Photos scanning in Messages is on-device only (no reporting
           | to govt.) and doesn't turn on unless you're an adult who
           | turns it on for a minor via Family Sharing controls. - iCloud
           | Photos scanning doesn't take effect unless you save the photo
           | and it's already in a database of flagged photos. So in your
           | scenario, you'd have to save the photo received from the
           | unknown number to get flagged.
        
             | _red wrote:
             | >So in your scenario, you'd have to save the photo received
             | from the unknown number to get flagged.
             | 
             | Whew! I was worried there for a minute. Maybe for extra
             | safety I could say "SIRI I DISAVOW OF THIS MESSAGE!"??
        
               | bingidingi wrote:
               | would you not report unsolicited child porn to the FBI
               | anyway?
        
               | samatman wrote:
               | Y'know, I have no idea what I'd do in this situation and
               | I really hope I'll never find out.
               | 
               | If a kilo of heroin just showed up in the back seat of my
               | car, I'd throw it out the window and try not to think
               | about it. I certainly wouldn't bring it to the police,
               | because _mere possession is a serious crime_.
               | 
               | CP is the same way, except it comes with a nice audit
               | trail which could sink me even if I delete it
               | immediately. Do I risk that, or do I risk the FBI
               | deciding I'm a Person of Interest because I reported the
               | incident in good faith?
               | 
               | There are no good choices there.
        
               | tsimionescu wrote:
               | The scan doesn't detect child porn, it detects photos in
               | the CSAM database. The two may or may not be same thing,
               | now it in the future.
        
             | lijogdfljk wrote:
             | I'm confused - the article explicitly states this scenario
             | - minus the swatting.
             | 
             | Ie unless you're replying to purely the swatting part, the
             | article seems to support this. Specifically a prediction
             | that governments will creep on legally requiring Apple to
             | push custom classifiers:
             | 
             | > Apple's changes would enable such screening, takedown,
             | and reporting in its end-to-end messaging. The abuse cases
             | are easy to imagine: governments that outlaw homosexuality
             | might require the classifier to be trained to restrict
             | apparent LGBTQ+ content, or an authoritarian regime might
             | demand the classifier be able to spot popular satirical
             | images or protest flyers.
        
               | ezfe wrote:
               | That sentence is wrong. It simply isn't accurate of the
               | current system. It relies on future changes to the
               | system, not just changes to a database.
               | 
               | The iMessage feature is not a database comparison system,
               | it's to keep kids from getting/receiving nudes
               | unexpectedly - and it works based on classifying those
               | images.
               | 
               | I don't dispute this is a slippery slope - one could
               | imagine that a government requires Apple to modify it's
               | classification system. However, that would presumably
               | require a software update since it happens on device.
        
               | xondono wrote:
               | That refers to the icloud scanning, the idea being that
               | if the hash database contains propaganda, people
               | uploading that propaganda to icloud could get reported by
               | their own device.
        
             | arihant wrote:
             | Didn't apple also announce a feature for iOS 15 where
             | iMessage photos are somehow automatically collected and
             | shown in iCloud? A way to reduced hassle of creating shared
             | albums. So with that, I think all users of iCloud photos
             | are under risk here.
        
         | ncw96 wrote:
         | > As far as this is concerned, seems like if you don't use
         | iMessage or iCloud you're safe for now.
         | 
         | Yes, this is correct. The Messages feature only applies to
         | children under 18 who are in an iCloud Family, and the photo
         | library feature only applies if you are using iCloud Photos.
        
           | withinboredom wrote:
           | I'm fairly certain the age is different per region and
           | hopefully tied to the age of consent (in this particular
           | case).
        
             | rootusrootus wrote:
             | I don't think it has anything to do with age. It has
             | everything to do with you adding the phone to your family
             | under settings and declaring that it belongs to a child.
             | You control the definition of child.
        
               | jdavis703 wrote:
               | I could imagine an abusive partner enabling this to make
               | sure their partner isn't sexting other people. Given the
               | pushback for AirTags I'm surprised people aren't more
               | concerned.
        
               | endisneigh wrote:
               | You're misunderstanding what this is if this is an actual
               | concern of yours.
        
               | jdavis703 wrote:
               | I'm not sure I'm misunderstanding. This is another
               | feature that allows someone with access to another
               | person's phone to enable stalkerware like features.
        
               | rootusrootus wrote:
               | Anyone 13 or older can remove themselves from a family
               | sharing group. The only exception is if screen time is
               | enabled and enforced for their device.
               | 
               | Frankly, if you have an abusive partner with physical
               | control over you and a willingness to do this, the fact
               | that Apple supports this technology is the _least_ of
               | your problems.
        
               | xondono wrote:
               | Except this would require consent of the abused partner
               | when creating the account to set an age <13yo.
               | 
               | You can't ser this to other accounts on you family
               | remotely.
        
           | josh_today wrote:
           | Would artificially inflating every child's age to 18+
           | eliminate the iMessage problem
        
         | [deleted]
        
       | edison112358 wrote:
       | "This means that when the features are rolled out, a version of
       | the NCMEC CSAM database will be uploaded onto every single
       | iPhone."
       | 
       | So every iPhone will now host the explicit images from the
       | National Center for Missing & Exploited Children database.
        
         | spiznnx wrote:
         | The database contains perceptual hashes, not images.
        
         | pgoggijr wrote:
         | No, they will host the hashes computed from those images.
        
         | hartator wrote:
         | Yes, everyone in jail! It's probably just the md5 or something
         | like that, but I don't like it either.
        
         | joshstrange wrote:
         | > So every iPhone will now host the explicit images from the
         | National Center for Missing & Exploited Children database.
         | 
         | It's hashes, not the images themselves.
        
           | cblconfederate wrote:
           | And how did the user end up with the hashes? He hashed the
           | original images which he then deleted, your honor!
           | 
           | BTW this is going to be a major target for smearing people
           | that the US doens't like
        
             | joshstrange wrote:
             | I'm sorry but this is the most ridiculous thing I've read
             | today. Hashes have never and probably will never be used
             | "smear" someone the US doesn't like. We can speculate about
             | them planting evidence but trying to prosecute based on
             | hashes baked into the OS used by millions? That's absurd.
        
         | kevincox wrote:
         | I'm pretty sure this is a non-tech way of saying "a machine
         | learning model" or other parameters which is not a particularly
         | useful form of this database.
        
         | artimaeis wrote:
         | > No user receives any CSAM photo, not even in encrypted form.
         | Users receive a data structure of blinded fingerprints of
         | photos in the CSAM database. Users cannot recover these
         | fingerprints and therefore cannot use them to identify which
         | photos are in the CSAM database.
         | 
         | Source (PDF): https://www.apple.com/child-
         | safety/pdf/Technical_Assessment_...
        
         | zionic wrote:
         | How long until a hacker uses ML to generate collisions against
         | those hashes?
        
           | outworlder wrote:
           | For what purpose? A collision doesn't mean that you found the
           | source images. Not even close.
        
             | __david__ wrote:
             | Find collisions, spam the colliding photos to people you
             | don't like, watch the mayhem unfold.
        
             | acdha wrote:
             | With a broader rollout to all accounts and simply scanning
             | in iMessage rather than photos there's one possible
             | scenario if you could generate images which were plausibly
             | real photos: spam them to someone before an election, let
             | friendly law enforcement talk about the investigation, and
             | let them discover how hard it is to prove that you didn't
             | delete the original image which was used to generate the
             | fingerprint. Variations abound: target that teacher who
             | gave you a bad grade, etc. The idea would be credibility
             | laundering: "Apple flagged their phone" sounds more like
             | there's something there than, say, a leak to the tabloids
             | or a police investigation run by a political rival.
             | 
             | This is technically possible now but requires you to
             | actually have access to seriously illegal material. A
             | feasible collision process would make it a lot easier for
             | someone to avoid having something which could directly
             | result in a jail sentence.
        
             | octopoc wrote:
             | So you can upload the colliding images to iCloud and get
             | yourself reported for having child porn. Then after the law
             | comes down on you, you can prove that you didn't ever have
             | child porn. And you can sue Apple for libel, falsely
             | reporting a crime, whatever else they did. It would be a
             | clever bit of tech activism.
        
             | bingidingi wrote:
             | I guess in theory you could poison the well by widely
             | sharing many false positives?
        
             | ursugardaddy wrote:
             | Improved swatting, it's going to all make for a badass
             | october surprise next election
        
           | bjustin wrote:
           | There is a minimum number of hash matches required, then
           | images are made available to Apple who then manually checks
           | that they are CSAM material and not just collisions. That's
           | what the 9to5Mac story about this says:
           | https://9to5mac.com/2021/08/05/apple-announces-new-
           | protectio...
        
       | strogonoff wrote:
       | If Mallory gets a lawful citizen Bob to download a completely
       | innocuous looking but perceptual-CSAM-hash-matching image to his
       | phone, what happens to Bob? I imagine the following options:
       | 
       | - Bob's info is sent to law enforcement; Bob is swatted or his
       | life is destroyed in some other way. Worst, but most likely
       | outcome.
       | 
       | - An Apple employee (or an outsourced contractor) reviews the
       | photo, comparing it to CSAM source image sample used for the
       | hash. Only if the image matches according to human vision, Bob is
       | swatted. This requires there to be some sort of database of CSAM
       | source images.
       | 
       | - An Apple employee or a contractor reviews the image for abuse
       | without comparing it to CSAM source, using own subjective
       | judgement.
        
       | literallyaduck wrote:
       | It is okay to use the back door when we want to find people:
       | 
       | being terrorists
       | 
       | exploiting children
       | 
       | who are not vaccinated
       | 
       | use the wrong politically correct language
       | 
       | anything else we don't like
        
       | Drblessing wrote:
       | Use signal y'all
        
       | slaymaker1907 wrote:
       | I'd be surprised if this goes through as is since you can't just
       | save this stuff indefinitely. Suppose a 14 year old sexts a 12
       | year old. That is technically child porn and so retention is
       | often illegal.
        
       | outworlder wrote:
       | > these notifications give the sense that Apple is watching over
       | the user's shoulder--and in the case of under-13s, that's
       | essentially what Apple has given parents the ability to do.
       | 
       | Well, yes? Parents are already legally responsible for their
       | young children and under their supervision. The alternative would
       | be to not even give such young children these kind of devices to
       | begin with - which might actually be preferable.
       | 
       | > this system will give parents who do not have the best
       | interests of their children in mind one more way to monitor and
       | control them
       | 
       | True. But the ability to send or receive explicit images would
       | most likely not be the biggest issue they would be facing.
       | 
       | I understand the slippery slope argument the EFF is making, but
       | they should keep to the government angle. Having the ability for
       | governments to deploy specific machine learning classifiers is
       | not a good thing.
        
       ___________________________________________________________________
       (page generated 2021-08-05 23:00 UTC)