[HN Gopher] Apple clarifies why it abandoned plan to detect CSAM...
       ___________________________________________________________________
        
       Apple clarifies why it abandoned plan to detect CSAM in iCloud
       photos
        
       Author : Anonboxis
       Score  : 98 points
       Date   : 2023-09-01 10:23 UTC (12 hours ago)
        
 (HTM) web link (www.wired.com)
 (TXT) w3m dump (www.wired.com)
        
       | sneak wrote:
       | The vast majority (99%+) of iCloud Photos are not e2ee and are
       | readable to Apple.
       | 
       | You can rest assured that they are scanning all of it serverside
       | for illegal images presently.
       | 
       | The kerfuffle was around clientside scanning, something that it
       | has been reported that they dropped. I have thus far seen no
       | statements from Apple that they actually intended to stop the
       | deployment of clientside scanning.
       | 
       | Serverside scanning has been possible (and likely) for a long
       | time, which illuminates their "slippery slope" argument as farce
       | (unless they intend to force migrate everyone to e2ee storage in
       | the future).
        
         | h1fra wrote:
         | Where do you get this number?
        
           | sneak wrote:
           | e2ee for iCloud is currently opt-in, without prompts/nudging.
           | Most power users don't even have it turned on or are aware of
           | its existence. The setting is buried/hidden in submenus.
           | 
           | Approximately no one uses it.
           | 
           | Hopefully Apple will begin promoting users to migrate in
           | future updates.
        
             | h1fra wrote:
             | Indeed. After looking at the documentation, photos are not
             | e2ee by default.
             | 
             | https://support.apple.com/en-us/HT202303
        
             | 0x000042 wrote:
             | > The setting is buried/hidden in submenus.
             | 
             | Mind sharing where it is on an iPhone and Mac? I have not
             | been able to find it.
        
               | michaelt wrote:
               | It's called "Advanced Data Protection for iCloud"
               | 
               | It's kinda complicated to turn on, as it disables most
               | account recovery options.
        
               | JTyQZSnP3cQGa8B wrote:
               | It was announced some time ago, but remember that you
               | need modern devices on both sides to enable that feature.
        
               | smilespray wrote:
               | How to turn on Advanced Data Protection for iCloud
               | 
               | https://support.apple.com/en-gb/HT212520
        
             | Gigachad wrote:
             | It only just came out this year and it comes with some
             | pretty serious UX issues around potentially getting locked
             | out of your data.
        
         | maxhille wrote:
         | Apple has full control over their customers devices, so they
         | can access all encryption keys and device local files anyway.
         | That e2ee setting seems pretty pointless to me...
        
           | kylehotchkiss wrote:
           | You can enable device wipe after 10 wrong passcodes, and E2EE
           | gives Apple pretty broad cover to deny government requests to
           | your data. The appeal to that for me isn't US government (who
           | have easy access to everything else about you), but other
           | governments around the world with worse human rights records.
           | It's terrifying that while traveling a policeman could make
           | up something and get details about your home life they aren't
           | entitled to.
        
       | Simulacra wrote:
       | I haven't forgot about the guy that sent photos of his child to
       | his doctor and was investigated for child pornography. With these
       | systems, in my humble opinion, you are just one innocent photo at
       | the beach away from your life turned upside down.
        
         | tjpnz wrote:
         | And Google to this day refuse to admit the mistake. They've
         | even gone as far as to insinuate that he still is a pedo
         | despite a police investigation clearing him.
        
       | tabeth wrote:
       | I'm not sure I understand Apple's logic here. Are iCloud Photos
       | in their data centers not scanned? Isn't everything by default
       | for iCloud users sent there automatically to begin with? Doesn't
       | the same logic around slippery slope also apply to cloud scans?
       | 
       | This is not to say they should scan locally, but my understanding
       | of CSAM was that it would only be scanned on its way to the cloud
       | anyways, so users who didn't use iCloud would've never been
       | scanned to begin with.
       | 
       | Their new proposed set of tools seems like a good enough
       | compromise from the original proposal in any case.
        
         | Moldoteck wrote:
         | so users who didn't use iCloud would've never been scanned to
         | begin with. - so why not implement csam for icloud only without
         | local scanning?
        
           | Gigachad wrote:
           | Because the idea is that the iCloud data would be encrypted
           | so their servers couldn't scan it. With the plan being they
           | would do on device scanning of photos that were marked as
           | being stored on iCloud.
           | 
           | It's objectively better than what google does but I'm glad we
           | somehow ended up with no scanning at all.
        
             | Moldoteck wrote:
             | that sounds strange, I mean i'm not sure what's the big
             | difference. If data is scanned on icloud, this means it's
             | not encrypted, got it, if scanned on devices, data is fully
             | encrypted on icloud, but apple has access by scanning it on
             | devices and can send unencrypted matches, so it behaves as
             | an unencrypted system, that can be altered at apple's will,
             | just like icloud... but still, why scanning locally only if
             | icloud is enabled? why not scan regardless? Since policy is
             | meant to 'catch bad ppl', why limit to icloud option and
             | not scan all the time
        
         | matwood wrote:
         | You are correct, the original method would only have scanned
         | items destined to iCloud and only transmitted some hash of
         | matching hashes. And yes, similar slippery arguments exist with
         | any providers that store images unencrypted. They are all
         | scanned today, and we have no idea what they are matched
         | against.
         | 
         | I speculated (and now we know) when this new scanning
         | announced, that it was in preparation for full E2EE. Apple came
         | up with a privacy preserving method of trying to keep CSAM off
         | their servers while also giving E2EE.
         | 
         | The larger community arguments swayed Apple from going forward
         | with their new detection method, but did not stop them from
         | moving forward with E2EE. At the end of the day they put the
         | responsibility back on governments to pass laws around
         | encryption - where they should be, though we may not like the
         | outcome.
        
         | theshrike79 wrote:
         | In my opinion their goal was to get stuff to a state where they
         | could encrypt everything on iCloud so that even they can't
         | access it.
         | 
         | To counter the "think of the children" -argument governments
         | use to justify surveillance, Apple tried scanning stuff on-
         | device but the internet got a collective hissy-fit of
         | intentionally misunderstanding the feature and it was quickly
         | scrapped.
        
           | Shank wrote:
           | > In my opinion their goal was to get stuff to a state where
           | they could encrypt everything on iCloud so that even they
           | can't access it.
           | 
           | They basically did. If you turn on Advanced Data Protection,
           | you get all of the encryption benefits, sans scanning. The
           | interesting thing is that if you turn on ADP though, binary
           | file hashes are unencrypted on iCloud, which would
           | theoretically allow someone to ask for those hashes in a
           | legal request. But it's obviously not as useful for CSAM
           | detection, as, say, PhotoDNA hashes. See:
           | https://support.apple.com/en-us/HT202303
        
           | bryan_w wrote:
           | > so that even they can't access it.
           | 
           | > scanning stuff on-device
           | 
           | What do you think they were going to do once the scanning
           | turned up a hit? Access the photos? Well that negates the
           | first statement.
        
             | theshrike79 wrote:
             | Who is this "they" who will access the photos on-device?
        
         | turquoisevar wrote:
         | > Are iCloud Photos in their data centers not scanned?
         | 
         | No outright statement confirming or denying this has ever made
         | to my knowledge, but the implication, based both on Apple's
         | statements and the statement of stakeholders, is that this
         | isn't currently the case.
         | 
         | This might come as a surprise to some, because many companies
         | scan for CSAM, but that's done voluntarily because the
         | government can't force companies to scan for CSAM.
         | 
         | This is because based on case law, companies forced to scan for
         | CSAM would be considered deputized and thus subsequently it
         | would be a breach of the 4th amendments safeguards against
         | "unreasonable search and seizure".
         | 
         | The best the government can do is to force companies to report
         | "apparent violations" of CSAM laws, this seems like a
         | distinction without a difference, but the difference is between
         | required to actively search for it (and thus becoming
         | deputized) v. reporting when you come across it.
         | 
         | Even then, the reporting requirement is constructed in such a
         | way as to avoid any possible 4th amendment issues. Companies
         | aren't required to report it to the DOJ, but rather to the
         | NCMEC.
         | 
         | The NCMEC is a semi-government organization, autonomous from
         | the DOJ, albeit almost wholly funded by the DOJ, and they are
         | the ones that subsequently report CSAM violations to the DOJ.
         | 
         | The NCMEC is also the organization that maintains the CSAM
         | database and provides the hashes that companies, who
         | voluntarily scan for CSAM, use.
         | 
         | This construction has proven to be pretty solid against 4th
         | amendment concerns, as courts have historically found that this
         | separation between companies and the DOJ and the fact that only
         | confirmed CSAM making its way to the DOJ after review by the
         | NCMEC, creates enough of a distance between the DOJ and the act
         | of searching through a person's data, that there aren't any 4th
         | amendment concerns.
         | 
         | The Congressional Research Service did a write up on this last
         | year for the ones that are interested in it[0].
         | 
         | Circling back to Apple, as it stands there's nothing indicating
         | that they already scan for CSAM server-side and most comments
         | both by Apple and child safety organizations seem to imply that
         | this in fact is currently not happening.
         | 
         | Apple's main concerns however, as stated in the letter by
         | Apple, echo the same concerns by security experts back when
         | this was being discussed. Namely that it creates a target for
         | malicious actors, that it is technically not feasible to create
         | a system that can never be reconfigured to scan for non-CSAM
         | material and that governments could pressure/regulate it to
         | reconfigure it for other materials as well (and place a gag
         | order on them, prohibiting them to inform users of this).
         | 
         | At the time, some of these arguments were brushed off as
         | slippery slope FUD, and then the UK started considering
         | something that would defy the limits of even the most cynical
         | security researcher's nightmare, namely a de facto ban on
         | security updates if it just so happens that the UK's
         | intelligence services and law enforcement services are
         | currently exploiting the security flaw that the update aims to
         | patch.
         | 
         | Which is what Apple references in their response.
         | 
         | 0: https://crsreports.congress.gov/product/pdf/LSB/LSB10713
        
         | no_time wrote:
         | > I'm not sure I understand apples logic here. Are iCloud
         | Photos in their data centers not scanned? Isn't everything by
         | default for iCloud users sent there automatically to begin
         | with? Doesn't the same logic around slippery slope also apply
         | to cloud scans?
         | 
         | I don't see the problem with this status quo. There is a clear
         | demarcation between my device and their server. Each serving
         | the interests of their owner. If I have a problem with their
         | policy, I can choose not to entrust my data to them. And
         | luckily, the data storage space has heaps of competitive
         | options.
        
       | beej71 wrote:
       | I think they likely also considered the lawsuit exposure. If just
       | 0.0001% of users sued over false positives, Apple would be in
       | serious trouble.
       | 
       | And there's another dynamic where telling your customers you're
       | going to scan their content for child porn is the same as saying
       | you suspect your customers of having child porn. And your average
       | non-criminal customer's reaction to that is not positive for
       | multiple reasons.
        
       | neonate wrote:
       | http://web.archive.org/web/20230901190025/https://www.wired....
       | 
       | https://archive.ph/HZVdd
        
       | gruturo wrote:
       | > "Scanning every user's privately stored iCloud data would
       | create new threat vectors for data thieves to find and exploit"
       | 
       | > "It would also inject the potential for a slippery slope of
       | unintended consequences. Scanning for one type of content, for
       | instance, opens the door for bulk surveillance and could create a
       | desire to search other encrypted messaging systems across content
       | types."
       | 
       | Yes, and it was patently obvious from the onset. Why did it take
       | a massive public backlash to actually reason about this? Can we
       | get a promise that future initiatives will be evaluated a bit
       | more critically before crap like this bubbles to the top again?
       | Come on you DO hire bright people, what's your actual problem
       | here?
        
       | jmyeet wrote:
       | Part of the reason why this was (and is) a terrible idea is how
       | these companies operate and the cost and stigma of a false
       | negative.
       | 
       | Companies don't want to employ people. People are annoying. They
       | make annoying demands like wanting time off and having enough
       | money to not be homeless or starving. AI should be a tool that
       | enhances the productivity of a worker rather than replacing them.
       | 
       | Fully automated "safety" systems _always_ get weaponized. This is
       | really apparent on Tiktok where reporting users you don 't like
       | is clearly brigaded becasue a certain number of reports in a
       | given period triggers automatic takedowns and bans regardless of
       | assurances there is human review (there isn't). It's so
       | incredibly obvious when you see a duet with a threatening video
       | gets taken down while the original video doesn't (with reports
       | showing "No violation").
       | 
       | Additionally, companies like to just ban your account with
       | absolutely no explanation, accountability, right to review or
       | right to appeal. Again, all those things would require employing
       | people.
       | 
       | False positives can be incredibly damaging. Not only could this
       | result in your account being banned (possibly with the loss of
       | all your photos on something like iCloud/iPhotos) but it may get
       | you in trouble with law enforcement.
       | 
       | Don't believe me? Hertz was falsely reported their cars being
       | stolen [1], which created massive problems for those affected. In
       | a better world, Hertz executives would be in prison for making
       | false police reports (which, for you and me, is a crime) but that
       | will never happen to executives.
       | 
       | It still requires human review to identify offending content.
       | Mass shootings have been live streamed. No automatic system is
       | going to be able to accurately differentiate between this and,
       | say, a movie scene. I guarantee you any automated system will
       | have similar problems differentiating between actual CSAM and,
       | say, a child in the bath or at the beach.
       | 
       | These companies don't want to solve these problems. They simply
       | want legal and PR cover for appearing to solve them, consequences
       | be damned.
       | 
       | [1]: https://www.npr.org/2022/12/06/1140998674/hertz-false-
       | accusa...
        
       | [deleted]
        
       | menzoic wrote:
       | Pretty ridiculous idea. Bad actors simply won't use their
       | platform if this was in place. It would only be scanning private
       | data from all people who aren't comitting crimes.
        
         | stuartjohnson12 wrote:
         | You'd be surprised. Lots of offenders are very low
         | sophistication. If you read news articles about how a
         | particular offender was caught with illegal material, so so
         | often it's because they uploaded it to a cloud provider. It's
         | not a one-sided tradeoff here.
        
           | akira2501 wrote:
           | What percentage of offenders victimize children and never
           | record it in any way? If that's the overwhelming majority of
           | abuse cases, what are we even doing here?
        
         | [deleted]
        
         | [deleted]
        
         | [deleted]
        
       | formerly_proven wrote:
       | The who is often interesting with these stories.
       | 
       | > a new child safety group known as Heat Initiative
       | 
       | Doesn't even have a website or any kind of social media presence;
       | it literally doesn't appear to exist apart from the reporting on
       | Apple's response to them, which is entirely based on Apple
       | sharing their response with media, not the group interacting with
       | media.
       | 
       | > Sarah Gardner
       | 
       | on the other hand previously appeared as the VP of External
       | Affairs (i.e. Marketing) of Thorn (formerly DNA Foundation):
       | https://www.thorn.org/blog/searching-for-a-child-in-a-privat...
       | 
       | So despite looking a bit fishy at first, this doesn't seem to
       | come from a christofascist group.
        
         | figlett wrote:
         | > So despite looking a bit fishy at first, this doesn't seem to
         | come from a christofascist group.
         | 
         | Why would you assume this in the first place?
        
           | bsenftner wrote:
           | They use hysteria to generate power in society.
        
           | krapp wrote:
           | The main impetus behind "child safety" advocacy nowadays seem
           | to be by cells of extremist right-wing Christian / QAnon
           | types who believe in conspiracy theories like Pizzagate and
           | the "gay groomer" panic. It's a reasonable assumption to make
           | about any such group mentioned in the media that doesn't have
           | an established history at least prior to 2016.
        
             | figlett wrote:
             | It sounds like an entirely unreasonable assumption to me.
             | Advocating for child safety is something that transcends
             | political differences, and generally unifies people across
             | the political spectrum.
             | 
             | I mean, there aren't many people who want paedophiles to be
             | able to amass huge collections of child abuse imagery from
             | other paedophiles online. And pretty much every parent
             | wants their child to be kept safe from predators both
             | online and offline.
        
               | krapp wrote:
               | I didn't claim otherwise. The fact remains that a
               | specific subset of a specific political party has been
               | using "advocating for child safety" as a pretext to
               | accelerate fear of and harassment against the LGBT
               | community and "the left" in general for years now, and
               | they put a lot of effort into appearing legitimate.
               | 
               | And yes, because their politics are becoming normalized
               | within American culture, it is necessary to be skeptical
               | about references to any such group. Assuming good faith
               | is a rule on HN but elsewhere, where bad faith is what
               | gets visibility, it's naive.
        
               | figlett wrote:
               | Well, paedophiles hijacking leftist movements for their
               | own ends is a known problem, it's happened before and it
               | will happen again. One particularly infamous instance
               | occurred in the UK back in the 1970s:
               | 
               | https://www.theguardian.com/politics/2014/mar/02/how-
               | paedoph...
               | 
               | So if there are indeed some right-wing groups talking
               | about this, maybe it's best not to brush off their claims
               | without some scrutiny first. And I say this as someone
               | who mostly agrees with the left on most things.
               | 
               | Anyway I don't think that any of this has much to do with
               | Apple being asked to implement specific technical
               | measures for detecting child abuse imagery.
        
               | formerly_proven wrote:
               | > So if there are indeed some right-wing groups talking
               | about this, maybe it's best not to brush off their claims
               | without some scrutiny first. And I say this as someone
               | who mostly agrees with the left on most things.
               | 
               | Figlet indeed.
               | 
               | > figlett 5 months ago [flagged] [dead] | parent |
               | context | prev [-] | on: Florida courts could take
               | 'emergency' custody of k...
               | 
               | > This is excellent news for children at risk of being
               | abused by militant transgenders and the medical
               | establishment who are enabling them. Thank you Florida
               | for trying to put an end to this menace.
               | 
               | https://news.ycombinator.com/item?id=35029166
        
               | figlett wrote:
               | Exactly, this is one area where the political left,
               | particularly in the US, are failing terribly on child
               | safety.
               | 
               | I'm in the UK and we're doing better here though, the
               | main left-wing party is backing away from the particular
               | ideology that has enabled this. I was going to vote for
               | them anyway as we desperately need our public services to
               | be restored and welfare for those less fortunate in
               | society to be improved, but I'm pleased they're moving
               | towards a sensible, harm-reducing stance on this issue
               | rather than assuming everything the gender activists say
               | is reasonable.
        
           | passwordoops wrote:
           | [flagged]
        
             | PrimeMcFly wrote:
             | [flagged]
        
           | tomjen3 wrote:
           | Because when they couldn't win the war on porn, some right
           | Christians decided to cloak their attack in "concerns" of
           | "abuse". See project Excedus. Of course it has nothing to do
           | with abuse and everything to do with their attempts to keep
           | people from seeing pixels of other people having sex.
           | Backpage was shut down despite being good at removing
           | underage and trafficed women - which meant that sex workers
           | had to find other places that didn't have nearly as good
           | protections.
           | 
           | So yeah. When these things pop up I assume malicious intent.
        
             | figlett wrote:
             | But being critical of pornography and considering it to be
             | abuse isn't a view limited to right-wing Christians. For
             | example, here's what Noam Chomsky has to say about it:
             | 
             | > _Pornography is humiliation and degradation of women. It
             | 's a disgraceful activity. I don't want to be associated
             | with it. Just take a look at the pictures. I mean, women
             | are degraded as vulgar sex objects. That's not what human
             | beings are. I don't even see anything to discuss._
             | 
             | > Interviewer: But didn't performers choose to do the job
             | and get paid?
             | 
             | > _The fact that people agree to it and are paid, is about
             | as convincing as the fact that we should be in favour of
             | sweatshops in China, where women are locked into a factory
             | and work fifteen hours a day, and then the factory burns
             | down and they all die. Yeah, they were paid and they
             | consented, but it doesn 't make me in favour of it, so that
             | argument we can't even talk about._
             | 
             | > _As for the fact that it 's some people's erotica, well
             | you know that's their problem, doesn't mean I have to
             | contribute to it. If they get enjoyment out of humiliation
             | of women, they have a problem, but it's nothing I want to
             | contribute to._
             | 
             | > Interviewer: How should we improve the production
             | conditions of pornography?
             | 
             | > _By eliminating degradation of women, that would improve
             | it. Just like child abuse, you don 't want to make it
             | better child abuse, you want to stop child abuse._
             | 
             | > _Suppose there 's a starving child in the slums, and you
             | say "well, I'll give you food if you'll let me abuse you."
             | Suppose - well, there happen to be laws against child
             | abuse, fortunately - but suppose someone were to give you
             | an argument. Well, you know, after all a child's starving
             | otherwise, so you're taking away their chance to get some
             | food if you ban abuse. I mean, is that an argument?_
             | 
             | > _The answer to that is stop the conditions in which the
             | child is starving, and the same is true here. Eliminate the
             | conditions in which women can 't get decent jobs, not
             | permit abusive and destructive behaviour._
             | 
             | (Source of the above is this interview:
             | https://youtube.com/watch?v=SNlRoaFTHuE)
        
       | baz00 wrote:
       | It's nice that Apple have clarified this. I think that the
       | original intent was a misstep and possibly an internal political
       | situation that they had to deal with. I can see that a number of
       | people would be on each side of the debate with advocacy
       | throughout the org.
       | 
       | There is only one correct answer though and that is what they
       | have clarified.
       | 
       | I would immediately leave the platform if they progressed with
       | this.
        
       | gnfargbl wrote:
       | _> "Scanning every user's privately stored iCloud data would
       | create new threat vectors for data thieves to find and exploit, "
       | Neuenschwander wrote. "It would also inject the potential for a
       | slippery slope of unintended consequences. Scanning for one type
       | of content, for instance, opens the door for bulk surveillance
       | and could create a desire to search other encrypted messaging
       | systems across content types."_
       | 
       | Both of these arguments are absolutely, unambiguously, correct.
       | 
       | The other side of the coin is that criminals are using E2EE
       | communication systems to share sexual abuse material in ways and
       | at rates which they were not previously able to. This is, I
       | argue, a bad thing. Is is bad for the individuals who are re-
       | victimised on every share. It is also bad for the fabric of
       | society at large, in the sense that if we don't clearly take a
       | stand against abhorrent behaviour then we are in some sense
       | condoning it.
       | 
       | Does the tech industry have any alternate solutions that could
       | functionally mitigate this abuse? Does the industry feel that it
       | has any responsibility at all to do so? Or do we all just shout
       | "yay, individual freedom wins again!" and forget about the actual
       | problem that this (misguided) initiative was originally aimed at?
        
         | PrimeMcFly wrote:
         | It's an _incredibly_ bad thing. It 's also an incredibly poor
         | excuse to justify backdooring phones.
         | 
         | Cops need to investigate the same way they always have, look
         | for clues, go undercover, infiltrate, find where this stuff is
         | actually being made, etc.
         | 
         | Scanning everyone's phones would make their jobs significantly
         | easier, no doubt, but it simply isn't worth the cost to us as a
         | society and there is simply no good counter-argument to that.
        
           | [deleted]
        
           | rob74 wrote:
           | If CSAM was still done the way it "always has been", then
           | "cops" relying on the methods they always had would be a
           | valid answer. But since tech has enabled the distribution of
           | CSAM at unprecedented scales, I think the requests by law
           | enforcement to also make their job a bit easier have some
           | merit...
        
             | danaris wrote:
             | The technology has changed distribution, yes.
             | 
             | It hasn't particularly changed production, which is where
             | the actual abuse happens. There are still _actual_ people
             | abusing and filming _actual_ children, and those can be
             | found by the police by the same old-fashioned methods they
             | 've always had available. (Plus many new ones that _don 't_
             | violate everyone's civil liberties or destroy the security
             | of every networked device.)
        
             | PrimeMcFly wrote:
             | I don't.
             | 
             | They can find those materials the same way abusers work
             | their way into communities to have access to them in the
             | first place.
             | 
             | The increased scale only means they need more people
             | working on it.
        
               | rob74 wrote:
               | More people. Paid from taxes. Ok, let's increase taxes!
               | Oh, wait...
        
               | PrimeMcFly wrote:
               | Well yes, let's absolutely increase taxes for the ultra-
               | wealthy why don't pay nearly enough. I don't see the
               | problem.
               | 
               | That aside, we could also, at least in the US, stop
               | taking such a ridiculous stance against drugs and instead
               | prioritize finding makers of CSASM materials.
               | 
               | The people are there, they are just not being utilized
               | effectively.
        
           | theshrike79 wrote:
           | Let's take a step back here and bring in some facts.
           | 
           | "Apple" wasn't scanning your phone, neither was there a
           | "backdoor".
           | 
           | If you would've had iCloud upload enabled (you'd be uploading
           | all your photos to Apple's server, a place where they could
           | scan ALL of your media anyway), the phone would've downloaded
           | a set of hashes of KNOWN and HUMAN VERIFIED photos and videos
           | of sexual abuse material. [1]
           | 
           | After THREE matches of _known and checked_ CSAM, a check done
           | 100% on-device with zero data moving anywhere, a  "reduced-
           | quality copy" would've been sent to a human for verification.
           | If it was someone sending you hashbombs of intentional false
           | matches or an innocuous pic that matched because some
           | mathematical anomaly, the actual human would notice this
           | instantly and no action would've been taken.
           | 
           | ...but I still think I was the only HNer who actually read
           | Apple's spec and just didn't go with Twitter hot-takes, so
           | I'm fighting windmills over here.
           | 
           | Yes, there is always the risk that an authoritarian
           | government could force Apple to insert checks for stuff other
           | than CSAM to the downloaded database. But the exact same risk
           | exists when you upload stuff to the cloud anyway and on an
           | even bigger scale. (see point above about local checks not
           | being enabled unless iCloud sync is enabled)
           | 
           | [1] It wasn't an SHA-1 hash where changing a single bit in
           | the source would make the hash invalid, the people doing that
           | were actually competent.
        
             | PrimeMcFly wrote:
             | > "Apple" wasn't scanning your phone, neither was there a
             | "backdoor".
             | 
             | Yes, you're right. But I've seen calls for phones to scan
             | all content before being uploaded or encrypted, and it
             | often feels, at least in some countries, that could still
             | plausible happen. I suppose that's what I had in mind when
             | I wrote my comment.
             | 
             | > But the exact same risk exists when you upload stuff to
             | the cloud anyway and on an even bigger scale.
             | 
             | There's a difference with them actively doing it and
             | announcing they are doing it, vs the possibility they are
             | doing it silently without consent.
        
             | taikahessu wrote:
             | You might've read the spec but you're missing the point and
             | your approach is naive. For me it's about crossing the
             | line. If you want to be snooping around my phone or my
             | house, you need a warrant and go to official channels
             | provides by my gov officials. And you really think it's as
             | simple as picking apples from oranges? I mean come on. Yes,
             | it's easy to implement hash check to see if u have some
             | known child porn in your cloud. But was that the use case
             | for the advocates? No. Their use case was to try finding
             | abuse and that would need a more thorough scanning. And
             | once we're there, we have to make hard decisions on what is
             | porn or abuse. If u think it's easy, then you need to it
             | through harder. Think of some picture from sauna where
             | there are naked family, might be harmful? But normal here
             | in Finland. What about a stick figure cartoon what depicts
             | a some shady sexual positions with a smaller child-like
             | figure in it? Or what about grooming, asking for naked
             | pics? How is this system going to prevent that? I mean, I
             | get why ppl would want something like this. But it isn't
             | the right solution imho.
        
             | Moldoteck wrote:
             | i thought they can't scan the media in icloud, since media
             | is encrypted, no?
             | 
             | also: If it was someone sending you hashbombs of
             | intentional false matches or an innocuous pic that matched
             | because some mathematical anomaly, the actual human would
             | notice this instantly and no action would've been taken. -
             | if someone is doing this, imagine the scale- thousands of
             | pics that should be human-evaluated, scaled to thousands of
             | people, it'll be just plain ignored, meaning system loses
             | it's purpose. also, you say that it'll be enabled only if
             | icloud bck is enabled, but it's not guaranteed, this
             | assumption can later change... and it doesn't make sense,
             | for me your 2 statements contradict themselves:
             | 
             | - if apple can scan your photos in icloud AND for this
             | feature to be enabled, you must enable icloud, why they
             | should send hashes to you? they can scan the photos anyway
             | in icloud, since all your photos are backed up there.
             | Unless... they can't scan photos in icloud since these are
             | encrypted, meaning scanning can be done only locally before
             | photos are sent, meaning icloud enabling is not mandatory
             | and it could work without it.
             | 
             | Either way the csam scanning is imo pointless, on one hand
             | bc of privacy reasons(and we've seen that if state is able
             | to use a backdoor, it'll use it when needed) and on the
             | other hand, because of generative algorithms: photos can be
             | manipulated to trigger csam even if human eye can see
             | another thing (aka hashbomb) OR a sick/ill intentioned
             | person can generate a legit csam like photo just by using
             | target ppl's face(or description of their face), in this
             | case I don't even know if they are breaking the law or not,
             | since the image is totally generated but is looking totally
             | illegal
        
               | theshrike79 wrote:
               | You do know that we currently have "thousands of people"
               | watching for and tagging the most heinous shit people
               | upload to social media, right? There are multiple sources
               | for this how we use outsourced people from Africa and
               | Asia to weed through all of the filth people upload on FB
               | alone.
               | 
               | "Looking illegal" isn't enough to trigger a CSAM check in
               | this case. It's perfectly normal to take pictures of your
               | own kids without clothes in most of Europe for example.
               | Nothing illegal.
               | 
               | That's why the checks would've been explicitly done on
               | _known_ and confirmed CSAM images. It wasn 't some kind
               | of check_if_penis() -algorigthm, or one of the shitty
               | ones that trigger if there's too much (white) skin colour
               | in an image.
        
               | Moldoteck wrote:
               | Again, somebody can train an algorithm to create false
               | positives or real csam like pictures, like close enough
               | to trigger the check. Afaik csam is not about exact match
               | but rather close enough match based on a clever hashing
               | algorithm, and in this case, algorithm can be induced
               | into false positives(and by my limitet knowledge, hashing
               | can have collisions) or even true positives but that are
               | fully generated(and afaik generated images is not
               | illegal, but i guess it depends on country).
               | 
               | Outsourcing work for this(afaik) isn't possible since
               | it's private data, not public and only specific
               | organisations can have full access to potential triggers
               | 
               | But in the end it also doesn't matter because there are
               | other problems too, like how to make the final list
               | easily checkable so that we are sure governments/
               | companies do not alter the list to target specific
               | ppl/groups for their own interest. Or how algorithm isn't
               | modified under the hood to check not just images but also
               | text/files
        
         | liveoneggs wrote:
         | How many times are you okay with having your own children taken
         | from you while the thought police make sure your latest family
         | beach vacation wasn't actually trafficking?
         | 
         | How many times will actual abusers be allowed to go free while
         | your own family is victimized by the authorities and what ratio
         | do you find acceptable?
        
         | seanieb wrote:
         | We were not even at a point where that question needs to be
         | asked.
         | 
         | Federal and state police, some of the best funded, equipped and
         | trained police in the world are so inundated with cases that
         | they are forced to limit their investigations to just toddlers
         | and babies. What use is it to add more and more cases to a
         | mountain of uninvestigated crimes? Whats needed is more police
         | clearing the existing caseload.
        
         | sebstefan wrote:
         | >Both of these arguments are absolutely, unambiguously,
         | correct.
         | 
         | Oh, please. As if we couldn't just compare the hashes of the
         | pictures people are storing against a CSAM database of hashes
         | that gets regularly updated
         | 
         | When this was proposed people would respond "But they could
         | just mirror the pictures or cut a pixel off!"
         | 
         | Who cares? You got that picture from some place in the dark
         | web, and eventually someone will stumble upon it and add it to
         | the database. Unless the person individually edits the pictures
         | as they store them, that makes it so that you're never sure if
         | your hashes will posthumously start matching against the DB.
         | 
         | People who wank off to CSAM have a user behavior similar to any
         | other porn user, they don't store 1 picture, they store dozens,
         | and just adding that step makes them likely to trip up, or
         | straight up just use another service altogether
         | 
         | "What if there's a collision?" I don't know, go one step
         | further with hashing a specific part of the file and see if it
         | still matches?
         | 
         | This whole thing felt like an overblown fearmongering campaign
         | from "freedom ain't free" individualists. I've never seen
         | anything wrong with content hosters using a simple hash against
         | you like this.
        
           | ksaj wrote:
           | The hashes can not have collisions anymore, because modern
           | forensics hash with _both_ md5 and sha512, and both hashes
           | must be together for use in any legal case. The odds of both
           | of them having a collision is big enough to flat out say it
           | 's not going to happen.
           | 
           | But even if there was an md5 hash collision back when md5 was
           | the only one hash use, it still doesn't matter because upon
           | viewing the image that matched, if it's not csam, it doesn't
           | matter. Having said that, the chance of _dozens_ of images
           | matching hashes known to be associated to csam is also so
           | unlikely as to be unthinkable. Where there is smoke, there is
           | fire.
           | 
           | And further, a hash alone is meaningless, since in court
           | there must be a presentation of evidence. If the image that
           | set off the csam alarm by hash collision is say, an
           | automobile, there is no case to be had. So all this talk
           | about hash issues is absolutely moot.
           | 
           | Source: I have worked as an expert witness and presented for
           | cases involving csam (back when we called it Child
           | Pornography, because the CSAM moniker hadn't come about yet),
           | so the requirements are well known to me.
           | 
           | Having said all that, I am an EFF member, and I prefer
           | cryptography to work, and spying on users to be illegal.
        
             | pseudalopex wrote:
             | Apple's system used a perceptual hash. Not cryptographic
             | hashes. The hash databases were not auditable and were
             | known to contain false positives. The threshold for viewing
             | reported matches was not auditable and could have been
             | changed at any time. I hope your expert testimony was more
             | careful.
        
         | draw_down wrote:
         | [dead]
        
           | danpalmer wrote:
           | [flagged]
        
             | draw_down wrote:
             | [dead]
        
         | danpalmer wrote:
         | I agree that those statements are correct, however my reading
         | of the proposed Apple implementation was that it struck a good
         | balance between maximising the ability to discover CSAM,
         | minimising the threat vectors, minimising false positives, and
         | minimising the possibility that a malicious government could
         | force Apple to implement bulk surveillance.
         | 
         | I'm all for privacy, but those who put it above all else are
         | already likely not using Apple devices because of the lack of
         | control. I feel like for Apple's target market the
         | implementation was reasonable.
         | 
         | I think Apple backed down on it because of the vocal minority
         | of privacy zealots (for want of a better term) decided it
         | wasn't the right set of trade-offs for them. Given Apple's aim
         | to be a leader in privacy they had to appease this group. I
         | think that community provides a lot of value and oversight, and
         | I broadly agree with their views, but in this case it feels
         | like we lost a big win on the fight against CSAM in order to
         | gain minor, theoretical benefits for user privacy.
        
           | HelloNurse wrote:
           | But "the ability to discover CSAM" is by itself an excuse for
           | mass surveillance, not a bona fide goal. It is certainly
           | possible, instead, to investigate, then find likely
           | pedophiles, and then get a search warrant.
        
             | danpalmer wrote:
             | Discovering users sharing CSAM is a goal isn't it? That's
             | why governments around the world require cloud storage
             | providers to scan for it - because waiting until the police
             | receive a report of someone is not really feasible. A
             | proactive approach is necessary and mandated in many
             | countries.
        
               | Moldoteck wrote:
               | imo diminishing ppl's privacy is a goal. Apple's csam
               | could be tricked in different ways, esp with generative
               | algorithms, like an malicious person will send you an
               | album with 100+ normal looking photos(to the eye) but
               | altered to trigger csam, now govt needs to check 100+
               | photos per person per send and dismiss the false
               | positives. Since this can be replicated, imagine gov't
               | will need to scan 100k similar usecases just for 1k ppl?
               | that's insane, they'll either not check them, so system
               | became obsolete(bc in this case ill intentioned ppl can
               | just send an album of 5k photos, all triggering csam and
               | only a bunch will be real csam. multiplied by nr of these
               | ill ppl, you understand system is easy to game, or they
               | spend thousands of hours checking all this photos and
               | checking each person. Another vector of attack is
               | generation of legit looking csam, bc, generating
               | algorithms are too good now, but in this case(afaik) it's
               | not a crime, since image is fully generated(either by
               | only using ppl's face as starting point or using the
               | description of their face tweaked enough to look
               | realistic). So what we get is: - a system that can be
               | gamed in different ways - a system that's not proved to
               | be effective before releasing - a system that may
               | potentially drive those ppl to other platforms with e2ee
               | that don't have the csam scan(i assume since they know
               | what e2ee is, they can find a platform without csam), so
               | again obsolete AND: - a system that can't be verified by
               | users (like is the csam list legit, can it trigger other
               | things, is the implementation safe?) - a system that can
               | be altered by govt by altering the csam list to target
               | specific ppl (idk snowden or some journalist that found
               | something sketchy) - a system that can be altered by
               | apple/other company by altering csam list for ad
               | targeting purposes
               | 
               | Idk, maybe i'm overreacting, but I've seen what a
               | repressive gov can do, and with such an instrument it's
               | frightening what surveillance vectors can be opened
        
         | version_five wrote:
         | Mass surveillance is never an appropriate solution, let's start
         | with that.
         | 
         | I don't belive tech has an over weighted responsibility to
         | solve society's problems, and in fact it's generally better if
         | we don't try and pretend more tech is the answer.
         | 
         | Advocating for more money and more prioritization for this area
         | of law enforcement is still the way to go if it's a priority
         | area. Policing seems to be drifting towards "mall cop" work,
         | giving easy fines, enabled by lazy electronic surveillance
         | casting a wide net. Let's put resources towards actual
         | detective work.
        
           | SoullessSilent wrote:
           | I would prefer advocating more money for mental health as it
           | would provide additional benefits in other areas of society
           | down the line too? I can't imagine child porn consumption
           | rising from a healthy mind.
        
           | danpalmer wrote:
           | Mass surveillance is bad, but I think there are versions of
           | it that are far less bad than others.
           | 
           | Apple's proposed solution would have theoretically only
           | reported cases that were much more than likely to be already
           | known instances of CSAM (i.e. not pictures of your kids), and
           | if nothing else is reported, can we say that they were really
           | surveilled? In some very strict sense, yes, but in terms of
           | outcomes, no.
        
             | candiodari wrote:
             | How about we start with this version of surveillance:
             | currently it is almost impossible, and frankly stupid, for
             | kids to ask for help with abuse, because they'll end up in
             | this sort of system (with no way out one might add)
             | 
             | https://www.texastribune.org/2022/07/26/texas-foster-care-
             | ch...
             | 
             | So how about we implement mass-surveillance by giving
             | victims a good reason to report crimes? Starting with not
             | heavily punishing victims that do come forward. Make the
             | foster care system actually able to raise kids reasonably.
             | 
             | Because, frankly, if we _don 't_ do it this way, what's the
             | point? Why would we do anything about abuse if we don't fix
             | this FIRST? Are we really going to catch sexual abuse, then
             | put the kids into a state-administered system ... where
             | they're sexually, and physically, and mentally, and
             | financially abused?
             | 
             | WHY would you do that? Obviously that doesn't protect
             | children, it only hides abuse, it protects perpetrators in
             | trade for allowing society to pretend the problem is
             | smaller than it is.
        
             | Muehevoll wrote:
             | [flagged]
        
             | Moldoteck wrote:
             | ok, and in theory, with new generative algorithms, do you
             | think it's still ok? Suppose apple implements this, suppose
             | someone finds a way to generate meme images that can
             | trigger apple's algorithm(but human can't see anything
             | wrong), suppose that someone wants to harm you and sends
             | you a bunch of memes and you save them. What will happen?
             | Or what does happen if somebody is using generative
             | algorithm to create csam like images by using people's face
             | as base but the rest of the image is generated, should this
             | also trigger csam?
             | 
             | Also, you can not guarantee that apple/google will use only
             | known instances of csam, what if, govt orders them/google
             | to scan for other type of content under the hood, like
             | documents or god knows what else bc govt want's to screw
             | that person (for the sake of example let's suppose the
             | targeted person is some journalist that discovered shady
             | stuff and govt wants to put em in prison), bc you know, you
             | don't have access to either algorithms and csam scan list
             | that they are using, system could be abused and usually
             | could means 'sometime' it will
        
               | danpalmer wrote:
               | These criticisms are reasonable criticisms of a system in
               | general, but Apple's design featured ways to mitigate
               | these issues.
               | 
               | I agree that the basic idea of scanning on device for
               | CSAM has a lot of issues and should not be implemented.
               | What I think was missing from the discourse was an actual
               | look at what Apple were suggesting, in terms of technical
               | specifics, and why that would be well designed to not
               | suffer from these problems.
        
               | pseudalopex wrote:
               | Apple's mitigations and their inadequacy were discussed.
        
             | figlett wrote:
             | Mass surveillance isn't necessarily bad. It depends how
             | it's implemented. The solution you describe is basically
             | how it works with the intelligence agencies, in that only a
             | miniscule fraction of the data collected in bulk ever
             | reaches human eyes. The rest ends up being discarded after
             | the retention period.
             | 
             | In terms of outcomes, almost nobody is actually surveilled,
             | as the overall effect is the same as no data having been
             | collected on them in the first place.
             | 
             | That said, I am personally more comfortable with my
             | country's intelligence agencies hoovering up all my online
             | activity than I am with the likes of Apple. The former is
             | much more accountable than the latter.
        
               | jtbayly wrote:
               | Correction: the rest just ends up getting _searched_
               | thousands of times, in direct violation of the US
               | constitution.
        
           | flir wrote:
           | Every other aspect of life has been impacted by the
           | computer's ability to process lots of information at speed.
           | To say "no, policing must not use these tools but everyone
           | else can" seems - well, quixotic, maybe?
           | 
           | If illegal data (CP) is being transferred on the net,
           | wiretapping that traffic and bringing hits to the attention
           | of a human seems like a proportional response.
           | 
           | (Yes, I know, it's not going to be 100% effective, encryption
           | etc, but neither is actual detective work.)
        
             | danbruc wrote:
             | If you have reasonable evidence, wiretapping a suspect to
             | gain more evidence is fine. On the other hand wiretapping
             | everyone in hope of finding some initial evidence, that is
             | not okay at all.
        
               | figlett wrote:
               | Why is it not okay at all? That's what our intelligence
               | agencies do with their bulk data collection capabilities,
               | and they have an immense positive impact on society.
        
               | omniglottal wrote:
               | s/positive/negative/
        
               | danbruc wrote:
               | _[...] and they have an immense positive impact on
               | society._
               | 
               | For this I need proof.
        
               | jtbayly wrote:
               | If you want to argue that they can scan people outside
               | the country and not US citizens, and that that has a
               | benefit, go ahead and make that argument. You might even
               | convince me.
               | 
               | But it's just begging the question to say there's immense
               | benefit to them searching US citizens' communications
               | without a reason.
               | 
               | That's the whole question.
               | 
               | Show me why we should change the constitution which
               | guarantees us freedom from this sort of government
               | oppression.
        
               | figlett wrote:
               | I'm writing from a UK perspective so there's no
               | underlying constitutional issue here like there might be
               | in the US. Bulk data collection is restricted by specific
               | laws and this mandates regular operational oversight by
               | an independent body, to ensure that both the collection
               | and each individual use of the data is necessary and
               | proportionate.
               | 
               | Some of this will include data of British citizens, but
               | the thing is, we have a significant home-grown terrorism
               | problem and serious organised criminal gang activity,
               | happening within the country. If intelligence analysts
               | need to look at, for example, which phone number
               | contacted which other phone number on a specific date in
               | the recent past, there's no other way to do this other
               | than bulk collect all phone call metadata from the
               | various telecom operators, and store it ready for
               | searching.
               | 
               | The vast majority of that data will never be seen by
               | human eyes, only indexed and searched by automated
               | systems. All my phone calls and internet activity will be
               | in there somewhere, I'm sure, but I don't consider that
               | in itself to be government oppression. Only if it's used
               | for oppressive purposes, would it become oppressive.
        
               | flir wrote:
               | But that's just a restatement of the OP's position ("Mass
               | surveillance is never an appropriate solution"). You're
               | not attempting to justify that position.
        
               | danbruc wrote:
               | That's just an axiom for me, no justification needed. My
               | life is my life and it is not the business of the state
               | to watch every step I do as long as I am not affecting
               | others in any relevant way. You convince me that I or
               | society as a whole would be better off if I allowed the
               | state to constantly keep an eye on me, then I might
               | change my opinion and grant the state the permission to
               | violate my privacy.
        
               | flir wrote:
               | > That's just an axiom for me, no justification needed
               | 
               | Congrats, you've got a religion.
        
               | troupo wrote:
               | [flagged]
        
               | danbruc wrote:
               | That's nonsense, every worldview must be grounded in some
               | axioms, that does not make it a religion. I can break it
               | down somewhat more for you. The state has no powers
               | besides the ones granted by its citizens. I value my
               | privacy highly and need very good reasons to grant the
               | state permission to violate it. Catching criminals does
               | not clear the bar, there are other ways to do this that
               | do not violate my privacy.
        
               | jtbayly wrote:
               | It's guaranteed by our constitution, among other reasons.
               | Search of my communications for no reason is, by
               | definition "unreasonable search."
        
               | flir wrote:
               | That document with 27 amendments?
        
               | jtbayly wrote:
               | Yes?
               | 
               | If you want to get it amended, then by all means, make a
               | case for why it should be amended.
               | 
               | In the meantime you wanted to know why mass surveillance
               | isn't an option. The answer "because it's against the
               | law" is a simple, good answer.
               | 
               | If you want to know why we decided as a nation to make
               | that such a fundamental law that it is in our
               | constitution, you could do worse than reading about what
               | prompted the writing of the Bill of Rights.
               | 
               | I agree with a lot of the original reasoning.
        
               | danbruc wrote:
               | _The answer "because it's against the law" is a simple,
               | good answer._
               | 
               | While often true, at all times there have also been
               | morally wrong laws, so it would not be unreasonable to
               | counter that being written into law on itself means
               | nothing. So you should always be prepared to pull out and
               | defend the reasoning behind a law, which you also hinted
               | at in your following sentences.
        
           | DSingularity wrote:
           | When tech creates problems should tech tried to solve it or
           | should tech be limited?
           | 
           | We deceive ourselves honestly by pretending like we have not
           | created new realities which are problematic at scale. We
           | have. They are plentiful. And if people aren't willing that
           | we walk back tech to reduce the problems and people aren't
           | willing to accept technical solutions which are invasive then
           | what are we to do? Are we just to accept a new world with all
           | these problems stemming from unintended consequences of tech?
        
             | isykt wrote:
             | "Tech"? What do you mean by "tech?" Do you expect Apple to
             | remove the camera, storage, and networking capabilities of
             | all their devices? That's the "tech" that enables this.
        
               | glogla wrote:
               | I mean "tech" did a lot of messed up things - there is a
               | reason why "what is your favorite big tech innovation: 1)
               | illegal cab company 2) illegal hotel company 3) fake
               | money for criminals 4) plagiarism machine" is a funny
               | joke.
               | 
               | Enabling people to talk to each other without all their
               | communication being wiretapped and archived forever is
               | not one of those, I would say.
        
               | danaris wrote:
               | Those aren't really "tech innovations", though, aside
               | from maybe the plagiarism machine.
               | 
               | Uber and AirBnB are just using very-widely-available
               | technology--that some taxi services and hotels are _also_
               | using!--and claiming that they 're _completely different_
               | when the main difference is that they 're just ignoring
               | the laws and regulations around their industries.
               | 
               | Cryptocurrencies are using a tech innovation as a _front_
               | for what 's 99.9999% a financial "innovation"...which is
               | really just a Ponzi scheme and/or related scams in
               | sheep's clothing.
               | 
               | LLMs are genuinely a tech innovation, but the primary
               | problem they bring to the fore is really a conversation
               | we've needed to have for a while about copying in the
               | digital age. The signs have been there for some time that
               | such a shift was coming; the only question was exactly
               | when.
               | 
               | In none of these cases is technology actually _doing_
               | anything  "messed up". Companies that denote themselves
               | as being "in the tech industry" do bad things all the
               | time, but blaming the _technology_ for the _corporate_
               | (and otherwise human) malfeasance is very unhelpful. In
               | particular, trying to _limit technological progress_ , or
               | _ban widely useful technological innovations_ because a
               | small minority of people use them for ill, is
               | horrifically counterproductive.
               | 
               | Enforce the laws we have better, be more willing to turn
               | the screws on people even if they have lots of money, and
               | where necessary put new regulations on human and
               | corporate behavior in place (eg, requiring informed
               | consent to have works you created included in the
               | training set of an LLM or similar model).
        
             | BLKNSLVR wrote:
             | > When tech creates problems should tech tried to solve it
             | or should tech be limited?
             | 
             | You haven't explained the problem 'tech' has created, I'm
             | confused as to what your point is?
             | 
             | CSAM isn't a problem caused by 'tech' unless you're going
             | back to the invention of the camera, and I think that
             | toothpaste is well out of the tube.
             | 
             | Additionally, and this is where a whole of arguments about
             | this go wrong, the important part: the actual literal
             | abuse, is human to human. There is no technology involved
             | whatsoever.
             | 
             | Technological involvement may be an escalation of offense,
             | but it's vanishingly secondary.
        
         | tshaddox wrote:
         | > Both of these arguments are absolutely, unambiguously,
         | correct.
         | 
         | I don't really buy any "slippery slope" arguments for this
         | stuff. Apple already can push any conceivable software it wants
         | to all of its phones, so the slope is already as slippery as it
         | can possibly be.
         | 
         | It just doesn't make sense to say "Apple shouldn't implement
         | this minimal version of photo-scanning now even though I don't
         | think it's bad, because that's a slippery slope for them to
         | implement some future version of scanning that I _do_ think is
         | bad. " They already have the capability to push _any_ software
         | to their phones at _any_ time! They could just skip directly to
         | the version you think is bad!
        
           | mrits wrote:
           | Your comment confused me. Isn't Apple still scanning iPhones
           | for CSAM just not the iCloud? I don't see any additional
           | threat vectors by doing it locally.
        
         | traceroute66 wrote:
         | > criminals are using E2EE communication systems to share
         | sexual abuse material
         | 
         | Blah blah blah, the same old argument given by the "think of
         | the children" people.
         | 
         | There are many ways to counter that old chestnut, but really,
         | we only need to remember the most basic fundamental facts:
         | 
         | 1) Encryption is mathematics 2) Criminals are criminals
         | 
         | Can you ban mathematics ? No. Can you stop criminals being
         | criminals ? No.
         | 
         | So, let's imagine you are able to successfully backdoor E2EE
         | globally, on all devices and all platforms.
         | 
         | Sure, the "think of the children" people will rejoice and start
         | singing "Hallelujah". And the governments will rub their hands
         | with glee with all the new data they have access to.
         | 
         | But the criminals ? Do you honestly think they'll think "oh no,
         | game over" ?
         | 
         | No of course not. They'll pay some cryptographer in need of
         | some money to develop a new E2EE tool and carry on. Business as
         | usual.
        
           | danaris wrote:
           | > Can you stop criminals being criminals ? No.
           | 
           | [Citation needed]
           | 
           | This mindset--that assigns people into immutable categories,
           | "criminal" and "not criminal"--is actually one of the biggest
           | things that needs to change.
           | 
           | We absolutely _can_ stop criminals from being criminals. We
           | just can 't do so by pointing at them and saying "Stop! Bad!"
           | We have to change the incentives, remove the reasons they
           | became criminal in the first place (usually poverty), and
           | make it easier, safer, and more acceptable to go from being a
           | criminal to being a not-criminal again.
        
           | ben_w wrote:
           | > But the criminals ? Do you honestly think they'll think "oh
           | no, game over" ?
           | 
           | > No of course not. They'll pay some cryptographer in need of
           | some money to develop a new E2EE tool and carry on. Business
           | as usual.
           | 
           | I used to think this, I changed my mind: just as it's
           | difficult to do security correctly even when it's a legal
           | requirement, only the most competent criminal organisations
           | will do this correctly.
           | 
           | Unfortunately, the other issue:
           | 
           | > And the governments will rub their hands with glee with all
           | the new data they have access to.
           | 
           | Is 100% still the case, and almost impossible to get anyone
           | to care about.
        
             | traceroute66 wrote:
             | > only the most competent criminal organisations will do
             | this correctly.
             | 
             | All it takes is for one criminal to write a one-page guide
             | to using GPG and circulate it to the group ....
             | 
             | I know I mentioned paying a cryptographer earlier, but in
             | reality downloading and using GPG is a crude and effective
             | way of defeating an E2EE backdoor.
             | 
             | Are the GPG devs going to backdoor GPG to satisfy
             | governments ? Probably not.
        
               | ben_w wrote:
               | > All it takes is for one criminal to write a one-page
               | guide to using GPG and circulate it to the group
               | 
               | If cybersecurity was that easy, we wouldn't have so many
               | examples of businesses getting it wrong.
               | 
               | Just because everyone here can follow instructions like
               | that, doesn't make it common knowledge for anyone else.
        
               | dnh44 wrote:
               | > If cybersecurity was that easy, we wouldn't have so
               | many examples of businesses getting it wrong.
               | 
               | There are almost no consequences for anyone working at a
               | business that gets it wrong.
               | 
               | The consequences for being a nonce are quite severe so
               | the motivation to get it right will be quite high.
        
               | newscracker wrote:
               | > If cybersecurity was that easy, we wouldn't have so
               | many examples of businesses getting it wrong.
               | 
               | I can only partially agree with this point. Businesses
               | getting cybersecurity wrong has almost no material and
               | significant consequences. At best, they get a tiny slap
               | on the wrist or asked to answer some questions. Nobody in
               | said businesses goes to jail for it or personally pays
               | any fines. Compare that to criminals who have a lot more
               | to lose if they get caught -- jail time, fines they have
               | to pay, not having freedom for quite sometime, life not
               | being the same after they've served their sentence, and
               | more. Businesses have it extremely easy compared to this.
               | No wonder cybersecurity is so poor among all businesses,
               | including very large ones (like Microsoft, as a recent
               | example).
        
               | ben_w wrote:
               | > jail time, fines they have to pay
               | 
               | Fear of these is the reason for the (maliciously
               | compliant) GDPR popups, and _that_ despite discussion
               | about extra-territoriality and relativity limited
               | capacity-to-websites ratio.
               | 
               | The law and threats of punishment are clearly not hugely
               | significant to anyone involved in the specific topic of
               | this thread regardless; in the UK at least, it's the kind
               | of thing where if someone is lynched for it, the
               | vigilantes have to be extremely stupid (like attacking a
               | paediatrician because they can't tell the difference,
               | which happened) to not get public sympathy.
        
             | Moldoteck wrote:
             | they can just use/switch to unpatched devices with some
             | opensource e2ee without big effort. Result? Criminals will
             | continue doing criminal stuff, rest of the planet will be
             | under surveillance system that can be altered at
             | govt/company's will without your knowledge to either target
             | specific ppl(from govt) or target groups of ppl(for
             | 'relevant' ads)
        
               | ben_w wrote:
               | > they can just use/switch to unpatched devices with some
               | opensource e2ee without big effort.
               | 
               | Assuming no new zero-days, and that they were even doing
               | best practices in the first place. How many legit
               | companies do best practices?
        
               | Moldoteck wrote:
               | zero days are irrelevant imo, zero days are for targeted
               | attacks(assuming it's from gov), exploiting all zerodays
               | for all devices is not that productive, csam scan on the
               | other hand can handle both untargeted and targeted
               | surveilance: untargeted by spotting bad actors from a
               | generic csam list, targeted - by adding to that list
               | target's face/specific things to locate it and monitor
               | it.
               | 
               | That's the point, bad actors can circumvent the system if
               | they feel threatened, but system can be exploited by
               | gov/companies once rolled out globally to target any
               | user, so we get something that may be not that effective
               | against bad actors but poses great risk to be misused by
               | gov/company in their own interests without users knowing.
               | I've seen how an authoritarian gov in my country is
               | targeting ppl bc they are uncomfortable for the system
               | and this algorithm opens another potential vector of
               | attack.
        
               | ben_w wrote:
               | I don't buy your argument, but as for:
               | 
               | > I've seen how an authoritarian gov in my country is
               | targeting ppl bc they are uncomfortable for the system
               | and this algorithm opens another potential vector of
               | attack.
               | 
               | As per my last sentence in my initial comment in this
               | chain:
               | 
               | --
               | 
               | > And the governments will rub their hands with glee with
               | all the new data they have access to.
               | 
               | Is 100% still the case, and almost impossible to get
               | anyone to care about.
        
         | phpisthebest wrote:
         | >>and at rates which they were not previously able to.
         | 
         | your source for proof of that?
         | 
         | >> in the sense that if we don't clearly take a stand against
         | abhorrent behaviour then we are in some sense condoning it.
         | 
         | No. this narrative of "silence is violence" and "no action is
         | support" etc is 100% wrong.
         | 
         | You started out great, but i can not get behind this type of
         | thinking...
         | 
         | >>Does the tech industry have any alternate solutions that
         | could functionally mitigate this abuse?
         | 
         | Why is it a "tech industry" problem
         | 
         | >>Or do we all just shout "yay, individual freedom wins again!"
         | 
         | For me the answer is simple... Yes individual freedom is more
         | important that everything. I will never support curbing
         | individual freedom on the alter of any kind proclaimed
         | government solution to a social safety problem. Largely because
         | I know enough about history to understand that not only will
         | they no solve that social safety problem, many in government
         | are probably participating in the problem and have the power to
         | exempt themselves, while abusing the very tools and powers we
         | give them to fight X, for completely unrelated purposes.
         | 
         | Very quickly any tool we would give them to fight CSAM would be
         | used for Drug Enforcement, Terrorism, etc. It would not be long
         | before the AI based phashes detect some old lady's Tomato
         | plants as weed and we have an entire DEA paramilitary unit
         | raiding her home...
        
         | numpad0 wrote:
         | I don't like how these sentiments are written as if (C)SAM
         | sharing is only type of crimes to ever be committed, which is
         | devastating and PTSD inducing and life crippling, but say, not
         | a murder. It could be one, or it could be major financial
         | crimes, terrorism conspiracy, et cetera.
         | 
         | Yet, the only justification around for mass surveillance for
         | camera pictures, the important societal matter to rest
         | literally on "the other side of the coin", is "some minority of
         | people could be sharing naked photos of young members of
         | society for highly unethical acts of viewing".
         | 
         | wtf.
        
         | jimkoen wrote:
         | I'd very much like a source on your claim that "[...] criminals
         | are using E2EE communication systems to share sexual abuse
         | material in ways and at rates which they were not previously
         | able to."
         | 
         | > Does the tech industry have any alternate solutions that
         | could functionally mitigate this abuse? Does the industry feel
         | that it has any responsibility at all to do so? Or do we all
         | just shout "yay, individual freedom wins again!" and forget
         | about the actual problem that this (misguided) initiative was
         | originally aimed at?
         | 
         | The issue at large here is not the tech industry, but law
         | enforcement agencies and the correctional system. Law
         | enforcement has proven time and time again themselves that the
         | most effective way to apprehend large criminal networks in this
         | area is by undercover investigation.
         | 
         | So no, I don't think it is the tech industries role to play the
         | extended arm of some ill conceived surveillance state. Because
         | Apple is right: This is a slippery slope and anyone that
         | doesn't think malicious political actors will use this as a
         | foot in the door to argue for more invasive surveillance
         | measures, using this exact pre-filtering technology are just
         | naive idiots, in my oppinion.
        
         | acumenical wrote:
         | [flagged]
        
           | mvonballmo wrote:
           | I honestly hadn't considered that when I asked someone to use
           | Signal or Threema instead of Facebook Messenger that they
           | would think I was a pedophile or drug addict. Food for
           | thought.
        
             | acumenical wrote:
             | For what it's worth, I don't think using Signal or Threema
             | is enough to make you an E2EE enthusiast, and wanting to
             | speak without your speech later used against you is maybe
             | the purest reason for E2EE. I meant more so the type of
             | people who are into Tor or I2P.
        
         | isykt wrote:
         | How are individuals re-victimized with every share? That makes
         | no sense. Your LinkedIn profile photo could be on a billboard
         | in rural China, what would it be to you?
        
           | Simulacra wrote:
           | There are several cases of victims being harassed and haunted
           | by photos and video of them as a child being sexually abused.
           | This is one of the reasons for Masha's law.
           | 
           | https://www.nbcphiladelphia.com/news/local/child-porn-
           | victim...
        
             | akira2501 wrote:
             | This makes no mention of direct harassment. How easy is it
             | to connect random pictures of children with actual living
             | adults? She's suing people she's never had any direct
             | contact with. The government itself notifies her each time
             | someone is arrested and in possession of an image of her,
             | why, it does not say, but none of this sounds like a
             | necessary healthy resolution to the underlying problem.
        
         | bigDinosaur wrote:
         | The extreme hysteria created by anything related to children
         | often seems to be carte blanche to destroy privacy and
         | implement backdoors in applications. Most child abuse comes
         | from family members (which must be solved at the source), and
         | the ultra extreme cases simply make awful law (doing away with
         | E2EE or instituting mass surveillance to catch an incredibly
         | small minority is absurd).
         | 
         | Much like other 'tough on crime' measures (of which destroying
         | E2EE is one) the real problems need to be solved not at the
         | point of consumption (drugs, guns, gangs, cartels) but at the
         | root causes. Getting rid of E2EE just opens the avenue for the
         | abuse of _us_ by the government but in no way guarantees we 'll
         | meaningfully make children safer.
         | 
         | And no, we are not 'condoning' it when we declare E2EE an
         | overall good thing. Real life is about tradeoffs not absolutes,
         | and the tradeoff here is protection from the government for
         | potentially billions vs. _maybe_ arresting a few thousand more
         | real criminals. This is a standard utilitarian tradeoff that
         | 'condones' nothing.
        
           | gnfargbl wrote:
           | _> Most child abuse comes from family members (which must be
           | solved at the source)_
           | 
           | Yes. Since becoming an abuser is a process and not a moment,
           | part of the solution must be making access to CSAM much
           | harder.
           | 
           |  _> And no, we are not  'condoning' it when we declare E2EE
           | an overall good thing._
           | 
           | Agreed. I'm sorry if I worded things in a way that caused you
           | to see an implication which was not intended. To be clear:
           | E2EE is a good thing. Championing E2EE is _not_ equivalent to
           | condoning CSAM.
           | 
           | What I did say is that in failing to try and provide any
           | meaningful solutions to this unintended consequence of E2EE,
           | the industry is effectively condoning the problem.
           | 
           |  _> This is a standard utilitarian tradeoff_
           | 
           | If that's the best we can do, I'm very disappointed. That
           | position says that to achieve privacy, I must tolerate CSAM.
           | I want _both_ privacy and for us not to tolerate CSAM. I don
           | 't know what the solution is, but that is what I wish the
           | industry were aiming for. At the moment, the industry seems
           | to be aiming for nothing but a shrug of the shoulders.
        
             | [deleted]
        
             | pseudalopex wrote:
             | It is not clear if access to child pornography increases,
             | decreases, or has no significant effect on child sex
             | abuse.[1]
             | 
             | [1] https://en.wikipedia.org/wiki/Relationship_between_chil
             | d_por...
        
             | phpisthebest wrote:
             | >>That position says that to achieve privacy, I must
             | tolerate CSAM. I want both privacy and for us not to
             | tolerate CSAM.
             | 
             | I want to have more money than Elon Musk..... sometimes
             | life is not fair and we can not always get what we want...
             | 
             | Any "backdoor" or "frontdoor" in encryption is a total
             | failure of encryption. That is a immutable truth, more
             | fixed in reality than the speed of light is in physics.
        
             | danbruc wrote:
             | _That position says that to achieve privacy, I must
             | tolerate CSAM. I want both privacy and for us not to
             | tolerate CSAM._
             | 
             | Not true, you can have privacy and at the same time not
             | tolerate child pornography, those are two perfectly
             | compatible positions and arguably the current state. What
             | you can not have - by definition - is privacy on the one
             | hand and on the other hand no privacy in order to look for
             | child pornography. You can still fight child pornography in
             | any other way, but when it comes to privacy, you have to
             | make a choice - give people their privacy or look through
             | their stuff for illegal content, you can not have both. If
             | you have enough evidence, a judge might even grant law
             | enforcement the permission for privacy violating measures,
             | it should just not be the default position that your
             | privacy gets violated.
        
               | [deleted]
        
             | BLKNSLVR wrote:
             | > Since becoming an abuser is a process and not a moment,
             | part of the solution must be making access to CSAM much
             | harder.
             | 
             | In my opinion CSAM is a symptom, not a cause.
             | 
             | It's difficult to "stumble across" that kind of material
             | unless you're already actively looking for it, which means
             | some amount of "damage" is already done.
             | 
             | I also highly doubt that someone with no proclivities in
             | that direction would 'turn' as a result of stumbling across
             | CSAM. I'd guess they'd go the other way and be increasingly
             | horrified by it.
        
             | Guvante wrote:
             | Your logic is flawed fundamentally.
             | 
             | Should we disallow encrypted traffic between machines
             | because that encrypted traffic could be CSAM?
             | 
             | Saying "X is bad so we should be rid of it" requires a
             | method to get rid of it.
             | 
             | Obviously giving up privacy and allowing more surveillance
             | can reduce something but that isn't necessarily the right
             | choice.
             | 
             | Like what do you do if they start sharing in files the user
             | encrypts?
        
             | Manuel_D wrote:
             | > Yes. Since becoming an abuser is a process and not a
             | moment, part of the solution must be making access to CSAM
             | much harder.
             | 
             | This is a very big assumption. Sexual abuse of minors has
             | existed long before the internet, and long before
             | photography. The notion that less availability of CSAM
             | leads to less real-world abuse is not at all clear.
             | 
             | > If that's the best we can do, I'm very disappointed. That
             | position says that to achieve privacy, I must tolerate
             | CSAM. I want both privacy and for us not to tolerate CSAM.
             | I don't know what the solution is, but that is what I wish
             | the industry were aiming for. At the moment, the industry
             | seems to be aiming for nothing but a shrug of the
             | shoulders.
             | 
             | As other commenters have pointed out, the solution is to
             | prevent children from being abused in the first place. Have
             | robust systems in place to address abuse, and give kids
             | effective education and somewhere to speak out if it
             | happens to them or someone they know.
        
           | practice9 wrote:
           | > which must be solved at the source
           | 
           | Governments are not good at this type of thing. It requires
           | careful analysis, planning and actual decisions.
           | 
           | But slap on a regulation and require private companies to do
           | the hard work for you - now we are talking!
        
         | glogla wrote:
         | [flagged]
        
           | red_admiral wrote:
           | This morality may not be so unusual outside the tech "filter
           | bubble". And wherever someone, like the OP, appears to be
           | serious, my own personal morality says the absolute least
           | they deserve is an equally serious answer.
        
             | gnfargbl wrote:
             | I'm confused by what you mean by "morality" here. The only
             | moral position that I am communicating is that child sexual
             | abuse is a real thing that really happens, and it is bad
             | for both the individual and for society. That's it. There's
             | no subtext. There is explicitly no refutation of the
             | arguments against client-side CSAM scanning which, I will
             | say again, are unambiguously correct.
             | 
             | Is being against child sexual abuse _really_ an unusual
             | opinion in the tech industry? Have we really all gone so
             | far along the Heinlein /Rand road that any mention of a
             | real negative outcome gets immediately dismissed with the
             | empathy-free thought-terminating-cliche _" think of the
             | children?"_
        
               | glogla wrote:
               | > Is being against child sexual abuse really an unusual
               | opinion in the tech industry?
               | 
               | Nobody said that. You are being manipulative an trying to
               | make it look that people who disagree with you are
               | somehow pro-child abuse.
               | 
               | In saying "Does the tech industry have any alternate
               | solutions that could functionally mitigate this abuse?"
               | you are trying to pain a picture in which child abuse is
               | somehow "tech industry" fault.
               | 
               | You are also trying to paint a complete Panopticon in
               | which every interpersonal communication is subjected to
               | surveillance by the state as somehow the default, that
               | end to end encrypted electronic communication is changing
               | - while the truth is that personal communication was
               | private for hundreds of years, because it was impossible
               | for the state to listen in on everything.
        
               | gnfargbl wrote:
               | This thread is tending towards flamewar so I'll try to
               | dial back, but I do want to respond.
               | 
               |  _> You are being manipulative an trying to make it look
               | that people who disagree with you are somehow pro-child
               | abuse._
               | 
               | I am not doing that. You described my position as an
               | "alien morality", to which another poster seemed to
               | agree. I was responding to that by clarifying the actual
               | moral point I was making. For the avoidance of doubt, I
               | am not arguing that you are pro-child abuse.
               | 
               |  _> In saying  "Does the tech industry have any alternate
               | solutions that could functionally mitigate this abuse?"
               | you are trying to pain a picture in which child abuse is
               | somehow "tech industry" fault._
               | 
               | Yes, this is basically a correct understanding of my
               | position. I am stating that the problem has been
               | massively exacerbated by the adoption of E2EE by the tech
               | industry, and that the tech industry therefore has a
               | moral responsibility to deal with the unintended
               | consequences of its own action.
               | 
               |  _> the truth is that personal communication was private
               | for hundreds of years, because it was impossible for the
               | state to listen in on everything._
               | 
               | The truth is that in pre-internet times, networks of
               | pedophiles could be dismantled by the state, partly
               | through surveillance -- see https://en.wikipedia.org/wiki
               | /Paedophile_Information_Exchang... for a supporting
               | example.
        
               | red_admiral wrote:
               | > I am stating that the problem has been massively
               | exacerbated by the adoption of E2EE by the tech industry
               | 
               | I understand that most information on how the state
               | fights organised crime will be classfied, but if there is
               | any publicly available evidence for this claim that you
               | can share, I would be interested in reading it (and I
               | hope others on this thread would be too). I'm not saying
               | I doubt you - you give the impression you know what
               | you're talking about - please take this in the spirit
               | it's intended, as one of my former supervisors once said
               | "In academia, asking for citations/references is an
               | expression of interest, not of doubt".
        
               | gnfargbl wrote:
               | It's completely reasonable to ask for evidence. Don't
               | apologise for asking!
               | 
               | I'm not part of any state, and I don't have access to any
               | special knowledge that you can't find on the internet.
               | 
               | I'm also not aware of any study that provides the very
               | direct link you're asking for. Because of the nature of
               | E2EE, I don't know if it would be possible to produce
               | one. What I can do is link to evidence such as
               | https://www.weprotect.org/global-threat-
               | assessment-21/#repor..., which has (to me) some fairly
               | compelling data showing that the magnitude of the problem
               | is increasing.
        
               | red_admiral wrote:
               | I don't think anyone would disagree with you that child
               | abuse exists - and if they did, that's an empirical
               | question, and it resolves to you being correct.
               | 
               | The moral part is whether and how much society / the
               | state / the tech industry should invest in combating it,
               | and how the advantages and disadvantages of mandating
               | government access to E2E encrypted communications or
               | people's cloud storage weigh up.
               | 
               | For what it's worth, my own position is that the state
               | should do more about it, and should in principle have
               | more resources allocated to do so. I would support higher
               | taxes in exchange for more police (and better trained
               | police), who could do more about many kinds of crime
               | including child abuse. I wouldn't mind more resources
               | being allocated to policing specifically for fighting
               | child abuse, too. But I could think of a lot of other
               | places besides legislating access to people's messenger
               | apps where such resources could be invested.
               | 
               | I'm still undecided on whether legally mandated backdoors
               | in E2E encrypted storage and communications would be
               | _effective_ in fighting child abuse, which is a question
               | I would need more technical knowledge on before I could
               | take an informed position (I know a fair bit about
               | cryptography but less about how organised crime
               | operates). If it turns out that this would be an
               | ineffective measure (maybe criminals fall back on other
               | means of communication such as TOR relays) then it would
               | be hard to justify such a measure morally, especially as
               | it could have a lot of disadvantages in other areas.
        
         | 0x000042 wrote:
         | > Both of these arguments are absolutely, unambiguously,
         | correct.
         | 
         | Indeed, they are correct. And they were also brought up when
         | Apple announced that they would introduce this Orwellian
         | system. Now they act like they just realized this.
        
         | ben_w wrote:
         | > Is is bad for the individuals who are re-victimised on every
         | share.
         | 
         | Absolutely.
         | 
         | > It is also bad for the fabric of society at large, in the
         | sense that if we don't clearly take a stand against abhorrent
         | behaviour then we are in some sense condoning it.
         | 
         | Much less clear. There's always been the argument: does it
         | provide an outlet with no _ _new_ _ (emphasis on "new" so
         | people don't skim that word) victims, or does it encourage
         | people to act out their desires for real?
         | 
         | I don't have any answer to this question; but the answer
         | matters. I do have a guess, which is "both at the same time in
         | different people", because humans don't have one-size-fits-all
         | responses.
         | 
         | Even beyond photographs, this was already a question with
         | drawings; now we also have AI, creating new problems with both
         | deepfakes of real people and ex-nihilo (victimless?) images.
         | 
         | > Does the tech industry have any alternate solutions that
         | could functionally mitigate this abuse?
         | 
         | Yes.
         | 
         | We can build it into the display devices, or use a variation of
         | Van Eck phreaking to the same effect.
         | 
         | We can modify WiFi to act as wall-penetrating radar with the
         | capacity to infer pose, heart rate, and breathing of multiple
         | people nearby even if they're next door, so that if they act
         | out their desires beyond the screen, they'll be caught
         | immediately.
         | 
         | We can put CCTV cameras everywhere, watch remotely what's on
         | the screens, and also through a combination of eye tracking and
         | (infrared or just noticing a change in geometry) who is aroused
         | while looking at a forbidden subject and require such people to
         | be on suppressants.
         | 
         | Note however that I have not said _which_ acts or images: this
         | is because the options are symmetrical under replacement for
         | every other act and image, including (depending on the option)
         | non-sexual ones.
         | 
         | There are places in the world where being gay has the death
         | penalty. And if I remember my internet meme history right,
         | whichever state Mr Hands was in, _accidentally_ decriminalised
         | his sex when the Federal courts decided states couldn 't outlaw
         | being gay and because that state had only one word in law for
         | everything they deemed "unnatural".
        
         | kafrofrite wrote:
         | My two cents reg. this.
         | 
         | Creating backdoors that allow encryption schemes to be
         | subverted is _fundamentally_ going to cause harm on the
         | internet, and eventually fail the weakest users/those that need
         | privacy/security the most.
         | 
         | A mechanism that can subvert cryptographic protocols can be
         | used by any party, including oppressive regimes, private
         | entities etc. that have the resources/will/knowledge to use the
         | backdoor etc. Backdoors harm both the trust on the web (which
         | can have an impact on economic transactions among many others)
         | and the people that need security/privacy the most. In the
         | meantime, criminals will wise up and move their operations
         | elsewhere where no backdoors exist.
         | 
         | We basically end up with a broken internet, we are putting
         | people in harm's way and the criminals we are targeting are
         | probably updating their OPSEC/MO not to rely on E2EE.
        
         | mrtksn wrote:
         | >The other side of the coin is that criminals are using E2EE
         | communication systems to share sexual abuse material in ways
         | and at rates which they were not previously able to.
         | 
         | I think that companies might need to enable some kind of
         | mechanism for offline investigation of the devices though. The
         | CSAM is a real problem, there are real predators out there, the
         | only risks isn't CSAM and law enforcement does really need to
         | have a way to investigate devices. Previously, my proposal was
         | the ability to force the device scan the user content for
         | fingerprints of the suspected content but only with physical
         | access. Physical access enforces the law enforcement to
         | actually have a real and official investigation with strong
         | enough reasons to spend resources and risk repercussions when
         | done improperly.
         | 
         | However, the project of scanning all the user content for
         | policing the users was one thing that irked and later relieved
         | when Apple abandoned the project.
         | 
         | Apple's explanation is good and I agree with them but IMHO the
         | more important aspects are:
         | 
         | 1) Being able to trust your devices being on your side. That
         | is, your device shouldn't be policing you and shouldn't be
         | snitching on you. At this time you might think that the
         | authorities who would have controlled yor device are on your
         | side but don't forget that those authorities can change. Today
         | the devices may be catching CSAM, some day the authorities can
         | start demanding catching people opposing vaccines and an
         | election or a revolution later they can start catching people
         | who want to have an abortion or having premarital sexual
         | relations or other non-kosher affairs.
         | 
         | 2) Being free of the notion that you are always watched. If
         | your device can choose to reveal your private thoughts or
         | business, be it by mistake or by design, you can no longer have
         | thoughts that are unaligned with the official ones. This is
         | like the idea of a god who is always watching you but instead
         | of a creator and angels you get C level businessmen and
         | employees who go through your stuff when the devices triggers
         | decryption of your data(by false positives or by true
         | positives).
         | 
         | Anyway, policing everyone all the time must be an idea that is
         | rejected by the free world, if the free world doesn't intent to
         | be as free as Democratic People's Republic of Korea is
         | democratic.
        
         | FredPret wrote:
         | No, no, no.
         | 
         | There is no parallel to be drawn between better encryption and
         | worse outcomes for kids. Should we also outlaw high-performance
         | cars because these sometimes serve as effective getaway
         | vehicles for criminals?
         | 
         | CSAM producers and consumers should be found and punished via
         | old-fashioned methods. How was this done in the past? Did we
         | just never catch any human traffickers / rapists? No, we had
         | detectives who went around detecting and presumably kicking
         | down doors.
         | 
         | To outlaw large sections of mathematics because of this is
         | absurd. And from the amount of power it would give big
         | governments / big businesses, the fabric of society doesn't
         | stand a chance.
        
           | hx8 wrote:
           | I think it's a pretty hardline opinion to state law
           | enforcement should be confined to "old-fashioned" methods.
           | Tech is changing the world. Let's not let it be a de-facto
           | lawless world.
           | 
           | Yea, LE/IC clearly have gone too far in many modern tactics.
           | 
           | Yea, it's possible to build a surveillance/police state much
           | more efficiently than ever before.
           | 
           | Yea, we should be vigilant against authoritarianism.
        
           | techsupporter wrote:
           | > Should we also outlaw high-performance cars because these
           | sometimes serve as effective getaway vehicles for criminals?
           | 
           | What if we change the last bit after the "because" to "these
           | sometimes are used at unsafe speeds and, intentionally or
           | not, kill people who are not in cars?"
           | 
           | Because, at least for me, the answer is an unambiguous yes.
           | 
           | I agree that privacy and security should be available to
           | everyone. But we also shouldn't count on being able to find
           | people who are doing vile things--to children _or_ adults--
           | because the person messed up their opsec. I think Apple is
           | correct here but as an industry we have to be putting our
           | brains to thinking about this.  "To outlaw large sections of
           | mathematics" is hyperbole because we _use mathematics_ to do
           | a lot of things, some useful and some not.
        
           | threeseed wrote:
           | > How was this done in the past? Did we just never catch any
           | human traffickers / rapists?
           | 
           | Recently invented encrypted chat rooms allow people to
           | coordinate and transfer CSAM without any government official
           | being able to infiltrate it. And just being able to freely
           | discuss has been shown to make the problem worse as it
           | facilitates knowledge transfer.
           | 
           | This is all completely different to in the past where this
           | would have been done in person. So the argument that we
           | should just do what we did in the past makes no sense. As
           | technology advances we need to develop new techniques in
           | order to keep up.
        
         | tomjen3 wrote:
         | The real issue is that FBI/NSA/CIA has abused our trust in the
         | so completely that we have to make E2E communication. From
         | assasinating people like Fred Hampton to national security
         | letters, the government has completely lost the trust of tech.
         | 
         | That is a bigger problem and it will take a long time to fix.
         | So long that I suspect that anybody reading this is long dead,
         | but its like the saying with planting trees.
        
         | Manuel_D wrote:
         | > The other side of the coin is that criminals are using E2EE
         | communication systems to share sexual abuse material in ways
         | and at rates which they were not previously able to.
         | 
         | ...regardless of whether Apple rolls out E2EE right? End to end
         | encryption is available through a whole host of open-source
         | tools, and should Apple deploy CSAM scanning the crooks will
         | just migrate to a different chat tool.
        
         | themgt wrote:
         | _It is also bad for the fabric of society at large, in the
         | sense that if we don 't clearly take a stand against abhorrent
         | behaviour then we are in some sense condoning it. Does the tech
         | industry have any alternate solutions that could functionally
         | mitigate this abuse?_
         | 
         | I'd suggest there's a lot the not-tech industry could do to
         | stop condoning abhorrent behavior that stops short of
         | installing scanners on billions of computing devices. It's
         | become a bit of a trope at this point, but it's bizarre to see
         | a guy who is/was spokesman for a "minor attracted persons"
         | (i.e. pedos) advocacy group getting published negatively
         | reviewing the controversial new sex trafficking movie ... in
         | Bloomberg:
         | 
         | https://www.bloomberg.com/opinion/articles/2023-07-15/qanon-...
         | 
         | For some background:
         | 
         | https://www.opindia.com/2021/08/meet-noah-berlatsky-prostasi...
         | 
         | His 501(c)(3) non-profit also advocates in favor of pedophilic
         | dolls and has a "No Children Harmed certification seal" program
         | for pedophilic dolls/etc:
         | 
         | https://prostasia.org/blog/dolls-prevent-child-sexual-abuse/
         | 
         | https://prostasia.org/no-children-harmed/
         | 
         | I'm not sure you can criminalize stuff like this, but it sets
         | off my alarm bells when pedophile advocates are being published
         | in mainstream news at the same time there's a moral panic
         | around the need to scan everyone's hard drives. Is society
         | actually trying to solve this problem, or is this more like
         | renewing the Patriot Act to record every American's phone calls
         | at the same time we're allied with al Qaeda offshoots in Syria?
         | Interesting how terrorism has been the primary other argument
         | for banning/backdooring all encryption.
         | 
         | ----
         | 
         | As an aside I couldn't find ~anything about this group "Heat
         | Initiative" Apple is responding to? Other than a TEDx talk by
         | the founder a couple years ago which again seems very focused
         | on "encrypted platforms" as the primary problem that needs
         | solving:
         | https://www.ted.com/talks/sarah_gardner_searching_for_a_chil...
        
           | vacuity wrote:
           | Can't solve social problems with technology, as they say. And
           | as mentioned elsewhere, most child abuse is perpetrated by
           | family members and other close connections.
        
         | [deleted]
        
         | thinking_ape wrote:
         | [dead]
        
         | lamontcg wrote:
         | Are you willing to go to jail after someone hacks into your
         | phone and uploads CSAM to it which triggers this detection
         | mechanism?
        
       | _boffin_ wrote:
       | So... they're just doing on device scanning instead of icloud and
       | just calling it a different name?
        
         | sheepscreek wrote:
         | Yes - and there's a huge difference between the two.
         | 
         | In a word, decentralization.
         | 
         | By detecting unsafe material on-device / while it is being
         | created, they can prevent it from being shared. And because
         | this happens on individual devices, Apple doesn't need to know
         | what's on people's iCloud. So they can offer end-to-end
         | encryption, where even the data on their servers is encrypted.
         | Only your devices can "see" it (it's a black box for Apple
         | servers, gibberish - without the correct decryption key).
        
         | meepmorp wrote:
         | Not really? It looks like the nudity detection features are all
         | on device, aren't CSAM specific, and seem to be mostly geared
         | towards blocking stuff like unsolicited dick pics.
         | 
         | The earlier design was a hybrid model that scanned for CSAM on
         | device, then flagged files were reviewed on upload.
        
         | olliej wrote:
         | No, the terrible misfeature that this group wants is
         | "government provides a bunch of opaque hashes that are 'CSAM',
         | all images are compared with those hashes, and if the hashes
         | match then the user details are given to police"
         | 
         | Note that by design the hashes cannot be audited (though in the
         | legitimate case I don't imagine doing so would be pleasant), so
         | there's nothing stopping a malicious party inserting hashes of
         | anything they want - and then the news report will be "person x
         | bought in for questioning after CSAM detector flagged them".
         | 
         | That's before countries just pass explicit laws saying that the
         | filter must includE LGBT content (in the US several states
         | consider books with lgbt characters to be sexual content, so a
         | lgbt teenager would be de facto CSAM), in the UK the IPA is
         | used to catch people not collecting dog poop so trusting them
         | not to expand scope is laughable, in Iran a picture of a woman
         | without a hijab would obviously be reportable, etc
         | 
         | What Apple has done is add the ability to filter content (eg
         | block dick picks) and for child accounts to place extra steps
         | (incl providing contact numbers I think?) if a child attempts
         | to send pics with nudity, etc
        
           | Dig1t wrote:
           | >in the UK the IPA is used to catch people not collecting dog
           | poop
           | 
           | What does this mean? What is IPA? I tried Googling for it but
           | I'm not finding much. I would love to learn more about that
        
             | olliej wrote:
             | The investigatory powers act.
             | 
             | It was passed to stop terrorism, because previously they
             | found that having multiple people (friends and family etc)
             | report that someone was planning a terrorist attack failed
             | to stop a terrorist attack.
        
       | rafale wrote:
       | False positives would constitute a huge invasion of privacy. Even
       | actual positives would be, a mom taking a private picture of her
       | naked baby, how can you report that. They did well dropping this
       | insane plan. The slippery slope argument is also a solid one.
        
         | cmcaleer wrote:
         | NYT article about exactly this situation[0]. Despite the
         | generally technical competency of HN readership, I imagine
         | there would be a lot of people who would find themselves
         | completely fucked if this situation happened to them.
         | 
         | The tl;dr is that despite this man ultimately having his name
         | cleared by the police after having his entire Google account
         | history (not just cloud) searched as well his logs from a
         | warrant served to ISP, Google closed his account when the
         | alleged CSAM was detected and never reinstated it. He lost his
         | emails, cloud pictures, phone number (which losing access to
         | prevented the police from contacting him via phone), and more
         | all while going through a gross, massive invasion of his
         | privacy because he was trying to do right for his child during
         | a time when face-to-face doctor appointments were difficult to
         | come by.
         | 
         | This should be a particularly salient reminder to people to
         | self-host at the very least the domain for their primary and
         | professional e-mail.
         | 
         | [0] https://www.nytimes.com/2022/08/21/technology/google-
         | surveil...
        
         | [deleted]
        
         | Gigachad wrote:
         | The apple one was only matching against known images, not
         | trying to detect new ones.
         | 
         | The google one actually does try to detect new ones and there
         | are reported instances of Google sending the police on normal
         | parents for photos they took for the doctor.
        
       ___________________________________________________________________
       (page generated 2023-09-01 23:00 UTC)