[HN Gopher] The EARN IT act is an attack on end-to-end encryption
       ___________________________________________________________________
        
       The EARN IT act is an attack on end-to-end encryption
        
       Author : jmsflknr
       Score  : 209 points
       Date   : 2020-03-06 17:13 UTC (5 hours ago)
        
 (HTM) web link (blog.cryptographyengineering.com)
 (TXT) w3m dump (blog.cryptographyengineering.com)
        
       | motohagiography wrote:
       | The risk of backdoors in encryption has changed since the last
       | crypto wars.
       | 
       | With Huawei entering the 5G market, all ostensible law
       | enforcement encryption backdoors now become de-facto Chinese
       | communist party backdoors because of the pervasiveness of their
       | equipment, and that nations interception capabilities.
       | 
       | The UK and Canada have approved Huawei to supply critical
       | networks, and now end-to-end encryption on our personal devices
       | is the _only_ thing preventing interception by Beijing.
       | 
       | It also explains why the US president was so angry with Bojo over
       | approving Huawei, because it means if the U.S. allows Huawei, it
       | must also allow end to end encryption for citizens to protect
       | themselves. The national security priority of mitigating that
       | aggressive foreign interception capability for every business in
       | the country should outweigh the special interest of law
       | enforcement using victims groups as human shields.
        
         | vkou wrote:
         | > With Huawei entering the 5G market, all ostensible law
         | enforcement encryption backdoors now become de-facto Chinese
         | communist party backdoors because of the pervasiveness of their
         | equipment, and that nations interception capabilities.
         | 
         | How is this different from those backdoors being CIA/NSA
         | backdoors, when that equipment was made by US vendors?
         | 
         | Nothing fundamentally changed. We should be building protocols
         | that don't require us to trust the underlying network.
        
         | jszymborski wrote:
         | > The UK and Canada have approved Huawei to supply critical
         | networks,
         | 
         | If you're referring to 5G, Huawei has yet to be given a green
         | light in Canada. Perhaps you're referring to some other
         | infrastructure, however.
        
           | mrybczyn wrote:
           | You mean other than Huawei 5g being chosen and deployed (or
           | about to be) at Telus 1 of the 3 major carriers?
           | 
           | And the fact that both Telus and Bell use Huawei for their
           | 4g/LTE networks?
        
             | dblohm7 wrote:
             | That does not mean that the government has approved it. In
             | fact, the telcos are freaked out right now that the
             | government will say no and force the telcos to replace the
             | Huawei stuff.
        
       | seemslegit wrote:
       | FB and google aside - why does something like signal need section
       | 230 protections to operate ?
        
         | unnouinceput wrote:
         | World-wide? It doesn't. In US? it does. Also once Google will
         | obey to out Signal from their store, usage will decrease (and
         | is not a wide usage currently to begin with).
         | 
         | Short-term effects? Power back to surveillance / power
         | grabbers. They don't care about CSAM, they care about
         | money/power.
         | 
         | Long-term effects? Like dark web, Signal (or something similar)
         | will be used mostly by criminals, so EARN IT will fail it's
         | "honorable" goal 100%, while achieving its hidden goal (strip
         | privacy from ordinary citizen).
         | 
         | Also this will accelerate Splinternet. The future looks bleak,
         | welcome to it.
        
           | seemslegit wrote:
           | Why does it need them in the US ? - From what I understand
           | section 230 protects a company from lawsuits related to
           | content posted by their users, signal is not a content
           | platform as it does person-to-person communication
        
             | Klonoar wrote:
             | Signal also doesn't retain the content, I believe, which...
             | well, if it's more or less auto-killed, there's no
             | argument, no?
             | 
             | Note that I don't support doing away with Section 230, but
             | I find this train of thought interesting.
        
       | toss1 wrote:
       | Obviously, as TFA describes, this is a huge end-run to create a
       | problem.
       | 
       | But, would this leave a loophole for text-only and/or highly
       | bandwidth-limited communications to remain end-to-end encrypted?
       | 
       | If you cannot sent a photo, audio or video, kind of hard to send
       | CSAM material, yet end-to-end real-time SMS type messages are
       | still somewhat useful in many instances (better than nothing).
       | 
       | Anyone with more detailed info?
        
         | ISO-morphism wrote:
         | Base64 encode, no such thing as text-only.
        
           | toss1 wrote:
           | Good point, but if you bandwidth-limit and message-size-limit
           | the channel to something like the speed of a world-record
           | typist, sending a Base64 encoded pic of any good resolution
           | would take days rendering it essentially useless for that
           | sort of thing.
        
             | DuskStar wrote:
             | And now you've blocked me copy-pasting my novel's
             | manuscript...
        
       | cassalian wrote:
       | I have a feeling that the EARN IT act will pass based on that it
       | so cleverly disguises it's ability to ban end-to-end
       | encryption... I imagine most elected officials will hear
       | something like this:
       | 
       | Person explaining: "It's establishes a committee to make sure
       | people are using best practices to ensure child pornography etc
       | isn't being distributed on their platforms"
       | 
       | Elected official: "Hmm, it actually sounds like this will help
       | the children!"
       | 
       | Maybe it's just me _shrug_ , but I have little faith in our
       | elected officials to parse out the ramifications to encryption
       | based on how the act is written.
        
         | tapoxi wrote:
         | >I have little faith in our elected officials to parse out the
         | ramifications to encryption based on how the act is written.
         | 
         | Then its your duty to contact your Senators/Representative.
         | It's not hard.
        
       | bilekas wrote:
       | There is another reason its not really effective to try to ban or
       | even limit e2e encryption. If you're intentions are nefarious,
       | and you have a group that you need to communicate with, you will
       | just implement the encryption yourself. Or buy burner phones etc.
       | 
       | Honestly, with my cynical hat on, I feel this is actually being
       | pushed by the marketing and advertisement lobbyists, such as
       | Facebook, google and others, in order to data mine for
       | advertisers into your communications..
       | 
       | Maybe my tinfoil hat is a little bit too big, but I really use
       | encrypted communications so that at least there is one dialogue
       | that's not being warped into advertisements for me.
       | 
       | I genuinely fear for future generations privacy, in all regards.
       | It's worrying and it really does deserve more attention. It's so
       | crazy to my younger nieces and nephews that when I was their age,
       | I didn't have a phone. It blows their minds. And I'm only in my
       | 30's.
        
         | ta999999171 wrote:
         | It's not too big. Metaphorically.
         | 
         | Realistically, it will amplify signals/readings, so, it's
         | outdated.
         | 
         | Anyway, what happens when messaging with ciphertext becomes
         | illegal?
        
         | fenwick67 wrote:
         | Also One-Time-Pad is uncrackable and dead simple to program and
         | you can do it with pen and paper. The cat's out of the bag for
         | the baddies.
        
           | gph wrote:
           | I've seen OTPs described as being a replacement for e2e
           | encryption before, but I just don't see it as actually usable
           | outside of large established organizations (NSA, cartel,
           | etc.) to use for passing short text messages. And even then
           | the usability seems cumbersome enough that they'd only use it
           | when strictly necessary.
           | 
           | Am I missing something?
        
             | aianus wrote:
             | I don't see why it's limited to short text messages.
             | 
             | Your pre-shared OTP can be a rack of 8TB hard drives
             | delivered to an embassy by the Marine Corps which covers a
             | whole lot of documents and media before it's exhausted.
        
               | bilekas wrote:
               | Looks like a legit solution to me!
        
           | bilekas wrote:
           | Exactly, the problem for the baddies has been solved since at
           | least 1882 (Frank Miller iirc: __Edit Wiki says yes __). So
           | this argument about security is obviously just what
           | information they 're getting.
           | 
           | You can't blame someone for not knowing what they don't know,
           | but lawmakers are supposed to enact laws based on the benefit
           | for their citizens. Seems not to be the case here.
        
       | squarefoot wrote:
       | I might be playing Captain Obvious here, but anyway, this
       | worldwide coronavirus pandemic state of panic is the best
       | possible scenario in which corrupt politicians in any country
       | could enact restrictive laws or do nasty things while their
       | citizens look elsewhere. If they were waiting for a good mass
       | distraction weapon, well, this is it.
        
       | carapace wrote:
       | > The new bill would make it financially impossible for providers
       | like WhatsApp and Apple to operate services unless they conduct
       | "best practices" for scanning their systems for CSAM.
       | 
       | I'm okay with that, as long as e2e isn't actually banned.
        
         | cfv wrote:
         | As the author states, encrypting your data impedes scanning it,
         | so any encryption that is actually worth anything inmediately
         | puts them at fault, so it wont be applied.
         | 
         | Any unmonitored user upload is at risk here.
         | 
         | This is dark.
        
       | andrewflnr wrote:
       | Is there any real evidence that restricting distribution of CSAM
       | creates better outcomes for victims? More to the point is there
       | evidence that increased aggression in enforcement of CSAM
       | possession laws produces proportionately better outcomes? I would
       | expect the ROI curve to go flat pretty quickly.
        
         | mindslight wrote:
         | It's a hot button for lawmakers because they want the privilege
         | reserved for themselves. I mean that is why these psychopaths
         | lust after power over others, right? To partake in what money
         | cannot buy.
         | 
         | It's similar cognitive dissonance to homophobic agendas being
         | pushed by closeted politicians.
        
       | wilkystyle wrote:
       | The argument from those in power against encryption is mind-
       | bogglingly stupid, and I don't know if it is due to an extreme
       | ignorance, or because of a lust for the power and leverage that
       | mass surveillance grants a governing entity (I suppose it could
       | be both?)
       | 
       | Regardless, using the argument of child pornography and sex
       | trafficking is an emotional play, and it is solely designed to
       | resonate with those who also do not understand the technology.
       | 
       | If this same argument took another form, e.g. if this were an
       | attempt ban walls made out of non-transparent material because
       | opaque walls allow child abuse to occur hidden from sight, the
       | obvious violation of privacy would be evident to the average
       | person.
       | 
       | What inevitably happens with laws that create such a drastic
       | power imbalance between your average citizen and the governing
       | entity is that those with power and status are exempt.
        
         | duxup wrote:
         | Not even the governing entity...
         | 
         | ALL the governing entities, or any entity with bad intentions,
         | because whatever magical backdoor or whatever are available
         | will surely leak and in that case everyone is at the mercy not
         | just their own government (let's say they're responsible folks)
         | but other government groups who do not care what happens to
         | anyone outside their borders....
        
         | throwaway55554 wrote:
         | > The argument from those in power against encryption is mind-
         | bogglingly stupid...
         | 
         | Is it, though? I mean, most people think all sorts of bad stuff
         | happens on the internet, so playing to those emotions would
         | garner more support.
         | 
         | > ... extreme ignorance, or because of a lust for the power and
         | leverage...
         | 
         | The latter. They believe that tech companies have too much
         | power right now.
        
           | TrueDuality wrote:
           | The funny thing is that true end-to-end encryption actually
           | weakens the power a lot of tech companies have. That content
           | isn't available for them to scrape.
           | 
           | By mandating that the encryption can't be end-to-end
           | guarantees these large companies access to private data they
           | wouldn't otherwise have. Data they can then use for their own
           | gain. When people complain that they have that access,
           | they'll now have the excuse the government made them do it.
        
           | cjfd wrote:
           | Whether or not the tech companies have too much power does
           | not sound like the most relevant question here. The lack of
           | power of citizens of communicate privately is the more
           | worrisome thing. And maybe some random person is right to say
           | that all of his communications are not something to hide. It
           | becomes something else if lawyers and journalists can't
           | protect their communications. If the state cannot suffer that
           | it is rather likely that it is engaging in dirty business
           | that it wants to hide, as pretty much all of them are.
        
         | CJefferson wrote:
         | Encryption isn't like transparent walls in my opinion.
         | 
         | In the real world, the police can get into anywhere, and get
         | basically any physical object, once they have a warrant. Most
         | people seem to agree that is reasonable -- I don't think there
         | is a big push for an easy way for people to hide physical
         | objects from police.
        
         | saber6 wrote:
         | > The argument from those in power against encryption is mind-
         | bogglingly stupid, and I don't know if it is due to an extreme
         | ignorance, or because of a lust for the power and leverage that
         | mass surveillance grants a governing entity (I suppose it could
         | be both?)
         | 
         | The people we are talking about are not stupid. We both know
         | which reality is true, as sad as the admission feels.
        
         | bilekas wrote:
         | > using the argument of child pornography and sex trafficking
         | is an emotional play
         | 
         | It is and it's used everytime something like this comes up
         | actually. But if you take it for what it is; most of the
         | lawmakers don't fully undersand what their deciding on, so they
         | depend on lobbyists for their info and unfortunately, they're
         | always just interested in making money, so the lawmakers get a
         | skewed view and some nice talking points. I'm not sure if I
         | feel bad for them or just be confident that the next generation
         | of life long politicians might be people like us who are aware
         | of this problem and enact laws to protect privacy.
         | 
         | The old pendulum that swings example.
        
           | inetknght wrote:
           | > _just be confident that the next generation of life long
           | politicians might be people like us who are aware of this
           | problem and enact laws to protect privacy._
           | 
           | Do you want a pay cut? I don't want a pay cut. Unless we take
           | a pay cut and go become a elected, it won't be people like us
           | who are the next generation of life long politicians.
        
             | bilekas wrote:
             | Personally, no, I'll be the first to put my hand up and say
             | I don't have the patience to try and deal with the
             | bureaucracy but I do believe that there are some people who
             | go into politics to genuinely make a difference and improve
             | the society. As cynical as we can be about politicians, for
             | sure there are people out there who would just like to
             | 'fix' things. - Usually the roadblock to that is the
             | interest bodies with influence.. It's annoying at best, but
             | I hold out hope..
        
       | Klonoar wrote:
       | Has there been any public statement from Apple, Google, and
       | assorted companies? I'd imagine they have to be concerned about
       | this.
       | 
       | This one feels kind of weird in that I'm not seeing the same
       | level of uproar/pushback as I've seen in the past, which is a
       | slightly frightening bit.
        
         | cft wrote:
         | If one of them is behind on encryption relative to others it's
         | a competitive disadvantage. If all of them are forced to
         | abolish encryption it's a welcome opportunity for better ad
         | targeting . Additionally this increased regulatory and
         | technological burden is a good start-up deterrent. What's not
         | to like?
        
           | Klonoar wrote:
           | The bill is specifically going after Section 230 on the
           | surface, though, which I would think would be of more
           | immediate concern to them. Most blog posts even note that the
           | encryption aspect is being targeted as a run-around.
           | 
           | I'm willing to entertain the logic, sure, just not sure I
           | agree with it. Feels like there's more at stake for them
           | (collectively) here.
        
       | ctoth wrote:
       | Please won't someone think of the children?
        
         | bostik wrote:
         | I'd like it if someone actually thought of the adults, for a
         | change.
        
       | clarry wrote:
       | > So in short: this bill is a backdoor way to allow the
       | government to ban encryption on commercial services.
       | 
       | I'm all for this. Suddenly people in tech would have to start
       | taking free & open source decentralized services seriously
       | instead of lazily relying on Google and Facebook while
       | complaining about how evil they are.
        
         | danShumway wrote:
         | We want decentralized systems so that people will be more free.
         | Giving up freedom in order to get them is completely
         | counterproductive.
         | 
         | Decentralized systems aren't nothing; they're more resilient
         | against these types of attacks than the alternatives. However,
         | decentralized systems don't _welcome_ these attacks any more
         | than anybody else does. These attacks still hurt us, and they
         | still make our life harder -- China isn 't _more_ free because
         | its centralized services are all back-doored.
         | 
         | There's a (thankfully fringe) group of people who keep saying
         | that if Section 230 goes away everyone will just switch to more
         | Open protocols and it'll be fine. But this is painfully naive;
         | a pseudo-requirement to backdoor communications will make it
         | harder to build Open platforms and onboard users, because no
         | commercial host will want to touch a platform that exposes them
         | to liability. Getting rid of Internet freedom turns the people
         | using these systems into criminals, which will make Open
         | platforms much more dangerous to use, much more risky to
         | sponsor, and much harder to advertise or develop.
         | 
         | What happens when you go to host your private encrypted email
         | on Linode, and Linode says, "no, because then we can't scan
         | your server for CSAM"? What happens when the Matrix org tries
         | to set up a free server to onboard new users and the government
         | prosecutes them? What happens when every user running a Tor
         | exit node becomes liable for content traced back to that IP?
         | What on earth makes you think the DOJ won't prosecute hosts of
         | Open services?
         | 
         | There are so many ways this law can go wrong, and so many ways
         | it can be expanded from here to shut down the projects you
         | think it's helping.
        
           | clarry wrote:
           | > There are so many ways this law can go wrong, and so many
           | ways it can be expanded from here to shut down the projects
           | you think it's helping.
           | 
           | It'd have to be expanded by a ridiculous amount. To the point
           | where it's practically speaking illegal for anyone to run
           | non-approved software on their computers, if it can send
           | messages over the internet. I'm not saying they wouldn't try
           | that, but in practice it's completely unrealistic. Which
           | means, at best, it'll be a law that criminalizes everyone and
           | nobody really cares.
           | 
           | Hilariously, the same tech behemoths that people give up
           | their freedom to are doing everything they can to push us
           | into the same exact situation, with their walled gardens and
           | power asymmetry that allows them to squeeze out competition.
           | 
           | > But this is painfully naive; a pseudo-requirement to
           | backdoor communications will make it harder to build Open
           | platforms and onboard users, because no commercial host will
           | want to touch a platform that exposes them to liability.
           | 
           | If you play your cards right, these "platforms" are more like
           | internet routers and load balancers that facilitate message
           | exchange between computers. You build your platform on top of
           | this technology. Banning the technology would be akin to
           | banning TCP/IP+TLS or UDP+DTLS. I don't see hosting providers
           | having a case for banning encrypted transport protocols. And
           | the law in question doesn't go that far; again, for it to go
           | there we'd need a law that virtually bans all encrypted
           | message exchange on the internet.
           | 
           | > What happens when the Matrix org tries to set up a free
           | server to onboard new users and the government prosecutes
           | them?
           | 
           | Once decentralization becomes a thing techies care about
           | because they need it, I'm sure we can spread software by word
           | of mouth just like we did back in the early days of Kazaa,
           | DC++, Torrents, etcetra. In fact that's pretty much how
           | software gets adopted today. You only need faddy & flashy
           | onboarding when you're trying to growth hack a product that
           | has no intrinsic demand for it. We'd be way past that point.
           | 
           | > What happens when every user running a Tor exit node
           | becomes liable for content traced back to that IP?
           | 
           | That's already something you should worry about if you're
           | about to run a Tor exit node. I don't recommend it.
           | 
           | Exit nodes are just bowing down to the centralized clearnet.
           | A proper decentralized network is one where the exchange
           | stays within, and doesn't rely on a single point source as a
           | hosting node that can be taken down. If a message is in the
           | network, it can be anywhere, and it can and will replicate
           | itself if requested.
           | 
           | > What on earth makes you think the DOJ won't prosecute hosts
           | of Open services?
           | 
           | The easy way out for them is that everyone uses the same
           | handful of services provided by a handful of tech behemoths.
           | Then they only need to prosecute those, if they don't play
           | along. That's largely where we've been headed, and I think it
           | really sucks, because I lose my freedom and privacy both to
           | these tech companies as well as their government.
           | 
           | If individual people started participating in a network and
           | hosting their own nodes, the situation becomes much harder.
           | If they seriously tried to prosecute everyone, it'd end up
           | being much like the neverending war on drugs (or piracy or
           | similar). Except that in this case, you wouldn't be up
           | against just potheads and kids downloading movies, you'd also
           | be up against professionals (and not only in tech) keeping
           | their comms confidential. But everyone, individually, is a
           | small fry, so fighting a legal battle against everyone is
           | very nonproductive, unlike slapping one megacorp with
           | millions in fines they can actually pay.
           | 
           | The status quo is terrifying, and it is not getting better
           | because even technical users are too lazy to care if they
           | don't have to. If I move to decentralized services that try
           | to provide freedom and anonymity, I'm just isolating myself
           | from everything else and simultaneously painting a target on
           | my back. That's because these things do not have enough
           | mindshare, and they will never have enough mindshare when
           | techies just encourage everyone to keep using google & fb &
           | co.
        
       | pat2man wrote:
       | Couldn't Apple and Google just ship a client side model for
       | detecting certain images and apps could use it to detect CSAM
       | without leaving the device?
        
         | SkyBelow wrote:
         | How long before someone extracted the algorithm and turned it
         | into an application to test images before they are shared?
         | Could even use it as a benchmark for software to hide from
         | authorities.
         | 
         | One thing to remember is that "The Net interprets censorship as
         | damage and routes around it." applies regardless of how
         | agreeable the censorship is.
        
         | bostik wrote:
         | The US regulators are painfully aware of device rooting and how
         | it allows to subvert any client-side security measures.
         | 
         | Disclosure: I have been dealing with certain state regulators
         | and their morbid fear of forged geo-location data for a number
         | of months. It would be inane to assume other regulators would
         | be any less informed about the threat vector.
        
         | kodablah wrote:
         | Linked to from the article:
         | https://blog.cryptographyengineering.com/2019/12/08/on-clien...
        
       | throwbackThurs wrote:
       | It's also very easy to implement your own client side encryption
       | methods with wrappers around service provider APIs. Sure they may
       | be private APIs but if people are desperate libraries and apps
       | will appear to do this.
       | 
       | Really anyone who wants to encrypt data WILL, the algo ithms are
       | public knowledge, it ain't going anywhere.
       | 
       | Why is the English speaking world becoming so authoritarian in
       | its policing of its people? This isn't going to end well, I
       | honestly believe the English Empire is burning right now.
        
       | tboyd47 wrote:
       | How do you figure? The word "encryption" doesn't appear anywhere
       | in the bill.
        
       | faeyanpiraat wrote:
       | Couldn't homomorphic encryption be part of a possible solution to
       | enable end-to-end encryption and server-side scanning aswell?
        
         | nullc wrote:
         | Fully homomorphic encryption doesn't provide an ability to
         | operate on encrypted data and get a decrypted output. It is
         | also absurdly slow, with order-of-magnitude performance of a
         | minute per and-gate in the operation being performed.
         | 
         | But, lets forget the terms you used and consider the question
         | of "can fancy crypto do something here"?
         | 
         | A protocol could be created using a zero knowledge proof and a
         | private set intersection that could do the following: I compute
         | the hash of a file, blind it, and then submit it for you to
         | query against a secret database of naughty hashes (Private set
         | intersection / Private information retrieval). Then I encrypt
         | the file, send it, the opened intersection result, and a ZKP
         | that the encrypted file has a hash corresponding to the query.
         | 
         | The server only learns if the encrypted file was a hit on your
         | database, it doesn't even learn the file's hash if it wasn't a
         | hit. If the private intersection scheme is setup right the user
         | doesn't learn if it was a hit or not.
         | 
         | Assuming the naughty hash databases was reasonably small (like
         | tens of thousands of items), and users were assumed to be on
         | very fast smart phones or desktops... then could actually this
         | could have workable performance with existing tech-- on the
         | order of tens of seconds processing on the client, milliseconds
         | on the server.
         | 
         | But, this kind of scheme is pointless: I just make a one bit
         | change to every file I send and it'll never match. You could
         | invoke some kind of fuzzy match but then false positives are a
         | real problem, the fuzzy hashing is a lot more expensive to
         | perform inside the ZKP (now you need every user on a 8 core
         | desktop), and the fuzzy matching is prohibitively expensive
         | inside the private intersection (so the server side scales
         | poorly).
         | 
         | You could go further and make it so that the server could
         | decrypt the entire file if and only if there was a fuzzy match
         | (and still, the user still can't tell if a match happened)--
         | but even that would create really bad incentives to stuff the
         | database with false-positive producing data (or just loads of
         | legally protected speech which they'd like to covertly
         | monitor). You couldn't make the database public and transparent
         | without distributing the naughty-data yourself and without
         | making it easy for users to self-censor anything that might
         | match.
         | 
         | ... and that kind of supercharged scheme is also still easily
         | defeated by just pre-encrypting the data.
         | 
         | So you'd have a system that was absurdly expensive to create,
         | expensive to operate (both for the client and server),
         | extremely non-transparent due to its complexity (even if it was
         | open source) and private database, and would have extremely
         | limited ability to do its job. Users would be subjected to an
         | uncertain and non-transparent level of non-privacy. To me that
         | seems more dystopian than a transparent "we're gonna watch
         | everything you do", at least with a simple surveillance state
         | you know where you stand and you'll refrain from complaining
         | about Dear Leader online, where it might result in you ending
         | up in a prison camp.
         | 
         | The whole discussion misses the point that the real goal of
         | these systems isn't to protect children, stop child porn, etc.
         | (which, as awful as it is, is essentially a rounding error in
         | the risks we face) the real purpose it to subject the
         | population to pervasive whole-take retroactively accessible
         | surveillance.
        
         | seemslegit wrote:
         | No, by definition allowing a 3rd party to derive any
         | information about the semantics of the content means it's not
         | end-to-end encrpytion or not a two-way communication (the 3rd
         | party is assumed to be trusted)
         | 
         | Also - this is hardly the point.
        
         | jagged-chisel wrote:
         | Perhaps. What's the state-of-the-art in homomorphic encryption?
         | I only recall vaporware, platitudes, and non-starters in the
         | last year.
        
       ___________________________________________________________________
       (page generated 2020-03-06 23:00 UTC)