[HN Gopher] Debunking NIST's calculation of the Kyber-512 securi...
       ___________________________________________________________________
        
       Debunking NIST's calculation of the Kyber-512 security level
        
       Author : bumbledraven
       Score  : 172 points
       Date   : 2023-10-03 19:49 UTC (3 hours ago)
        
 (HTM) web link (blog.cr.yp.to)
 (TXT) w3m dump (blog.cr.yp.to)
        
       | greggsy wrote:
       | It would be interesting to see Signal Sciences response to this
       | Bernstein's post
        
       | codeflo wrote:
       | That's more of a diary than an article -- jargony, disorganized,
       | running in circles, very hard to follow. But the information
       | might be important regardless. There's a strong implication that
       | NIST with help of the NSA intentionally standardized on a weak
       | algorithm.
       | 
       | We all know that's possible.
       | 
       | But can someone who follows some of this stuff more closely
       | explain what the play would be? I always assumed that weakening
       | public cryptography in such a way is a risky bet, because you
       | can't be sure that an attacker doesn't independently find out
       | what you know. You can keep a secret backdoor key (that was the
       | accusation when they released Dual_EC_DRBG), but you can't really
       | hide mathematical results.
       | 
       | Why would they be willing to risk that here?
        
         | aaomidi wrote:
         | > Why would they be willing to risk that here?
         | 
         | Certain types of attacks basically make it so you need to have
         | a specific private key to act as a backdoor. That's the current
         | guess on what may be happening with the NIST ECC curves.
         | 
         | If so, this can be effectively a US-only backdoor for a long,
         | long time.
        
           | tptacek wrote:
           | I don't believe that is anybody's guess on what may be
           | happening with the NIST ECC curves. Ordinarily, when people
           | on HN say things like this, they're confusing Dual EC, a
           | public key random number generator, known to be backdoored,
           | with the NIST curve standards.
        
             | dfox wrote:
             | The issue with the NIST curves is that they were generated
             | from a PRNG with some kind of completely random seed. The
             | conspiracy theory there is that the seed was selected such
             | as to make the curve exploitable for NSA and NSA only.
             | Choosing such a seed is somewhat harder than complete break
             | of the hash function (IIRC SHA-2) used in the PRNG that was
             | used to derive the curve.
             | 
             | On the other hand, there is a lot of reasons to use
             | elliptic curve that was intentionally designed, so, DJB's
             | designs. And well, in 2009 I would not imagine that the
             | kinds of stuff that DJB publishes will end up being TLS1.3.
        
               | tptacek wrote:
               | It's very _unlikely_ the seeds were random, and they
               | weren 't even ostensibly generated from a PRNG, as I
               | understand it. Rather, they were passed through SHA1
               | (remember: this is the 1990s), as a means to destroy any
               | possible structure in the original seed. The actual seeds
               | themselves aren't my story to tell, but are a story that
               | other people are talking about. For my part, I'll just
               | point again to Koblitz and Menenzes on the actual
               | cryptographic problems with the NIST P-curve seed
               | conspiracy:
               | 
               | https://eprint.iacr.org/2015/1018.pdf
        
               | f33d5173 wrote:
               | A hash function is a (CS)PRNG. It has the key property,
               | namely of being indistinguishable from randomness while
               | being generated deterministically.
        
               | tptacek wrote:
               | In fact, `echo "This is my seed" | openssl sha -sha256`
               | is not really a CSPRNG. Hash functions are the bases of
               | many PRNGs. But I think you're abusing an ambiguity with
               | the word "random" here. At any rate: we should be clear
               | now on the point being made about the P-curve seeds.
        
             | api wrote:
             | Also: if the NIST ECC curves actually are backdoored then
             | why would the NSA need to try to push a backdoored random
             | number generator? Just exploit the already-backdoored
             | curves.
        
               | PeterisP wrote:
               | Redundancy, so if one backdoor is closed/fixed/avoided,
               | you still have more.
        
             | aaomidi wrote:
             | Yeah I've noticed people mixing them up. They happened
             | around the same time, so I can excuse it a bit.
             | 
             | The problem with the NIST ECC curves are that we still do
             | not know where the heck that seed came from and why that
             | seed specifically.
        
               | tptacek wrote:
               | See Koblitz and Menenzes:
               | 
               | https://eprint.iacr.org/2015/1018.pdf
        
         | manonthewall wrote:
         | It's all far too conspiratorial for me. Just show me the math
         | as to why it's broken, I don't need a conspiratorial mind map
         | drawing speculative lines between various topics. Do an
         | appendix or two for that.
        
         | Exoristos wrote:
         | You're making an assumption that the NSA cares about the
         | efficacy of cryptography for other people. Why would they care
         | about that?
        
       | nmitchko wrote:
       | Unfortunately, the NSA & NIST most likely is recommending a
       | quantum-proof security that they've developed cryptanalysis
       | against, either through high q-bit proprietary technology or
       | specialized de-latticing algorithms .
       | 
       | The NSA is very good at math, so I'm be thoroughly surprised if
       | this analysis was error by mistake rather than error through
       | intent.
        
         | tptacek wrote:
         | "Specialized de-latticing algorithms"?
        
         | [deleted]
        
         | [deleted]
        
         | [deleted]
        
         | jrochkind1 wrote:
         | The NSA also has a mission-based interest in _breaking_ other
         | people's crypto though, which is generally known.
         | 
         | Which is generally known, so I'm surprised by your argument.
         | Even if the NSA knows more than they are telling us, this
         | doesn't result in most of us feeling less worried, as their
         | ends may not be strengthening the public's cryptography!
        
           | aaomidi wrote:
           | Yes: https://en.wikipedia.org/wiki/Dual_EC_DRBG
           | 
           | Also, we still to this day do not know where the seed for
           | P256 and P384 came from. And we're using that everywhere.
           | There is a non-zero chance that the NSA basically has a
           | backdoor for all NIST ECC curves, and no one actually seems
           | to care.
        
             | brohee wrote:
             | Or you find it somewhat credible but still use them because
             | fending off the NSA is not something you want to spend
             | energy on, and you are confident in the fact that NSA think
             | no one else can find the backdoor.
        
             | sweis wrote:
             | NIST P-256 curve seed came from the X9.62 specification
             | drafted in 1997. It was provided by an NSA employee, Jerry
             | Solinas, as an example seed among many other seeds,
             | including those provided by Certicom. Read this for more
             | details: https://eprint.iacr.org/2015/1018
        
           | garba_dlm wrote:
           | I just find it sad that it's things like these that make it
           | impossible for the layman to figure out what is going on
           | with, for example, Mochizuki's new stuff
           | 
           | I have no reason to doubt that a lot of math has been made
           | more difficult than necessary just because it is known to
           | give a subtle military advantage in some cases, but this
           | isn't new;
        
           | cosmojg wrote:
           | Isn't that what the person you're replying to said?
        
           | yung-africas wrote:
           | [flagged]
        
             | rideontime wrote:
             | what do you have against dodo birds
        
         | sweis wrote:
         | "High q-bit proprietary technology" and "specialized de-
         | latticing algorithms" are made up terms that nobody uses.
        
           | tptacek wrote:
           | I'm stuck on trying to work out what it would mean to de-
           | lattice something. Would that transform a lattice basis into
           | a standard vector space basis in R or something, or, like
           | MOV, would it send the whole lattice to an element of some
           | prime extension field?
           | 
           | In my mind's eye, it's cooler: it's like, you render the
           | ciphertext as a raster image, and then "de-lattice" it to
           | reveal the underlying plaintext, scanline by scanline.
        
             | garba_dlm wrote:
             | i'm still working on understanding lattices better
             | 
             | but i can imagine, based on my own ignorance, creativity,
             | and lack of correct understanding, would be some kind of
             | factorization.
             | 
             | as I think while trying to better know what's a lattice, I
             | imagine a lattice like a coordinate pair, but instead of
             | each coordinate existing on a line, they exist on a binary
             | tree (or some other directed graph explored from a root
             | outwards without cycles)
             | 
             | which means you have two such binary-trees (not necessarily
             | binary, but it's just easier to work with them seemingly)
             | 
             | and then you combine these into ONE lattice. so then, to
             | de-lattice means to recover the binary trees.
             | 
             | but when I say binary tree I'm thinking about rational
             | numbers (because stern broccott trees)
        
               | tptacek wrote:
               | A lattice is like a vector space, but with exclusively
               | integer coefficients. It's not a coordinate pair. If you
               | think of vectors as coordinate pairs, a vector space is a
               | (possibly unbounded) set of coordinate pairs. If you
               | haven't done any linear algebra, a decent intuition would
               | be mathematical objects like "the even numbers" or "the
               | odd numbers", but substituting vectors (fixed-sized
               | tuples of numbers) for scalars.
        
       | thadt wrote:
       | The unfortunate reality of this is that while he may be _right_ ,
       | it is difficult to classify the responses (or non-response) from
       | the NIST people as deceptive vs just not wanting to engage with
       | someone coming from such an adversarial position. NIST is staffed
       | by normal people who probably view aggressively worded requests
       | for clarification in the same way that most of us have probably
       | fielded aggressively worded bug reports.
       | 
       | Adding accusatory hyperbolic statements like: "You exposed three
       | years of user data to attackers by telling people to use Kyber
       | starting when your patent license activates in 2024, rather than
       | telling people to use NTRU starting in 2021!" doesn't help.
       | Besides the fact that nobody is deploying standalone PQ for some
       | time, there were several alternatives that NIST could have
       | suggested in 2021. How about SIKE? That one was pretty nice until
       | it was broken last year.
       | 
       | Unfortunately, NIST doesn't have a sterling reputation in this
       | area, but if we're going to cast shade on the algorithm and
       | process, a succinct breakdown of why, along with a smoking gun or
       | two would be great. Pages and pages of email analysis, comparison
       | to (only) one other submission, and accusations that everyone is
       | just stalling so data can be vacuumed up because it is completely
       | unprotected makes it harder to take seriously. If Kyber-512 is
       | actually this risky, then it deserves to be communicated clearly.
        
       | nonrandomstring wrote:
       | Love the narrative style of this writing second-guessing the
       | erroneous thought processes. Are they deceptive? Who knows.
       | 
       | What worries me is that it's neither malice nor incompetence, but
       | that a new darker force has entered our world even at those
       | tables with the highest stakes.... dispassion and indifference.
       | 
       | It's hard to get good people these days. A lot of people stopped
       | caring. Even amongst the young and eager. Whether it's climate
       | change, the world economic situation, declining education, post
       | pandemic brain-fog, defeat in the face of AI, chemicals in the
       | water.... everywhere I sense a shrug of slacking off, lying low,
       | soft quitting, and generally fewer fucks are given all round.
       | 
       | Maybe that's just my own fatigue, but in security we have to
       | vigilant _all the time_ and there 's only so much energy humans
       | can bring to that. That's why I worry that we will lose against
       | AI. Not because it's smarter, but because it doesn't have to
       | _care_, whereas we do.
        
         | r3trohack3r wrote:
         | Bad systems beat good people.
         | 
         | There are a lot of symptoms to distract yourself with. Focus on
         | the game instead.
         | 
         | A society full of good people will sort out the rest.
        
           | bdamm wrote:
           | This apathy is an interesting phenomenon, let's not ignore
           | it. The Internet has brought us a wealth of knowledge but it
           | has also shown us how truly chaotic the world really is. And
           | negativity is a profitable way to drive engagement, so damn
           | near everyone can see how problematic our society is. And
           | when the algorithm finds something you care to be sad about,
           | it will show you more, more, and ever more all the way into
           | depression.
           | 
           | This is the lasting legacy of the Internet, now. Not freedom
           | for all to seek and learn, but freedom for the negativity
           | engines to seek out your brain and suck you into personal
           | obliteration.
           | 
           | A society of good people? Nobody really cares any more. And I
           | do agree with the gp; if you look, you can see it everywhere.
           | What is this going to become? Collective helplessness as we
           | eek out what little bits of personal fulfillment we can get
           | in between endless tragedy and tantalizing promise?
        
       | perihelions wrote:
       | Related thread from last year, with 443 comments:
       | 
       | https://news.ycombinator.com/item?id=32360533 ( _" NSA, NIST, and
       | post-quantum crypto: my second lawsuit against the US government
       | (cr.yp.to)"_)
        
       | neonate wrote:
       | http://web.archive.org/web/20231003195013/https://blog.cr.yp...
       | 
       | https://archive.ph/NrOG6
        
       | fefe23 wrote:
       | If you have never heard of Bernstein, this may look like mad
       | ramblings of a proto-Unabomber railing against THE MAN trying to
       | oppress us.
       | 
       | However, this man is one of the foremost cryptographers in the
       | world, he has basically single-handedly killed US government
       | crypto export restrictions back in the days, and (not least of
       | all because of Snowden) we know that the NSA really is trying to
       | sabotage cryptography.
       | 
       | Also, he basically founded the field of post-quantum
       | cryptography.
       | 
       | Is NIST trying to derail his work by standardizing crappy
       | algorithms with the help of the NSA? Who knows. But to me it does
       | smell like that.
       | 
       | Bernstein has a history of being right, and NIST and the NSA have
       | a history of sabotaging cryptographic standards (google
       | Dual_EC_DRBG if you don't know the story).
        
         | NovemberWhiskey wrote:
         | > _If you have never heard of Bernstein, this may look like mad
         | ramblings of a proto-Unabomber railing against THE MAN trying
         | to oppress us._
         | 
         | Can I point out that Ted Kaczynski was also actually a
         | mathematical prodigy, having been accepted into Harvard on a
         | scholarship at 16?
        
           | skeaker wrote:
           | If you want, sure, but I think the reason he was mentioned
           | with a negative connotation might be more to do with the
           | murders he committed.
        
         | [deleted]
        
         | kpdemetriou wrote:
         | Bernstein is often right, despite the controversy around the
         | Gimli permutation.
         | 
         | In this particular case it's worth noting that neither BSI
         | (Germany) nor NLNCSA (The Netherlands) recommend Kyber.
         | 
         | Unfortunately, alternative algorithms are more difficult to
         | work with due to their large key sizes among other factors, but
         | it's a price worth paying. At Backbone we've opted not to go
         | down the easy route.
        
         | throw0101a wrote:
         | > _If you have never heard of Bernstein, this may look like mad
         | ramblings of a proto-Unabomber railing against THE MAN trying
         | to oppress us._
         | 
         | > _However, this man is one of the foremost cryptographers in
         | the world_ [...]
         | 
         | It's possible to be both (not saying Bernstein is).
         | 
         | Plenty of smart folks have 'jumped the shark' intellectually:
         | Ted Kaczynski, the Unabomber, was very talented in mathematics
         | before he went off the deep end.
        
         | ignoramous wrote:
         | An interesting set of comments (by tptacek) from a thread in
         | 2022 (I wonder if they still hold the same opinion in light of
         | this latest post on NIST-PQC by djb):
         | 
         | > _The point isn 't that NIST is trustworthy. The point is that
         | the PQC finalist teams are comprised of academic cryptographers
         | from around the world with unimpeachable reputations, and it's
         | ludicrous to suggest that NSA could have compromised them. The
         | whole point of the competition structure is that you don't
         | simply have to trust NIST; the competitors (and cryptographers
         | who aren't even entrants in the contest) are peer reviewing
         | each other, and NIST is refereeing._
         | 
         | > _What Bernstein is counting on here is that his cheering
         | section doesn 't know the names of any cryptographers besides
         | "djb", Bruce Schneier, and maybe, just maybe, Joan Daemen. If
         | they knew anything about who the PQC team members were, they'd
         | shoot milk out their nose at the suggestion that NSA had
         | suborned backdoors from them. What's upsetting is that he knows
         | this, and he knows you don't know this, and he's exploiting
         | that._
         | 
         | ---
         | 
         | > _I spent almost 2 decades as a Daniel Bernstein ultra-fan ---
         | he 's a hometown hero, and also someone whose work was
         | extremely important to me professionally in the 1990s, and, to
         | me at least, he has always been kind and cheerful... I know
         | what it's like to be in the situation of (a) deeply admiring
         | Bernstein and (b) only really paying attention to one
         | cryptographer in the world (Bernstein)._
         | 
         | > _But talk to a bunch of other cryptographers --- and, also,
         | learn about the work a lot of other cryptographers are doing
         | --- and you 're going to hear stories. I'm not going to say
         | Bernstein has a bad reputation; for one thing, I'm not
         | qualified to say that, and for another I don't think "bad" is
         | the right word. So I'll put it this way: Bernstein has a fucked
         | up reputation in his field. I am not at all happy to say that,
         | but it's true._
         | 
         | ---
         | 
         | > _What 's annoying is that [Bernstein is] usually right, and
         | sometimes even right in important new ways. But he runs the
         | ball way past the end zone. Almost everybody in the field
         | agrees with the core things he's saying, but almost nobody
         | wants to get on board with his wild-eyed theories of how the
         | suboptimal status quo is actually a product of the Lizard
         | People._
         | 
         | (https://news.ycombinator.com/item?id=32365259,
         | https://news.ycombinator.com/item?id=32368598,
         | https://news.ycombinator.com/item?id=32365679)
        
           | tptacek wrote:
           | I hope he finds all sorts of crazy documents from his FOIA
           | thing. FOIA lawsuits are a very normal part of the process
           | (I've had the same lawyers pry loose stuff from my local
           | municipality). I would bet real money against the prospect of
           | him finding anything that shakes the confidence of practicing
           | cryptography engineers in these standards. Many of the
           | CRYSTALS team members are quite well regarded.
        
           | mort96 wrote:
           | I don't think the "these finalist teams are trustworthy"
           | argument is completely watertight. If the US wanted to make
           | the world completely trust and embrace subtly-broken
           | cryptography, a pretty solid way to do that would be to make
           | competition where a whole bunch of great, independent teams
           | of cryptography researchers can submit their algorithms, then
           | have a team of excellent NSA cryptographers analyze them and
           | pick an algorithm with a subtle flaw that others haven't
           | discovered. Alternatively, NIST or the NSA would just to
           | plant one person on one of the teams, and I'm sure they could
           | figure out some clever way to subtly break their team's
           | algorithm in a way that's really hard to notice. With the
           | first option, no participant in the competition has to that
           | there's any foul play. In the second, only a single
           | participant has to know.
           | 
           | Of course I'm not saying that either of those things
           | happened, nor that they would be easy to accomplish. Hell,
           | maybe they're literally impossible and I just don't
           | understand enough cryptography to know why. Maybe the NIST
           | truly has our best interest at heart this time. I'm just
           | saying that, to me, it doesn't seem impossible for the NIST
           | to ensure that the winner of their cryptography contests is
           | an algorithm that's subtly broken. And given that there's
           | even a slight possibility, maybe distrusting the NIST
           | recommendations isn't a bad idea. They do after all have a
           | history of trying to make the world adopt subtly broken
           | cryptography.
        
             | tptacek wrote:
             | If the NSA has back-pocketed exploits on the LWE submission
             | from the CRYSTALS authors, it's not likely that a purely
             | academic competition would have fared better. The CRYSTALS
             | authors are extraordinarily well-regarded. This is quite a
             | bank-shot theory of OPSEC from NSA.
        
               | mort96 wrote:
               | It's true that nothing is 100% safe. And to some degree,
               | that makes the argument problematic; regardless of what
               | happened, one could construct a way for US government to
               | mess with things. If you had competition of the world's
               | leading academic cryptographers with a winner selected by
               | popular vote among peers, how do you know that the US
               | hasn't just influenced enough cryptographers to push a
               | subtly broken algorithm?
               | 
               | But we must also recognize a difference in degree. In a
               | competition where the US has no official influence over
               | the result, there has to be a huge conspiracy to affect
               | which algorithm is chosen. But in the competition which
               | actually happened, they may potentially just need a
               | single plant on one of the strong teams, and if that
               | plant is successful in introducing subtle brokenness into
               | the algorithm without anyone noticing, the NIST can just
               | declare that team's algorithm as the winner.
               | 
               | I think it's perfectly reasonable to dismiss this
               | possibility. I also think it's reasonable to recognize
               | the extreme untrustworthiness of the NIST and decide to
               | not trust them if there's even a conceivable way that
               | they might've messed with the outcome of their
               | competition. I really can't know what the right choice
               | is.
        
               | tptacek wrote:
               | That's an argument that would prove too much. If you
               | believe NSA can corrupt academic cryptographers, then you
               | might as well give up on all of cryptography; whatever
               | construction you settle on as trustworthy, they could
               | have sabotaged through the authors. Who's to say they
               | didn't do that to Bernstein directly? If I'd been
               | suborned by NSA, I'd be writing posts like this too!
        
               | mort96 wrote:
               | You're still not recognizing the difference between
               | corrupting a single academic cryptographer and corrupting
               | a whole bunch of academic cryptographers. This isn't so
               | black and white.
               | 
               | For what it's worth, I do think the US government could
               | corrupt academic cryptographers. If I was an academic
               | cryptographer, and someone from the US government told me
               | to do something immoral or else they would, say, kill my
               | family, and they gave me reason to believe the threat was
               | genuine, I'm not so sure I wouldn't have done what they
               | told me. And I know this sounds like spy movie shit, but
               | this is _the US government_.
               | 
               | One last thing though, if you're giving me the black and
               | white choice between blindly trusting the outcome of a US
               | government cryptography standard competition or
               | distrusting the field of cryptography altogether, I
               | choose the latter.
        
               | tptacek wrote:
               | As long as we're clear that your concern involves spy
               | movie shit, and not mathematics or computer science, I'm
               | pretty comfortable with where we've landed.
        
               | mort96 wrote:
               | If your argument is: "assuming the US government wouldn't
               | be able to make someone act against their will and stay
               | silent about it, the NIST recommendation is trustworthy",
               | I'm certainly more inclined to distrust this
               | recommendation than I was before this conversation.
               | 
               | Note that the "forcing someone to comply" thing was just
               | meant as one possibility among many, I don't see why you
               | completely dismiss the idea of someone who's good at
               | cryptography being in on the US's mission to intercept
               | people's communications. I mean the NSA seems to be full
               | of those kinds of people. You also dismiss the
               | possibility that they just ... picked the algorithm that
               | they thought they could break after analysing it, with no
               | participant being in on anything. But I get the feeling
               | that you're not really interested in engaging with this
               | topic anymore, so I'll leave it at that. It's already
               | late here.
        
         | zahllos wrote:
         | This comment is factually incorrect on a number of levels.
         | 
         | 1) single-handedly killed US government crypto export
         | restrictions - Bernstein certainly litigated, but was not the
         | sole actor in this fight. For example, Phil Zimmerman, the
         | author of PGP, published the source code of PGP as a book to
         | work around US export laws, which undoubtedly helped highlight
         | the futility of labelling open source software as a munition:
         | https://en.wikipedia.org/wiki/Pretty_Good_Privacy#Criminal_i...
         | 
         | 2) Bernstein "founded" the field of post quantum cryptography:
         | Uh. Ok. That's not how academia works. Bernstein was certainly
         | an organiser of the first international workshop on post
         | quantum cryptography, but that's not the same as inventing a
         | field. Many of the primitives that are now candidates were
         | being published long before this, McEliece being one of the
         | oldest, but even Atjai's lattice reductions go back to '97.
         | 
         | 3) The dual_ec rng was backdoored (previously read was and is
         | fishy, poor wording on my part), but nobody at the time wanted
         | NIST to standardize it because it was a _poor PRNG anyway_:
         | slow and unnecessarily complicated. Here is a patent from Scott
         | Vanstone on using DUAL_EC for "key escrow" which is another way
         | of saying "backdoor":
         | https://patentimages.storage.googleapis.com/32/9b/73/fe5401e...
         | - filed in 2006. In case you don't know Scott Vanstone, he's
         | the founder of Certicom. So at least one person noticed. This
         | was mentioned in a blog post as a result of the Snowden leaks
         | working out how the backdoor happened:
         | https://blog.0xbadc0de.be/archives/155
         | 
         | NSA have been caught in a poor attempt to sabotage a standard
         | that nobody with half a brain would use. On the other hand NSA
         | also designed SHA-2, which you are likely using right now, and
         | I'm not aware of anyone with major concerns about it. When I
         | say NSA designed it, I don't mean "input for a crypto
         | competition" - a team from the NSA literally designed it and
         | NIST standardized it, which is not the case for SHA-3, AES or
         | the current PQC process.
         | 
         | DJB is a good cryptographer, better than me for sure. But he's
         | not the only one - and some very smart, non-NSA, non-US-citizen
         | cryptographers were involved in the design of Kyber, Dilithium,
         | Falcon etc.
        
           | tptacek wrote:
           | Dual EC is virtually certain to be a backdoor.
           | 
           | I had the same take on Dual EC prior to Snowden. The big
           | revelation with Snowden wasn't NSA involvement in Dual EC,
           | but rather that (1) NSA had intervened to get Dual EC
           | defaulted-on in RSA's BSAFE library, which was in the late
           | 1990s the commercial standard for public key crypto, and (2)
           | that major vendors of networking equipment were --- in
           | defiance of all reason --- using BSAFE rather than vetted
           | open-source cryptography libraries.
           | 
           | DJB probably did invent the _term_ "post-quantum
           | cryptography". For whatever that's worth.
        
             | zahllos wrote:
             | DualEC: agree. Wanted to point out that it was a poor PRNG
             | _anyway_ and point out that the NSA's attempt at
             | backdooring the RNG wasn't that great - as you say, RSA
             | BSAFE used it and it made no sense. We could also point out
             | they went after the RNG rather than the algorithm directly,
             | which is a less obvious strategy.
             | 
             | I'll believe he invented the term - I have a 2009 book so-
             | named for which he was an editor surveying non-DLP/non-RSA
             | algorithms. Still, the idea that he's "the only one who can
             | produce the good algorithms" and literally everyone else on
             | the pqc list (even if we subtract all the NIST people) is
             | wrong is bonkers.
        
               | ziddoap wrote:
               | While I agree with a lot of what you have said,
               | 
               | > _Still, the idea that he 's "the only one who can
               | produce the good algorithms"_
               | 
               | The parent post did not, at all, make the claim that
               | Bernstein is the _only one_.
        
       | [deleted]
        
       | jeffrallen wrote:
       | Something I've learned from a career of watching cryptographer
       | flame wars: Don't bet against Bernstein, and don't trust NIST.
        
       | jcranmer wrote:
       | Notwithstanding DJB's importance to cryptography, and the fact
       | that I'm ignorant of a large number of details here, there was a
       | point where he lost a lot of credibility with me.
       | 
       | Specifically, when he gets to the graphs, he says "NIST chose to
       | deemphasize the bandwidth graph by using thinner red bars for
       | it." That is just not proven by his evidence, and there is a very
       | plausible explanation for it. The graph that has the thinner bars
       | is a bar chart that has more data points than the other graph.
       | Open up your favorite charting application, and observe the
       | difference in a graph that has 12 data points versus one with
       | 9... of course the one with 12 data points has thinner lines! At
       | this point, it feels quite strongly to me that he is trying to
       | interpret every action in the most malicious way possible.
       | 
       | In the next bullet point, he complains that they're not using a
       | log scale for the graph... where everything is in the same order
       | of magnitude. That doesn't sound like a good use case for log
       | scale, and I'm having a hard time trying to figure out why it
       | might be justified in this case.
       | 
       | Knowing that DJB was involved in NTRU, it's a little hard to
       | shake the feeling that a lot of this is DJB just being salty
       | about losing the competition.
        
         | aaomidi wrote:
         | > Knowing that DJB was involved in NTRU, it's a little hard to
         | shake the feeling that a lot of this is DJB just being salty
         | about losing the competition.
         | 
         | There isn't a lot of people in the world with the technical
         | know-how for cryptography. It's clear that competitors in this
         | space are going to be reviewing eachothers work.
        
           | tptacek wrote:
           | Yes, that was the premise of the competition, and was in fact
           | what happened.
        
           | pnpnp wrote:
           | Sure, but this was just a weird thing to hone in on.
        
         | ziddoap wrote:
         | > _At this point, it feels quite strongly to me that he is
         | trying to interpret every action in the most malicious way
         | possible._
         | 
         | Given the long and detailed history of various governments and
         | government agencies purposefully attempting to limit the public
         | from accessing strong cryptography, I tend to agree with the
         | "assume malice by default" approach here. Assuming anything
         | else, to me at least, seems pretty naive.
        
         | [deleted]
        
         | DangitBobby wrote:
         | If you continue reading, you'll find that they aren't
         | responding to requests for clarification on their hand-waving
         | computations. Suspicion is definitely warranted.
        
       | tptacek wrote:
       | An important detail you really want to understand before reading
       | this is that NIST (and NSA) didn't come up with these algorithms;
       | they refereed a competition, in which most of the analysis was
       | done by competitors and other academics. The Kyber team was
       | Roberto Avanzi, Joppe Bos, Leo Ducas, Eike Kiltz, Tancrede
       | Lepoint, Vadim Lyubashevsky, John M. Schanck, Gregor Seiler,
       | Damien Stehle, and also Peter Schwabe, a collaborator of
       | Bernstein's.
        
         | wnevets wrote:
         | Correct me if I'm wrong, everything is also being done out in
         | the open for everyone to see. The NIST aren't using some secret
         | analysis to make any recommendations.
        
           | tux3 wrote:
           | Teams of cryptographers submit several proposals (and break
           | each other's proposals). These people are well respected,
           | largely independent, and assumed honest. Some of the mailing
           | lists provided by NIST where cryptographers collaborated to
           | review each other's work are public
           | 
           | NIST may or may not consort with your friendly local
           | neighborhood NSA people, who are bright and talented
           | contributors in their own right. That's simply in addition to
           | reading the same mailing lists
           | 
           | At the end, NIST gets to pick a winner and explain their
           | reasonning. What influenced the decision is surely a
           | combination of things, some of which may be internal or
           | private discussions
        
           | tptacek wrote:
           | You don't really know, but you can be reasonably sure that
           | they didn't sabotage the submissions themselves.
        
           | pclmulqdq wrote:
           | There is a final standardization step where NIST selects
           | constants, and this is done without always consulting with
           | the research team. Presumably, these are usually random, but
           | the ones chosen for the Dual-EC DRBG algorithm seem to have
           | been compromised. SHA-3 also had some suspicious
           | constants/padding, but that wasn't shown to be vulnerable
           | yet.
        
             | tptacek wrote:
             | The problem with Dual EC isn't the sketchy "constants", but
             | rather the structure of the construction, which is a random
             | number generator that works by doing a public key
             | transformation on its state. Imagine CTR-DRBG, but
             | standardized with a constant AES key. You don't so much
             | wonder about the provenance of the key so much as wonder
             | why the fuck there's a key there at all.
             | 
             | I don't know of any cryptographer or cryptography engineer
             | that takes the SHA3 innuendo seriously. Do you?
             | 
             | Additional backstory that might be helpful here: about 10
             | years ago, Bernstein invested a pretty significant amount
             | of time on a research project designed to illustrate that
             | "nothing up my sleeves" numbers, like constants formed from
             | digits of pi, e, etc, could be used to backdoor standards.
             | When we're talking about people's ability to cast doubt on
             | standards, we should keep in mind that the paragon of that
             | idea believes it to be true of pi.
             | 
             | I'm fine with that, for what it's worth. Cryptography
             | standards are a force for evil. You can just reject the
             | whole enterprise of standardizing cryptography of any sort,
             | and instead work directly from reference designs from
             | cryptographers. That's more or less how Chapoly came to be,
             | though it's standardized now.
        
               | pclmulqdq wrote:
               | I do know a few cryptographers who were suspicious of
               | SHA-3 when it came out, but after some napkin math and no
               | obvious hole was found, they were fine with it. The
               | actual goal of that extra padding was to get extra one
               | bits in the input to avoid possible pathological cases.
               | 
               | My understanding of the Dual-EC problem may be different
               | than yours. As I understand it, the construction is such
               | that if you choose the two constants randomly, it's fine,
               | but if you derived them from a known secret, the output
               | was predictable for anyone who knows the secret. The NIST
               | did not provide proof that the constants used were chosen
               | randomly.
               | 
               | Random choice would be equivalent to encrypting with a
               | public key corresponding to an unknown private key, while
               | the current situation has some doubt about whether the
               | private key is known or not.
        
               | tptacek wrote:
               | Who were those cryptographers?
        
           | codr7 wrote:
           | My rule of thumb in these situations is always: if they
           | could, they would.
           | 
           | I've seen enough blatant disregard for humanity to assume any
           | kind of honesty in the powers that were.
        
         | [deleted]
        
         | kpdemetriou wrote:
         | [delayed]
        
       ___________________________________________________________________
       (page generated 2023-10-03 23:00 UTC)