Redirecting you to
Podcast Jul 08, 2022

Root Causes 232: NIST Announces Its Post Quantum Crypto Selections

NIST has announced its winning algorithms for round 3 of its post-quantum cryptography "contest." Join us as we name the winning algorithms and why they were chosen. We discuss the continuing effort to arrive at additional algorithms, and we talk about the next steps coming out of this announcement.

  • Original Broadcast Date: July 8, 2022

Episode Transcript

Lightly edited for flow and brevity.

  • Tim Callan

    We’ve been talking for years about quantum-safe cryptography and the contest, they call it, that NIST has led and championed to come up with quantum-safe or post-quantum algorithms, and on July 5, very recently, NIST announced the winners with NIST IR. We’ll get to this.

  • Jason Soroko

    Everybody who is interested in this, and you should be, there’s a heavy, heavy document that you passed over to me, Tim. If you search on NIST IR 8413, there’s a really, really meaty PDF file that NIST put out, very well-considered, very well-written. If you don’t have a Ph.D. in math, it will probably fry your brain. It just about fried mine.

  • Tim Callan

    It is technically deep. It is very technically deep. I could not remotely follow the math, but it also includes the implications, which is what you and I were focused on. So, we have winners! So, who are the winners?

  • Jason Soroko

    So, Tim, there really are two major categories in the third round of candidates, which is what we’re talking about, and one of them is, of course, the key exchange method and the signatures primitives. So, therefore, what are we talking about? Everyone wants to know the names of the cryptographic algorithms, and so, for the KEMS, it is CRYSTALS-Kyber.

  • Tim Callan

    CRYSTALS-Kyber, or people just normally refer to it as Kyber.

  • Jason Soroko

    Whenever I’ve been talking to people who are really, truly in the know in the quantum world, quantum resistant cryptographic algorithms, we will typically refer to it as Kyber. But let’s stick with KEMS for a moment because what’s interesting here, Tim, first of all, when we are talking about selection, they have now officially been selected for standardization, which basically means the standardization has not happened yet, but there has now been an official selection to go to that stage.

  • Tim Callan

    Official-ish. Let’s be clear. NIST does not actually control these standards organizations. There is no legally binding power in this. However, that said, these standards organizations pay rapt attention to NIST’s work. They have been paying rapt attention to this. They have followed NIST’s recommendations in other areas of updating our cryptography like replacing SHA-1, for example, and it would be a ludicrously absurd outcome if they didn’t follow NIST’s recommendations. So this is, this is in the bag, but it’s not official-official.

  • Jason Soroko

    I think where the rubber is truly gonna hit the road is with the standardization. There’s gonna be a ton more work to do, and in fact, I don’t think we gonna see standardization finished until at least 2024. We are talking another year and a half. Maybe up to two years from where we are today.

  • Tim Callan

    But at least we can get started.

  • Jason Soroko

    So for the KEMS, Tim, there were nine. By Round 3, we had gotten down to nine names. We now have one that has absolutely been selected for standardization and another four that will advance to a fourth round.

  • Tim Callan

    That’s funny, and they announced some months ago that there would be a fourth round, and that was not originally the expectation, I think, either for NIST or for the rest of us, so tell us why that’s going on.

  • Jason Soroko

    Between BIKE, Classic McEliece, HQC and SIKE - - these are the names, of course, of the cryptographic algorithms in question. I think that they wanna cut down the number from four to some lower number than four. The reason why we have one chosen to standardize is because it’s just like NIST said in their previous rounds. The way that we are choosing which to be standardized is based on their all-a-round capabilities for performance and security, and I think that the best balance of the two was Kyber. When you start looking at the others, what becomes very, very clear is that you’d have to really know a reason why you would choose one of the others, other than Kyber. In other words, Kyber has a shorter bit length in reality. It would probably have a shorter bit length than most of these others. However, there may be some performance advantages with these other ones or some security advantages with these others that would make them say, you know what, we’re gonna allow these to be go toward standardization; however, it’s only gonna be for x or y or z use case.

  • Tim Callan

    Also there was a sense of just a little bit of an insurance policy. Like, what if some genius at a whiteboard somewhere is thinking about these problems and mathematics, in general, in a way that nobody else has before. Because this happens every once in a while, and as a consequence a paper gets published tomorrow, and we realize that the implication of this is that Kyber is not gonna be any good. Under those circumstances, they don’t want us to have nothing. No matter how unlikely that is, cause I think we all think that’s phenomenally unlikely, it’s hard to declare that it is categorically impossible. Therefore, they wanna have some kind of life raft against the phenomenally unlikely possibility that that occurs.

  • Jason Soroko

    I wanna dig into what you just said a little more right after we finish this off, Tim, because once we talk about the signature cryptographic algorithms, then we can get into what they have in common and the reasons why NIST is kind of holding out for a fourth round and actually made an interesting statement as well, and we’ll talk about that.

  • Tim Callan

    So, I think that’s great. Why don’t we talk about Dilithium, and then we’ll come back to that.

  • Jason Soroko

    So for the signatures that you’ve got, of course, CRYSTALS-Dilithium, as you say always, Tim, just typically we call it Dilithium, and that is the cryptographic algorithm that is going to be standardized, along with FALCON AND SPHINCS+. Well, it’s a long acronym. So FALCON and SPHINCS+ were both really good. The issue with both of them is that they have well actually, one of the advantages they have over Dilithium is that they have really short bit lengths compared to Dilithium. And so there are going to be some use-cases where they’re going to be useful, so all three of those, Dilithium, FALCON and SPHINCS+, are going to move toward standardization, or those are the recommendations to go toward standardization. Three more got kicked out out of the six.

  • Tim Callan

    To be clear, what you were saying, let me just make sure I’m stating it clearly, and tell me if you don’t agree with this, which is, we’re expecting Dilithium to be in use 99.99999% of the time. That it’s gonna be very specific log ins, controlled environments where someone has a real good reason to go with, let’s say, FALCON, where you will see that algorithm in play.

  • Jason Soroko

    100%, and Tim, we will cover in future podcasts some of these that have moved forward into standardization, what they would be best used for. We’ll talk about in real layman’s language so that it’s a lot more clear. In fact, now that we know basically it’s Kyber and Dilithium, if you wanna have the names on your lips, that’s the two names, Kyber and Dilithium. One thing that needs to be kept in mind a lot is that these are both ultimately lattice-based. Let’s go back in time let’s go back to the algorithms we use today - RSA and ECC. You could categorize both of them as factorization problems.

    In other words, you’re forcing a classic computer to solve a difficult mathematical problem in exponential time. In other words, as your bit length goes up and up, the amount of time that it takes to solve the mathematical problem posed by these algorithms expands to the point where - -

  • Tim Callan

    Scaling is your friend, and by adding relatively few bits, you can make it impossibly hard to brute force.

  • Jason Soroko

    What’s very important to note, of course, is that there is no mathematics currently that proves the factorization of large prime numbers wrong. In other words, it would take quantum computing and Shor’s algorithm to do so. That’s RSA. So, factorization of trying to solve for points on an elliptic curve, the proof for the fact that that is always difficult, and there are no shortcuts actually came from Fermat’s Last Theorem and some other really good mathematical work. And so a little later in time came elliptic curve cryptography, and we currently use that, and what we’re going to see is that probably what’s gonna dominate early forms of post-quantum-based cryptographic algorithms is gonna be lattice-based, and I think, Tim, it’s worth us talking through that in a separate podcast about what that is and - - because it’s like all math, it’s hard, but I’m gonna attempt, maybe I’ll fall on my face, you never know, but I’m gonna make an attempt to try to explain why it’s hard.

  • Tim Callan

    It’s great. That’s a great topic. That deserves its own podcast for sure, but that’s a great topic. Let’s do that.

  • Jason Soroko

    The last point, and it’s going right back to what you were saying, which is, okay, we have lattice-based cryptography and Kyber and Dilithium both falling in that category, I think what NIST is hoping for is something that is other than lattice-based as an alternative in case one day lattice-based falls on its face because somebody really somebody has this Eureka moment and goes, oh, I know how to bypass that.

  • Tim Callan

    Well, right. And they’ve even said that they’re pretty explicit about that when they explained why Classic McEliece, in the paper, why Classic McEliece was included, and this is me horribly paraphrasing, but the gist of it goes, this has been around a long time, it’s been pounded on and off a lot, and we think if there were any great big skeletons in the closet, we would’ve found them by now. So that’s our little insurance policy. And we don’t like it because keys are bigger than we’d prefer, but if it was that or don’t have cryptography, we could all cope.

  • Jason Soroko

    If we now reflect on what you just said and take a look at everything that’s in this document from NIST after the third round, I, I mean, I would love for somebody to tell me otherwise, but I think we can start feeling good that our, at the very least, our mathematical basis is very sound, and that’s a good place to be. In other words, I don’t know how long lattice math has been around. If you take a look at some of the earlier studyings of it, I mean, people have been trying to solve integer-based lattice problems, and as number theory going back to, I think the Babylonians were even messing with that stuff. Like this is a known quantity, and therefore, it’s math that’s been pounded on for a long time.

  • Tim Callan

    Let me get your perspective on this, Jason. Is this the kind of thing, like is it possible with an elegant enough or smart enough mathematical proof to actually prove that one of these algorithms doesn’t have any flaws or vulnerabilities, or is this more of a black swan problem where we never know for sure? We just know we haven’t found it, and we think we would have found it by now?

  • Jason Soroko

    Perfect. So, let’s compare and contrast ECC, we’ve got Fermat’s Last Theorem of which a piece of that solution part of what came out of that was a proof that as long as you parametrize your ECC problem correctly, it is a very hard problem.

    With lattice, and by the way, this is now a preview of the next podcast, to give you a sneak peek. What makes lattice interesting is that you gotta kinda twist your mind for a moment and realize that fundamentally, fundamentally what makes the problem of lattice finding basically the shortest vector, the shortest vector problem, SVP, which is what you’ll often see the acronym, what will be extremely important in the standardization is how big your boundaries need to be. In other words, again, it’s about your parameterization because you have to make it a big enough problem in order for the SVP to be take a quantum computer million of years to solve. That will be the challenge in standardization. I think though that what NIST has done here in the choosing of the third round and we’re eventually seeing the fourth round, is you now see these lattice-based problems being – - like the raw material, the raw math, and some incredibly clever additions to ways to do these things. Cause it’s not just let’s go off and solve for a large prime number and therefore, there’s a simple formula. This is not what we’re dealing with in this. This kind of quantum cryptography involves multiple, multiple forms of mathematics all working in conjunction with each other and this is why the NIST document is so absolutely heavy. It’s so heavy. It’s because you don’t have to be just an expert in one form of math. You have to be an expert in multiple, multiple forms of mathematics in order to be able to fully appreciate the cleverness to which made solving for these problems computationally operational in a real-world example.

  • Tim Callan

    That was why it was a global community effort because there is nobody who is truly expert in every aspect of this, and so what had to happen is we had the best in the world focus in on their particular areas of extreme knowledge and capability and test these algorithms within the parameters of what they know.

  • Jason Soroko

    Exactly, Tim. So to answer your questions as succinctly as possible, to flip the question a little bit, you may one day see a proof for, alright, if you parameterize Kyber or Dilithium lattice a specific way, we can at least measure that Shor’s algorithm with a quantum computer of x bits will there is no way to downgrade it. There is no way to solve for it. If you’re really interested in that topic, there are at least two different ways to slowly break down the problem. In other words, yes, it will take you millions of years, but if you want to start understanding the true attacks against these forms of mathematics, then I think what you need to start looking at is go back to basics and understand how to actually spend that million years, and then block by block do the brute force math of understanding how you actually get toward solving the SVP. Once I think mathematicians go back to that and once the standardizations come out, then you’re gonna have a set of parameters chosen based on some good math, but then you’re gonna have the flip side which is, you’re gonna now have people attacking that to the max to make sure that there are no clever ideas against it.

  • Tim Callan

    That would be the worry, is no matter how much resource was put against it in this contest, and it wasn’t trivial, imagine how that compares to the resource that will be put against this problem once it’s locking everything in the world. If there is anything to be found, we will find it. That’s why we’d rather know ahead of time.

  • Jason Soroko

    You got it, Tim. So, there it is. We’re finally there. I think they were a little bit late. I think we were expecting this a little earlier in the year, but we’ve got it.

  • Tim Callan

    The original published schedule was it was gonna be right around the New Year, and so it’s about another six months and when I was at the RSA conference, there was a cryptography panel, and Dr. Dustin Moody was on it, and the question someone ask him, how come it’s late, and he said it’s the government, things have to take the time they take. That was it. They just had to go through their process.

  • Jason Soroko

    Tim, the one thing I’m gonna say about this NIST IR 8413, everybody should take a look at it because a lot of people listening to this podcast are concerned about not just the security but also performance. There is just terrific benchmarking on specific types of computers that are available today that actually show what does the key generation, how long does it take, how long does the encapsulation take, how long does the deencapsulation take. These are all very important things in a real-world scenario that practitioners of cryptography are gonna wanna know about, and this document does a great job of laying it all out, so check it out.

  • Tim Callan

    Cool. Awesome. So big news. Glad we covered it. Sounds like there’s a lot more, we’re gonna have to dig in, and it’s a giant, like you said, it’s a giant, meaty paper I’m sure we’ll be returning to this topic on multiple occasions, but this is a good place to stop. So thank you, Jason.