Redirecting you to
Podcast Feb 19, 2019

Root Causes 06: Quantum-Resistant Cryptography

The pending cryptographic Quantum Apocalypse requires that we replace the hashing and encryption algorithms used through the internet, enterprise networks, mobile service, and popular devices. Join our experts to learn more about the requirements for quantum-resistant algorithms to survive the Quantum Apocalypse.

  • Original Broadcast Date: February 19, 2019

Episode Transcript

Lightly edited for flow and brevity.

  • Tim Callan

    This is part two of our ongoing series on quantum computing and the effects that it ultimately will have on PKI. We already had a nice conversation about the looming Quantum Apocalypse as we’ve been calling it, what needs to be done, and the time frame it needs to be done in. As a reminder for the listeners, this isn’t going to be a problem tomorrow but is something that if we don’t get started on it now, when it does become a problem, we won’t be prepped to solve it.

  • Jason Soroko

    If you take a look at any kind of estimate for how long until we’re entering the post quantum world, which obviously effects PKI, it impacts the world that we live in in terms of encryption and the algorithms that underpin it. It’s going to take at least the next decade to nail down the algorithms and the primitives, but it is the resistance to quantum cryptography that we’re searching for today. I don’t think anybody is talking about making it completely impervious. The key term is resistance.

  • Tim Callan

    That’s a good point. Of course, it’s not just a matter of getting the right cryptography, of solving the cryptographic problems. It’s also a matter of deploying it across an entire functioning ecosystem without shutting down everything that we all depend on every day. That’s part of why we need time as well.

  • Jason Soroko

    A simple question: “Which are the best algorithms or which is the best algorithm that is currently part of the candidate list?” It’s a really tough balance, and it’s that balance that’s going to take us a long time to hone in on.

  • Tim Callan

    So how are our brightest minds thinking about this problem? Frame the problem for us. How does one approach this?

  • Jason Soroko

    It’s best to go back to the analogy of prime numbers and solving prime numbers being so difficult for a classical computer because a lot of people have heard that analogy. If you’re listening to this podcast, most likely you have. It’s obviously very, very difficult for a computer because as fast as classic computers are, they’re simply unable to go through the full list of large prime numbers quickly. There’s no way to very quickly solve for it. That’s mathematically proven.

    With Elliptic Curve Cryptography (ECC) and the algorithms underlying it, it’s the same thing. There is no way to quickly solve points on the curve, and therefore ECC is something that we use today. But more interestingly even than that, Tim, perhaps is that one of the reasons we use ECC is that it has even smaller key sizes and is extremely efficient, which makes it viable for a classical computer to be able to use as an algorithm. We have the exact same things going on with post quantum algorithms.

  • Tim Callan

    Please elaborate.

  • Jason Soroko

    You need something that’s difficult to solve. It needs to have a difficult problem for a quantum computer to solve and therefore is resistant to things like Shor’s algorithm or other classic breaking algorithms. It needs to be able to generate keys relatively quickly. The key sizes need to be of a reasonable size. All these things that we’re used to from the current world will apply here because there’s all kinds of fancy math out there that might be very, very resistant to quantum computing but might not be terribly efficient when it comes to actually generating these keys.

  • Tim Callan

    Is there a set of candidates? Is there an identified set of at least good potential algorithmic approaches that our community is considering?

  • Jason Soroko

    Absolutely. And that list is not growing. It’s a fairly set list right now. I will describe one of them, which is known as lattice-based algorithms. Without getting too much into the math or the geometry of it, think of a lattice within a matrix of points and solving for various vectors within those points. Once again there’s presumably no easy way for a quantum computer to be able to solve for minimum distances and other types of difficult problems for quantic computing within that lattice.

    When considering a particular lattice-based algorithm, we can explain why it’s on the candidate list, mostly because it is very, very efficient. Key sizes, etc. are fairly reasonable. Interesting about it as well is that it also can run on classic computers, and that’s an important point. I’ve heard a lot of people talk about this where the actual encryption itself will also need a quantic computer. That’s what we’re trying to avoid.

  • Tim Callan

    Because otherwise you have to change out everything; all the hardware.

  • Jason Soroko

    You got it.

  • Tim Callan

    With deeply entrenched systems in place that’s a heinously difficult task.

  • Jason Soroko

    And therefore the post quantum world will not look like a world where everybody is using a quantum computer. It’s actually far from it. The majority of people will have and still be using classic computers. It’s just that there will be quantum computers in existence that are above a certain number of qubits and therefore theoretically able to solve the RSA and ECC algorithms in a reasonable amount of time.

  • Tim Callan

    The basic strategy here is to say there are some encryption algorithms that are vulnerable to the specific oddities of quantum computing and others that are not. If the load would be the same on our traditional non-quantum computers but we can switch to one of these approaches that is going to be less friendly to the advantages of quantum computing, we can basically nullify this unfair advantage that a quantum computer would have in breaking our encrypted files.

  • Jason Soroko

    That’s one of the important points. Let’s put it into perspective. Using lattice-based algorithms as an example we can see the challenge we face. Not only do we need to be resistant to quantum-based algorithms as an attack. (And believe it or not, at this moment that we do not yet have a mathematical proof that lattice-based algorithms are in fact secure against quantum computing algorithms.) We don’t have the proof the same way we do with RSA and ECC yet. So we are starting from that level.

  • Tim Callan

    It’s possible, maybe, that we won’t have that proof. That is one possible outcome.

  • Jason Soroko

    Yes. We may end up one day looking at technical journalism and find out, “Oh geez, one of these candidates is now off the list because somebody had a bright bulb moment and realized some very long equation later that lattice-based algorithms are no longer a candidate because it can be unraveled in some way.”

    But I really do want to get through some of the other properties. It’s not just resistance against quantum computing and quantum algorithms.

  • Tim Callan

    So what are they?

  • Jason Soroko

    You also have to be secure against traditional attacks. In other words, there are phenomenal math being developed right now that is not quantum-based that could potentially also unravel these things. And so, if that’s the case then it’s really, really, really challenging to even think about.

    Randomness is still part of the equation here. Randomness necessarily means that you’re going to have a whole lot of numbers. If you think about prime numbers, at what size does a classical computer start to have difficulty solving for prime numbers? Well, you have to get into quite large numbers. For the smaller prime numbers, there actually is a list of them, and therefore if you just run through the list. You could probably crack an RSA algorithm fairly rapidly. So therefore, security is going to be derived from the mild worst-case complexity assumption.

    In other words, you need to cryptography to be hard to break for all those random choices of keys. Those are also properties of the new, candidate, quantum resistant algorithms that have not been solved yet because nobody has run through all the different mathematical possibilities.

  • Tim Callan

    So it needs to be resistant to a quantum computing attack. It also needs to be resistant to traditional attacks which are continuing to develop and improve. What other criteria do we have?

  • Jason Soroko

    I think you mentioned it before. The simplicity and flexibility of working on a classical computer. Are there drop-in replacements? Is there compatibility with existing protocols and networks? Because obviously if it doesn’t work that way it’s going to be difficult to be viable.

    There’s also resistance to side channel attacks, some sort of built-in counter measure that naturally is in or inherently is difficult for people who are trying to attempt to use that as an attack means. Does it have the capability of perfect forward secrecy? What’s its resistance to misuse?

    You know that one came up with ECC. You guys listening probably remember one of the first set of published parameters for which curves were good curves that were published by NIST we’re actually thrown away because they were found to be weak. ECC needed to be very carefully studied because of its potential for misuse in terms of the parameters that were chosen. And therefore for a post quantum algorithm, candidates will be selected based on: “Is it resistant to that kind of misuse, or are there very few mathematical possibilities of that actually happening.”

  • Tim Callan

    That’s a lot of things to be scrutinized. So how does this happen? How does the community get to the bottom of these questions and pick a candidate, whether it’s a lattice-based approach or something else as the right path forward?

  • Jason Soroko

    NIST definitely has sort of a coming together of a lot people who were doing their own individual research, and what they’ve asked for (and I believe this started back before 2015) was, “Give us your candidate algorithms and show us your proofs for why you think it’s reasonable.” That list is being compiled and categorized. There’s a handful of algorithms within each category, such as lattice-based algorithms.

    It will take a really long time for each of these to be worked through, and it won’t necessarily be NIST itself. It’s like any scientific community coming together where there’s going to be a lead organization with leading research and then there will be peer review of this research to work through all the different mathematical proofs that will be necessary.

    An important point that NIST has made is that at the end result there will probably not being a winner. This isn’t an NBA bracket. At the end of the day we’re probably going to end up with two or three or perhaps even more candidates that will come out as being on top. And probably multiples being used. It’ll be perhaps similar to today where we have RSA algorithms living shoulder to shoulder with ECC.

  • Tim Callan

    So how are ultimately will we decide? You’re right. That’ll be up to the individual software vendors. And they will look at the common consensus. They will be motivated to make things that work with their computing systems and are fast and aren’t going to be broken, and those will be the solutions they will adopt.

    Wow. This is exciting. What would you say would be a time frame? If you had to guess to get to the point where we’re feeling pretty good about what those go-forward algorithms are and they’re actually being implemented.

  • Jason Soroko

    Take a look at how long it took ECC, which I think might be an analogy here. I don’t think ten years is unreasonable. Definitely in years seven, eight, and nine we’re doing very different research than we’re doing now, which is we’ll probably be narrowing things down a lot further.

    If you take a look at the sheer amount of work it’s going to take to determine what these properties actually are and especially all the different mathematical proofs that have to be devised, we’re looking probably at a decade. It may line up fairly well with the point at which quantum computers do sustain a number of qubits which threatens ECC and RSA.

  • Tim Callan

    That’s going to be super important. You know when you say, “How long did ECC take?” Yeah, it was about a decade and gosh it doesn’t sound like we have a lot of extra time. So, this feels like a very important thing that the community and the world needs to be getting on and really making sure that it happens.

    I'm going to ask you a completely unfair question. What is your level of confidence that we will work out these questions – have one or more viable algorithmic choices and have them in place and in production – before we do have a quantum computing crisis?

  • Jason Soroko

    I actually feel pretty good about it.

  • Tim Callan

    Good.

  • Jason Soroko

    Simply because I think people realize the real importance of this. There are real dollars being spent on it. Really, really smart people being put on it. I think what’s going to allow us to actually break through here, Tim, is we’re probably not going to end up with a VHS/Betamax stalemate at the end. Obviously there will be commercial interests that will say, “Hey this is my preference versus another,” but I think that won’t stop anything. What it will do is spur on the commercial spearhead of doing this research to getting to better places and it won’t stop actual progress. Which is good.

  • Tim Callan

    And to your point it’s software not hardware. This is where the Betamax analogy doesn’t quite hold. I think the ECC/RSA analogy that you gave earlier is much better in terms of it is viable for multiple approaches to live side by side and maybe different ones are better for different circumstances or maybe they’re just going to both exist and we’ll see which one ultimately is more secure. But that is a thing we can do with our systems.

  • Jason Soroko

    Remember, Tim, you’ve got all kinds of primitives, right? You’ve got hashing, what are the use cases that you’re actually going to be using these algorithms for? Signing documents, encryption, and even things that are quite exotic today but may become fairly normal tomorrow.

    I think this is maybe where we can end it, Tim, is with kind of an exciting kernel. Things like fully homomorphic encryption, which is when Alice and Bob’s communications never leave an encrypted state and the ability to put Alice and Bob’s communication up into the cloud and be able to take measurements off of it without having to actually fully decrypt it. That kind of property will be potentially possible down the road because some of these candidate algorithms, including lattice-based algorithms, include the property that’ll enable things like fully homomorphic encryption, and which things like RSA and ECC do not.

  • Tim Callan

    That’s not even the motivator, but that absolutely could be a benefit that comes out of this.

  • Jason Soroko

    In fact it is on the NIST list of properties that when they’re considering candidates, some of these extra benefits to the algorithm themselves are obviously going to weigh on which were considered better candidates. But you know as well as I do it could also spur the VHS/Betamax type of analogies as some commercial groups push one versus the other because of some of these special properties.

  • Tim Callan

    That’s going to be interesting to watch. So, to the point that we can watch this develop in real time, obviously we will.