Root Causes 36: Quantum Apocalypse - The Search for Quantum Resistant Crypto
Finding the new quantum-resistant cryptography we will need to replace RSA and ECC is a difficult task requiring the coordinated effort of academics, industry, and government. NIST has stepped in to lead this volunteer community. Join us to learn about this project to discover and vet going-forward crypto candidates, where we stand in the process, and where we go from here.
- Original Broadcast Date: September 3, 2019
Episode Transcript
Lightly edited for flow and brevity.
-
Tim Callan
We’re following up on our last podcast on the topic of quantum computing and building, implementing, and deploying a quantum resistant cryptographic backbone to future-proof our systems against the world of quantum computing. Where we left off last time, Jay, I believe, was that we had talked about the concept of the Z date, which is how long until we need this, essentially. And I asked you, are we going to get there on time? And you said stay tuned. So here we are. I'm going to ask you, are we going to get there on time?
-
Jason Soroko
I think the answer is maybe.
Here’s the good news. There is work going on right now to get us toward a quantum resistant algorithm or algorithms that will be the future algorithms for what we do today, which is to issue certificates for SSL / TLS authentication and encryption, all those good things we take for granted right now. With RSA-2048, for example, and Elliptic Curve Cryptography (ECC), those cryptographic algorithms are very well-known, very well-used, very mature. Unfortunately, we’re not at that maturity level yet with quantum resistant algorithms. That’s what NIST is currently helping with.
-
Tim Callan
The next step to get there is that NIST has moved in and taken a leadership position. They do this on a lot of security initiatives, and they’re kind of orchestrating, coordinating the effort by a lot of smart people, many of them academics, to put together a plan for quantum resisting cryptography.
-
Jason Soroko
Back in December 2016, there was a formal call for submissions of post quantum algorithms. I forget the exact number of total submissions. It was under 100 but somewhere around the high eighties.
-
Tim Callan
That’s a lot.
-
Jason Soroko
That is a lot. Especially when you consider the sheer amount of work that went into any given one of them.
But a year later (it took a year, Tim), a year later, December 2017, sixty nine of those submissions were declared complete and proper for submission into what we’ll call Round One.
-
Tim Callan
What does Round One do and how does it work?
-
Jason Soroko
Round One was all about kicking the tires of the different ideas that had been submitted. It was more like a brainstorming, just a call of who has an algorithm, describe it to us, and then the encouragement was, “Let’s try to attack these things. Are there obvious errors in the math? Will there be a leader of one specific type of cryptography versus another?” Because there’s not just one mathematical approach. There are actually at least a handful. This was a nice broad stroke by NIST and all the researchers to fill the pipeline of technology and see what comes out at the other side.
-
Tim Callan
So the ideal, which maybe they didn’t get to, would’ve been to uncover every viable alternative algorithm, shine a light on all of them, and let the global community poke at them to see which ones prove to be infeasible or inappropriate or inadequate so that we can wind up with a narrowed list of viable candidates.
-
Jason Soroko
That’s right. I don’t think that the emphasis was on implementability. Some of the algorithms that were submitted even had extremely long key lengths and therefore might require incredibly wide networking paths to be able to actually physically implement. Therefore, what might not get through to the second and third round potentially, if there is a third round, was allowed in the first round. Just because it was just completely academic, completely theoretical, so a lot of stuff that might not get to the end was submitted in the first round.
-
Tim Callan
So where are we in Round One?
-
Jason Soroko
We’re actually in the middle of Round Two, which I believe will have a major point within August of 2019. I think it’s going to be in the second or a third week of 2019 which is going to be a second post quantum cryptography standardization conference.
I think we’re down now to twenty-six algorithms that are currently being evaluated within Round Two. They’ve advanced to what NIST is calling the post quantum crypto semifinals. Seventeen of those are public key encryption and key agreement algorithms and nine of those are digital signatures.
Now Round Two’s going to be a lot more about implementing capability and so I'm going to rhyme off three things here.
Obviously the size of the encryption key and signatures is important. If it’s just too huge, too large, it’s difficult to implement. Therefore maybe even if it is a strong algorithm and is resistant to attack, it still has to be manageable.
-
Tim Callan
It might be infeasible to use in practice.
-
Jason Soroko
That’s correct.
Number two is the time required to encrypt and decrypt on each end of a communication channel to sign messages, verify signatures, etc.
And number three, the total amount of traffic sent over the wire required to complete the encryption or decryption.
Those are the three most important things at Round Two, but as well we’re going to continue to kick the tires as hard as possible and allow people to do mathematical proofs of attack against these algorithms. Algorithms that are successfully attacked are obviously going to be moved to the back
-
Tim Callan
So there’s a sense that these things have to be practical in the real world. And we know what the consequences of this is, right? When I click on a webpage I don’t want to have to sit and count to twenty before I get my results, because it just screws everything up. I don’t want to not be able to connect on my telephone because it takes too long. Are there standards in place for these three requirements like size, time, total traffic? Like are there thresholds to get over or is this just part of what we’re considering when we’re looking at the aggregated viability of this encryption algorithm?
-
Jason Soroko
NIST is taking the approach right now where they’re not using a threshold. Obviously they want to be able to get the best of all worlds. If all of the algorithms fall apart except for one and it happens to have a very large key length, well maybe it has to be considered, right?
So therefore, I don’t think NIST has ever even said, “Look if some new idea gets dreamt up, we won’t consider it in a third round.” That’s just not the case. In fact, what you’re seeing, and I think this is what NIST is encouraging, is a lot of people who were in the first round who had similar ideas, but one of them may have had, you know, a more efficient key length, another of them might’ve had a stronger security overall, they’ve been asked to merge. Which is an interesting concept.
-
Tim Callan
They’re saying, “Why don’t you guys work together? Because I bet you that working together you can come up with something that’s better than what either of you came up with individually.”
-
Jason Soroko
That’s exactly what is happening, and it’s happening within Round Two. So there have already been announced mergers of algorithms. I won’t talk about them here. It’s too much detail, but that’s all posted on the internet. It can be researched and checked out. It’s very interesting.jas
-
Tim Callan
So Round Two we’re expecting by the end of August to be worked out entirely, and what do we expecting to get? What’s the deliverable at the end of that?
-
Jason Soroko
I think what you’re going to find is another re-rendering of, “Alright, here are the ones we’re going to throw out because they were successfully attacked. Here are the ones with key lengths that are just abominable.” You know just too much to deal with.
Especially because of the fact that maybe there are a handful left at the end of the day that have all those properties we want and look implementable. Those are the ones that have become very, very shortlisted.
-
Tim Callan
This might be a horribly unfair question, Jay, but I'm going to ask it. Is there an expectation or a target for how many algorithms we’ll have at the end of Round Two?
-
Jason Soroko
I think it’s a completely fair question. It’s not going to be down to one or two. It’s going to be less than twenty is my guess. Perhaps even down to ten or maybe less than that. But not just a couple.
-
Tim Callan
So it still seems like there’s a long distance between ten candidates and “Here’s what we’re moving forward with.” What’s the path to getting this decided?
-
Jason Soroko
There may very well be a third round, and I think that the third round would essentially be just more of what we’re doing right now. However, this is the tire kicking but amped up to eleven.
-
Tim Callan
This is hugely important. If you think about RSA, RSA ran in a real limited set of use cases for more than a decade before all of a sudden we saw this explosive growth of the World Wide Web and people started to say, “Let’s use this.” So it had had a chance to be proven, and plenty of people had poked at it and tried to beat it and used it in real world implementations on their networks. It had all that time to be refined and to be proven.
Now all of a sudden we’ve got this highly visible, very important need for a new set of algorithms where whatever we start implementing is going to have crosshairs on it from the day it shows up in production systems, and it feels like all of this has to happen lickety-split.
-
Jason Soroko
Let’s consider this for a moment, Tim. We’re now into the prognostication part of the podcast here.
Dustin Moody over at NIST, one of his quotes is that a very wide range of mathematical ideas are represented by what he’s calling three different families of algorithms: lattice-based, code-based, and multivariate-based. There’s also a few miscellaneous types as well.
I think what we’re going to see, Tim, is you just said RSA took a long time. ECC took a long time and each one of them would be considered its own class of algorithm. NIST is keeping at least three distinctive groups or more. Like three plus one miscellaneous list. I think it’s in his words that’s the hedge against the possibility that if someone breaks one, we could still use another.
-
Tim Callan
So let’s say there is a breakthrough that nobody has thought about that is fundamental at its core, and it turns out that the whole code-based approach—just to pick an example at random—is rendered invalid. Someone does some really smart work on a chalkboard and comes back and says, “Guess what, folks?” And under those circumstances we say, there’s a Plan B. There’s a Plan C.
-
Jason Soroko
I think ECC itself, Elliptic Curve, there were a lot of question marks around it saying, “Hey, it’s a great idea. It’s not prone to advances in mathematics around factorization.” That’s the reason why we have ECC. The big question about it was, “Geez, is it mathematically sound or is there some mean to completely break it down?”
I don’t think the mathematical proof of that came about until the solving of Fermat’s Last Theorem, believe it or not. It took a mathematical Eureka moment before ECC was really considered rock solid. And even when it was considered rock solid, what we learned because of some politics that happened towards the end of that process was that you have to choose your parameters incredibly carefully or otherwise you can unravel it.
-
Tim Callan
Ok, wow. You and I talked in our last episode about this concept of X plus Y where X is the amount of time it takes the community to arrive at these going-forward algorithms—this algorithm or these algorithm—and then Y was the amount of time it takes to implement it across our global infrastructure. So that’s real and in production before Z rolls around, which is the date that quantum computing can break the paradigms we have right now.
The idea that X plus Y must be less than Z or bad things happen. So how much time are we going to need for Y and how much time can we allow for X? Because it seems like the clock is ticking.
-
Jason Soroko
If you think about it, how long did it take for RSA? How long did it take for ECC? Some of the math we’re talking about here is as substantial if not more so, and so therefore perhaps about that long. I don’t know, Tim. Maybe we are talking on the timeframe of the decade in order to kick it hard enough. That’s not a fact. That’s just me saying it could take that long.
-
Tim Callan
One would hope that with much more attention on this than was on RSA or ECC, hopefully we could accelerate that to some degree. We also can try to accelerate Y because we have fundamentally much more crypto-agile systems now than we did even a decade ago, let alone longer than that. Ideally that means that these updates can be pushed out and hit the overwhelming majority of the digital systems without a great deal of breakage and without an exorbitant amount of work required by those companies or those technology providers or those individuals.
But gosh. As I’ve mentioned before, we have much more complexity in our systems. We have more devices. We have more kinds of data. We are rapidly moving into a world where everything needs and has crypto. When your refrigerator has to have crypto, that’s a very different world than it was even when ECC came in.
-
Jason Soroko
Yeah, there’s absolutely no doubt about it. Mosca, who we reference quite often on these post quantum podcasts, Tim, University of Waterloo researcher who kind of owns that Z date concept called Mosca’s Inequality, which is that whole idea of how long your current crypto’s going to last? How long it’s going to take to retool and how long it will take for large scale quantum computing to break what we currently have, which is X.
That whole concept coming out of the University of Waterloo, whenever he’s presenting, one of his conclusion slides quite often is an appeal to the commercial side of the world. Because obviously there’s a ton of academic researchers working on this right now feverishly. The commercial side of the world can’t wait until NIST has finished its process completely.
In order to figure out what kind of tooling do we need to do in order to spit out an SSL certificate that can do this. That’s absolutely the case. And that’s the reason why we’re talking about this, Tim. That’s the reason why behind the scenes we are heavily involved in understanding this, researching it, getting ready for the retooling ourselves so that none of this will come as a surprise when our commercial customers need it.
-
Tim Callan
And this is an example of a larger concept that’s so often true in the world of IT security and in the world of PKI, which is that it’s like an iceberg. Most of what goes on is under the surface of the water and you never see it. There’s a bunch of work that technology people do and vendors do to make sure that their systems are hardened and secure and they just want the end users, whether that’s IT professionals or individuals using devices, not to have to worry about it. And so you do the hard, specialized work. You do it correctly and then the systems just work. That’s really the idea.
-
Jason Soroko
In fact I gave a couple of stage talks not that long ago, reminding people of just how pervasive Public Key Infrastructure and cryptography are in their lives, whether they know it or not. You know, it even surprises me when I came up with this slide, trying to come up with examples of all the things that happen on a daily basis that are involved.
-
Tim Callan
It's everything.
-
Jason Soroko
It's everything.
-
Tim Callan
That’s how we got into talking about mohawks. And if you’re curious about that, go back and listen to our last episode on Mosca’s Inequality.
-
Jason Soroko
It’s a really good analogy. We might bring up your mohawk, you know as a joke more than once because it really makes the point of, what happens if all this goes sideways? Your point was, we’re looking at Mad Maxworld.
-
Tim Callan
It really is Mad Max. It truly is. And that’s why we can’t allow it happen.
-
Jason Soroko
That's right.
-
Tim Callan
The good news is, we’re not going to allow it to happen. All sorts of people understand the gravity of this and the effort is being put in and it is not reasonable to expect that that will be allowed to happen. But nonetheless I think it’s worth acknowledging that the stakes are that high and that this level of resourcing and attention is appropriate.
-
Jason Soroko
So for those of you interested, NIST, their computer security resource center has a great website. Like, more information than you could possibly shake a stick at or ever read. One of their websites that you can search on, post-quantum cryptography. All kinds of work there. Dustin Moody is the author of quite a lot of it. But if you want to read about these Round One submissions… if you notice Tim I resisted the urge to rhyme off what the algorithm names were within each of the categories and the twenty-six that were chosen in Round Two.
-
Tim Callan
Too many.
-
Jason Soroko
There’s too many there so therefore go to that website and check it out. It’s fascinating stuff.
-
Tim Callan
This is a fascinating topic and again, as with many topics so much more that could be discussed. Let me just remind listeners, our Episodes 5 and 6 lay some groundwork for this. It’s good if you want to understand the basics. Listen to the episode right before this one where we really talk about the concept of the Z date and how it works and when it is. And of course, stay tuned to Root Causes because we will continue to cover this topic as it continues to develop.