Redirecting you to
Podcast Dec 28, 2022

Root Causes 265: A Banner Year for Post-quantum Cryptography

2022 was post-quantum cryptography's biggest year so far. Our hosts are joined by guest Bruno Couillard, CEO and CTO of Crypto4A. We go over many developments in PQC, including the announcement of the NIST round 3 winners, the defeat of several late candidate algorithms, isogeny-based cryptography, hybrid certificates, and the significance of April 14, 2030.

  • Original Broadcast Date: December 28, 2022

Episode Transcript

Lightly edited for flow and brevity.

  • Tim Callan

    It’s a guest episode. We love our guest episodes. We are very lucky to have Bruno Couillard. Bruno is the CEO and CTO at Crypto4A. How are you doing today, Bruno?

  • Bruno Couillard

    I am doing great guys. Thank you for having me.

  • Tim Callan

    Thank you so much for joining us. This is one of our year-end lookback episodes. For those regular listeners, you know every year we will be go back and we will look back at the big events that happened this year and this was definitely so far in human history the biggest year for post-quantum cryptography and so we couldn’t do year-end lookbacks without talking about post-quantum cryptography. We are very happy to have Bruno here because that’s something of an area of expertise for you and so that’s something that we want to make sure that we talk about today.

    So, yes. Gentlemen, it was a big year for crytpo, for post-quantum crypto. I’m not even sure where we want to start. I actually know for sure where we need to start. The biggest part of it by far was on July 5 when NIST announced its winners for its Round 3 contest. So, tell us about that.

  • Bruno Couillard

    Yes. I will definitely say a few words about that. I think this was a very happy day for the entire community. Most of had been waiting for that announcement for months.

  • Tim Callan

    It was originally expected at the end of 2021. And we were all expecting it in Quarter 4 and it was the beginning of Quarter 3 of 2023.

  • Bruno Couillard

    Dr. Moody from NIST was being asked every panel he would attend when is it coming out and it was hard for him to not have a date and everybody was really relieved when that came out for sure. And this is not as well-known, I think, that when NIST started the entire process of the post-quantum migration effort, they declared very much upfront that two algorithms, the stateful hash-based algorithm called LMS and XMSS were actually declared post-quantum ready and they were put on a sideline, on a side track, as these exist. They are kind of strange algorithms to use. They have this state management challenge to them but in total at the moment there are the four that came out, four algorithms that came out in July and the two that had been declared in 2020 to be valid. So, it's a good toolkit. Very different than what we’ve been used to for the last 30 years.

  • Tim Callan

    One of the things that we’ve talked about of course in previous episodes was that it is a good toolkit but there are a lot of eggs in lattice basket. And so one of the things we’ve seen of course is that NIST is proceeding with its Round 4, which has a few purposes but one of those purposes is to diversify our approach to cryptography a little bit because I think there’s a fear – and tell me if you don’t agree with this, Bruno or Jason – but I think there’s a fear that somewhere some genius at a whiteboard is figuring out a whole new way to think about lattice-based encryption and if that happens we don’t want to be left with nothing.

  • Bruno Couillard

    Exactly. And I think that fear certainly has been given more credibility when we saw SIKE and Rainbow which were also contending for, vying for a spot in the limelight. To get their hopes dashed at the last minute - -

  • Tim Callan

    And not just dashed but just destroyed. Like they were both just roundly and completely defeated to the point where you just had to throw the strategy out.

  • Bruno Couillard

    Exactly. And it goes to show that we are entering a world of very new ideas, new concepts and it is very likely there are lots of people out there that fear that, we may end up finding techniques to break lattice-based cryptography and we have to be careful and on the lookout to have always a backup plan. It’s always good in this kind of space.

  • Tim Callan

    That seems like lattice has been pretty reasonably pounded on. It’s not a completely new strategy. So that’s good. Generally, there’s a sense that the more scrutiny these things have undergone the less likely that an ah-ha moment has gone undiscovered. Which is intuitively that would make sense, right and, that doesn’t mean there isn’t an ah-ha moment that has gone undiscovered. I know there’s a lot of negatives there but it still could be that ah-ha moment would occur and we just plain haven’t discovered it yet. And for that matter, that could happen for RSA. Even though it seems very unlikely.

  • Bruno Couillard

    That is absolutely correct, and to quote a good quote from Yogi Berra, “It’s tough to make predictions especially about the future.”

  • Tim Callan

    One of my favorite quotes. I love that quote. Absolutely. Yes.

  • Bruno Couillard

    And it applies to our world. We never know – as you suggested – some very smart person out there in the world today or tomorrow could come back with a genius idea that breaks something that we hadn’t anticipated. So you have to be prepared in essence.

  • Tim Callan

    And so this brings us directly into the NIST Round 4. So, tell us about NIST Round 4.

  • Bruno Couillard

    Well, they are now engaging into this next series of algorithm exploration. They are collecting more inputs, more candidates. The idea, as you suggested, Tim, is to diversify as best as possible. They’re also asking for input on use cases. One example that NIST was asking recently on whether they should standardize Classic McEliece, which has been established for many years.

    And so, NIST is actually canvassing the cryptographic community to figure out how to go about Round 4. How to design the next wave of algorithms so that they can compliment what has already been selected in July but I wouldn’t be surprised if we end up with a Round 5 and 6 and on and on.

  • Tim Callan

    That’s just what I was gonna ask you. Like what are your thoughts? Like how long does Round 4 go on? How is Round 4 resolved? And do we just keep having rounds forever? Is this just the new normal?

  • Bruno Couillard

    I think once - - my feel is that - - so cryptography has been established now for almost, we’ve used cryptography in commercials for almost 30 years.

    And if you guys recall, in the early days of the internet – mid-90s/late-90s – there was this big, big issue with cryptography with a whole that was export crypto, there were kyper chip, that was the example where designers of protocol, designers of APIs had to be careful not to allow for, it was anti-crypto-agility if you wish. This concept remains to this day. We basically have very much a one algorithm that does a job and it’s only one algorithm. We don’t have many protocols that would say, that have this crypto-agility mindset because it was against the kind of rules that created the internet in the first place. What post-quantum is causing the world to adopt and rethink is this idea that, maybe we need to have crypto-agility at all layers – at the cryptographic engines, at the APIs, at the protocol.

    And once you adopt this mantra, switching and modifying and migrating to new algorithms may become less of a pain in today’s world where we are going from kind of a monolith construct to having to adopt and adapt ourselves and our protocols to this new era. I do think once we get there and we’ve built up the tools and we have refactored our cryptographic stacks to deal with the multiplicity of capabilities, I do think we may end up with less of pushback to constantly allowing for new changes and new updates because cryptography continues to evolve but the tools we are gonna use in the future to break the crypto will be quantum tools, quantum computing tools. We will learn to use those quantum computers probably doing things we have not yet imagined and I suspect quantum computers of the future will have much more grit to break our crypto that we are thinking of today. So I suspect we will be having a bit of a churn there.

  • Tim Callan

    It’s interesting. You are bringing up a very good point which is cryptography in one way is one of these kind old, like you said, monolithic things. RSA was invented in the 1970s and everybody just kind of said, well, this is how it’s done. It’s just sort of foundational and it’s always there and I am reminded of Y2K. Where all of these decisions had been made – again, back in the 50s, 60s, 70s – and nobody could imagine a time when years didn’t start with 19 and then suddenly it came, and everything had to change. And it was this huge effort to go sort of rip out the ossified and put in the agile. So, this is a similar thing here. As we are ripping out the ossified, what you are saying is make sure we put in the agile. Don’t put in different flavor of ossified because if we do that then we are back to the same problem fundamentally.

  • Bruno Couillard

    I think typically if you leave engineers to design things and designers of protocols and stacks, perfect example is how we now guarantee from a time perspective I’m not sure we’ll run out of bits to express time anytime soon.

    Another one that we have fixed or we’ve given ourselves more longevity is from IPv4 to IPv6. We’ve expanded the space so big that I don’t think we are gonna run out of IPv6 addresses anytime soon. I think we are gonna end up more or less building our cryptography 2.0 stack because we’ve lived under 1.0 until now but I think now we are in the process now of having to deal with the fact that we need to revisit all of these decisions of the past. But I do think when we get to cross that chasm we will likely end up with a very flexible, much more agile and built to remain agile sort of construct. Is what I do believe. Because we tend to have that approach, as we come across these big barriers to progress. Usually when we do put together the next generation, we tend to ensure that this barrier is not gonna be on our roadmap anytime soon in the future.

  • Jason Soroko

    Hey, Bruno. Just a quick thought just on what you just said. I’m trying to anticipate things that I know where we are going to need agility anyway. I saw Isogeny-based cryptography coming back at some point in the deeper future. I can’t see NIST wanting to just completely ignore it and we are probably gonna see a new implementation even after the SIKE incident. I also can see, as you just said, quantum computers are going to evolve probably at a linear rate perhaps into almost eternity where quantum computers become more and more powerful. So what we saw in the past with traditional cryptographic algorithms being, we had to deprecate older styles where it was the same algorithm but we had to up the bit length. I can even see that being another thing we have to be agile for and, Tim, I know you had a topic coming but I do want to bring up the topic of how we’re gonna address that agility, Bruno, which bringing up the topic of hybrid certificates and some of the recent announcements about that. But, anyway, that’s my two cents in terms of the fact that I just can’t agree more about the need for agility going forward.

  • Bruno Couillard

    I’m in full agreement here as well and again it’s amazing to see how much of an impact the last 30 years of evolution of the internet. Like if you guys think back of 1994-ish where NetScape came out, SSL came out, PKI came out like pretty much in the same years. Just a few years before that, it had been like the web or http had been invented. From that timeline to today, if you sit back and think of how impactful the internet in our lives, in our day-to-day lives, whether it’s ordering things on Amazon or the delivery, the logistic behind the scene that delivers food in the stores or the current conversation we are having, everything nowadays has somehow a touchpoint to the internet and the security flares that are very much at the foundation of it. So, I’ve often used the analogy, especially with my parents, I explained the fact that what we are having to do here as a cryptographic community is we’re having to almost do the same thing as Boston did when they did the Big Dig. And that’s you have infrastructure that is hugely important to the livelihood of a city, you have to build underneath it so you have to lift the current infrastructure up in the air while it’s running, while maintaining it's fluidity and functionality and then go dig underneath, replace the entire gut of it, put the new thing in place, then bring everything back down and hopefully you haven’t stopped the traffic and made it now, you’ve extended it’s life for years and years and years. I usually use that to explain what we are about to do from a crypto migration here. We are gonna have to do pretty much the same thing on a global basis. Like every place you think there’s crypto, you have to lift, change it and then replace it. So it will definitely be a fun journey. I can assure you of that.

  • Tim Callan

    I agree with you and we’ve talked about how I think in a lot of ways the next step is standards. So standards bodies now that they have these primitives have to go in and go at a much deeper level of saying ok, we are gonna from a perspective, we’re gonna explain to you how it is that you have to implement this so that it is interoperable, secure, future proofed, backward compatible, etc. And all of those things definitely are gonna have to go on now. The other thing that’s interesting I think and, Bruno, I know that you understand this well is there’s been a lot of direction from the U.S. Government on this topic.

  • Bruno Couillard

    Absolutely. I think looking back at 2022, there’s a definite I would put those as the drumbeat or flags or signals, strong signals, as starting with a first memo coming out of the White House on January 19 where they indicated that cyber security is critical and quantum computing is arriving so they were basically signaling let’s gear up here. We’ll have to go and visit our current crypto inventory. We’ll have to figure out where are things at risk. What can we change? What are the priorities? And that was the first kick at the can I would call it. Very quickly after the White House came out pretty much in May with an even more, not stringent, but with more details of how they plan to make sure that their departments would gear up to be quantum safe. And as the year went by, as you said, there was the announcement by NIST. NIST, as soon as that came out, NSA in September came out with their updates. They had said all along that they would follow NIST’s lead. So, NSA, in September, came out with an update to CNSA, their policy for cryptography and they said in essence, we will use the standards that NIST is working on and they also went further and they suggested that – well, they kind of stated that in 2025 they are expecting vendors to use hash-based signatures, stateful hash-based signature algorithms. So this is the point I was making earlier. That thy shall use those algorithms to sign your code and your firmware on machines you deploy in national security systems. I thought this was very powerful. In 2025, in the world we live in, is tomorrow. I mean you don’t have much time here.

    And then on 18th of November, there was another memo. This one from the Executive Office of Management and Budget, that absolutely took me by surprise in that the info in there was again going and taking what had already been stated, but this one had a slightly increased level, if you wish, of urgency. In that memo, they were suggesting that departments could and should start working with prototype products even if they are not standard, go test them out, go deploy them in your operational environment to figure out how they are gonna behave in the real world. That was the first time that they were opening the door to doing work with non-standard products and technology and it looked to me as, wow, things need to be done soon here. Someone somewhere wants to get things done. Fast. So, I do think there was certainly a plethora of very strong indicators – at least from my perspective – from the U.S. White House and NIST and NSA that this a real and a serious issue and we have to get going.

  • Tim Callan

    This is one of the things that we’ve been trying our small way to emphasize is don’t just sit around and wait for it to be handed to you. Get knowledgeable. And part of that is hybrid certs. So another interesting announcement this year was that ISARA put its hybrid certificate patent – or technology - I’m not sure what word I want to use – process into the public domain. So, tell us about that.

  • Bruno Couillard

    Depending on how you think about the transition. For example, you as a public CA, when you issue a certificate through a server, there’s a good chance that that server will be called by thousands and thousands of browsers and clients that may not all be ready on the same day to have transition to post-quantum.

  • Tim Callan

    They’re guaranteed not to be.

  • Bruno Couillard

    Pretty much. It would be a miracle if it ever happened.

    So the idea and the beauty of the concept of hybrid certificates is that for those machines that have not yet transitioned, they can be sent an x.509 certificate which looks and feels like a normal x.509 certificate. It has an outer body. It has a signature. All seems ok. But one difference is that it has very bulgy extensions at its core and those extensions are non-critical and the idea here is that every pieces of your post-quantum signature, your public key to validate that signature or the public key of the subscriber and the algorithms are created as extensions, x.509 v3 extensions, that are part of your x.509 general purpose certificate.

    So, if I receive this and I don’t have a clue as to what these extensions mean, given that they are not critical, I simply skip over and move on. So as far as I am concerned, if I’m the client receiving this hybrid certificate, if I’m not equipped to read and use this post-quantum material, I’ll just simply ignore it for the time being. But if I have now been updated, at some future date I get my firmware update or my software gets updated and I’m now able and capable to read these extensions, I now have the ability to verify my certificate both on its classic signature, which is the outer shell, as well as validating the post-quantum signature that’s now inside this shell. So hybrid certificates have this ability to allow for a smooth transition in a world where, as you said, Tim, it would be a miracle to attempt and I’ve never, ever, ever, seen it.

  • Tim Callan

    And otherwise, it just is so much harder. Like, you imagine trying to accomplish this without hybrid certificates and it’s just a nightmare.

  • Bruno Couillard

    Yes. I’ve done some projects way back in my life where there was a notion by a very, very big and smart and well-funded customer and one of the ideas they were contemplating were to set up two parallel PKI and, again, they had resource; they had money; they had the wherewithal to achieve this; and in the end, they abandoned the idea because of the massive amount of headaches it was causing. So, transitioning from a classic to a post-quantum in a world of large amounts and large, large numbers of customers, if you don’t have hybrid certificates, I don’t know how you do it. I really don’t. Too much of a challenge.

  • Jason Soroko

    I think the hybrid certificates as well, Bruno, talking about that agility topic earlier which is just do darned important, even the ability to do lateral movements from one post-quantum algorithm to another becomes possible that even the ability to switch between RSA and ECC, classic, is possible.

  • Tim Callan

    Or a new thing that we don’t use today that is a result of Contest 4 or Contest 5 or Contest 6.

  • Bruno Couillard

    I was actually talking with someone just earlier today. I think – and I’ll probably be shut down by many for saying that – but I do think that you may end up with a world of tomorrow where say we become satisfied with the ability to validating both Dilithium and ECDSA signatures, as an example. So we then, one day in the future, make our certificate have those two signatures be what they called composite signatures. The two signatures have to be validated. But you could end up with a world where you’ve got an x.509 certificate that has an outer shell which requires two validation with two different algorithms using say ECDSA as an example and Dilithium and you could still try out inside that certificate as a hybrid kind of delivery mechanism a brand-new idea that the world of tomorrow will come up with. So, in essence, you could have a composite that embeds the hybrid idea. So it is potentially where maybe the extreme idea of crypto agility or certificate agility but I do think we may end up having use cases where that will have to be part of our ecosystem in the future as well.

  • Tim Callan

    So that’s an interesting perspective. Don’t just think of hybrid certificates as a transitional thing for PQC. They may also just be part of our future in a more crypto-agile world?

  • Jason Soroko

    They can also be part of right now, Bruno. Which means if you take a look at what Cloudflare is doing just as an example, they have a press release on this where they said they are gonna be implementing hybrid certificates and draft standards of post-quantum algorithms and so therefore, lattice-based post-quantum cryptography being used right now even though the official standards are not finished, they are using hybrid certificates in order to give themselves that ability and that agility to move from one draft standard to the next. It’s just a really good example.

  • Tim Callan

    It’s crazy. And if anyone is gonna do it, it’s gonna be Cloudflare. They have such a giant network that they control. So, there’s footprint. There’s technical resources. It’s a very technically adept company and it’s an extremely agile company and then when you put all those things together that would be the people who came out with something like this. But, still, like wow. You want to talk about being very early to the party to have Cloudflare implementing these things the same year as the standards are actually finalized is just super fast.

  • Bruno Couillard

    And I think thanks to their efforts and their process, we will encounter, we will keep encountering places where these new constructs and the size of these certificates or the keys will cause us to have to revisit certain pieces of components that we hadn’t even anticipated changing. So, we need to explore. We need to test. We need to be trying out those combinations and we need to do it as fast as possible because it sounds like quantum computers are potentially more advanced than we might think.

  • Tim Callan

    That’s possible as well.

    So, Bruno, I can’t let you leave without asking you to reiterate something that you told Jason and me – not on this podcast but just separately – you told us about I guess an announcement that came out earlier this year from the Cloud Security Alliance. Why don’t you tell us about that?

  • Bruno Couillard

    Yes. I thought it was awesome. On March 9, the Cloud Security Alliance decided to set the date by which everyone had to be quantum ready and that date, they picked the 14th of April 2030.

  • Tim Callan

    April 13, you are fine. April 15, you are too late.

  • Bruno Couillard

    That’s it. That’s it. And I thought it was awesome. When I saw that I thought, wow, that’s cool.

  • Tim Callan

    That’s cool. Like, Jason and I have been talking about this a lot for years haven’t we, Jason? And, um, it’s just been such an eventful year. Probably the last question, gentlemen, unless I missed something, is when we look back 365 days from today and we look at 2023, are we gonna say it was the most eventful year in the history of the post-quantum crypto? Is that the new trajectory we are on?

  • Bruno Couillard

    That’s an interesting one. Definitely the future is getting harder to predict.

  • Tim Callan

    Yes. Yogi, what’s your opinion on that? What do you think? What do you guys think?

  • Jason Soroko

    I think, Tim, that Bruno is onto something suspecting that this big push by the U.S. Government specifically means that there is something probably going on that we are not completely in the know about yet and it might come out a little more in 2023 and if it does, then I guarantee 2023 will be the biggest year in post-quantum history and I suspect that there’s something spooky at the spooky level that’s going on that’s causing this push and you heard it here from Bruno just how hard they’re pushing it and the timeframes being pushed. So more of this may come out and there may even be a harder push next year. That’s what I think.

  • Tim Callan

    And even without that, we’re gonna see all of industry spinning up. Like it’s gonna be the year of industry spins up. And that’s gonna mean lots of events, lots of change, lots of progress is gonna occur in 2023.

  • Bruno Couillard

    And I think my personal opinion is that NIST’s, the National Cyber Center of Excellence or NCCOE, they have done a really awesome job I think at trying to demonstrate this capability and industry’s ability to address the changes on the go here. So if anyone is interested, I would suggest to check out NCCOE post-quantum migration project. You will find many companies that are working together to figure out those challenges as a team. And, there’s a lot. It’s not, I could probably go on for hours, being an HSN guy, the interfaces, the API changes, the concept of, it used to be that you call in for a specific key for your algorithm. In the future, you’ll have to think of it as, well, do I use a key? Is it multi-key? How do I refer to it? There’s a whole lot of different places where assumptions that we made back in the 90s will have to all be revisited.