Redirecting you to
Podcast Jan 10, 2025

Root Causes 455: PQC Standardization in IETF

We talk with guest Sofia Celi of Brave Browser, who leads the IETF PQC standardization effort, about the process of setting standards for PQC-compatible digital certificates. We learn about expected timelines, hybrid strategies, the NIST PQC onramp's role, and more.

  • Original Broadcast Date: January 10, 2025

Episode Transcript

Lightly edited for flow and brevity.

  • Tim Callan

    Thrilled to be having a guest. I always like having guests, and I'm especially thrilled because this guest we've been trying to get on the episode for what maybe like, nine months. Before you answer that, let me introduce Sofía Celi. Sofía, welcome.

  • Sofía Celi

    Thank you. I'm happy to be here.

  • Tim Callan

    I think it's been like nine months we've been trying to get you on this show. You do a lot of things, and I don't want to list all of them. Maybe in a future episode if we have you back, but a couple things I think that are relevant to us today. First of all, is, you are a cryptographer for the Brave browser. In addition to that, I know you chair several working groups in IETF, including the postquantum cryptography working group. Tell us about that.

  • Sofía Celi

    Yes. That is correct. I have several involvements in the IETF, either in the research part of the IETF, which is called the IRTF, which focuses on future research, or in this more standardization of protocols for the use in the internet, which is the IETF. I chair two groups there. One research group, which is called the Human Rights Protocols Considerations, and then on the IETF, I chair the postquantum use in protocols, the PQ working group of the IETF, and I'm also part of the Ombudsteam of the IETF.

  • Tim Callan

    Like I said, you do a lot of things. I'd love to focus today and correct me if you don't think this is right, but I believe that it is. That there's kind of a cascading protocols and standards process. Where first we had to get those FIPS protocols out of NIST. We had to have the contest. We had to get the winners. We had to convert those into FIPS protocols. That needs to be turned into certificate protocols and communication protocols, which we will do through IETF, and then that can flow down to other bodies, one of which is CA/Browser Forum, not the only one, in order - ETSI is another one - in order to build their own standards. Am I articulating that correctly?

  • Sofía Celi

    That is correct. So usually we have regional bodies of a standardization that usually are the ones that is standardized cryptography for the national level interest. That is the case with the National Institute of Technology Standards in the US. The other regional bodies that also standardized sometimes cryptography that also has happened in Germany or sometimes at the EU level. We do have a place where we standardize cryptography at the IETF, which is called the CFRG, the cryptography research forum, which is part of the research part of the IETF. But in general, what we do is that if there is already existence in some cryptography that has been standardized by regional bodies, we usually use them in the Internet protocol. In the big protocols that secure the communication of the internet, we directly use them. So either you will use them in the NSX or in TLS or in any of the protocols that IETF chooses.

    And then that is true that eventually, once they became a standard that is also already incorporated in the protocols that secure the transport of the internet, then what happens is that a lot of companies and a lot of organizations adopt those standards and push them. Either the big CDNs and kind of the server side of the things will adopt them, and also on the client side, the browsers will start also adopting them, and, of course, all of the parallel organizations that exist to them, which is like the certificate management, key management, even like UI concerns. There's like a lot of places that eventually the internet will touch.

  • Tim Callan

    You guys are critical to the rollout of PQC. Like, in a realistic way, we need the work that IETF does to roll out PQC. And this isn't me buttering you up. This is me leading up to a question which is, what is IETF up to in this regard? Where does it stand? What are the plans? When are we going to see them? What are you guys doing?

  • Sofía Celi

    We actually started pretty early to consider postquantum cryptography and internet protocols, because even during the ongoing competition at the NIST level, they were already proposed as to how to integrate them into the communication protocols.

    For example, one of the biggest drafts that we currently have at the TLS Working Group, which is how to add a KEM into the key exchange of the TLS 1.3 protocol. That draft has existed for like years now, so we were almost on par, but we were still waiting for NIST to get the final decision on who the winners were and actually publish a set of standards in order to also make those drafts and RFCs.

    At the moment what is happening is that because those drafts already exist there and have been packed for a while, what is currently happening is that now we're actually trying to transform it into proper RFC so people could use them. There's a bunch of drafts currently. There's a lot of them, the TLS working group. There's some at the MLS working group. There's one at the ACME working group. There's a lot of working groups that have already like starting preparing. The focus has been mostly to focus on the confidentiality parts of the protocols. The ways in which we establish as share security is the one that we are putting our focus the most on, because we consider that that could be, that there's like a concrete attack against the confidentiality part.

    Someone can harvest now and decrypt later. There has been less of a focus on the authentication part. Though I do think that next year we will see the rise of the authentication part, mostly because actually, two weeks ago or three weeks ago, Amazon actually published that they are going to be issuing certificates for the internal, private CA with MLDS. That means that the industry is already trying to issue these certificates, even for the private use cases. This is a private CA, but that already like signals that there’s interest into actually experimenting and deploying the authentication part.

  • Tim Callan

    There's a bunch of directions we can branch from this. I want to grab that last bit that you just said, and maybe we will return to some of the other ones but it's interesting you bring this up. We see pre-standards, homegrown, walled garden implementations from technology leaders. Amazon is not the first. I think Cloudflare was the first I became aware of where they just sort of said, none of this has worked out, but we control both ends, we're just going to make something, and we'll change it later if we have to. How do you feel about that stuff? As somebody who makes standards. Is this good? Is this bad?

  • Sofía Celi

    That's actually great, in my opinion, because one of the reasons why at the IETF level and other standardizations levels, we have been feeling comfortable of putting some cryptography is because we have actual resource and actual benchmarks that shows us that if we migrate to some post-quantum cryptography, nothing broke.

    There was a really early experiment that was launched between Cloudflare and Google Chrome in 2016, I think, that showed that it was feasible to put like a hybrid key exchange. I think they use an isogeny one plus like an elliptic curve algorithm. And it showed that it was okay, that nothing really broke. There was some impacts on the round trip at the TCP layer, and there was some middle boxes that couldn't handle, but it was something that we could solve. And we saw that quite a lot for the key exchange part. For the confidentiality part, there were a bunch of experiments performed by a lot of companies. There were also similar experiments done by Amazon, but we saw that less with the authentication part, because it's really difficult to have these emulated experiments, because you will have to have a way to issue these certificates, and we will have to do a coordinated experiment between public CAs and the server and the client. That's much more difficult to launch, but it's great to see this experimentation and it’s exciting, at least from Amazon, from a private CA even when they are both controlling the endpoint, because if they succeed and they showed us that there was nothing that really went terribly wrong, then that means that maybe the public CAs will be able to start migrating, because now we have the assurances from the experimentation and actual data and actual numbers that show us that everything was fine.

  • Tim Callan

    I get it. Real world, just real world data are always valuable. You said there are a lot of drafts in draft form right now, in production right now, and in particular, that you guys are focused much more on the KEM, ML-KEM side of thing than the signature side of thing. Are we therefore expecting to see standards actually come out in stages? Like are we expecting to see a standard for key exchange before we get a standard for digital signatures?

  • Sofía Celi

    Yes. I would say that that's probably what is going to be happening. One of the most active groups that is currently undertaking this postquantum migration is the TLS one, because TLS essentially protects like the majority of communications in the internet and there, there exists already one, at least, I think four proposals to actually how to integrate ML-KEM into the key exchange part of TLS. That draft exists alone, so it doesn't really touch any of the signature authentication part of TLS. We'll have to wait for a separate draft that actually states how to do the signature part in a safe way.

  • Tim Callan

    I'm just gonna put you on the spot. When are we expecting to see a standard for TLS key exchange?

  • Sofía Celi

    That I really hope to happen next year because that specific draft that is called a Stebila draft, I think because it was authored by a cryptographer researcher, which is called Douglas Stebila from the University of Waterloo. That has already been working being working group last call for a really long time. I really hope that eventually it's going to be published as an RFC in 2025, hopefully.

  • Tim Callan

    When do you think in 2025? That's a lot of range.

  • Sofía Celi

    Well, I hope at least because the IETF has actually four global meetings that we do, I hope it's in the third global meeting that we do, which usually happens – and I have to check this - hopefully, by November at the latest. That will be my hope.

  • Tim Callan

    We’re shooting for this year, but we're really shooting for the end of this year.

  • Sofía Celi

    One of the discussions that has been happening in that working group, also that is very interesting, is that for the majority of times we thought that the approach that we were going to be taking in the confidentiality part was always to be hybrid. But we have heard from certain other organizations that they are fine with not having a hybrid, but just directly migrating to the postquantum algorithm, and not, for example, doing ML-KEM or x.509 or one of the NISQers, but rather just using ML-KEM. There's some debate at the moment about what is the most appropriate thing, either hybrid or just pure, postquantum, or both.

  • Tim Callan

    This is something really big you're saying now. I want to make sure I'm understanding this right. You're saying that it is not yet determined whether or not a hybrid standard will be offered for TLS.

  • Sofía Celi

    I think that one will go because that has had a lot of support from the working group. But what yet needs to be determined if we are going to be offering only postquantum. Anybody can do that actually, because there's already code points registered on INA that allow you to only use postquantum algorithm, but it is more about if we're going to create a standard that says that it is recommended by TLS to also only use this postquantum algorithm.

  • Tim Callan

    I might have two options. I might be able to use a hybrid implementation or a pure PQC implementation. Then of course, from there, people can then proceed to offer hybrid certificates and pure PQC certificate. Presumably. Does all of that get yoked into the November release, or is it possible that that winds up being more kind of staged releases over time to get through those?

  • Sofía Celi

    I think there will be stage releases. We usually do have these big meetings on the IETF when, essentially, because we meet together, we get like, consensus much more easily. But if that happens in the mailing list of the IETF, and it turns out that we reach consensus fairly early, then it can also happen.

  • Tim Callan

    Are there any road blockers or sources of worry? Do you feel good that this will eventually get done and it's just got to get done, or are there risks involved in all of this?

  • Sofía Celi

    I actually think that, for example, publishing all of the drafts on a hierarchy exchange for any of the protocols on the IETF, I don't see that there's much of a risk, because we already have experimentations from companies that show that it's okay to do. But some middle boxes choked when Google Chrome changed to a different algorithm, but it eventually was resolved. We have also assurances on the security of the algorithms, because NIST has been standardizing them for a really long time. That seems less of a worry on the key exchange part.

    On the authentication part on the contrary, and the reason why I think it has been delayed is both because it's much more difficult to migrate the authentication systems. You have to coordinate with the different Certificate Authorities. You have to come up with a way to expire this now all certificates which were always horrible, horribly done how we expire certificates. We have to coordinate with a lot of parties, and also the majority of the signature algorithms that have been proposed to standardize by NIST do have some public key sizes that are much longer when compared with the classical counterparts. We still don't know, because we send so many signatures, for example, at a TLS layer. I think we send six signatures in a handshake. Because we send so much the impact of the signature at the authentication level is much more felt than at the key exchange level.

  • Tim Callan

    Sofía, while I've got you, you've already covered so much territory, and I'm taking notes here. I got a lot of interest in terms of Certificate Authority, as a Certificate Authority, in order to think through the signing process of a CSR, a Certificate Signing Request. And we know from Bas Westerbaan and just thinking through the fact that a lot of these postquantum algorithms have much larger signature sizes than we've dealt with with RSA or ECC. I happen to know right, IETF working with Bas Westerbaan and others on things like Merkle tree certificates, that project and also another project which you just mentioned, which is a lot of people who listen to this podcast are very interested in ACME. And of course, because the CSR process is part of ACME, you guys, and other reasons, you guys at IETF are actually working on PQC ACME. This whole world around the CSR process large signature sizes and how that cascades to changes that are necessary, this could probably just be a whole episode to itself, but I'd love to hear you categorize what's IETF doing? What's the thinking around that in general? Is it a later project, and what's generally going on to help solve this problem?

  • Sofía Celi

    Yes. I think you touched a really interesting point, which is that currently, what we are always accustomed to in the majority of the protocols to use for authentications are signatures. But from a cryptographic standpoint of view speaking, you can also arrive to authentication not only by using signatures. You could have a little bit more complex system or just change them in order to save some bites.

    For example, one of the proposals of the American tree is also to rethink the way that the certificates are kind of handled in a way that it will be more efficient without having to think so much on the size of the public key. Or, for example, there's another proposal that I have co-authored, which is called KEM-TLS, which, instead of using signatures, use KEMs to arrive to authentication, which is also a way that you could use. There's like other things that you could also use for authentication.

    At the moment, we seem to be fixated for signatures, but maybe it turns out that we cannot handle those signatures in the real world, and hence we have to move to other authentication mechanisms. Or it also maybe means that some protocols will not be able to handle. Let's take the case of maybe DNSSEC, and if it turns out that we cannot, like, just completely migrate the signatures of the DNSSEC to a postquantum one, that maybe we'll have to redo the whole DNSSEC protocol. That's also an approach that we could take.

    There's like a lot of open avenues. At the moment, they are all stopped because we're only focusing more in the case change part. But it turns out that if, for example, some of the experiments that Amazon is making at the moment, or that Cloudflare is making at the moment, really showed us that the signatures would really make connection chokes that maybe we should think of other authentication mechanisms.

  • Tim Callan

    This is an important point. I want to make sure that we're communicating this correctly to the listeners. I'm gonna play this back, and I want you to correct me if I'm getting this wrong. You're saying that we're going to try to do authentication with signatures, but we recognize that that may not work, and in the event that it doesn't for some protocols or many possibly, we may actually switch authentication strategies and release standards that use a different strategy for authentication and that that is a possibility.

  • Sofía Celi

    Yes, that could be a possibility. So protocols can also change.

  • Jason Soroko

    Sofía, I have to then ask, with that in mind, and I love the words that you used, there's like a lot of open avenues. We're going to start to feel that we're closer once certain routes are blocked off and we decide we're going down certain routes, because we've learned things. My question to you is, with NIST having now set deprecation dates for RSA and ECC for the current bit lengths, I'm very curious to know from you whether or not that has lit a fire underneath IETF. I mean, I know you guys have already been working for a very long time on a lot of these things, but in order to get to that next phase of getting to CA/Browser Forum and the other standards bodies and guidance bodies that need this output from you, there's only X number of years left before it's go time. I'm just curious, did that deprecation date from NIST kind of heighten the need to move quick within IETF?

  • Sofía Celi

    It actually does heighten it, but the IETF usually takes a more independent approach because there's also the possibility that certain regional bodies or certain agencies from different countries will say that they need to deprecate it, and in that case, the IETF doesn't block them. Because you could always register the code point with INA of the algorithm and then perform TLS 1.3 for example, by switching just the algorithm if you wanted. Like that's not constrained by any of the standards.

    If you want that to actually become a standard of the IETF, then there's different, because then what you have to actually do is present the proposal and getting it standardized by the IETF. But nothing prevents people from using a code point and an algorithm assign a code point in the infrastructure. It is something that definitely we're keeping in mind, but the IETF independently creates their own opinions. Like it is a point that we take into account, but independently we also decide what the consensus or the different parts of the internet is.

  • Jason Soroko

    That's good, Sofía. I completely understand what you're saying, and certain organizations can go off and do their own thing, and there are positive aspects to doing that, as you've just said, with some of the experimental work going on by Cloudflare, Amazon and others. I guess for me, though, it comes down to things like the CA/Browser Forum are important to the entire internet. Therefore there's dependencies there that are critical. But that's it. Thank you for the answer.

  • Sofía Celi

    Yes, but I think in the case at least of the CA/Browser Forum, there will be more coordination of the IETF, because usually there is. In the past there has been some coordination.

  • Tim Callan

    Honestly, I think CA/Browser Forum I'm expecting will be able to mostly take what IETF hands us and wrap a little bit around it and be ready to go. I'm expecting CA/Browser Forum’s work here to be pretty light, because the heavy lifting, I think, will all be done by IETF. Which is good because CA/Browser Forum is going to want to move fast. I asked that question partly out of interest for the listeners, but partly because I care. I want to know. I’m in CA/Browser Forum. That’s when we pick up the baton and run.

  • Sofía Celi

    Just to add some extra thing here, one, also, another thing that we're waiting in the IETF is for the Onramp NIST postquantum process that is happening, because in those cases, the submissions to that specific new process that NIST is actually taking have been of some signature algorithms that seem to have now smaller public key sizes, or at least times that are much more efficient. In this case, maybe as a result of the Onramp competition process of NIST maybe we'll have an algorithm that is much more better suited for IETF.

  • Tim Callan

    Is there time for that? Like, the first contest took, how many years? Seven years?

  • Sofía Celi

    Yes. A lot.

  • Tim Callan

    Do we expect the Onramp to be significantly shorter, because I've always been assuming, and maybe I'm wrong here, and please, correct me if I am - I've been of the impression that we were going to go to market, if you will, with the existing winners and possibly the Round 4 winner and then there would be a future update, if that's the right word, to account for the Onramp. Are you saying that's not the case?

  • Sofía Celi

    There's a lot of companies in the internet who are interested in putting the winner, which is ML-DSA, which is Dilithium. It is Dilithium. Essentially they will be moving. But at the IETF level, there has not been a decision strongly made to only use ML-DSA, because we're still dealing with the confidentiality part. There has some parts of the IETF that around the country has been saying, okay, let's maybe wait a little bit to see what the results are. It turns out that in the second round competition, I feel like things are actually moving much more faster. I don't know if it's because NIST already has like the experience now with this postquantum competition, and now they feel like they can move things faster. But they, for example, promised us, because I'm part of one of the teams, and at some point they actually told us that they will be announcing some of the winners of the second round this year, and they did so. I think that this one actually will be a little bit faster than the prior.

  • Tim Callan

    Are you predicting? I think we should all reconvene to have a separate conversation about the Onramp, because we could spend a whole episode just on that. Are you predicting that we will have Onramp winners in 2025?

  • Sofía Celi

    Oh, 2025, no maybe 2026.

  • Tim Callan

    No. Maybe 2026. But surely we're all not going to sit around until 2026 to figure out our authentication problem.

  • Sofía Celi

    No, but I do think that a thing that should happen next year is that people should start doing these experimentations with ML-DSA, as AWS is currently doing, to actually check that everything works. Even if it's a private CA, it will give us really a lot of insight to actually know it, because the majority of experiments that we have on that are just emulated. Actually having real world data will be great. I know that Cloudflare, Bas, also did this experiment, not with the specific algorithm, but they, what they did is they added a lot of bites such that it has the same size as the signature algorithm. And with that, they also saw how the internet reacted to it. But those kind of experiments will be great to actually see.

  • Tim Callan

    Now we've talked a lot about ML-DSA. NIST actually awarded three digital signature algorithms as winners and has at least put out two FIPS standards for those. What about the others?

  • Sofía Celi

    Yes. There has been some discussion about them. Two of them are very similar. I think they’re just going to be considered like almost the same. And in this case, maybe the ML-DSA is going to be the one that is going to be prioritized.

    There's another one that is like, there is one that is based on hashes. It's a hash-based algorithm that was called SPHINX. And I don't remember anymore how NIST is calling that one. I don't remember what's the new name, but it's a SPHINX, and that one, for example, it has been really interesting for other protocols. This has not been discussed at the TLS level, for example, but some people have actually proposed it to use it at the DNSSEC layer, because it has some sizes that are much more amenable to be used in DNSA, but it will mean that you will have to change, in a way, kind of how the DNSSEC protocol works. There's still some discussion if that's the best path forward.

  • Tim Callan

    Okay, got it. That's a lot of work to do, am I correct in thinking that that probably gets settled a little later than, let's say, KEM and TLS, because this is the most critical path item for us to solve?

  • Sofía Celi

    Yes. It's also a question that sometimes even some people at the IETF we have been discussing, at least in regards to criticals as DNSA, or IPSEC. The industry has not widely adopted neither of those protocols because they are a little difficult to implement. I have had to do that, and it's not so nice. Some people have actually proposed that because we're going to be already being migrated to postquantum cryptography, why not this also be the time that we actually make those protocols more usable. We could also release maybe some versions of the protocols that are slightly easier to use.

  • Tim Callan

    Well, we may need to try to get you to come back and just have a whole episode just on that. I think this could go forever. I'm going to ask only one more question, and then Jay, feel free to do whatever you want. Are we still going to be using TLS 1.3? Is it going to have to be a new version of TLS? Is it going to have to be something that we're not even calling TLS? What are we going to do with that?

  • Sofía Celi

    I think for a while we're going to be having TLS 1.3. Something really interesting that happened also some weeks ago is that we actually publishing an RFC which makes TLS 1.2 kind of frozen. So no more features or no more like enhancements to TLS 1.2 will happen, which essentially like starts the deprecation path for TLS 1.2 and that will mean essentially that in the whole or the parts of the industry we want to see that full migration to TLS 1.3. Because even though it has been widely adopted, it is not adopted by everyone. So we would love to actually see it more widely adopted.

    Then there has been also some people talking about what it will mean for the future of new TLS. One of the things that that proposal will for sure, have to have, is quantum security and some people have also claimed that it will be great to even incorporate the new privacy mechanisms that certain extensions of TLS 1.3 has. Maybe having good mechanisms to check for expirations of certificates or check for if the certificates are included in transparency logs and preserve privacy when you're performing those checks should be considered as well. Those are, like open avenues as well as if we want to think of a new TLS.

  • Tim Callan

    Wow. Okay. I think maybe we need to have a conversation about that. Jason, I will certainly hand the forum over to you. Before I do that, I'm going to say, Sofía, I almost feel like you've posed more questions than you've answered with this conversation, and I certainly hope we can coax you into coming back and breaking off some of these other topics and having a nice in depth conversation just on them.

  • Sofía Celi

    For me it's very exciting time, because it I feel like the migration is opening so many avenues to explore the internet. That is very exciting. So open questions is great for research and moving the industry forward.

  • Jason Soroko

    Thank you, Sofía. Really, I'm going to take this time to throw it right back at you, Sofía, in terms of giving you the chance to give this audience, this is a very, very interested audience in this specific topic area that you're speaking to. I don't think a lot of people realize just how important you are in terms of the overall process of how this is happening. I'd love to give you the chance to give the final word on what would you like to say on behalf of IETF to the rest of the world right now that's listening at the end of 2024 going into 2025. Final words of yours.

  • Sofía Celi

    From the IETF level I want to say, we actually want more people coming to the IETF with different experiences and also with different things to approach. For example, if you're thinking of maybe adding more privacy preserving technologies to certain parts of the protocol on the internet or changing some protocols such that they are more amenable to use postquantum cryptography, that is also something definitely great that we value a lot. If they come from underrepresented communities as well, and that we sometimes don't hear their voices, that is also great, because sometimes when we create the threat modeling on different internet protocols, we fail to capture them all, because we don't have sometimes those specific people sitting at the IETF conversations. If you come from there that will be great.

    If you have more exposures, for example, in the different regions of the world, how the internet works, because you see certain different types of censorship, because you see internet degradation because the connectivity of the internet is not great, that is also something that the IETF is deeply interested in, and it touches a lot on postquantum cryptography, because the majority of the experiments we have done, we have always done them with the perfect internet conditions. But that's not the whole world. If you have actual experience of when the connectivity of the internet is not great or it's heavily censored, and you're trying to put postquantum cryptography, and you failed, because it makes it even worse, then we would really love to hear that specific perspective. The perspective of the IETF, in my opinion, at the moment, is actually listening more to all of the different cases that the world has.

  • Jason Soroko

    My final thought then is as we approach 2025, with not a lot of years left, I would love to hear when you think - and this is not a question to answer now, we'll do have you on a future podcast - when do you think it's going to start to reverse and we start to close off the avenues and get closer to final decisions on a lot of things that need to be done? Because it's just not a lot of years left before we have to come up with solutions that really are very important to the entire internet.

  • Sofía Celi

    My hope would be that no later than 2027 we would have at least the standards for the confidentiality part of the protocols, and we'd have a good path for the authentication part of the protocols. That would be my hope. I hope that no later than that. But I also want to say people that if you want to work on privacy preserving technologies, that sometimes other standardization bodies, that are not quantum safe, that is also great having you to think about it. It's okay, because we still need to incorporate more privacy in certain parts of the protocols.

  • Tim Callan

    Again, I think we could keep going. I have all sorts of questions popping into my mind, but somewhere along the line, we have to let the listeners get back to their day. Sofía, I really, really hope you come back and talk to us more, because I think there's so much here. We've barely scratched the surface. It's a great introduction to the topic, and with luck, we'll be able to go back and deeper dive. I want to thank you for joining us today. This has been a terrific conversation.