Redirecting you to
Podcast Sep 07, 2022

Root Causes 240: Hyundai Production Private Key Found in How-to Manual

A white hat researcher recently defeated a production automobile's PKI by searching for the private key on Google. Join us as we describe the implementation error making this possible and how it might have come about.

  • Original Broadcast Date: September 7, 2022

Episode Transcript

Lightly edited for flow and brevity.

  • Tim Callan

    So, what we wanna talk about today is, I would say, fairly say, a jaw-dropping article, is jaw-dropping a fair description, Jay?

  • Jason Soroko

    I think so, Tim.

  • Tim Callan

    That we recently saw in The Register, and this is dated Wednesday, August 17, and here’s the headline: “Software Developer Cracks Hyundai Car Security with Google Search” Thomas Claburn is the author, and um, tell us what we learned from the article, Jay.

  • Jason Soroko

    Sure, Tim. I’m gonna keep this very high level because as soon as you get into automotive, security automotive PKI, it gets really deep, real fast.

    So at the highest level, you can imagine that there’s software in your car. Just about every car right now has, in fact, a pile of software, within your dashboard but also, even within the engine management system, so the computers that run your engine. Anyway, it has become common practice after a whole lot of black hats, where white hat researchers have shown you can take control of cars because they’re just so darn computerized and also connected, that firmware running these embedded computers can be messed with, and what I mean messed with, I mean replaced with malicious firmware or all kinds of other techniques that the bad guys or the white hat researchers have. So, it’s a very good practice to be able to Code Sign that software to make sure that it’s legit. The genuineness of the firmware on these computer systems is very, very important.

  • Tim Callan

    Not only does Code Signing show us that the source is legitimate - it’s who we think they are - but it also can help making sure you know when this was timestamped so you can know it’s the current update, and also, you know that the bits in that update haven’t been changed in any way. So those are three very valuable things that get attached to Code Signing your firmware. But go on.

  • Jason Soroko

    So we have seen mistakes in implementation in the past, and some of those mistakes have been only the header of the software on the embedded device was Code Signed, and the rest of it was just free to do whatever it was going to be, and so therefore, the bad guys were able to modify ultimately the firmware leaving a header in place. So if you go through the entire history of what the white hats have shown us, just about every mistake in the book has been made. Well, here’s another one, Tim. That’s what this article is about, it’s essentially another mistake in implementation, but it’s a doozy.

    Basically, white hat researcher taking a look directly at the firmware of one of these embedded systems in a car, said, I don’t know where I’ve seen this before or whatever it was. They essentially did a Google search on once they saw a piece of this embedded software and then determined the private key that had signed the firmware was directly out of the example manual from NIST, in other words, 800-38A, which is basically, here’s how you implement PKI 101. And it was literally directly out of that book. In other words, somebody at Hyundai, and believe me, I’m not picking on companies. Computers are hard, Tim. We know this. But it was them. They were listed in this article. First of all, there’s the WOW moment of realizing somebody decided to take a published example in probably one of the most widely read PKI, how to do PKI manuals there is; guidance in terms of how to perform PKI and chose to use not just the technique but chose to use the exact private key example out of the book.

    It’s a demonstration of fundamental misunderstanding of what that guidance was trying to give you. It wasn’t trying to give you, here’s the private key everybody in the world should use, it was an example of here’s what a private key looks like, don’t use this but, this is what it is going to look like. Somebody at Hyundai decided to use it literally.

  • Tim Callan

    So somebody was developing software inside of Hyundai and went in and used the NIST guidance and did not understand - I would contend - did not understand fundamentally what the public/private key exchange was or did or why we have it, when they were working on the software. Or they would have realized that they can’t do this.

  • Jason Soroko

    You’re right. We’ve said umpteen number of times in this podcast, don’t roll you own crypto. That’s an old trope. We’ve also said things such as, partner with a good security vendor, especially in PKI. It’s not something you wanna host on your own, unless you have a truly expert team in PKI, and very few companies have that. And so, if you’re implementing PKI, you don’t hand this off to just a generalized developer who might not fully appreciate what PKI really and truly is. I think that’s an example of what’s happened here.

  • Tim Callan

    Or at least, somehow, there has to be something in the technology process for a system like this or, if we wanted to argue that this is an IoT device for someone in the IoT space. There’s gotta be something in your technology process where things like encryption and public keys go through some level of technical review, technical understanding, technical audit by the people inside of your organization who get this. And there absolutely are people at Hyundai who understand PKI. It just seems like this bit of software didn’t get under their eyeballs.

  • Jason Soroko

    That’s right, Tim, and obviously, the automotive industry is highly, highly integrated. There could be Tier 1providers of these computers that could have also been involved, and this is why I’m not poking at Hyundai. It might not have even been one of their engineers.

  • Tim Callan

    It could’ve been an OEM. But even if it were an OEM, then there needs to be a process for ensuring that that’s correct too.

  • Jason Soroko

    And that’s an incredibly good point, is the OEM Hyundai in this case probably dealing with a Tier 1 provider needed to better understand, well, what is the chain of command here in terms of handing off the technologies. And so there is even another point I want to make on that exact piece which is so common in automotive which is quite often a factory will provide a, what I like to call a transport key. Think of it almost like a birth certificate or a serial number, but it comes in the form of an asymmetric key pair.

    Quite often, you would use that, you would hook into that just to say okay, that’s obviously a genuine part that comes from a specific factory, but now for the purposes of what I’m doing with PKI in terms of either encryption or signing or whatever it is you're doing cryptographically, you will then provide you own key pair, and so in this case, wow, did you ever miss that.

  • Tim Callan

    It’s true. That solves that, doesn’t it?

    And even if this were a public/private key pair that had not been published in a place where people can find it, at the end of the day, you're rolling forward with a firmware with a software update system using a private key that was provided to you by an outside party, if this did come from an OEM. If this came from an OEM, then someone decided we’re gonna use this private key that was given to me outside my organization and you shouldn’t do that either, because that now means your private key is exposed.

  • Jason Soroko

    That’s it. And in fact, it can even be more complicated than that in the sense that quite often the underlying arm chip may have its own key pair that’s created, which then gets embedded into a chipset which then gets embedded into the automotive computer, which then gets embedded into a car. Tim, we might be talking about four to five providers for this individual piece of engine management system.

  • Tim Callan

    You don’t know how far back and chips go way back. Chips go that far up the chain. So, in the event that something gets incorporated that far up the chain, and if it rolls through, you could see that coming at you. That’s why IoT supply chain, especially for something complex like automotive, is so complicated.

  • Jason Soroko

    It’s incredibly complicated, and in fact, we have talked about that in the past. You know, it’s something we haven’t talked about as recently, but I think in the automotive industry, Tim, I have a theory. And I could be wrong totally, totally wrong. I think that the chip shortage has made the automotive assemblers, the big companies like Hyundai, so darn desperate for chips wherever the heck they can get them from that they may have just thrown out the QA. The process.

  • Tim Callan

    Is breaking.

  • Jason Soroko

    That process you were talking about, is I think that’s what’s happening.

  • Tim Callan

    But that’s dangerous, and that’s scary too because you could have problems. I mean this was some white hat finding it, dinking around with his own car, but, what if we discover that the hard way when cars are going down the Freeway?

  • Jason Soroko

    Well, this is the problem, and I know people like Charlie Miller, and I’ll kind of paraphrase something that he has said which is, look, when he was hacking cars, it wasn’t easy, and so therefore, it’s not gonna be the average hacker, that’s gonna be doing this, maliciously or otherwise. And I think that that’s very, very true. On the other hand, if the supply chain process for security has broken down to this level, that’s something to think about, and this is an example that should raise some alarms, going geez, what’s the state of the industry right now? Are the different car companies paying as close attention as they should? And it’s a legitimate question.

  • Tim Callan

    It’s a legitimate question. Now, I’m gonna ask you one other question. I’m gonna pivot a little, which is to say, was it really important and valuable in the misguidance that their example public/private key pair were in there as real key pairs? Like, could the misguidance simply have said, [insert key here]? Because that would have avoided this problem too.

  • Jason Soroko

    True, true. However, however, if I were, if I think of myself 25 years ago, and I was just learning this, seeing what the blob actually looked like for realzy, listed by NIST, would have been really helpful. In other words, if you’re really a truly down-in-the-weeds computer practitioner, especially in PKI, seeing what that private key really and truly looks like with your eyes - - it’s amazing how the human brain works. This isn’t so much a computing thing as it’s a, the way the human brain works. The ability to pattern recognize based off of a literal example is powerful, and that’s why NIST did it, and I wouldn’t change the 800-38A. I think that they’ve done a good job. However, however, to your point, maybe they need to make a modification to the document and put a great big arrow and say, don’t use this. It’s just an example.

  • Tim Callan

    I haven’t looked at the actual document, so I can’t comment on whether there’s any explanation in there that says, like, don’t use these, and, and I get that, and they may very well say, we’re writing this document for people who know what they’re doing. If you don’t know what you're doing, you have no business reading this document. Maybe that is a perfectly valid position to take. It just made me wonder if that was an avoidable error, um, or if it would be possible to publish what looked like a public and private key pair that wouldn’t work. That might still accomplish your objective, Jason, of knowing what the blob supposed to look like but you could, if nothing else, what if they didn’t match each other? At that point, you wouldn’t actually allow somebody to make this error. Maybe there were ways that could have been done on that side, too.

  • Jason Soroko

    I won’t deny it. I think you could be absolutely correct to physically make it not possible to take an example out of the book. You’re right. However, the way that a lot of computer scientists are, they’re pedantic and for good reason, and so therefore, it needs to be right on the nose. That’s what they done. That is what they’ve chosen to do.

  • Tim Callan

    Flip a bit half way through the public key, and you don’t have this problem anymore.

  • Jason Soroko

    But then it just becomes another example. So therefore, you can bit-flip all you like, the problem is it’s still, it’s still a valid private key. You're right. Somebody more clever than me could maybe come with a guardrail so that this couldn’t happen again. On the other hand, I’d like to think that, just a big arrow saying don’t using this. It’s just an example. Might be a better way of doing it.

  • Tim Callan

    Mmaybe there’s a lesson learned there, too. Maybe if you're providing material to engineers you could also give a little thought about the usability of that material and the way that that too could hit a pitfall. Maybe a little bit of thought to that might be another ah-ha moment out of this. Anyway, very, very interesting article. I think nothing either of us expected to see and it was just like wow.

  • Jason Soroko

    It could be a sign of the times due to shortages within supply chain but on the other hand perhaps this is just a big red light to all you folks out there who are doing PKI - - frankly, in any environment - - these are mistakes that could easily be made by people that are otherwise well intentioned. So, Tim, you are dead right. There needs to be process. There needs to be double-checks because friends don’t let friends make mistakes like this.

  • Tim Callan

    This was, this was surely some kind of process failure that should not have occurred. It’s not like engineers at Hyundai haven’t thought about this. They certainly have a process that they think prevents this kind of outcome, and it just didn’t happen in the case.

  • Jason Soroko

    Even their T1 suppliers, Tim, are deeply experienced in PKI. There should have been a process at some point where somebody in the chain of process should have said, what’s going on here? Where did this private key come from?

  • Tim Callan

    Anyway. So there you go. There’s your gee-whiz moment for the day.