Redirecting you to
Podcast Dec 14, 2022

Root Causes 262: The Continuing Erosion of Online Identity

In one of our 2022 wrap up episodes, we look back at the continued erosion of the idea of reliable online identity throughout the year. We discuss the rise of deep fakes, celebrity phishing, voice biometrics, AI-generated art, trust models, and the failure of Twitter blue check marks.

  • Original Broadcast Date: December 14, 2022

Episode Transcript

Lightly edited for flow and brevity.

  • Tim Callan

    So, we are doing 2022 lookback episodes. It’s one of the things we try to do. We fit a few of these in at the end of every year to see what the themes were that we noticed during the year, and I think one of the themes that we saw that probably – sometimes these are themes we are expecting. I don’t think we really were expecting this theme but one of the interesting things that we’ve seen in the last year and it’s kind of a constellation of different factors that have driven toward it is increasing – I don’t know if progress is the right word. It’s the opposite of progress – increasing destruction or erosion of the concept of a reliable online identity in the digital world.

    So, if you go back in time to let’s say the early days of social media or even before then, the early days of the worldwide web, I think there existed a great deal of trust that a certain online individual or entity or organization or company talking about itself or speaking with the voice of that identity or company really was real, was authentic. Now, obviously, there were some challenges to that like phishing being a big one. Where someone pretends to be somebody they’re not to steal your money. But there was a pretty high degree of trust and reliance in people’s identities online. If I saw somebody with my friend’s name and my friend’s picture on a social media site, I didn’t really say “is that actually my friend?” And, while there’s been sort of an ongoing, this isn’t new in 2022 but I think in 2022 we saw a sudden uptick in the degree to which these particular ideas were being called into question. Do you agree?

  • Jason Soroko

    Enormously, Tim. We’ve seen it before. I’ve never seen as much as I had in 2022.

  • Tim Callan

    And so, again, to some degree, this isn’t a new story, and we know that catfishing has gone back for years and years. And that’s where you go on a social media site, you pretend you are somebody you are not for one reason or another and oftentimes they are stealing pictures from real people and people’s pictures are showing up on false identities and stuff like that. So, that isn’t new.

    But we saw this escalating in the last year, and I think there were a few things that were driving it that were kind of systematic. One of which was just the real progression of deep fake technology. The fact that it’s very easy now that really anybody can make a video or an audio file that seems to be somebody else talking and that right there has cast a lot of doubt and I’ve brought this idea up on a few occasions but, very famously in the 2022 election, there was this video that reportedly was Joe Biden falling asleep at the G20 or G12 or one of these and it just wasn’t real. Just plain wasn’t real. And so, that I think brought into the news something that existed and had existed, which is that it is well within the ability of lots of people, and this isn’t just highly trained professionals, but this is more or less anybody who is reasonably competent with a computer to create these things. And so, as a result, it’s cast a lot of doubt on one of the fundamental things that we thought we could rely on, which is this person’s picture, this person’s voice and now suddenly, those aren’t reliable.

  • Jason Soroko

    It used to be, Tim, everybody knew about PhotoShop and it was like “oh, that photo has been photoshopped.” Well, this is that on incredible steroids. Where you are exactly right. You are talking about a person’s voice, a person’s persona being able to be, to say anything. And part of the problem is a lot of these public figures have so much footage of them that’s available and have so much voice audio, really crystal-clear, good audio that’s available of them. You throw that all into a machine learning mechanism and utilizing artificial intelligence and have it spit out hey, have that person say and model the mouth to say x, y and z. In fact, Tim, I was watching not that long ago – I think we even talked about it – this ability to take old photographs, like 100-year-old black and white photographs of long-deceased relatives and then throw that into artificial intelligence and what they are doing is they are modeling against real live people’s faces that are actually moving around in three dimensions. Plotting that old image against that real three-dimensional image and having them just move their face and animate themselves and then adding even voices that are a synthesis of some voices of your family member and being able to create a false voice, I can tell you, Tim, it’s used for nostalgia purposes but the point I’m trying to make is that software - -

  • Tim Callan

    It would have to be. Absolutely. So, I want to take this written account that my grandmother made about the first time she saw a car and I want to have what appears to be my grandmother telling the story. I get that. Or, you want to have Teddy Roosevelt giving some famous speech. So, certainly we could do that but, absolutely. Why does it have to stop there and, of course, the answer is doesn’t and, we know how it goes. If there’s an exploit, someone is going to exploit it.

  • Jason Soroko

    Tim, I was watching the movie Road Runner referencing Anthony Bourdain and apparently there’s up to 30 seconds or so – I forget the exact number – of time of him speaking where it’s not him speaking. It’s something he wrote and then artificial intelligence used his bank of all of his voice to then be able to speak those words and I can tell ya, unless you were told that, you would never would know that he did not record that himself.

  • Tim Callan

    And this is kind of a classic. You already mentioned machine learning. This is kind of a classic use case for machine learning. I mean the fundamental idea how the machine learning concept is, we used to call this neuro network programming back in the day because you take all the data, you throw it into the system, and you let the system figure it out – which by the way is how the human brain does it, too – and you just let it run until it gets it right and because computing power is very good and there’s lots of storage and there’s lot of compute available, this works and at the end of the day, you can just literally take a whole bunch of audio files of the same person and feed them into the machine and what you get out the other side is something that sounds like the person. And you and I don’t have to know how to do it. The machine learning system learns.

  • Jason Soroko

    Tim, I can tell you I just know how much in popular culture this has now becoming even towards the end of the year in 2022, looking at an Instagram account, I’m now seeing various celebrities being really delighted to look at artificial intelligence-generated artwork of themselves. And some of it, it’s not just artwork. You can tell it has been fabricated, as an abstraction. Some of it is just so accurate some of these actors that are speaking about the outputted artwork is “oh, I don’t remember being there.” It’s like, well, you weren’t.

    So, the point being, this software to do it, is to the point where it is now in popular culture; it’s available to everyone; and that’s what’s gonna make this difficult, Tim, as you said, to really distinguish who is who in terms of identity anymore.

  • Tim Callan

    Well, I’m reminded of something. This is a little orthogonal but just a few months ago there was an article I read with great interest. I don’t have it in front of me, but someone had won a prize-based fine art competition. I think it was at the state fair. They went to the state fair and there was a category which was computer augmented art. So you could take your photograph and retouch it, right, in PhotoShop, for instance, or whatever you want. And someone had won with this beautiful picture. You saw it online. It was gorgeous. It was wonderful. That had been entirely AI-based where someone was using basically a program where you enter descriptions and you enter a nice big long description of what you want and then it makes that thing and gives it to you. And then you refine what you are getting by refining your description and this guys did this iteratively over and over again and wound up with this really beautiful thing that won this fine art category at the state fair and looking at it, you would never in a million years guess that this wasn’t a painting by a very adept painter. And so, in that case, it’s not someone’s voice, it’s not someone’s ID but it’s the same idea. As these AI-based capabilities continue to grow this distinction between what we would call real and authentic or hand-made and what we would call computer-generated becomes very difficult to make.

  • Jason Soroko

    Tim, do you remember back in the day when voice authentication was considered a thing? How quaint.

  • Tim Callan

    I know. You’d never do that now. Absolutely.

  • Jason Soroko

    No. Unfortunately, it’s still out there. Quite often used the banking industry. So, all you folks out there who are protecting your bank accounts with voice biometrics, I got news for you. Your voice is not a secret.

  • Tim Callan

    So, before we move on from this topic, you can check out our episode 198 about deep voice fakes because that’s definitely one of the themes but it’s not the only one.

    Another one what we’ve seen and this one actually me myself in a small way. I’m not a celebrity, but celebrity-based phishing or other kind of cons. Scams. Social engineering attacks that appropriate the identity of somebody that you believe that you know who they are and that you have some degree of trust for.

    So, we know that it was a big year for phishing attacks and a lot of these were based on trying to steal your crypto wallet that were purporting to include celebrity endorsements for services that didn’t really exist that those particular celebrities didn’t endorse because they didn’t really exist. And so, this whole idea of taking the identity of somebody that you believe is real and attaching that to your skeevy activity in order to get passed the defenses and the barriers we have in our minds took off in 2022 in a pretty big way.

  • Jason Soroko

    It certainly did. Using somebody else’s authority, know-how and, Tim, as you say, it doesn’t have to be somebody who is on the cover of People Magazine. It can be a Tim Callan who is known in the - -

  • Tim Callan

    It can be Tim Callan.

  • Jason Soroko

    It can be somebody in the industry who is simply known as, hey, when Tim speaks he is an expert in the subject, I’m gonna listen to what he has to say or if he is advertising something, I might go and take a look at what he is trying to sell me. It could be something totally unrelated to - -

  • Tim Callan

    Or if I’m receiving a communication that supposedly is from the company Sectigo and I know that Tim Callan is associated with Sectigo because he has things like a podcast then all of the sudden attaching this identity to this communication, at least in the case of my criminals, they believed that it would be more likely for it to be accepted and there you go.

  • Jason Soroko

    Confidence tricks are as old as time and this is a very modern version of it. Isn’t it, Tim?

  • Tim Callan

    It’s a modern update. So, check out our Episode 238 for that one where we get into the details of that and again, it was weird to have it happen to me personally but, this is a thing that’s happening to lots and lots of people and I think if you are a genuine celebrity – certainly I’m not – then that sort of thing is probably normal and expected to some degree at this date in life.

  • Jason Soroko

    I think so. Tim, I think in terms of another topic that we covered a little earlier this year, Twitter really did highlight - - you know where I’m going with that.

  • Tim Callan

    Can’t miss the blue checkmarks. Go ahead. Talk about the blue checkmarks.

  • Jason Soroko

    The blue checkmark. Of course, everybody wanted a blue checkmark back in the day because it kind of meant, well, somebody from Twitter recognized me as being a somebody. Maybe a celebrity. Somebody who was influential and all of the sudden, there was scheme, Elon Musk, give me 8 bucks and I’ll get you a blue checkmark. Well, all of the sudden, you had Mickey Mouse, Chiquita Banana, everybody was whoever they wanted to be. They just had to pay their 8 bucks and there was no provisioning part of it. So, Tim, I think you are gonna see more of this.

  • Tim Callan

    Actually, we did a whole episode just on that, too, pretty recently so that’s one worth listening to as well because making an authoritative statement about who somebody is – an individual or an organization – there is a lot that goes into that. And there’s a lot of learning and there’s a lot of craft and there’s a lot of ways people try to trick you and there’s a lot of years’ worth of discovery and refinement behind techniques and data sources and methods and, we saw how really critical those things are and you can kind of see the difference between if you open up a certificate, an EV or an OV certificate, and you look at the name of the company, that’s a very reliable piece of information. On the other hand, if you just saw a Twitter checkmark during that period of time when they were up, those three days or whatever it was, it was worth pretty much zero in terms of the degree of trust you could give it.

  • Jason Soroko

    Tim, I’m just thinking right now about that even would affect me and how I’ve changed my attitudes, my trust towards things. If I saw Associated Press give a news report that the White House just blew up because Aliens came down a few years ago I would have thought, oh geez, maybe the site has been hacked but I should worry about that. Right now, it’s like, no, I bet you that’s a spoof site. There’s that word spoof. Additionally, you could even think about other types of things such as moving the stock market with very well-known publicly traded companies announcing material changes to their you know - -

  • Tim Callan

    In what are essentially false communications that aren’t really communications from that company. They just pretend to be, but they have real world effects on those companies. By the way, that’s our Episode 259. I looked up that one. What went wrong with Twitter blue checkmarks. But, that’s an example of a communication that’s basically false and sometimes, like with the Twitter blue checkmarks, I think a lot of those were just done by people who were trying to make a point so you could either call them social commentators or maybe you wanted to call them pranksters. I don’t think they were people who were necessarily seeking to gain advantage. But, if you had gotten the Twitter blue checkmark for a major company and you wanted to short their stock and you shorted their stock and then put out some kind of announcement saying that everything was going to hell, you could actually use that for stock manipulation in a very real way. So, all that kind of stuff becomes available as we run into this idea of like deteriorating trust that an online or digital actor or entity is what it represents itself to be or what it appears to be.

  • Jason Soroko

    Tim, the only thought I have left with everything that you’ve just mentioned here is not only do I have less trust now in everything, just like a lot of us had less trust in just images because of PhotoShop. Now we gotta deal with artificial intelligence and the fact that it’s just so darn easy to declare yourself as known entities. I think part of it is gonna also be 2023 and beyond, that ability to provision each other. That ability to, you and I, Tim, often use that Alice and Bob example of if we wanted to do business together we have to do x, y, z…in order to be able to verify each other. That skill just becomes that much more important and also, the ability to then verify even things that we do trust to make sure we are not looking at a deep fake, not looking at something that’s not just identity blinded, but perhaps, because of the fact that now you can now consume content from these bad guys where the motivations are enormous. What I’m worried about, Tim, is that there probably will be in 2023 entirely new ways of us being tricked and we won’t be ready for it. And so, that’s what I’m looking at in the future.

  • Tim Callan

    And I think one of the things you see with social engineering in general is that humans are incredibly contextual in their decision making and so somebody who is trained perfectly well not to fall for a scam in one context, can fall for what is structurally the exact same scam if you change up the context and so, that’s a lot of what we’ve seen over the years is new context, new context. We all started to learn not to fall for phishing attacks, then all of the sudden, we started getting text-based messages and we fell for them. Or we weren’t falling for email-based phishing attacks and we started to get things on our social media sites and we fell for them because the context is different and so, that’s absolutely something and, the people who make their living by creating and operating social engineering attacks are going to try everything they can think of and where it works they are going to increase it and enhance it and double down on it and keep doing it. And so, yes, one wonders how certainly technologies can help. When do, where do NFTs or in general blockchain fit in to help with this? Where does PKI fit in to help with this? Where do well-understood authentication practices that have been exhibited by and practiced in certain industries for decades, how do they fit in and help with this? And there is an opportunity for methods and technology to actually go a long way in solving these problems. We are just so very much not there yet. And maybe we all need to get there.

  • Jason Soroko

    You are absolutely right.

  • Tim Callan

    Like as a series of computing platforms and as an ecosystem and as a society, maybe we need to get there and I predict that 2023, like you said, we will see more of this kind of stuff and the pressure on it will increase. The number of victims will go up. The number of high visibility attacks will go up. The variety of attacks will go up and this will probably go on this way for some years and as that happens, the pressure will increase to come up with technology-enabled solutions to these kinds of problems.

  • Jason Soroko

    Tim, think about file operating systems where the files themselves are identified by their public key and that can chain up to a trusted source. So, these are things that we’ve heard about. If you are in the absolute cutting edge of, Web3, for lack of a better term, there’s a lot of the building blocks for what you just said will be the solution for this that are inside of Web3 but they’re just not in the hands of everybody yet. They’re in the hands of extreme technologists right now and just waiting to get out there.

  • Tim Callan

    Just gotta get there. And for it to work widely, it’s really gotta get to the point where it’s consumable by nearly everybody. That means it has to be built into your device. It has to be built into the sites and services that you use. It has to be unambiguous from a user experience perspective. That’s how we get there and that’s a lot of work and especially with an incredibly complex ecosystem of actors and systems and devices and services. That’s a lot of work. In a global, multilingual, multicultural set of consumers. Like ultimately, I predict that the pressure on this kind of thing will increase. Year over year for the years to come and with it will be the pressure to build some better technology-enabled solutions to at least cut down on our ability to be victimized by this kind of thing.

  • Jason Soroko

    Alright, Tim. Well said. I hate making hard promises but what I really would like to do with you, Tim, in 2023 is to start to flesh and out and expose what’s out there in terms of solving some of these problems because the building blocks and the bones for it, some of it is there. It just needs to be better understood. It needs to be productized, needs to be just put into everybody’s hands, but I’d like to talk about what it is. What some of the really cool ideas are that are completely orthogonal to what we are used to, and it all comes down to digital identity, Tim. And that’s the central theme here.

  • Tim Callan

    I agree. I think that’s a great looking forward theme for us. I’m certain we are gonna be returning to this topic and let’s make sure we do that.