From Deep Fakes to Biometrics: Aaron Painter on the Evolution of Identity Verification
Share
Podcast
About This Episode
In today's discussion, hosts Vince Spina and Rachael Lyon are joined by Aaron Painter, CEO of Nametag, to delve into the evolving complexities of identity verification and cybersecurity. We'll explore the limitations of current Multi-Factor Authentication (MFA) solutions, with a spotlight on the high-profile MGM attack in 2023, where social engineering compromised IT help desks.
Aaron shares insights on alternatives like biometric authentication, the importance of user provisioning and recovery processes, and the balance between security and user experience. We'll also discuss privacy concerns, innovative consent practices, and the daunting challenges posed by deepfakes.
Podcast
Popular Episodes
Podcast
From Deep Fakes to Biometrics: Aaron Painter on the Evolution of Identity Verification
Rachael Lyon:
Welcome to To The Point cybersecurity podcast. Each week, join Vince Spina and Rachel Lyon to explore the latest in global cybersecurity news, trending topics, and cyber industry initiatives impacting businesses, governments, and our way of life. Now let's get to the point.
Rachael Lyon:
Hello, everyone. Welcome to this week's episode of To the Point podcast. I'm Rachel Lyon here with my co host, Vince Spina. Vince, how are you doing my friend? You got a good trip coming up.
Vince Spina:
I do.
Rachael Lyon:
Are you excited?
Vince Spina:
I am. I'm on my way to the Middle East and, yeah. Very excited. I haven't been part to that part of the world ever to Saudi Arabia. Yeah. But it's been a while since I've been to the Middle East. So I'm very excited about that.
Rachael Lyon:
And you're gonna be speaking on, I think it's a stage in the round.
Vince Spina:
In the round.
Rachael Lyon:
It's a very exciting opportunity there.
Vince Spina:
Yes. I've been I've been practicing in my, bedroom. Just walking around
Rachael Lyon:
a little bit,
Vince Spina:
making sure I talk to all 4 walls. Nice.
Rachael Lyon:
It's gonna be awesome. I look forward to to see you in the video once it's ready. So please welcome to the podcast, everyone, Aaron Painter. He's the CEO of Nametag, the world's first identity verification platform designed to safeguard accounts against impersonators and AI generated deepfakes. Aaron, I am so excited to have you have you on the podcast. Welcome. And and please share a little bit more with our listeners kinda how name tag came about because I think it's just such a such an interesting story.
Aaron Painter:
Thank you, Rachel and Vince. I'm a big fan too, and I'm I'm excited to be here. You know, it was sometimes like we all of us, we have these deeply personal situations and they lead us to ask, gosh, there must be a better way. And for me, it was the start of the pandemic. I had spent most of the last 20 years living outside the US. I just moved back to the US. I sort of left my last job running a company based in the UK and, you know, everything was moving digital and we're all sort of getting into this hunker down mode and branches were closed, things were closed physically. And then suddenly I had this wave of friends and family members who had their identity stolen.
Aaron Painter:
And I was like, gosh, what is going on? Like, it feels like the whole world's falling apart. So with this we're gonna fix. You know, we're gonna jump on the phone. I'm gonna be a good friend. I'm gonna be a good son. We're gonna figure this stuff out. And we started calling these basically help desk customer support lines and everyone we called had a very similar and probably familiar experience, which is well before I can help you, I need to ask you some quote security questions. And those questions were often either wildly easy or so bizarre that no one could answer them.
Aaron Painter:
And so it turned out someone had called before we did and they knew the answers to those questions and they answered them and they sort of took over different people's accounts. And it felt like this wave was sort of starting, but I realized it had been around for a long time. You know, this concept that no matter how protected an account is, for example, when you open a new bank account, they might say we need to go through k y c, know your customer, anti money laundering protections. Let's let's sort of verify your identity and they have ways of doing that. But yet when you call the transact, you called access to your account, no one's relying or trusting that initial method. And and so I sort of asked why why is it that we go from this with theory is a slightly more secure method to something that's less secure at the moment when it really matters. And that led to this whole journey of exploration and realizing that the technology that had been built to sort of verify who someone was was really built for a different era and it was built for these sort of regulatory check the box sort of environments and it wasn't able to prevent against, you know, we then called digital manipulation and now you would think of as deep fakes.
Rachael Lyon:
That's crazy.
Vince Spina:
Yeah. So, hey, let Aaron, let's jump in. So, you know, you were kinda talking about technology of the past and probably what comes to mind is most enterprises that are today are using, multi factor authentication or MFA. And, you know, for basic, type of issues, it's a decent technology, but, the attacks are getting much more sophisticated, and these sophisticated attacks can get, around a lot of the, the issues with multifactor. Can we talk about what some of those are?
Aaron Painter:
Yeah. MFA is necessary for sure. I mean, I'm I'm in MFA, super addict. I mean, every every account is MFA, I I turn it on. In my authenticator app on my phone in different places just wildly full. It's almost overwhelming. MFA is is critical. If it's an option, you should turn it on.
Aaron Painter:
The problem is that MFA is not sufficient because MFA is really only as strong as the reset or the recovery process that you have in place around it. So as a practical example, you know, after the Silicon Valley Bank issue, you know, as a as a tech company and we banked as Silicon Valley Bank. And I said, okay, we're gonna go. We're gonna find a new bank now. We're gonna find a bank that really is committed to security. And I called around, it's booked every bank and the different commercial banking platforms. And, you know, the one of them that was very excited. They're like, we are the most secure.
Aaron Painter:
You'll love this. We offer UB keys to our banking customers. It's like, oh, that's great. That feels very progressive. I said, well, what happens if I lose my UB key? It means, don't worry. We just send you an SMS text message. So so what's the point? Like because if I can just call or someone can call and say, well, I lost my YubiKey and they say we'll send you a text message, then the YubiKey is really theatrics. It's really only about the SMS verification because that's the actual layer of protection that's happening when someone claims that they're locked out.
Aaron Painter:
And so the the concern with MFA today and and oddly, it's not necessarily new. Bad actors have more sophisticated techniques, but in many ways, it's a very traditional route. Today, we would call it social engineering. But in the olden time, we might just call it, you know, kind of being a con artist. Like, you literally did on a website, there was a button for a while that said, forgot my password. Alright. You know, locked out of my password. Great.
[05:45] MFA removal complicates account recovery, frustrating users.
Aaron Painter:
You click it, it would email your account temporary password. When we added MFA to accounts, sort of the button to unlock yourself went away. And so if you are locked out of MFA as a customer or an employee in your workforce situation, the only option you have is basically to call the help desk and say, I'm locked out. And then the only tools these these poor help desk reps have are typically security questions or sometimes now in trying to be more advanced, say, well, let's launch into kind of a video call, a Zoom, or a Teams call, and then a whole a whole process ensues, which by the way has its own set of first associated with it. And so there just there isn't a process even when it does exist. It's incredibly time consuming, cumbersome. Everyone gets frustrated. It's it's very expensive it turns out.
Aaron Painter:
It's also been fascinating for me to learn 50% of calls to a help desk are access related. They are people that are locked out and need to somehow recover access to their account. So this isn't an obscure thing. It's really the primary reason people are calling for help, and we haven't equipped agents to do it, and we haven't traditionally put in more automated ways so that you don't have to make that call in the first place.
Rachael Lyon:
With so many I mean, I I think that's a really interesting point. I mean, there's there's obvious vulnerabilities here with the with the SMS, right, the the email based business phone call verification. Why are people still using this? Like, why are they still relying on it? I mean, do you is it just because we've always used it? You know, why change it? Is it too hard to change? I mean, why why when we we know that this is a big problem?
Aaron Painter:
You know, interestingly, not it's really many of our partners who we love working with, but, it's the MFA providers themselves. Those are the options inherent in their products today. Right? When you wanna turn on recovery in one of the leading MFA solutions, the options are SMS or use a personal email or, you know, interestingly, some of them say, well, you can do things like, you know, face check, but face check requires you to have the same device. So if you've lost your device or upgrade your device or gotten a new device, well, then, you know, Microsoft's face check model for Entra actually doesn't apply. And so the default standards in these tools today are wildly primitive, and frankly, that's what bad actors have taken note of. And this this was sort of aware on the, you know, really forward minded security professionals. I'd say, you know, early last year and and so forth, but August was really the breakaway moment. You know, August of 2023 with the MGM attack.
MGM targeted by IT help desk attacks
Aaron Painter:
Poor MGM that, you know, they are now the poster child to this, but when 60 minutes does an episode about it, it gets it gets mainstream. And poor MGM, I mean, they this is what happened. You know, a bad actor called the IT help desk, pretended to be an employee who was locked out of MFA, and 8 minutes later, was able to get their MFA reset and deposit ransomware in the company and, you know, MGM was offline for 2 weeks. And it's just been this epidemic in the months since. I mean, q 4 last year alone, something like 230 Enterprises were attacked the same way, and then 2024 has just been wild. Companies of all sizes, particularly industries like health care, you know, we've had a a bevy of government warnings, HHS, FBI, pick your government agencies saying you don't understand health care is being attacked at the IT help desk for both a patient accounts and, you know, practitioner accounts. This is just the the the method for bad actors to take over an account and then typically to deposit ransomware and cause even more harm.
Vince Spina:
Hey, Aaron. I just wanna maybe double click. So when I've been listening to you talk and we we're talking about some alternatives to MFA, you talked about facial recognition, so biometric authentication. Earlier, you talked about a YubiKey, which, you know, hardware security keys. Are you a believer in those type of technologies or are there other ones like, contextual authentication or adaptive authentication? Like, what what can our listeners if if you're trying to go beyond MFA, what are what are some of the the more advanced technologies that you would endorse?
Aaron Painter:
You really need to think about the the before and after sort of the surrounding that concept of MFA. So let's say it's hardware based, it's YubiKey, and you're you're setting that up, you're issuing those to your your users or your, you know, you're setting up adaptive MFA. Great. That's an important step in sort of the most extreme version of standard MFA today. All critical things you need to do. The thing to think through is, well, how am I provisioning that user and how am I recovering that user? And that's what I mean by surround, sort of setting that person up and what to do if they're locked out. So, you know, I I worked at Microsoft for 14 years. I started in in Seattle in Redmond.
[10:10] Remote work complicates traditional security provisioning
Aaron Painter:
And, actually, when I started, I the first thing I had to do was go to a physical office, a security office, and I showed them my ID, some security person sort of interviewed me. They gave me an access card and they literally said, this is how you will access the network. That was great 20 years ago. But in a world where we have remote workers where a lot of us aren't going to the office, we may never go into the office, you know, users of a a company platform as a customer, you're probably never going to see someone in person. The provisioning element in particular is very risky from a security perspective. You know, the idea that we often hire people HR says, okay. Great. IT go set up their user credentials and then IT either hands back to HR a temporary password or email someone's personal account and says, welcome to the company.
Aaron Painter:
You know, whatever personal email address you use, these are gonna be your access credentials. It's kind of scary. You know, the the all the infrastructure you put in place for that authentication and MFA might be fantastic. But let's say if if someone's committed hiring fraud, which we see a lot of, and they're preparing to outsource their job or, you know, someone is impersonating a more established professional to apply for that job and, you know, they go through a Zoom based interview process and, oh, I'm I'm not feeling well today or my camera's broken up off camera. Companies hire someone in the US, they don't do an I9 verification for 3 to 5 days later. We see a lot of day 1 comp you know, new employee shows up, access credentials, and they steal a bunch of IP and disappear. Wow. You don't know who you hired.
Aaron Painter:
That's provisioning risk. So as good as MFA is, it's actually the, the chasm, the little crevices between, let's say, HR and IT you get to exploit. And you say, well, HR is supposed to sort of check your ID at some point typically after you've been hired, not even in the interview or the candidacy process, the offer letter process. IT doesn't really do that. And so it's just an easy exploit. And then the same thing happens at recovery. It's this moment of, well, I'm locked out. Okay.
Aaron Painter:
I'm either gonna, you know, so many scenarios we hear. My my home base office is 2 hours away. We'll drive to the base office or, you know, some way you have to be able to go through that same, high fidelity, high assurance method of making sure the right person's getting back in. And so we we could go through a little bit more detail. That was your core question, but the methods in place today really revolve around to me. It's not individual biometrics. It's not, you know, contextual behavioral patterns. Biometrics are great.
Aaron Painter:
Face ID, for example, is great. However, face ID knows that you are the same face that enrolled in face ID, but Apple doesn't actually know whose face that is. So the classic thing, you know, last February, I think there was, there's a Wall Street Journal article of his dad in Florida, and he said, I'm blocked out of my Icloud account. And he called Apple. He offered to fly to Cupertino. He offered $10,000 in cash. He's like, look, I just want my family photos in my Icloud account. I have face ID set up.
Aaron Painter:
Apple, you know it's me. And they said, well, no. Our Icloud account, we don't know who who you are. Yes. Face ID is the same ID you enrolled on the phone, but we don't know whose face that is. And so it's that concept of linking of using biometrics to forms of government issued ID. So you'd know or other methods. You need to know who that face is that you're, let's say, enrolling in some kind of face check or whose, behavior patterns you're watching if you're looking for behavioral biometrics.
Aaron Painter:
You need to know who that person is before you can actually monitor for the changes.
Frustrated with complex verification, seek simpler solutions.
Rachael Lyon:
Have we ever yeah. I get I get really frustrated with the whole verification process. I'm gonna I'm not gonna lie. I mean, I put off MFA for as long as I can and that's somebody in cyber and that's not I shouldn't even say that out loud, you know, but I love the face ID, but are we ever gonna get to a place where it's actually simpler? You know what I mean? Like is it, I've been watching Altered Carbon a lot so maybe that's why I'm there, but you know, there they use DNA, you know, you just kind of put like some saliva on a screen and like, oh, we verified who you are. But are we ever gonna get to a place and I think about this both within organizations as as and as a regular person where it's gonna get simpler for us, just one little thing and they can definitively, you know, know we are who we say we are.
Vince Spina:
So so you keep the security aspects, but you take the friction out of
Rachael Lyon:
Exactly.
Vince Spina:
The equation, which we're all frustrated with. Yeah.
Aaron Painter:
That is the balance. That is sort of this thing we all have to weigh instead of how much friction can I put in place and how much security value do I need? And then increasingly, it's really what's what's sort of the privacy, levels or risk or tolerance levels that my organization might have. And so that's that's also a really big part of it because, you know, you get into that, take your your altered carbon DNA saliva example. Well, I'll tell you from all the enterprises we work with, very big companies, they would also, well, do I really wanna be collecting my employees saliva? You know, what's the what's the consent for that? How am I gonna store the data behind the saliva? Am I running the analysis on the DNA? Am I holding their DNA then as an employer? All those sort of things.
Rachael Lyon:
Definitely.
Aaron Painter:
They're just complicated realities. You know, we get it in the simple world of what about someone's face or what about the photo of the government ID and, you know, we have some really thoughtful and kind of progressive solutions to that stuff, but it it is sort of adding privacy into the mix and increasingly states, and and some national governments around the world, less so national level in the US, but certainly the state level in the US are are putting priorities in place around this and and focused on giving, you know, end users rights, to protect their privacy that that hinge on these issues. So the tools that often we wanna use to increase security and decrease friction could have privacy trade offs if they're not thought through full in.
Rachael Lyon:
Yeah. Absolutely. Well, I would I'd be happy to be part of a pilot program to test that if that that happens. I'll I'll give I'll give it up.
Vince Spina:
And I and I think I'm on the opposite of that equation. That's another podcast in the future where we we can spend an hour on that.
Rachael Lyon:
Uh-huh. I don't know. I I get caught up in these shows because they're just so fascinating and it just seems like a a simpler world sometimes, but it's also like a 100 years in the future. So
Vince Spina:
Aaron, where where do you land on that? I mean, you're both a person, but then you're a cyber expert, you know, having to tackle with all these kind of issues that you and your customers are saying. As a as a person and an expert, where do where do you kind of lie on that privacy versus, you know, security kind of,
Aaron Painter:
aspect? I don't think we get much of a trade off. And and so we we hinge I hinge very heavily on privacy. And personally, I care enormously about privacy. And, you know, for the earliest days of our companies, we're kind of like, well, we're a privacy company. And we're like, well, privacy is sort of just an ingredient that has to be there. Okay. We're really more security company that really cares about privacy. And it's funny, we are incredibly forward thinking in so many of our approaches.
Progressive privacy tech creates complexity in implementation
Aaron Painter:
And it's also probably our biggest point of, complexity because we're so progressive on things that the average privacy person on a privacy team, you know, we go through an enterprise sort of sales process, let's say, and security loves us, workplace experience loves us, legal loves us, and then you get to privacy and they're like, we haven't seen this kind of stuff. This is all new. What do you mean you're only opt in based? What do you mean you're, you know, multilayered consent where a person's explicitly? What do you mean you're and they're just a lot of new concepts that we think are the right things in the right direction forward, But being very candid, it's actually one of the hardest things we have to deal with is explaining a lot of the privacy forward, sort of technologies that we built in to our product at least. Part of that has to do with consent. And so and so many of the laws you're seeing are incredibly about consent. And this to me is not, oh, accept all cookies when I load a web page. Like, you know, try finding the button often like no. They will send you away or it's buried in the small corner.
Aaron Painter:
This to me is, hey, you are about to do this. Are you okay with it? Hey, someone's about to use your photo for this. Are you sure you're okay? Hey, you were okay last week, but now you're doing something different. Are you sure you're okay with this? Or let's say, hey, you scanned your ID. One of the features we call privacy masking. You scanned your ID in our our experience, but doesn't mean that the person who's asking for information about you needs everything on your ID. Maybe they just need to know you're human or maybe they seem to know you're over 21 or over 18. Why overshare? Like, why do you go to a bar today, you know, in the US need to be 21 and show a bouncer at the door? Why do they need your home address?
Rachael Lyon:
Right.
Aaron Painter:
Like, that's that's just not a portion of the part of the equation. It's kinda creepy. Yeah. And so trying to take those sort of practices and bringing them into the digital domain. And frankly, when you're just reinventing how this stuff is done, you you can be a lot more thoughtful about it. And at least that's that's been our approach.
Vince Spina:
Yeah. Great.
Rachael Lyon:
It's it's interesting you talk about consent because I'm thinking about, you know, now some of the airlines when you go, you show your face. Have you have you been through that where you have to, like, you look into the camera and it, like, verifies your facial identity in order to get on the plane versus scanning other things. I don't recall opting into that. But I guess I had to at some point and I just don't recall like how it just seemed to come about. I'm like, oh, how does it even know my face, Aaron? It was it was bizarre to me. I mean, what what's your perspective there? Do you know have any background on how that came about?
Aaron Painter:
Yeah. And you're seeing I mean, the travel experience in many countries, in particular in the US, is really becoming heavily on that. I mean, right, the TSA, in particular now even right is you stay go to for the TSA agent at the entry in airport security environment and they they take your photo. You know, there's a big sign. This photo will not be kept. Well, it turns out, I think Vox did investigative report a lot of times that photo might be kept. There are other things. The way, there might be good reasons for that photo to be kept, but they're setting a market expectation that photo should never be kept.
Aaron Painter:
Okay. Fine. But it it is in those scenarios, you don't really you have a choice, but it's kind of like choosing I'm not gonna go through the metal detector. They are set to make your life difficult. Right? Like, you have a choice in the US. You don't have many other countries, but Right? Like, you have a choice in the US. You don't have many other countries, but you might not want to. The where the world's interested in end user experience.
Aaron Painter:
Like, this is a very a very real life scenario today. So if you use HubSpot as a platform, HubSpot for a long time had the same challenge we were talking about, which is I'm locked out, okay, call the help desk. And users were frustrated, help desk reps were worried they were gonna let the wrong person back into someone's HubSpot account. So HubSpot implemented this in a really creative way, which they now have on their product when you're, you know, on the login page. There's a button that says I'm having trouble accessing my account and it gives you 2 options with great clarity. And so it says, okay, great. You can contact support, click here, but it will take you 48 to 72 hours. Or you can get back in in an automated way that might use biometrics and other things and you'll scan your ID with your consent, but you get right back in.
Aaron Painter:
Your choice.
Rachael Lyon:
Nice.
Aaron Painter:
And so, you know, the majority of people shall we say, certainly say I'm gonna go the express route. But you know what? There's some people that might not be comfortable with that or might not have the right kind of phone or be in an environment or in a country where that doesn't work or whatever it might be. Usually single digit percentages, but they they're able to sort of make that conscious choice. And imagine if you went to the TSA and there was a line like, hey, I'm prepared to use all this new stuff and there's a line that says I'm not. And today, you know, there's a pre check and a non pre check line, but they're all using the same the same tech.
Rachael Lyon:
And I gotta say I love clear just to get that out. Clear plus pre check. I can't I can't advocate enough for that.
Vince Spina:
And and global entry. Hey, if we if we could just shift gears here now, I wanted to maybe kinda get your opinion on things like deep fakes, Aaron. So Yes. Artificial intelligence been around for 20 years, but in the last 2 years, it has just absolutely exploded. And a byproduct of that is you're seeing a lot more deep fakes, and those deep fakes are getting really, really good. As an expert, can you kinda talk about what are some of the, kind of the technologies that bad actors are using manipulate, videos and images and audio files, things like that.
Aaron Painter:
Yeah. It's it's a great topic. I mean, your Gen Gen AI in particular has is brought AI or ML discussions that we've had here, right, for for decades. You know, when you go to the academic conferences, people like we weren't the cool kids and now we're the cool kids in in AI. But it's been around for a while in a way. The introduction of transformers and and true gen AI models like we live with Jet GPT have really been the wake up call for many to see what's possible. Unfortunately, though, that same technology has enabled bad actors as new technology often does, to do really advanced things in really easy ways. So if you focus just well, the term deep fake, let's say, 2017, 2018, there was a sort of a user on Reddit who started talking about adult content, and created of likeliness of a, you know, celebrity and sort of the term and then the Reddit handle related to deep fakes.
[22:31] Deep fakes: evolving tech, impersonation, fraud, social impacts.
Aaron Painter:
And that's kind of where the term that we know it came about. It was about imagery, it was about celebrities and, you know, really adult content. And since then we sort of extrapolated that term to mean a lot of different things and it can mean, like you said, Vince, audio, video, variety of things typically used to impersonate someone else. And so there's a social media implications of that, you know, the thing with Elon Musk a few weeks ago, you know, all these versions of Elon Musk promoting different investment schemes and people investing in them and losing money because they've created a likeness of Elon, to, you know, adult content to teenagers in South Korea harassing each other and making, deep fakes of each other to, really where we see it, which is bad actors using deep fakes to impersonate rightful account owners and taking over their accounts. And so the the most easy to understand level of that I think is really around voice and to your point on the technology evolution. You know, a few years ago, 5 ish, you know, if you wanted to create a voice model of someone, you know, there was software and you would read a script and it'd be a set set of words. It'd read it for 20 minutes and you would sort of train the model on what your voice was like, and then it might be able to replicate elements of you. Today, Microsoft researchers have gotten that down to 3 seconds of someone's audio clip.
Voice replication now creates convincing deep fakes easily.
Aaron Painter:
And so you can take someone's podcast, someone's social media post, someone, you know, sending you a voice message on WhatsApp, whatever it might be, and you could use that to create a voice replication of them and then allow that person to basically say, you know, whatever it is you want them to say sounding like you. Now the quality gets better, it'd be more than 3 seconds, obviously. But what's interesting is that for a lot of deep fakes, you don't actually need wildly high fidelity things either to fool people or for for it to work. You know, the the most interesting sort of viral clips now are like, look at this person overheard, like, I caught a video of them saying something and it's blurry and it's grainy and the audio quality is bad. You might believe it even more. You don't need this beautiful high fidelity be able to project it in 10 80 p or whatever. And so the the concept that, even a little bit, the ability to create defects trick our human mind is there. And so we see it in audio, we see it in voice, we see it in video, and people are using that, to different degrees.
Aaron Painter:
So people are using it, for example, when they they go through, you know, voice print services, my voice is my password. Those technologies, unfortunately, have their day has passed. You know, if someone offers you enroll in those, the answer should just be no. Like, there's not good logic and they're doing their best to stay competitive, but it's if that's an arms race and the bad actors are winning. Where where it gets very interesting to me is when you say, okay, we're gonna go to the extreme. So take the scenario of I'm locked out of my account and you're an employee at a company and that company is following, let's say, the advice of Okta's CSO after MGM, which is you should do video verification with your users. So you're gonna jump on a video call, a teams call, or a Zoom call. Well, the problem is those platforms were meant to make it easy for someone to jump on and have a conversation.
Aaron Painter:
They weren't really meant to authenticate who the person is. So it just as easy as it is to choose a different microphone or camera, let's say, in Zoom, it is as easy to choose a piece of software that is a real time deep fake emulator, which means it can replicate in that little video box, the voice video of a person you want pretending that they are on the call, and you can have someone else behind that screen basically operating it. And that is wildly scary because we've all now adapted to this world of trusting what we see in here and doing so in a virtual way. And the technology today has not existed and not set up. Those platforms are not set up to sort of authenticate and limit, the types of content that's being fed into it. In a deep fake world, you call that an injection attack. You are injecting a malicious line of code or that case is software emulator to to impersonate someone else.
Rachael Lyon:
Wow. So how I mean, as an organization, as a business, how do you even combat this? I mean, it's it's growing like wildfire and it is so nebulous and it's coming from all places. How does an organization even start to try to address this?
Aaron Painter:
It's it's a real problem. You know, there are interesting organizations that are trying this element of, you know, I I believe so I don't I don't wanna point to single solutions. We spend all our days, you know, in my company working on trying to solve. But what, my personal belief on it is that solving a battle of AI against AI is an arms race. So back to deep fakes, one of the things people most have to talk about is deep fake detection. Not only 20 companies that do deep fake detection. That to me is sort of a losing battle because it's kind of saying, can one AI model detect that another one is an AI model? And always one person's gonna be slightly ahead in their model and little more likely than not, it's gonna be the bad actor who's slightly ahead. And so deep fake detection to me is really not a strategy.
Aaron Painter:
Where I think there is room to to be strong and to be competitive is using the various elements of technology that we have at our disposal against AI. And by that, I mean things like cryptography, encryption. Right? We were working on this for decades. Cryptography is actually in a security world, incredible advantage. When you can apply cryptography and biometrics and a supervised enrollment scenarios like when someone gets a new passport or a state ID or driver's license issued and AI, you have a much stronger arsenal to compete with deep fakes and AI. And so our model is trying to use that that technology. For us, a lot of that has to do with mobile phone. And so it's, hey, instead of use operating in browser based web environments, we say, you know what? Modern mobile phones today are actually credibly secure.
Aaron Painter:
The secure enclave environment of both Apple and iOS, non jailbroken to phones or Apple and Android phones are is an incredible advantage. If you can operate and use attestation functionality in this secure enclave, you get to benefit from cryptography. You get to benefit from the gyroscope and the 3 d depth map camera that powers face ID. You can actually use it to say is someone human, not in the face ID context, but just use the camera, right, with AI with other things. So our logic is is a broader solution set. Hey, you need to use more than one thing to defeat against AI. And we have those technologies at our disposal and a lot of them happen to, be around mobile platforms.
Vince Spina:
Aaron, you talked about some technologies. Do you, you see a play for blockchain technology at all and, just being able to find, you know, deep fakes and and, you know, mitigate that or or, stay away. And I I say that personally. I just wanna let you know I'm a I'm a big holder of Bitcoin and Ethereum. So I'm always trying to find ways to, you know, make that go up. But do seriously, do you see a play for blockchain in this as a use case?
Aaron Painter:
Yeah. I think the monetary side of of, you know, currencies and things whole merited to it. Maybe it made its own discussion. The concept of blockchain for identity solutions, I am am not overly bullish on. And I'll tell you the main reason why is that I think the question of identity is a real time question. It's not a once in ever question. And so, you know, the, I went very deep on this at a conference actually on panel a couple of years ago and real blockchain enthusiasts and she had incredible definition. It was it was actually very clarifying for me and and started this conversation and she said, well, let's pretend that you are got a library card and you got that library card issued on the blockchain.
Aaron Painter:
So that would be great. You don't need to do it again. You have it once. It's immutable. You know, it was issued to you. You know, the chain of custody and all that seem wonderful to me. And I I totally got that use case. But then I said, well, what if I move to a new city? Should the library where I live before still be giving me access to that library? I as a user might want that, but the library probably doesn't.
Real-time financial identity verification analogous to credit
Aaron Painter:
And to me, it's the same thing like you're issued a credit card and you have a physical credit card, let's say, But it's sort of only half the equation or part of it. When you go to transact, you swipe that credit card, the payment processing network says, is this valid? Has there been fraud on the account? Do you have available credit for this transaction to go through? And if so, the transaction is processed. That question of financial ability at that moment is a real time question. So when someone's asking me my identity, I don't need them to say, no. Once upon a time, we checked this. I need them to say, are you still you at this moment? Right? And the classic example on this might be, call it the large house sharing platforms or, you know, rent an apartment type of platforms. You know, they were trying to replicate, for example, in person hotel check ins. And so if you have a your average profile today on call it Airbnb or VRBO or others, then you can also have a verified profile check mark there.
Aaron Painter:
The challenge is they did that once. So at some point in the process of having an Airbnb account for years, they might say, hey, can you scan your ID or can we make sure that you're you? Great. You have a verified check. But the difference that they set out to solve the hotel problem, the hotel problem when you check-in, every time you check into the hotel, they say, can I see your ID? Can I see your credit card? And they check you in. Unfortunately, if someone else is using my Airbnb login credentials then or my account is compromised, that person gets to act as as if they're me fully because there's a little verified check there. And so it's a little bit similar to me to blockchain. It's not about the one time verifications. It's about and then trying to find ways to do that that are low friction, but the high assurance.
Aaron Painter:
Because there are times, you know, maybe you don't care as much in Airbnb. Maybe you should, maybe you don't. But I'll tell you, if you're accessing, you know, resetting your access to your employee credential and let alone if you're a privileged access user, you absolutely care that it's that same person. You don't wanna know that their ID was once verified last decade. You wanna make sure that they are that person now in front of the screen and they're the right one.
Vince Spina:
Yeah. I just watched the value of my crypto drop as I as a
Aaron Painter:
as a Well, your currency could be valuable and that's his own discussion. But but they're not trying to solve that question. In a way, they're actually in the anonymity of maybe we don't know whose money this is. Governments might not like that. You have KYC concerns around that, money laundering potential concerns around that. But the spirit of that that currency platform is anonymity. So they're actually not trying to solve an identity problem necessarily in the the value that's being portrayed today. So I I just think the blockchain technology extrapolation into identity is is not necessarily what the market needs today for all of the use cases.
Rachael Lyon:
So, Eric, can we shift gears a little bit too? I mean, I I love this whole deepfake conversations, one of my favorite topics. But, you know, we're facing an election period here, right? We've got a big election coming up. Everybody knows that. And we've been seeing a lot of deepfake action. I mean, we've talked about someone with Elon, but we know that there have been other deepfakes of candidates currently in motion and, you know, public trust. Right? How do you even trust what you're seeing on the Internet anymore? I mean, how do how do people need to navigate this landscape on what I'm seeing is is real, what I'm seeing is not real? And you know, how how do we even combat this threat to make sure that there's, you know, kind of a bit more trust in the electoral process?
Aaron Painter:
Yeah. I think today you really need to confront it with skepticism. And unfortunately, you know, you need to then make a question the channel that the message you're receiving on. Hey. Is this a big WhatsApp group or is it, an established news organization maybe that I can trust? Like, the channel of this they done some sort of additional vetting. You know, who is the person? Is this someone I know? Is someone I trust, someone I've seen before or not? The The practical things as an end user we can consider. But, more broadly, the the general approach to deep fakes today has been around what's considered watermarking. And so it's okay.
Watermarking verifies authenticity and counters deepfakes.
Aaron Painter:
Who created ink? We can we create almost like in the world of, you know, music sharing if you remember. Can we create sort of an official version of this song that somehow is attested to be the official version? And that's the principle behind watermarking. So many of the big tech companies have, of adopted methodology. So let's say, the White House wants to issue something, they can issue a, okay, this has been, you know, watermark by us and so we attest that this is real and not a deep fake. Okay. You know, that logic has its place. If you might remember that actually funny enough when, Biden issued this letter that he wasn't gonna run again, there was even some questionable, is this a deep fake? His letterhead or something was different and it come you know, that wasn't a watermark scenario and so that was what might have been a good use case for it, but those are the type of opportunities. Do you watermark and say, hey, definitely this came from us.
Aaron Painter:
Okay. The challenge is that outside of that type of channel, you know, hey, this video popped up that maybe this politician said this, that's not gonna be a watermark scenario. And so then you get into, well, is it legitimate? And I am a very big believer, like I was saying as a practical person, practical user, can you trust the channel? Can you trust the person? I'm a very big believer that safe communities know their members.
Rachael Lyon:
Right.
Aaron Painter:
And so I think social platforms, let's say in particular, have a responsibility to know who is behind an account. Now I I'm a huge privacy advocate. I fully support anonymity, aliases, pseudonyms, usernames, go for it. Amazing. But I think the platforms have a responsibility to know, hey, this is the human behind this account. And by the way, the fact that they know it's a human. K. Because that allows at least some sense of control to say, okay, whatever has been shared, this video, this audio clip, whatever it might be, you can't necessarily trust the file itself, but at least then you can put some trust in.
Aaron Painter:
Alright. Well, the person who posted it is likely a known person. They're likely a known human. The platform has seen not malicious content from them before other things. Today, the our our Internet platforms are not largely set up for this. I actually when we started our our company started name tag, I I thought, you know, I was like dating companies. Oh my gosh. So obvious for this.
Dating apps lack identity verification, posing safety risk.
Aaron Painter:
Like, the fact that you're gonna join a dating platform today often with, like, a social media login, could be anyone. You could have an unlimited number of those. And then if you really goes well, you're gonna build trust and hopefully meet up in person with someone. That's a classic scenario of, gosh, you really should know who the person is behind the account. Again, not a priority for those platforms today. And so it, you know, and it's sad, it goes very offline, you know, hey, we're about to meet up my sister, mother, brother, friend, and so the person, asked me, hey, would you mind just sharing me a photo of your ID before we meet up or can we meet up in a public place because of it? Like, that is sad that that is the, you know, when some of the the gig economy platforms or the, you know, meet up and buy and sell thing platforms, when when I meet with them sometimes and they say, oh, we only had x number of murders last year. Like like, what? Like like, how off base are we? That's that's okay. Like Yeah.
Aaron Painter:
And so anyway, I I firmly believe that although they're not gonna help us in this certainly election cycle, but I believe that platforms have responsibility to know know who's behind those accounts. And fundamentally, I think that is a broader long term solution.
Vince Spina:
Going off that, Aaron, how do you how do you, check for integrity and and validation of, you know, electric electronic voting machines, digital, voting systems, all that. Like, when you say, hey, We have a responsibility to know who's behind that. How do we do that? Because that's becoming more and more prevalent. We're 50 days from that happening in the US and more and more countries are are kind of heading in that direction. How do you validate that?
Aaron Painter:
Yeah. Fortunately, and there's a nuance because that's a big category, but, you know, fortunately, the the concept in in the identity world is in by ISO standards. You think of what's called supervised enrollment versus unsupervised enrollment. And that's what we're talking about. Did you go into the security office? Do you go into the DMV? Do you go into someone watching you? And even, you know, not this case you mentioned clear, not the case in all of their enrollments, but their airport security enrollments. Interestingly enough, that is a supervised enrollment. Right? You go in and someone sort of watching you, and it's gonna be difficult to deploy a deep fake in that kind of context.
Voting methods involve supervision and verification processes.
Rachael Lyon:
Right.
Aaron Painter:
And so, you know, even voting, you know, even let's say it's a voting machine, but at a physical polling place, well, interesting you can get into is the machine is self secured, is it encrypted, is someone have access, great discussion, but not quite in this the theme that we're on now. Interesting that is actually a supervised enrollment. You are going to a voting place, someone's asking, you know, depends on your state, gets contentious, but asking to verify that you are the registered voter, shall we say, maybe checking your ID. And but someone's there, someone's talking with you, someone's interacting with you, and then they're sending you into the voting machine. And so there's an element actually of sort of human supervision in that. You know, a lot of states that have voting by mail, for example, it's, you know, still paper based, you send it in, you sign something, people are looking, they're actually doing signature mapping there. Yeah. There's at the station pages often on those mail and forms.
Aaron Painter:
So they're actually still is a pretty strong human element in those. And so they're not they're not sort of pure digital. We're not yet today and I know in many places a, hey, log in and click your vote and log out. But I would also probably say our infrastructure of the Internet is not quite ready for that today.
Rachael Lyon:
Absolutely. And I want to segue. There's, you know, the other popular topic with regard to deepfakes. And I think you kind of alluded to some of this with the Reddit thread you've mentioned. But, you know, legislating deepfakes, I mean, particularly deepfake porn. I know there was a Wired article yesterday, that was kind of at the state level of how I think there's 23 states that have passed some kind of legislation, 39 are looking at it, you know, as Congress is trying to assess, you know, how to legislate things like the Defiance Act or the Take It Down Act. But I mean, is there a way to legislate and regulate deep fakes, Aaron?
Aaron Painter:
Yeah. The biggest one to me is back to this concept of knowing who the platform's taking responsibility and knowing who's posting content. And, you know, coincidentally, the, the the adult content platforms themselves that actually in some cases been a little bit, not all of them, but some of them we've crossed past with have been a little bit progressive in this and doing the right thing and saying, hey, back up. The common thing that's that's going on adult content also is around age verification. And so simultaneously, in addition to deep fakes, many states and and and nationalities have been passing laws around age verification to keep adult content away from minors. And so that's one issue that many of these platforms are trying to solve. But the other one they're trying to solve is actually in preventing underage content from being uploaded. So prior to deepfakes, they were saying, hey.
Aaron Painter:
We don't want, you know, people under 18, let's say, as performers in, let's say, adult content videos. And so many of the Dell platforms today require identity verification, let's say, for the person uploading the content. Now varying degrees, sophistication of that technology, are they doing that perfectly? But actually, that is sort of the intent. And and in a way, I that I think is the right approach. It's the right thing. It should be on any social platform, including adult content sites. It's to say, hey. Do we know who's uploading this content? Because it turns out you then now you as a a user, let's say, a viewer of that site, you don't know the identity of the person who uploaded the content.
Aaron Painter:
There's probably some commercial name or a brand or, you know, an alias or an actor's name, whatever, stage name. Fine. You don't need to know the real name necessarily, but the platform does. And by the platform doing that, they can hold that person accountable and say, hey. You operate under this alias. You've uploaded content that's inappropriate. And by the way, we're not gonna let you do it again. We're gonna go back and correct it or make you take it down or other things.
Aaron Painter:
And so that path to me has been some of the voluntary paths that some of these platforms have pursued. And not universally, again, maybe not to the right standards, but that to me is the inkling of what regulation could look like in requiring more of these platforms and more broadly to say, hey, you need to know who's uploading content. Because that you you might create the deep fake and it might look wildly realistic, but we're gonna hold the person who uploaded it accountable more so than we can and really authenticating the content itself.
Vince Spina:
You've been in this business a long time. You got a lot of experience. Talk about your time at, Microsoft, etcetera. You specifically talked about how the threat last landscape has evolved over the, you know, over that period of time. What are some of the emerging threats, or trends that are happening today that you personally find intriguing or or challenging? Like, what's what's interesting to you? Or quite frankly, what what scares you? Like, what what are you guys thinking about?
Aaron Painter:
Yeah. I think, you know, the lay related not surprisingly to some of the stuff that we've talked about. I worry enormously about deep fakes. It is and even a few months ago, people were asking, oh, I've heard about this, but is it a real thing? It is a real thing. Gartner knows it's a Gartner says it's a real thing. Like, that is a real thing. I mean, it's it is a way that people are using to take over, accounts to impersonate others in really in significant ways. I'd say though, interesting one of the things I worry most about right now and I've written a little bit about, has been industry specific on this and it's related to health care.
Aaron Painter:
And it's a, that I worry that our our private health information is about to go the way of the last four of our social or answers to those security questions. It is that the clear target of bad actors now to go after personal health care information, health care data, health care infrastructure. You know, it's it's it's important. It's valuable, shall we say, unfortunately. It is more valuable on the call, the dark web for someone to have some of your health care information than it is your credit card number. And because of that, people are going after that data. And so it's risky. Your health care data could be exposed.
[43:39] Healthcare infrastructure targeted, jeopardizing critical care decisions.
Aaron Painter:
People are targeting that. But equally so, these bad actors are targeting the infrastructure of the health care themselves. And, you know, providers are in these moments of of offering critical care and needing to make critical decisions, but they also have to do this elements of protection and security. Now forget it. Imagine if you're the doctor who needs access to this given machine to provide critical care and something you're locked out of the account, like, that better be handled urgently with speed and with security because at the same time that machine can be taken offline. The whole infrastructure of a hospital can go offline. And we are seeing this is not, conceptual anymore. Unfortunately, this is really what we're seeing is happening.
Aaron Painter:
And so I worry about it enormously in health care. And then look at, you know, I I live in Seattle. I worry about an infrastructure. Like, our library system was attacked a couple weeks ago. Last week, our airport was attacked. Like, you know, the airport still still doesn't have updated screens and all the elements of just a normal travel experience because the airport suffered a cyberattack. Like, we are not ready, really anywhere in the world. Our digital infrastructure is not prepared for the level of of cyberattacks we're seeing, and over 90% of cyberattacks right now are identity related.
Aaron Painter:
They are this concept of impersonating someone else, taking over an account, and it's funny enough. It's not wild technology. Like, you mentioned sophistication earlier. Bad actors are using more sophisticated tools because they're easy to use with deep fakes, but this is not a wildly advanced thing. This is not minting a new cryptographic quantum computing ability to break an encryption algorithm. This isn't a tech breakthrough. This is like a social breakthrough.
Vince Spina:
Yeah.
Aaron Painter:
And we're just not prepared as an industry, as society to deal with it. That's what I worry about, and that's what drives me and and my team on a daily basis, frankly.
Rachael Lyon:
That's that's true. And you think about where, you know, health care struggles and budgets and hospitals and, you know, you layer all that on top of it. It's a very it's a very difficult difficult problem to solve. And final question, I know we're coming up on time. You've lived everywhere, Aaron. Is it China, France, Brazil, Hong Kong, United Kingdom, and now the US. You know, as through the lens of being a cyber professional, how's this shaped your thinking about cybersecurity? Has it? Because I think, you know, every country has a different point of view, right, of where they are in economic maturity or infrastructure maturity. You know, has that kind of impacted how how you think about cybersecurity and and what you look ahead and problems to solve?
Aaron Painter:
Yeah. I've been a big part living in different parts of my life has been a big part of, my life story and my personal journey and, you know, so very practical. When I lived in Brazil, I lived in Sao Paulo. I love Sao Paulo. I love the people. I love the culture of Brazil. I love the food. The weather was amazing.
Aaron Painter:
You know, it was really hard with crime. Like, you would on a daily basis whenever you left home, you worried about your personal safety. And if you're in a car or a taxi, the big risk was somebody would drive up on a motorbike and put a gun to the window and watch your mobile phone, your computer, your briefcase, your backpack. You know, my apartment building and, like, was standard. You would have 2 entrances. It was like a prison. You go through a gate. A guard would watch you.
Cyber threats are universal, unlike physical security.
Aaron Painter:
They unlock the 2nd gate. You go in and then you're inside the building and then, you know, incredibly secure and, if you had to be. And, you know, then you go to the Hamptons and, you know, and and lovely rich America and you see these beautiful, huge palatial estates with not so much as a picket fence, and you just walk up to the front door and, you know, you can knock on someone's door and you're like, wow, what a privileged world we live in. And to me, that that contrast around physical security, frankly, is quite local. You get to make those decisions because you know the threat landscape of your local environment and you know where you're operating in and you can make decisions. Do we need the extra gates or can we have the big advance? Like, you kinda know your local community, policing, detective work, our whole, you know, spy apparatus, like intelligence system is all structured on local intelligence and local safety risk, local local threats. In the world of cyber in the world of cyber, that becomes wildly, flat. The world becomes incredibly flat in terms of cyber because bad actors can use their best technique from anywhere in the world against you anywhere you are.
Cybersecurity: global threat requiring proactive response.
Aaron Painter:
And so the difference between physical and cybersecurity has never been more clear to me than in watching and and because of my life story in a way of seeing how much local matters in the physical sense and how much local absolutely is irrelevant in the cyber sense. And so we are equipping incredibly intelligent, hardworking, creative, professionals to do bad stuff anywhere in the world against anyone anywhere in the world. And that is, it's scary. The exciting opportunities that the world is flat sort of gave us from a technology perspective has also opened up all these risks from a cyber threat perspective. And that's it's just more important than ever, and we don't really have a choice other than to respond or be proactive in responding.
Rachael Lyon:
That's a great perspective. I didn't really think about it like that, but you're so right.
Vince Spina:
I love the word pictures in that. That description is fantastic. So
Aaron Painter:
Thank you. And I've never shared before, but something I think a lot about actually is Yeah. Is that concept and those that contrast in physical world and and the cyber world. So Absolutely.
Rachael Lyon:
Absolutely. Well, this has been such an awesome conversation, Aaron. Thank you so much for joining us. I've really enjoyed it, and I think our listeners got a lot of great insights. So thank you for your time. This has been a lot of fun.
Vince Spina:
Thank you, Eric. That was fantastic.
Aaron Painter:
This is a ton of fun for me. I really appreciate you having me. Love to keep in touch. So please, I'm very active on LinkedIn and posting content and things. Reach out, follow me, you know, follow the stuff we do, and we write a lot about this stuff. We commented on industry things. And I just I love hearing from folks. So, please don't be shy.
Rachael Lyon:
Wonderful. And that's Aaron Painter, CEO of Name Tag. So make sure you write it down and, follow-up. And to all of our listeners, thanks again for joining us for another awesome episode. And as always, Vince, what do we want them to do? We want them to remember to smash that subscription button. Please. You'll get a fresh episode in their inbox every Tuesday. So Absolutely.
Rachael Lyon:
To all of our listeners, thank you again, and until next time, stay safe.
Vince Spina:
Thanks for joining us on the To the Point cybersecurity podcast brought to you by Forcepoint. For more information and show notes from today's episode, please visit www.forcepoint.com/podcast. And don't forget to subscribe and leave a review on Apple Podcasts or Google Podcasts.
About our Guest
Aaron Painter, CEO, Nametag
Aaron Painter is the CEO of Nametag Inc., the world's first identity verification platform designed to safeguard accounts against impersonators and AI-generated deep fakes. Nametag has become the trusted choice for leading companies seeking to prevent fraud, reduce support costs, and eliminate the frustrations associated with account lockouts and high-value transaction authorizations.