Ir para o conteúdo principal
Background image

The Human Price for Data and Privacy Protection

Share

Podcast

ABOUT THIS EPISODE

This week Rob McDonald, SVP of Platform at Virtru joins the podcast to double-click into the data and privacy protection discussion. We explore subsidizing the pain of giving personal data in exchange for ‘free’ services, informed consent, regulation alone isn’t a silver bullet, and what outcomes we could we drive when we combine user decisions with regulation.

And he shares insights on behaviors that come with innovation, data as a common denominator, regulations such as GDPR and CCPA as progress markers (and not the final destination), the criticality of the CIO/CISO as a storyteller, and recognizing our front-line defenders are people (not robots!).

Podcast

Popular Episodes

      Podcast

      The Human Price for Data and Privacy Protection

       

      [01:12]Juicy Topics: Data and Privacy Protection


      Rachael: We got some good topics to talk through today. Please welcome to the podcast, Rob McDonald. He's senior vice president for the platform at Virtru.

      Rob: I totally agree that the conversation is leading up to juicy topics and fun stuff. I look forward to engaging with you and your audience today.

      Rachael: As Rob so eloquently told me what was going on, before we got on about dopamine, serotonin, and all that stuff. TikTok’s feeding me up the dog videos and the kitten videos. Literally, I'll lose an hour and a half, and I feel empty inside when I turn it off. I feel it's feeding me a drug. That's bad.

      Rob: I don't know if it's bad. People get value from it. What people don't understand is just how sophisticated the correlation of all the data points you create is to understand you. TikTok is not doing anything different than the other big tech companies. They just happen to be a lot more controversial based on where that data goes. But yes, it is a slippery slope in terms of what you give up, but people do give the value of it because they know you. They have turned something into a value for you and it's a slippery slope.

      Eric: How do I get to know my co-host, so I know her? I can't relate at all. Rob, you can analyze me here. I was 12 and I won't go into a whole lot of specifics, but somebody gave me a beer and I was losing a little control.

       

      How TikTok Figures Out People

      Eric: What do you do when you're 12 and you're a little inebriated? You drink coffee. I burned my mouth like you couldn't believe it. It was at work; it was an industrial coffee machine. I took it away. Took one huge gulp, and burned my lips, my tongue, and my whole mouth. I haven't had coffee since. Why can't Rachael do that with TikTok? Are you saying that the Folgers haven't figured out me, but TikTok's figured out Rachael?

      Rob: Actually, I think oddly enough, you nailed it. First off, I do think two wrongs make a there. It's just you drink a beer, then you burn yourself. So that worked out for you. But what you said there was you took that coffee and it burnt your mouth. You're like, "I'm not touching that again." That pain is still being subsidized heavily in industry, so you're getting all this value.

      If your credit card gets stolen, you immediately get refunded. There is still this subsidy that takes some of the bites out of that pain. It is a tsunami, however, where people are starting to realize what equities they're giving up and what value they're giving up. But the pain is still, I would say, manageable through those subsidies. So TBD when that threshold tips, but you want to be prepared for that tip. You don't want to get your mouth burnt with hot coffee when that thing tips.

      Eric: Help me understand that. I don't do coffee, ice cream, or coffee with chalk, I swore coffee off for life. Not a problem for me.

      Rachael: It's so good. Shameless plug for frosted coffee.

       

      The Price of Not Recognizing the Benefits of Data and Privacy Protection/h4>

      Eric: Rachael's for TikTok. There's no dopamine firing and making me want coffee ever. I'm telling you, I will probably never have coffee in my life. You swore off TikTok last week and you're back all ready for the crazy dog video. Obviously, no pain, Rob, is what you're saying. Rachael's pain would be the waste of time she's spending there and her degraded work performance or podcast prep. What you're saying is she's not recognizing the pain or the benefits are probably overcoming any pain that she would recognize.

      Rob: As a human, you've adapted. Your wet wear is adapted to say pain, don't do that. Reward, do that. When you see these breach incidents and dead exfiltration events, they're massive numbers with no names on them except for a corporation. You're like, "I don't understand that. How's that impact me?"

      However, if you talk to someone that has had that data exploited cross-modality, and they have truly suffered from that from a reputational perspective or a financial perspective, I guarantee their pain level is different especially if that pain for them has resulted in an inability to get some of that back. There are some things you can't get back. Some reputational things are really hard to get back. It's like a company losing a brand. How long does it take for you to get a brand back? Maybe never.

      Eric: Let me make sure that I understand this. I see Rachael's on board here. What you're saying is, if I have a credit card stolen and it doesn't cost me much of anything, there's very low pain.

       

      Proactive Measures to Enhance Security

      Eric: I don't necessarily take additional proactive measures in the future to better enhance the security around my credit card or not lose it or something like that. Now, if my identity is stolen, somebody files a fraudulent tax report as a result with the IRS on me, it's years of pain. I may be more inclined to protect that identity going forward because I felt that pain, it was traumatic. It was impactful. There was some meat behind it if you will. Is that what you're describing?

      Rob: Yes, that's exactly right. I think this pattern exists everywhere at the corporate level, and personal level.

      Eric: I want to take it to a corporate cybersecurity level now. I know you were a CIO in the healthcare space, I believe for your current role. If you had a ransomware attack that took down the business and let's say it was a wiper that wiped data, you're going to be more inclined in your next role or in that role as you look to post recovery. But you’ll protect the organization going forward, you'll be more inclined to protect against any kind of ransomware attacks or sensitivity that would allow things to come in. You're sensitized to it.

      Rob: Yes. I think to separate parties for a moment too because the technical audience is already sensitized to it. Their mission is aligned and oriented. That is around serving the organization, making sure that the organization is empowered to do what they need to do with the technology. Technology there's to enable the business. Then there's the, I hate to put it in just technical, non-technical. That's not fair. Let's talk about hospitals for a second, the doctors there, mostly the nurses.

      Eric: Clinicians.

       

      Technology is a Means for Data and Privacy Protection

      Rob: Yes. We'll raise the nurses up a little bit. They do the majority of the work in these hospitals. This is the reality. They're there to take care of the patients, and technology is a means to do that. Anything that gets in the way is no good. When that audience, when you talk to them and you're a CISO, your job is to translate risk at any given moment in time. Well, there are 30 other things competing. I'm not saying anything that anyone doesn't know.

      Eric: The gunshot wound that just came in through the emergency room doors for them, how do we keep them alive? They're not thinking about cybersecurity, any kind of ransomware attack, any kind of fishing email, anything.

      Rob: They weren't. What you just said is important, because ransomware in particular has resulted in an inability to take care of patients. That has elevated the position of cyber in these organizations. It has not changed the pressures on the CISOs and the unnatural, unfair, ridiculous types of leverage being placed on them. But it has given some opportunity for them to further those missions because now there is clear connected tissue with those operating teams. Yes, this is going to get in the way of me taking care of my patient, my number one mission.

      Eric: If we lose access to patient records while we're prepping for surgery, we may not be able to take Sarah into the surgery room and operate because we can't see what we're doing. She may have a life-threatening illness or problem that we've got to deal with. That's a clear way that they understand, the clinicians understand how this is linked together is what you're saying.

       

      A Good Opportunity

      Rob: I think so. This is also a good opportunity, even more now that they see that. They can also start to understand how directed and lateral attacks can start affecting IoT devices in hospitals, infusion pumps, and things of this nature. It's more than just "I don't know about this patient" it is "I'm going to damage this patient directly because the device I have trusted is no longer working the way I thought it would work."

      Rachael: We saw this, I mean, last year with the hospital being infected by ransomware.

      Rob: Talking in Ireland?

      Rachael: No. In the US. It was the first time they had been able to directly link a death, it was an infant death because ransomware shut down the systems. They couldn't use the scan to see that the umbilical cord was wrapped around the baby's neck, the one they delivered. She was like, "If I'd known, I would've gone to a hospital that you guys had this stuff down and that's heartbreaking.

      Rob: It's happening, and ultimately, it's the type of pain that we're talking about here. You don't want that type of pain to happen in order to necessitate change. But look, at the end of the day, humans are humans. Sometimes, what is required to necessitate change is to be able to see that happen in real-time. I don't ever want that to happen, but now that things like that have happened, you hope that position is elevated in the organization.

      Eric: Back to your example from a few minutes ago. I think if you were working in that hospital and you knew that story, if you were a part of that family who had that traumatic event, you are really sensitized to it.

       

      [11:30] How Do You Convey Risk?

      Eric: If you're working in a hospital two states over and you just have patient after patient coming in, you're down on nurses. My son's a pediatrician so I have some level of understanding of what they go through here. Down on nurses, you can't give the proper care to your patients. Are you sensitized enough to that or is it too remote? Then, the follow-up for you, would be, as a CIO or a CISO, how do you convey that risk, that sensitivity to somebody who didn't experience it? You've got to be proactive.

      Rob: Let me start with that one first. CISOs and COs have to be storytellers. We are humans at the end of the day, and we still exchange information. Our children learn through mimicry and listening to stories. We still do the same as a human. You have to be able to take these stories to the industry where you can say, "See this institution over here, they look just like us. They are us. They're no different." Let me tell you what happened to them when they deprioritized this initiative. 

      You can't drive change only through fear, but the risk is ultimately a quantification of what is likely to happen. If you have a story and you have a series of stories that result in a pattern, and that pattern looks like you, then yes, that is a higher likelihood and your risk quantification has to go up for that particular incident. Then your ability to deliver that story as that CISO is going to be what gives you the most leverage. CISOs and COs have to be good storytellers.

      Eric: As marketers, we love it. It's all about the story.

       

      Data and Privacy Protection Through Storytelling

      Rachael: It's how you work.

      Eric: I agree with you. In a prior life, I had a storytelling class created. The instructors, and the creators took it back to the beginning of the human race. We grew up through storytelling and we've had a lot of guests on the show that talk about stories. That is how we learn how we communicate. I love that answer because you don't want to wait until it happens to you.

      Rob: No, we wouldn't be here as a civilization, if we waited for everything. Just like in any business strategy, you can accept small failures along the way, but you know what you can't accept, a bunch of catastrophic failures. Because then the ship sinks. The story you're telling has to protect you from catastrophic failures and you have to learn from the smaller failures. Whether you're steering a cybersecurity team or you're steering a business, it's no different. You're doing the same effective risk quantification and navigation.

      Rachael: Going back to my favorite. I love the privacy topic and TikTok. I guess that's one of the things that I am struggling with today. Until you're personally affected, sometimes it’s very difficult for people today to be like, "Oh, it'll never happen to me." But it will. But then you have this whole other generation like the TikTok generation. Talk about PII. That's out there, it's on their tick TikTok. I know where they live, I got their birthday, their security number, the dog's name, all the things I need if I were a bad actor. But then, you have GDPR and CCPA and all these other things.

      Eric: The regulatory components.

       

      The Real Struggle

      Rachael: If you willingly put it out there, what's the point? I guess that's where I struggle too, and I think it's really critical. But then again, the information's probably already out there.

      Rob: Yes, it is already out there. But I think let's talk about it for a second because we have both made progress and also let down newer generations. That's the reality. We started in this world of blind trust, where you go to these SaaS providers and you say, "Here's my data," in exchange for a service and you hope for the best. Hope is the common denominator. GDPR is this move where we say, "Okay, that's not good enough."

      So you give up your data, you get a service and you surrender your control to a legal proxy. That's what you're doing. You have no ability to determine the comprehensiveness of action. By the way, we're still not solving the problem that you have no idea what you're consenting to anyway. This is what I mean by we've let them down.

      Rachael: I signed a 200-page agreement, Rob. I mean, just as an example. Anyway, I was looking at this thing and they're like, "Here's our agreement. You need to sign." It was literally 200 pages long. I was like, "I'm just going to sign this and I just hope it works out."

      Eric: Who doesn't do that?

      Rob: That agreement is designed to protect that company, not you. It is intentionally designed to be confusing to you so that you go just like that. There are these studies, now I think this might have been a Kahneman study related to the Kahneman study, where if you've seen 30 options for your 401(k), you just don't accept anything.

       

      Data and Privacy Protection: Pick One and Invest

      Rob: But if there's two or three, you'll pick one and invest. It's no different. So, if we don't think that's done on purpose, we're crazy. It's done on purpose and legal proxies are not the answers. We need to move to a state where if you're getting value from a service, great. Have a clear transport medium from the intent of that agreement to your ability to determine whether there's a violation or not and take control yourself.

      What we've done as society has continued to quantify the value of data. But we're not talking about the thing that is the tsunami, which is you have given up your sovereignty over control. That's what you've given up. So data aside for a moment, the tsunami that's coming is when everybody realizes that I have literally surrendered complete control. Forget about the value of data. That's the bridge we have to cross next or now, not next.

      Eric: I'm reminded, we did a podcast with the New York Times reporter Sheera Frankel, an author. She studied Facebook, and Facebook, now meta, I believe, have a tremendous amount of information on their users. They were having difficulty understanding how they protected information, how they should protect your information, and what they were going to do. Things changed over time significantly, but they still retained all of that data on that PII essentially on us.

      So, even if you do understand it, which you're not going to read the 200 pages, how they handle it could change. Then they send you another 200-page saying, "Heads up, we made this one minor change. Read the 200 pages if you have questions." There's no one to go back to.

       

      A Data Owner versus a Consumer

      Rob: What you just said, I'm okay with. As a business, if you want to change your mind, great. You know what? As a data owner and a consumer, I want to change my mind too. But when we decide to change our minds, we have to let the parties involved know in a way that they can consume and take action. So reading that agreement is not the answer, assessing the organization and putting a legal team in that organization to hold them accountable is not the whole answer.

      We need technical controls to map to that agreement so that you have like a beacon of when they make a change or when that change takes place. That does not exist. By the way, we just talked about one company. You just talked about Meta. Your data is in hundreds of organizations. There are institutions that are collecting it at an aggregate level and understand you at a deeper view across all those data silos.

      Take the complexity of that one company times all the others, it's probably not linear. It's probably exponential complexity. The chance of you being able to read and understand all that is backward. You're always going to be behind; you need to get ahead of it with technical control so that you can see what they're doing with it.

      Eric: If you even have any time. I can't even manage all my subscriptions for television anymore, let alone where the data is and where everything is. I've got to tell you; I'm not picking on Facebook because I can.

       

      How Facebook Scrubs Your Data

      Eric: I'm not a Facebook user, I have an account. I wouldn't know how to scrub my data if I wanted to just close that account out. There's nothing there. I wouldn't know how to scrub that data if I tried and I don't know if they would do it.

      Rob: You just landed on it. What you can do today is you can go and say, delete my information. I no longer consent. Then you get notifications back that say, this will be processed an X period of time or they got to meet it. Then, it says it's done. You're like, "Okay, I hope they did a good job." You have no idea. That's not appropriate in my opinion because we're beyond the technical age where we have a technical answer to this.

      There are technical controls, there are technologies that can allow you to be able to say, remove my data, revoke my consent, et cetera. It just happened inside the environment. We're choosing to ignore that because it's easier just to accept your data and treat it as of low value because you're the human.

      They just treat it like a pool of data. It's easier for these companies to do that. And I think it's not appropriate for us to let them take the easier road because there are technical answers now. There weren't in the beginning.

      Eric: There's a technical answer. Let's assume it's relatively cost-effective and easy. As consumers, the three of us, even with this podcast, we're not going to drive a behavior change. Are you talking about a regulatory change? Then, you've got to deal with country by country by country or something else. What do you recommend?

       

      [20:50] Regulation Alone Can't Ensure Data and Privacy Protection


      Rob: I think if regulation alone, was the answer, then we would not have healthcare companies doing bad things with data, because HIPAA has been around forever. So, regulation alone can't get it done. Over time, the reality is, I don't have a silver bullet answer for you, to be truthful. We're seeing an expansion, an awareness around this surrender of sovereignty today. This has been going on for a long time, this is not new.

      I think consumers are becoming increasingly aware of the sovereignty they gave up. There has to be that rising tide of awareness until consumers effectively indicate that they're going to not choose a brand over another brand because they're positioning that value pillar. That, plus a regulation and legal protection is probably what's going to be required. It can't be just one or the other, in my opinion. But that's where I stand on what it's going to take to get there. Eric: Where do we start, Rachael, as a prolific TikTok user?

      Rachael: I feel it's almost like Sisyphus. They make it so easy to do these things. I think to your point earlier, why would I decide to make my life harder? I've seen the benefits of multifactor authentication. So now I'm like, "Okay. I can take that extra step. It's inconvenient but I see the benefits now." Until you start seeing that kind of alternate reward system, it's hard to want to make that change. Eric: But you see the benefits because you understand the risk to some level. Rob: Yes, alternate reward systems. That's a topic right there. People are in different stages of activities and things that get benefits from.

       

      The Benefits of Social Platforms

      Rob: I think there are always going to benefit from these social platforms. Don't get me wrong. I'm not asking for the end of social platforms, but you wouldn't go to a grocery store today.

      Eric: I don't use them so no consequence to me.

      Rob: It’s nonparticipation for me as well, but that's just because I'm a very small sample set of privacy advocates. People use what they want to use, but you don't go to a grocery store for the grocery store and there's a product that says used with toxic pesticides. You wouldn't use it. Well, we didn't think about that a long time ago. Now, we do.

      There are projects like the digital standard that help you assess a company. I do think education is critical. Education is a long, long haul. It's going to be a long time before the education. What you end up having typically is you have educational change that results in evolution over time, then you have these periodic black swan events that accelerate the mission. I hate to say it, but as a historian of anything related to humanity, that's what motivates changes in a lot of ways.

      At the point when that pain overcomes the subsidies, that's a black swan event ultimately where people will demand more change. They're doing it now though, to be honest with you, they're doing it now. The awareness around cybersecurity, the awareness around privacy informed consent, is at an all-time high and you have to ask yourself why. I think it's because the pain is higher, and the awareness is higher.


      Data and Privacy Protection Starts with Awareness

      Eric: I agree with awareness. I don't know what a black swan event looks like in this space. There are customers potentially, and we'll just have to deal with that. So, take something like an Equifax. A lot of people had information stolen. You might say that was a black swan event.

      Rob: It was a big event for sure.

      Eric: When did that happen?

      Rachael: Three years ago.

      Eric: So let's say it was 2019 and I don't have the exact date. My data is still with Equifax. It was stolen and now it's with Equifax.

      Rachael: But do you get a choice? I guess that's my question with Equifax. I don't get a choice. Why does it just go to them?

      Eric: Do you ever go to Home Depot? They had an event. Target, they had an event. Do you still shop there?

      Rob: Yes.

      Eric: The answer was no, Rob, but let's not go into that. My point is we have awareness, but what is a black swan event or what I would call a catastrophic that would cause you to change buying behaviors?

      Rachael: How do we not hit it yet?

      Eric: I guess consumers weren't necessarily impacted directly in a negative manner with Sony.

      Rob: These are tough topics because you're wading in a territory, which is like, can it even change? If it does change, what's the economic impact on our country? This is where the need, the necessity of industry to overwhelmingly suppress, protect, and subsidize because it is directly attached to the economic outcome for the country. Some of this is just a choice of the world. Maybe there's one institution, maybe one of two to three processors of a certain type of activity.

       

      No Alternative

      Rob: There's just no alternative. Well, that's not great. But some of it is just more complicated than that. The country as a whole is better served if there's not a lot of buying behavior change, and you'll see some unnatural activities to protect that. But there are ultimately these black swan events that just overcome the ability to address that.

      If there's an event large enough or systemic enough or close to the livelihood of humans, think like the energy networks and things of this nature or distribution networks where people don't get water, they don't eat, things like that. There's no way you can't just magically produce spin in the market when the water's not showing up in my tap.

      But there's this longer-term thing that's happening already, which is the exfiltration of intelligence and intellectual property now. This is leaving the country. It's resulting in an increasing difficulty to compete in some areas. Now, I feel like we are the producers of a great deal of innovation, and we are well ahead of that curve. That's, of course, true. But it's not like this is not decaying our livelihood already.

      We say this event's not here, but I think a lot of that is simply the fact that we're, I don't want to say covering up, but we're simply trying to make sure people don't get too excitable about it. There are a lot of agencies and there are a lot of people with missions now that are actively protecting against this intellectual property exfiltration that is harming our economy. These events are happening, we just simply are good enough at combating the blowback now so that the average consumer doesn't have to fill it. That's the reality.

       

      Major Breaches to Data and Privacy

      Eric: I just did a quick search on major breaches and it's amazing. Yahoo, right? I talked about Equifax and Marriott. We had AdultFriendFinder. Let's not forget Ashley Madison. Remember that? People are still using those services. Facebook, Target, yes, skip OPM, MySpace, LinkedIn. We're talking about a lot of social media types of activities. I probably named four or five and I don't think people have curtailed for the most part.

      There are a couple of Erics out there, but even I use LinkedIn. I don't think people have curtailed their usage of social media platforms nor restricted the type of content they put up. And I think at the corporate level, we are incredible innovators in creating a lot of new ideas and things. I think at the enterprise level, whether government or private sector, doesn't matter. The focus is on innovation, much more so than on protecting the information, protecting data we have. That is not to take anything away from the millions of cyber defenders, aware professors, scientists, researchers, doctors, and everybody else who are working with data.

      Certainly, I don't think it's a black swan event. I think it's a catastrophic event. Call it what you will. I think it's more, there's a different level of awareness, depending on who you are, where you are, what you're doing. As a result, we are being mediocre. I had a professor once. A submarginal, I thought it was the greatest derogatory comment ever. It's about someone's paper she was talking about. No, that performance was submarginal. I was like, "That's a slam."

       

      Understanding the Importance of Data and Privacy Protection

      Eric: But I think our performance is submarginal in the area of understanding the importance of data, understanding risk, and protecting that data, despite overwhelming evidence that people want to steal it, are stealing it, and will continue to steal it.

      Rob: Humans have this need to constantly produce more, newer, better. Innovation is just in our blood. It cannot exist otherwise. That's how we've gotten to where we are so let's applaud that for a moment. I'm thankful for antibiotics, for these innovations that allow us to live a better life. So, let's keep that up. But with that comes edge behavior.

      If I'm pushing the frontier on a mission and I'm going to forget about these other things, I don't think anyone would advocate for us to change that pattern. As a result of that, the truth is when you have that type of edge innovation, what you do is you have a lot of increased surface area. You got these tech companies, and engineering at a blistering pace resulting in adoption by users who are not thinking about that. That results in you just giving up information. That's the equivalent of pouring buckets of water in the boat while you're trying to pour buckets of water out of the boat.

      The defenders are up there doing their best to defend it up, now you're just pouring more water in, but you're not trying to. That's why you see breaches up into the right. The technical sophistication is increasing. The surface area is increasing and the amount of information I can collect on my tech service is just phenomenally more sophisticated.

       

      [32:00] What the Defenders and Protectors are Doing

      Rob: We continue to take this approach as we should, as you pointed out, these defenders, these protectors, and these researchers are phenomenal. What they're doing is amazing. There are a lot of things that we don't know that they're doing. I'm thankful that they are, that's the reality, especially inside some of the agencies.

      But what we're not doing, I think at the same time, is we're not giving those individuals, we're still depending on a third party to do by the individuals that will never be aware. If I'm just a non-technical consumer, I don't want to have the technical sophistication to understand all those other individuals. Why would I? That's not my job.

      We're not doing enough to say when you agree to that EULA or when you agree to that data sharing agreement, okay, now we're going to marry that with a technical control so that if you need to change your mind later. You can on your own without 30 intermediaries. We do still need to cross that chasm.

      Eric: You can rest assured that the data that we are managing, and storing is safe with us.

      Rachael: It's a lot of trust.

      Eric: You're taking on that responsibility.

      Rob: It's complete trust. By the way, humans work by the trust. At the end of the day, you go into your dry cleaner, you give them the most expensive thing that you are handed down for three generations. You say, "Please clean this and don't mess it up." That's a tremendous amount of trust. Humans are never going to change.

      Eric: It doesn't always work out so well.

       

      Stop Trusting Everybody

      Rob: But you still do it. You still do it because humans work on a level of trust. I am not advocating that we stop trusting everybody. That's why the term trust I think is misinterpreted in a lot of ways. We still have to trust parties, but you need the ability to affirm independent activity and take technical control of that after the fact. Today, you do not have those two things. You have one of them.

      Rachael: It's a conundrum.

      Eric: I'm giving up on the consumer. I think there are too many Rachaels out there beyond TikTok because she enjoys it.

      Rob: I'm going to keep fighting for them but that's okay.

      Eric: At the enterprise level, at the corporate level, businesses, agencies, and organizations, how do they do a better job on a let's call it a data-centric approach to protecting our PII, our data? That to me seems we've got experts. Your example, we have doctors, nurses, and clinicians, you name it, who aren't experts in protecting data, but they have some level of awareness, but we have experts helping them. How at the enterprise level do we take that data-centric approach to security and protect the data of our customers and our employees and everybody else?

      Rob: I don't want to say do a better job because I don't want to devalue the efforts that are underway. I think there are a couple of things we do really well.

      Eric: But we need to do a better job.

      Rob: We do need to do a better job. Some categories though, once that data is within a location, we have a lot of techs and call policy to control to protect it when it's in my container.

       

      The Common Denominator

      Rob: Here, I got it. I got your data. It's great. Let's describe it this way. You're using two, three, or 400 SaaS applications inside your enterprise. This is the reality. The only common denominator to all those applications is the data. It's the only common denominator. So, it's the only thing that moves between them. It's the only thing that's valuable within them. The data is the proxy to the human that's fulfilling whatever business process that application is meant for. There's a connection.

      Eric: That's value. That's really where all the values are trapped also.

      Rob: That's exactly right. I think what I'm advocating for is that you understand that common denominator and you start down the road of understanding the data itself and protecting the data itself. Whether you're talking about zero trust or talking about whatever other framework, fundamentally, data has been only looked at as a component of the stack. We have yet to really focus on it as a predominant priority pillar and a first mover in terms of, "If that's the common denominator, can I move my intent and controls to that data itself so that when it flows, where it flows, I can express my control, how I need to."

      The other category is if you don't do that, to do business, that data has to leave your organization, full stop. There are no alternatives. If you only extend these legacy concepts, or you only extend zero trust as a means to contain SaaS applications, basically to put firewalls around SaaS applications, great. When that data stays in your environment, sure. But I think your organization needs to participate with vendors, suppliers, and customers.

       

      The Intent and Control for Data and Privacy Protection

      Rob: If we can't move the intent and control to the data when it leaves, your answer is, "I did my best to this point. Good luck." So, everybody's doing great. Everybody's working really hard. The initiatives are great. All I'm advocating for is that we're in the era now where a data-centric approach and focus should get more elevated priority as the more future-proofed approach to data protection.

      Eric: Do you think it is these days?

      Rob: Increasingly so. By the way, these data pillars have made their way into these standards now for zero trust. You see some of the government cybersecurity initiatives call out data pillars. Just on that telemetry alone, I would say that there is some increased awareness about the priority of that. I think it is getting better and I think we are earlier days of adoption. It's easy for tech organizations to say, "Do this, do that." So, let me put my CISO hat on, it's a far different beast to then implement. I would say from a technical maturity perspective, we're farther along.

      From an awareness perspective, this is important, we're farther along. Implementation is early days in a lot of organizations. That's going to take time because these CIOs, and CISOs organizations, have difficult jobs. The majority of them are still in the phase of data discovery and mapping. They don't even know where their data is and what it is. We've got to have empathy about that.

      Eric: I think back to the government CDM program, they broke it into four phases. Initially, it was who's on your network, what's on your network, what are they doing on the network, and then there was this fourth phase. 

       

      A Data-Centric Approach

      Eric: They really haven't gotten to twelve-ish years later, which was how are they interacting with data, how are they protecting and interacting with it, and everything else. You never get there because you don't know who's there, what's there, and what's happening. It's hard.
      Let me connect the corporate to TikTok because there's connective tissue here. You said you accepted that 200-page agreement and you're getting value immediately. I want to be very clear here. That was Rachael Lyon, not Eric Trexler. I'm not a TikTok user. I do not have an account.

      Rob: If I find one dance video this week, I'm going to call you out, Eric. What a data-centric approach allows you to do is defer the risk and liability. Because if you protect up front, you just said it yourself, the environment's so complex, you don't know what actor is where you don't want to know what data is where. This is the reality. By the way, in what world does that change?

      Are we getting simpler or more complex? It's not like we're going to get into a world at one point where everybody just completely has their mind wrapped around where everything is. We don't live in that world. My junk drawer in the kitchen would confirm that humans are bad at that.

      So, if you protect the data upfront, what you're doing is you're going, "I may not know what hostile territory it's going to operate in, but I'm going to defer liability because I'm protecting it here. I've got audit telemetry around what's happening to it and I can change my mind if I see a hostile event."

       

      [40:51] Trust: A Better Pillar of Brand

      Rob: That's why the data-centric pillar is getting so much more elevation. Although if I'm with Equifax and I want to, I can't even do that. If I'm with Marriott and I want to remove my account because of a breach, the breach has to happen first, my data's gone afterward as a consumer.

      That's right, today. We're earlier in the implementation journey. I think awareness is critical because you got to know these technical controls exist. You got to have some type of environmental pressure. Whether that be regulations, consumers, organizational reputation, or an increase in the elevation of trust.

      By the way, what better pillar of your brand than trust? It's what it's built on. I expect to see more organizations differentiate on this. But Marriott, as an example, they're just really in the implementation phase. You can't take advantage of that yet. You won't get any value from that, because they haven't implemented it, but they're on a journey, and let's hope we can help them accelerate.

      Rachael: Hope they get there soon. There are so many questions still, we could keep talking for hours. And I don't know if you got any answers.

      Rob: Well, I think we have a lot of answers. I think as an industry, we need to desperately be more empathetic towards these protectors and defenders because of where they're at in the implementation journey. The burnout is so high, that job satisfaction is low. Look, we're humans. We're not talking about machines. At the end of the day, we got to stop talking about these implementers as though they are some factories that you can crank the dial-up on and produce more widgets. It's just not that case.


      Why Are We Treating Data and Privacy Protection Differently?

      Rob: Management needs to do better. Leadership needs to do better. That needs to happen, then we're going to get there along the way. We've got a couple of guides along the way. We're seeing increased awareness around data protection and privacy. Those are our guides along this journey to make us a hero. Now, we need to support our heroes to get along the journey and hold ridiculous KPIs. Like no breach ever. If you get a breach, you're fired.

      These kinds of silly concepts. If every CEO only had successful businesses, that'd be an interesting world. But I don't know one CEO that hasn't had failures in the past. So, why are we treating our CISOs differently? It doesn't make sense.

      Eric: Spot on.

      Rachael: Well, I know we're coming up on time, Rob. I could keep talking about this stuff forever. Especially more TikTok conversations. But we'll leave that for another time.

      Rob: I certainly appreciate the conversation. I hope it was of value to you and your audience.

      Rachael: Thank you, Rob. To all our listeners, thanks for joining us this week for another awesome episode.

      Eric: Smash the subscribe button. Be aware and protect your data better.

       

      About Our Guest

      Rob McDonald - SVP of Platform, Virtru

      Rob McDonald is the SVP of Platform and an advocate of safeguarding data across new applications and data-sharing workflows. Prior to Virtru, Rob was the CIO for several Acute Care facilities and Denovo Healthcare development teams. His significant expertise in the healthcare industry earned him a spot in Becker’s Review as a 2013 and 2014 Top 100 Healthcare CIOs. Rob has also consulted with corporations to help them assess their current information security position and develop a plan to not only mitigate the discovered technical shortcomings but more critically to raise security awareness amongst their employees. Rob holds a Bachelor of Science degree in Computer Science from the University of Texas at Dallas and is a perpetual student of technology, information security, and privacy practices.