轉到主要內容
Background image

Innovation, Convenience, Entertainment and Predicting Privacy with Jill Aitoro (Part 2)

Share

Podcast

About This Episode

We’re back with part two of our discussion with Jill Aitoro, SVP for Content Strategy at CyberRisk Alliance where we pick the conversation up on privacy today and efforts by big tech and government to protect sensitive information.

We also dive into the slippery slope of consumer apps and health information used for convenience and, for some, entertainment and the realization of how that information could be shared and used by third parties in the coming decades (along with the check boxes you might mindlessly click today could come back to haunt you).

Podcast

Popular Episodes

      Podcast

      Innovation, Convenience, Entertainment and Predicting Privacy with Jill Aitoro (Part 2)

       

      [01:13] The Overturning of Roe v. Wade

      Rachael: Eric and I pick up our conversation with Jill Aitoro and dive into the privacy topic that you can't escape on social media, in the news, and everywhere in-between. I'm going to hand it off to Eric and let's get to the point.

      Eric: Let's transition a little bit and talk about accessing personal information. I know this is sensitive to Rachael. I'll lay it out. We had, in the last two weeks or so, in June of 2022, the overturning of Roe v. Wade. As a result of the Supreme Court ruling, there are a lot of privacy implications that the decision will carry. I know very little here. As we were prepping and talking about this, I found it to be a very eye-opening topic. Tell us more.

      Jill: It is a sensitive topic. I will do my best to keep my own perspective quiet, under the circumstances, but it comes down to potential risks tied to information about women with Roe v. Wade. There are a couple of things. Something that came up, for example, is there are big asks coming from Democrats for Google to stop collecting and storing location information. That came almost immediately.

      Actually, that came ahead of the ruling. They were calling for that. The reason for that is, if that can be tracked, that could be utilized by law enforcement if abortion is being criminalized. It could be utilized by extremists that could even track people in terms of where they are going.

      Eric: But you can do that today.

      Jill: That's true.

       

      The Big Concern

      Eric: An extremist, if they get access to your account information or, let's say, they gained access to an application that does track location data, they can do that today. Why the big concern?

      Jill: Now, it could be criminalized. Now, this could be utilized. It was a safety situation before, to some degree. Now it is a matter of, this information could be collected and provided to law enforcement for these women to be criminalized. It's a degree of targeting that could be done that some are saying puts women at risk. It just has a lot of ramifications in what could or can't be used by law enforcement.

      Law enforcement using tracking information of people has always been a little bit of a dicey situation, and if it can be used ahead of the crime, then that also is a situation that's relevant. That's what some of the situations are. Google has responded by creating, basically, turning off the default tracking as a security feature.

      Eric: At the app level or the phone level? Because wouldn't you just go to the telco and subpoena information, and where did this phone check in with which towers at which times because you have the app level?

      Jill: I don't know. That's a good question. I know it's deleting the location of users. I'm not so sure it could be accessed necessarily at all if it's never turned on in the first place.

      Eric: But I think the telcos know, if you're on cellular, the telcos will know which towers your phone is connecting to. I don't know the margin of error, but within a couple of 100 feet, a couple of 100 yards, I think, the telcos can track your phone at any given time.

       

      Jill Aitoro on How App and Phone Providers Respond to Investigations

      Eric: Even if the app or the provider, Android, iOS, in Apple's case, is disabling tracking, I would think it's a subpoena like an AT&T, T-Mobile, or Verizon Wireless, would be able to track the phone.

      Rachael: Do you get emails from Google where they literally list out specific businesses you visited? Whereas "It pinged off a tower a mile away," could include a lot of different things, but I'm creeped out. Like, "I don't even remember going to this store."

      Eric: Well, the best thing, leave your phone at home. Leave it on, and if you want to get paranoid, put a mannequin in your bed sleeping. Put the phone next to it, and then do what you need to do. Jill, just so I understand this, I'm certainly not an expert in this area. The concern is a young girl might live in Texas. 

      She has a phone, she's pregnant, and she's seeking to go to Planned Parenthood, or she wants to go somewhere to seek guidance and counsel. The law enforcement authorities could, I guess, find out after the fact. But they could subpoena her phone records, her application records from Google, from the app provider, and understand where she went at a given time.

      Jill: Yes, and utilize it in investigations. There’s, as some of these app providers and phone providers or device manufacturers look to respond to this by turning off certain functionality so that that can't be tracked. There is a talk about law enforcement is going to object in general because this is information that they use in investigations for far more than just Roe versus Wade scenario, so that's come up also.

       

      Putting Your Records on Subpoena

      Eric: If I'm accused of breaking and entering, they could subpoena my records. If I had my phone on me and it was tracking me, that would help in proving that I was at the location at the time the event happened and prosecuting me.

      Jill: I think some of the fear is also that, in states that are seeking to criminalize abortion, it could empower citizens to provide information on women seeking those services to law enforcement. That gets into a slippery slope in terms of privacy for sure, so that's a bit of an issue.

      What re-emerges is the fact that the health apps that are out there, that many of us use, are notoriously lax when it comes to security, particularly around transparency about what they do with data sharing. So, there are lots of reports. There have been basic companies that have gotten in trouble for vast collections of data that are routinely shared with third-party vendors, and the users aren't being told of that.

      Depending on what women, and what everybody has on those health applications, that also creates some risk here. There's been a lot of calls to refocus on what some of the regulations are, enforcements of regulations in terms of the health apps out there is another issue that's come up again because of all this.

      Eric: Unless those health apps connect to a licensed provider of some sort, I suspect they don't fall under any HIPAA requirements.

      Jill: I wish we had Jessica Davis on here to talk in-depth about it. She's our health editor. Device security is her baby in terms of coverage area. She knows this stuff inside and out, but it's limited. There are calls to change and better manage that.

       

       

      [09:24] Jill Aitoro Calls for Transparency on What's Being Sent Out to Data Collection Brokers

      Jill: Generally speaking, they send them out to data collection brokers, in the same way, that all the other information is sent out from applications. You know how it is. Then you get the application, and you say, "Yes, it's okay," as you're loading this onto your phone in terms of what they can do with your information. It's limited in terms of transparency.

      Eric: Well, you're granting transparency.

      Jill: Which is always an issue, and everybody, this comes down to awareness. I think in the wake of Roe v. Wade or the overturning, that's what we're hearing a lot. Everybody, but certainly women in this situation, you want to be cognizant of that. What is it that you're sharing, and what is it that you're maintaining on devices?

      These sorts of decisions just spotlight the need to be aware and for greater security awareness in the general population. I don't think it’s there now, whether it be tracking and making the proactive decision yourself to turn that off, versus it being enforced. It's an interesting conversation that's been going on in the wake of this.

      Rachael: It's a big one. In some ways, I'm glad we're having the conversation now. I think about those DNA apps where you have to spit in a thing, and they get your DNA. But then you could, "Oh, do you want to know if you're going to get Alzheimer's?" Then there's this big disclaimer saying, "By the way, this could be shared with an insurance company." You're like, "I want to know, but I don't want to know, because what if I am? I can't get insurance. What do I do?

       

      You Can't Predict the Future

      Rachael: That could happen in 20 years. I can't plan 20 years ahead." It's a very heady topic. How do you navigate? You can't predict the future and what those implications are going to be, and that’s scary.

      Eric: There's so much data out there, and it's not only you. If you have a cousin or a sister who goes to 23andMe or one of those applications, your DNA is essentially out there. It's close enough where people can link things to you. It is a very scary world. I'm often reminded when I think about that, a long time ago, I forget who said it but, don't put anything on the internet that you don't want everybody to see. The same thing applies to your phone. 23andMe, you put your data up there, you have to assume anybody can access it in some way, shape, or form.

      Jill: My family made fun of me because I did 23andMe to see my ancestry. It's like one additional permission to then link to others and finds out other people that may be related to you. I didn't want to do it. My family did it, so chances are my stuff is out there probably too somewhere, but I was like, I don't necessarily need everybody to know that information. They can know I'm Italian. I don't need them to know who I'm related to.

      Eric: Even though you didn't select that option, the data is out there. You sent your information. Your family sent your information. They can correlate, just like Rachael's getting all these advertisements from Google now.

       

      Taking It to a Whole Different Level with Jill Aitoro

      Eric: If 23andMe, we'll just pick on them for one second, has a data breach, doesn't protect the information, loses a backup disk, whatever it may be, the fact that you checked that box could be totally meaningless because your data is available. Even if you did nothing, if your sister did it, I bet the bulk of that data is available where they can correlate who Jill Aitoro is and what you do.

      Jill: We always heard about social engineering, and don't share your social security number, or the names of your kids, this takes it to a whole different level. It does get to a point where they can put components together to truly have a definition of a person, for lack of a better way to put it.
      Eric: Based on your experience and your expertise, what would be the best outcome here? How would you like to see it resolve?

      Jill: Which are we talking about?

      Eric: With data privacy. I think whether we're talking about Roe v. Wade being overturned, or any data privacy. I would think if you're into the Second Amendment, you don't want people tracking you going to the range. How would you see, let's say with Roe v. Wade maybe or health apps, health data? What would be ideal?

      Jill: I think there is a serious lack of regulations and compliance around regulations for these things. I'm not one to necessarily say throw more rules and restrict development or restrict innovation by any stretch. But when it comes to the management of personal information, it's ridiculous that these apps are not held to the same standard that medical services facilities are.

       

      What Jill Aitoro Finds Ridiculous About the Security Standards In Place

      Jill: I do online or virtual doctor's office visits on occasion, especially during COVID, and there's a lot, based on HIPAA requirements, standards that you have to go through. You can't maintain a single link, you need a fresh link, everything. Yet, I can save some sensitive information about myself on these apps, and they have no requirement to put those security standards in place, necessarily, so that's ridiculous to me. I think that is a loophole, basically, that needs to be managed.

      You could say, "Well, it's the responsibility of the users to manage this sort of thing," but they don't. The reality is they don't. So, it is the responsibility, when this information is so sensitive, of the vendors, and the developers to put this in place. They won't until there are standards in place, and they actually get in trouble for not adhering to those standards and complying. I think that's step one.

      It's always a nice, fluffy statement, but we need better cyber awareness, I think, in general. People need to know what they're putting out there as you said and be comfortable with it. If you want to, that's great, but you need to be aware and I don't think there's enough awareness.

      Rachael: These apps are treated more like entertainment, but they have very serious information on them.

      Jill: I say the apps, but it extends to devices. Device security in healthcare is huge. It's probably, along with ransomware, that one of the top security issues that face healthcare now is device security. No one's cracked it in terms of how to go about doing this, so, that's apps in your handheld, but it's even devices that hospitals and medical providers manage.

       

      [16:46] Where Do We Draw the Line?

      Jill: Those are not secure either, for a variety of reasons. It's an issue in healthcare because of the sensitivity of the information we're dealing with.

      Eric: I always wonder, where do we draw the line? Is my heart rate protected? Is my workout protected? What does that look like and what's the cost? 20 years ago, I was doing medical work with the military. We had to resolve DICOM imagery, and it was so expensive to get a DICOM compliant or certified monitor. It was basically the same display as a regular monitor, but it was, I don't know, five, 10 times the price. How do you do it at the price point and drive, what I would call common sense security? Should my heartbeat, my heart rhythm, and my run time be protected?

      Jill: The problem is you don't know. It's hard to know ahead of time how this stuff can be utilized. I mean, talking about the military, remember when the Healthcare Act, suddenly, they realized was showcasing location data, and a lot of military guys and women were running around and doing their exercise.

      Eric: Running around the base perimeter.

      Jill: Suddenly they're like, "People can track where people are around the world with this thing." It's, who would have thought that that would be an issue or a security thing?

      Eric: Former military here, they should have thought. I think they were sharing on Strava or Garmin Connect, or something like that. The bottom line, is you're running the base perimeter. I don't know. Maybe they figured, "The adversary built the base." They should know the perimeter, but I hear you. This accessibility to data is incredible.

       

      Jill Aitoro Tackles the Challenges in Innovation, Development, and Partnership

      Jill: It goes back to, "What is protected? What isn't protected?" It does go back also to the information-sharing aspect because you go too far in one direction and you don't put in definitions correctly. Then we go back to a state of everything is locked down, and that's a challenge too. It's a challenge in input sharing for cybersecurity, it's even a challenge in terms of innovation, development, and partnership, so, it's hard. I think that's why we end up where we are in a lot of these situations.

      Eric: I see it in the government where regulatory compliance overrides capability, features, and functions. In fact, in some cases, the government drives product design because of just these intensely crazy, ridiculous restrictions for the worst-case scenario. That's not the answer either. I don't know that you want the government, if you get a bunch of policy wonks, to sit down and determine what the answer should be for the people. I'm not sure that's the best answer either.

      Jill: No, and they struggle with it because they keep, I remember they still do, but under Carter, when he was secretary of defense. There was big talk about pulling in the Silicon Valley companies and enabling better innovation in the Defense Department. Defense Innovation Unit stood up and still exists, which is great. But the struggle is when these startup companies come and the government turns around and says, "We couldn't actually bring it in like that. You got to A, B, C, D, and you have to adhere to the procurement standards of government," which are a hot mess, they walk away.

       

      Jill Aitoro Raises the Need for Balance

       

      Jill: There needs to be a balance and an understanding, and I think it's dual. The private sector and commercial companies need to be better about raising the bar of security to a reasonable level to protect the information, to protect networks, and to protect their critical infrastructure.

      The government also needs to loosen, to some degree, its standards to enable innovation and to enable partnership, and to enable better protection of data because you're working collaboratively with the private sector. Finding that happy medium will always be a struggle between the public and private sectors. It's gotten better, I suppose.

      Eric: It's a very difficult problem. I think we're getting better, but I think the problem is getting more complex. If you go back to 1938, probably you weren't worried about information sharing in the same way. Nobody was. They had to break into your doctor's office, I guess, where he or she probably be back then, but they may not have even written anything down.

      Jill: I remember back when I was at Federal Times, we did a list of the 50 incidents in the United States that drove innovation and government things. One of them was 9-11, of course, and the talk there was what drove this realization that you need information sharing to enable national security versus keeping everything locked down.

      Then you go to the opposite scenario of Snowden saying, "You need information sharing, but you also need to do it in a secure way," and so it's like this evolution, a lot of it having to do with as we develop and as technology advances, it's like we're figuring it out as we go along and it creates problems. It's interesting though.

       

      [23:09] Jill Aitoro Commends the Companies Pushing for Innovation

      Rachael: It is. I'm still excited for the day when maybe we just have the little chip thing and I could pay for everything with my wrist.

      Jill: Talk about Big Brother.

      Rachael: I don't want to have to carry anything anymore. I just want to, "Boop, boop," and I'm out.

      Eric: It's called Apple Pay. You have it today, Rachael. I'll work with you.

      Jill: I will say, if anyone's going to end up doing that, eventually, giving you that chip, it's also Apple.

      Eric: Maybe.

      Jill: I'm aware of that.

      Eric: In defense of Apple, they're pretty good with privacy. They're certainly not perfect but I think they try harder, which might be Avis' logo, but I think they do try on the privacy side.

      Jill: Which is good, because they're among those companies, like I mentioned, that are pushing the envelope in terms of innovation, obviously. You need those companies to be at the front in terms of these standards and hold themselves to a high standard.

      Eric: They have the scale. They have the scale and the ability. With their profit levels, they have the ability to do the thing more easily, maybe, we'll say. Certainly not perfect.

      Jill: That’s the problem though. I think it was Kevin Mandia, years ago, I remember speaking too. This was when he was still Mandiant. No one knew who Mandiant was, and it came out with this big report on China. He described the small businesses and companies as the sieve, and they are. It's great when you see an Apple and a Microsoft doing what they're doing, but there are a whole lot of small companies that are not capable because of resources or expertise.

       

      Supply Chain Issues

      Jill: That will forever be the problem. We've seen it so many times in the last few years alone. The supply chain issues, for sure.

      Rachael: We could talk about this stuff for days. It's frustrating when there's not an easy answer. I'm lazy, Jill, I think it is what it is. I just want an easy answer, and move on to the next thing.

      Eric: Okay, Rachael. Leave your phone at home. Don't wear an Apple Watch. Doesn't sound like you might have one, but leave it at home and take a walk today, and you'll be relatively information protected.

      Jill: Go out into the wilderness.

      Rachael: What if I hurt myself and I need to call someone?

      Eric: Everything's a trade-off.

      Jill: Isn't that funny though? I have a 12-year-old, and we got him a phone this year. The rationale was, what would we do if he wants to go see his friends, we need the ability to reach him. It's like, I was gone in the morning and didn't come until it was dark, and my parents had no idea, so times change.

       

      After SolarWinds

      Rachael: It worked out okay. This is a great way to kick off a Friday. Thank you so much, Jill Aitoro, for joining us. These are the best conversations.

      Jill: It's super fun, anytime. I think the last time we talked; was after SolarWinds. We were like, "This is the biggest thing we've seen in years. This is going to do down historic." We've had like four or five more of those things since then, so there's always plenty to talk about.

      Rachael: I know. Imagine what the next six months will bring.

      Eric: We'll all be employed for a very long time, if we want to be, in this business.

      Rachael: Every day is something new, every day. To all our listeners out there, thanks again for joining us for another week. As always, please subscribe. It makes it easy. You get a fresh episode with Jill coming directly to your inbox every Tuesday. Until next time everybody, y'all stay safe.

       

      About Our Guest

      Jill Aitoro - SVP Content Strategy, CyberRisk Alliance

      Jill Aitoro has more than 20 years of experience editing and reporting on technology, business, and policy. Prior to joining CRA, she worked at Sightline Media as editor of Defense News and executive editor of the Business-to-Government Group. She previously worked at Washington Business Journal and Nextgov, covering federal technology, contracting, and policy, as well as CMP Media’s VARBusiness and CRN and Penton Media’s iSeries News.