メインコンテンツに移動
Background image

* Privacy Not Included, and What You Can Do About It with Zoë MacDonald

Share

Podcast

About This Episode

This week we deep dive into privacy with Mozilla Foundation’s Privacy Not Included content creator Zoë MacDonald. She shares fascinating insights from the deep research the *Privacy Not Included team undertakes to assess just how private is your data when using popular apps, driving in your connected car, etc. It was quite eye-opening just how little privacy there is for connected car owners – giving up all kinds of privacy in the name of modern convenience

In fact, Zoë breaks down how and why all of the 26 car brands researched earned the *Privacy Not Included label. (Hint: that’s not a great thing.) She also shares some insights on how Privacy Not Included got started in 2017 and the awesome buying guides they’ve been putting out to help everyone learn more about protecting their privacy with the products and services they use every day. Check out http://privacynotincluded.org to learn more!

Podcast

Popular Episodes

      Podcast

      * Privacy Not Included, and What You Can Do About It with Zoë MacDonald

      TTP Ep. 259—Privacy Not Included, and What You Can Do About It with Zoë MacDonald

       

      [1:19] A Conversation with Zoe MacDonald, the Mind Behind Privacy Not Included

      Rachael: So we are going to have such a fun conversation today. I love this theme, so please welcome Zoe MacDonald. She's a writer and digital strategist with the Mozilla Foundation where she helps shine a light on businesses that violate users' privacy by researching companies' privacy agreements. The culmination of that research becomes the privacy not included buyer's guide that helps people better understand the safety and security behind products that connect to the internet. Welcome Zoe.

      Zoe: Thank you for having me. Glad to be here.

      Audra: Excellent. So Zoe, can we jump off, and by you actually telling us more about Privacy Not Included Guide, so how it works, the areas that you focus on? Just because there may be a lot of our listeners who've not heard of it before.

      Rachael: Yes. And so many different areas too. I can't imagine how you prioritize, as well.

      Zoe: Yes, absolutely. It is difficult to prioritize Privacy Not Included was founded in 2017 and it was basically developed by our leader Jen Krider at Mozilla. And she noticed that there was a big gap between the increasing interconnectedness of our devices. And at the same time the awareness of what data these devices are collecting and what these companies are doing with that data. Whenever you see a big gap like that in the startup world, it's either because nobody cares or just nobody has really highlighted that yet. And the only way you can really find out which of those it is is by testing it. So we launched Privacy Not Included at that time in the holiday guide. 

       

      Zoe MacDonald Reveals Troubling Findings on Privacy Policies

      Zoe: It turned out it was the latter or the former, whichever people cared. People cared a lot about what their connected devices were collecting and sharing about them. The purpose of the guide is to give people the information that they need to be able to make smart choices about their privacy when they purchase these products.

      Audra: So let's talk about some of the bigger offenders of our privacy. Let's talk about cars. So various car companies and that industry you've actually found is one of the worst privacy policies that your team has ever researched. What are some of the findings that brought you to this conclusion?

      Zoe: It's a big question because they're doing all of the things basically that we don't like to see. One of the things I just want to underscore off the top is that it's not normal for us to take on an entire category or a buyer's guide. And to have every single product fail our criteria. I think that's partly why we've garnered so much attention with this report. We reviewed mental health apps for the second time earlier this year. 

      And we would also consider them a poor-performing category, something that we're worried about. But even then we had a couple of best-ofs and then you have a handful of middling products that get our sideways thumb. Then the warning label for that to be applied to more than half the products is a huge red flag. So for every single one of the car brands to earn the privacy not included warning label was definitely a shock to us. And something that we're really worried about.

       

      Zoe MacDonald Sheds Light on Data Control and Security Standards

      Zoe: We look at a few things, we look at the data that is collected and what's done with it. We look at consumer's control over that data. And we look at whether the companies meet a very basic set of security standards. What we call our minimum security standards, and then we also look at the track records because there are a lot of companies out there that say all the right things. But then in their history we just look at the past three years, what they do doesn't always match what they say. So it is a really important grounding factor to also look at, okay, you say you treat data X, y, Z, but how has that worked out for your consumers and your clients in the past?

      Rachael: I was just going to say, that I recently upgraded my 17-year-old Acura, which was amazing, but not connected to a connected car. I'm trying, was there ever even a privacy agreement when I signed up for the app? I don't know. I mean this thing follows me everywhere, but I didn't even think about it. How crazy is that. That when I got this app and it's like, oh, we can see where you're going, how much gas you have, all the things. Literally, it didn't even occur to me, Zoe. Isn't that crazy?

      Zoe: Yes, it is crazy. It's not right how obscure it is. And the fact that nobody reads privacy policies is like the oldest joke on the internet. And I think that car makers are really, really counting on that. Because all we're really doing is elevating things that are written in black and white on their websites as part of their so-called public privacy policies. 

       

      Unmasking the Secrets Hidden in Privacy Policies

      Zoe: But they know for a fact that people aren't actually aware of what's going on in the fine print. And in fact, the way that the policies are written, it's clear that they don't expect them to be read and understood because there's a lot of bad stuff in there. For lack of a better word.

      Audra: So question on that. Normally if companies are collecting your data, whatever that data may be. And it could be where you're driving how much gas you have and all that sort of thing. Normally businesses justify why and why they're holding that information. Have you guys gone back to the car industry like individual manufacturers and said, these are our findings and had any response or justification for what they're doing?

      Zoe: There are certainly a lot of legitimate reasons for car companies to have your data. Especially your driving data and some technical data and probably some of that feeds into some safety features as well. But what we found is that there are a lot of reasons for which they're using this data that could not be reasonably tied to anything that's going to help you get from point A to point B more safely. 

      Like for example, they're selling personal data. Most car companies, they're selling your personal data. So it's hard to imagine a scenario where that would work for your own benefit. And as far as engaging in a conversation with the car makers. Another thing that has been unique about cars as a product as a guide is that they pretty much fully ignored us. We always send a series of emails. 

       

      [8:17] Zoe MacDonald Exposes Car Manufacturers' Privacy Policy Silence

      Zoe: We send them three emails and then a warning email that they're receiving our privacy not included a warning label before we publish the research. And it was basically crickets across the board. We did get answers from three different companies, but they still didn't answer our questions. They at least acknowledged that we were writing to them. But yes, no, aside from that, they didn't express a whole lot of interest in collaborating with us, providing us answers, or engaging in any kind of conversation with us.

      Audra: Because one of the ones I admit, I went down a rabbit hole in my research was talking about how Nissan collected information about my sexual activity was in one of the articles. And that is even legal. And I mean, are you raising this with car manufacturers? Because one, how do they know someone's having sex in a car? That's one I'm really concerned about. And two, how would they use that data and sell it on to, I don't know, other businesses to market at you?

      Rachael: Maybe it's future design.

      Audra: More comfortable seats, just saying.

      Rachael: I don't know, just spitballing here.

      Zoe: Yes, I think that's one of the creepiest things for us is that there's just so much that we don't know. We don't know how they're collecting this information. We actually don't even know if they're collecting it. But what we know is that because they've said in the privacy policy that they may collect it. They legally can collect that information. 

       

      Unveiling the Privacy Predicament

      Zoe: And it's interesting because specifically in reference in one case it's information about your sex life and in another case, it's sexual activity data. The car companies responded that they're not collecting that. They don't have any intention of collecting thfat. But what they've done is basically just taken the broadest possible definition of personal data and kind of copied and pasted that into their privacy policy. But that's not a very good defense. Because that means that they absolutely are casting. They're admitting to casting the widest net possible to just collect whatever they may want to collect now or in the future.

      So that's kind of frightening. And as for what they're doing with it, as I said, they are selling it. We know also that they're sharing it widely. They use your personal information to also create inferences about you that they then sell. So that could be part of what's motivating them to have such a data maximalist approach. Because then, in theory, the more that you can infer and then the more you can sell and share and use to your own advantage. 

      I mean even without sharing that data, it can be immensely valuable to these car companies and their affiliates. Because they're massive companies on their own. And some of them even have their own data processing startups as part of the organization as well.

      Audra: So can people opt out?

       

      Zoe MacDonald Unveils the Challenges of Opting Out in Auto Manufacturers' Privacy Policies

      Zoe: Aside from how does Nissan know about my sex life? That has been our number one question. We did a Reddit AMA, and so many people asked it in different formulations, how do I turn it off? Is there a chip that I can carve out from my dashboard or do if I don't use the app? Is it mostly about the app? And that was, it's disheartening because we don't really have much to offer in the way of advice about how to opt out. Because these auto manufacturers are using implicit or assumed consent for the most part. And we just don't really have any super solid recommendations about how you can mitigate this mass collection of data. And it's unfortunate.

      Rachael: That's crazy. I have to say, if they're going to sell my data, I want a little kickback. I think we should all get a little piece of that pie. Since you're capitalizing on my personal information, I want to sell my own information. I'd like to have that ability.

      Audra: You could rent it out.

      Rachael: Rent it, yes, lease it.

      Zoe: I think that's legit 12, 24 months. Yes, I think it's completely legitimate. Absolutely. You should be, it sounds radical, but you should be in control of your own data and you should be the one who's benefiting from it. Especially when you are already paying for your car and its many services. There's no reason for your car maker to be exploiting you as their own personal side hustle.

       

      Car Manufacturers' Troubling Cybersecurity Shortcomings Unveiled by Zoe MacDonald

      Audra: So considering the amount of data that they are supposedly based on their terms and conditions that they're collecting. Do you guys look at anything around their cybersecurity and how they're protecting that data? If I've bought a new car, I'm going off and having a wild time in my car and all that sort of stuff. I want to make certain my data is protected, I'm driving interesting places and all that. How are they actually protecting that? Do you look at that side of things?

      Zoe: Yes, great question. And you would think that the car companies could at least be counted on to protect the data because it's their business asset. Unfortunately, 68% of the cars that we looked at earned our bad track record dinging, which means they had a data breach a hack, or a serious leak. Just in the past three years, that number would've been much higher if we had gone back further. 

      And then, as I said, we also could not confirm whether any of the cars that we researched met our minimum security standard. So that also raised a lot of alarm bells for us as well. So yes, we don't necessarily trust the data in the hands of the car manufacturers. But there is also a concern that we have that the data could end up in even worse hands, like strangers, hackers, and cybercriminals.

      Rachael: Right.

      Zoe: People like that.

      Rachael: Yikes.

      Audra: So can I ask for a positive? Did anyone, even though they're collecting loads of data, people aren't necessarily certain of what, did anyone or any of these companies come up with? Well, from a cybersecurity perspective, anyone at all?

       

      [15:15] Raising the Bar for Automotive Cybersecurity: A Discussion with Zoe MacDonald

      Zoe: From a cybersecurity perspective? No. Well, what I can say about cybersecurity, is the bad track record. As I said, that is most of them. So that's not good. The other thing that we looked at was the minimum security standard. And we couldn't confirm whether any of the car companies met our minimum security standard. And that's a pretty, I mean not necessarily a low bar, but it's kind of a's minimum security standard. It's not the state-of-the-art security standard here. So I was actually pretty surprised that none of them met.

      Audra: You're looking down here and you're still looking down below that then the bar.

      Zoe: Yes, that's what I'm trying to say.

      Rachael: Wow.

      Zoe: Exactly. I mean it's like a medium bar. It's something that is as sophisticated and technologically advanced and designed as a car should be able to clear that bar with no problem. Especially because of the volume and the intimacy of the data that the car collects and then how we trust these cars with our lives.

      Rachael: It seems like now if we're going to spin this into an opportunity though, perhaps if even one or two or three of these car companies or car brands. Could that become a differentiator for these cars as people start to become more aware of them? My data's out there in the ether and it's not necessarily secure and there's so much that you do in your car.

      Where do you go calls that you make through the interface? I think if Audi or whoever, let's say I'm making this up, but if they were like, you know what, we take your privacy first, privacy by design. Do you think that could become a differentiator for car brands ahead?

       

      Zoe MacDonald Discusses the Future of Car Privacy and Consumer Awareness

      Zoe: It could be, and it probably is already for some of the more luxury car brands. But the reality is for the average consumer, for the average driver, they're not going to be able to take privacy into account when they're shopping for a car. There are so many other factors that people have to look into whether the car is even available in their area or the safety features. 

      The budget is huge obviously. So for a select few probably this could be a differentiator for a more specialty brand. But we'd still be concerned about the Fords and Chevys and the Hyundais.

      Rachael: It seems like, and maybe we're still too early in this whole kind of conversation about privacy. But there are certain things that seem like they should be table stakes I guess. And I don't know when are we going to get there to their privacy becomes just table stakes, other ways for them to make money. That's not off of my personal information. And how far out do you think we might be from that? 

      Zoe? We're talking like 10, 15 years specifically as everything's going electric. So that kicks up the whole digital aspect even exponentially higher. And I'm reading like Audi's going to get rid of some of their cars so they can bring on 15, 20 new electric vehicles. Every manufacturer's looking into this and that just seems like this could really run amuck even more so very quickly.

       

      Navigating the Era of Connected Cars

      Zoe: Absolutely, and I think one thing that we realized in having these conversations that people don't necessarily understand about modern cars is that this distinction between A so-called smart car and a regular car doesn't really exist. That's just the way of the future now, but especially as you said in the next five or so years. And things like telematics used to be opt in, it used to be a dongle. But now these are increasingly just built into the car systems harder to opt out from, and then that's just fewer consumer choices.

      Rachael: Wow.

      Audra: So are there any ways that people can actually protect themselves in terms of the information that they do share? Is there anything that's in control of the drivers themselves that they should not be sharing through their cars?

      Zoe: There are little things that you can do. I mean, you cannot connect your phone to your car. You could not use the car's app because that of course unlocks access to your phone. And oftentimes both the app and the car share the same privacy policy, so we can't be sure which information is being drawn from which source. So in that sense, less is more. The connected services are also a big source of information or data that your car would create and extract about you. But it's such a, I mean compared to all the information that is collected by the car automatically it feels like a drop in the bucket. 

       

      Navigating Privacy and Convenience in the Age of Connected Cars: Insights from Zoe MacDonald

      Zoe: At the same time, I think that it probably makes your experience with the car that you purchased a lot worse. Like saying, Hey, don't connect your phone, don't use the connected services. It's hard to make that recommendation to people when it doesn't make a huge difference. But from your personal experience, it might have a big negative impact.

      Rachael: And that seems to be, I think the crux of this conversation, right? Because depending on people's generation, what have you, there are those that are like whatever, man, my life's out in the open. What could they possibly exploit for me? And then there are others that I'm locking it down. I'm not going to have social media accounts. But there's this, I guess, the spectrum of those willing to give up privacy for convenience. But at what cost ultimately?

      Zoe: And I think it really bothers people in this case that they don't have the choice because a lot of people feel that they have to drive. A lot of people do have to drive to support the life that they want to live. It's not really fair to not be able to opt out of that kind of data collection and that sort of violation of your privacy.

      Rachael: Absolutely. I mean, unless you're going to ride a bicycle everywhere.

      Audra: Or buy an old car.

      Zoe: Well, so that's the other thing is that I'm tempted to tell people to buy older cars, but cars are also getting much safer every five in 10 years. They make major improvements to car safety and certainly, some of the data that's being collected is being used to that end. And that's good, that's fine. 

       

      [21:59]Exploring Solutions with Zoe MacDonald

      Zoe: But the information should be walled off in a sensible way so that it's used to help us and even infrastructure-driving data can be useful. It's just that it's not helpful for it to be packaged and sold for example. Or just collected for the sake of building a very robust personal profile about you that then can be shared with a bunch of people who want to sell you things.

      Audra: So there's always the option of this being a new business that someone can come up with a privacy solution that you can actually implement on your cars. And actually limits what kind of information is sent out about you just putting it out there as a business idea.

      Rachael: Like a signal blocker so that none of the signals work and they can't track anything.

      Audra: That's options.

      Zoe: I'm also afraid of, I'm wary of those kinds of tools and those kinds of innovations because, in a lot of the privacy policies, Tesla is just coming to mind. But there are others as well that said basically, sure. You can turn off the data collection of your car, but we can't promise, we can't say what'll happen to your car as a result. Basically. It might be a little bit or a lot inoperable. And so things like that worry us. So we don't want to encourage people just for the record to start messing around with their cars to try to disconnect something or attach something or block the signal because we're concerned that it would cause a safety concern. Those people are for function.

      Audra: It could be an innovation though for the car manufacturers, an additional sell-on upsell.

      Zoe: Absolutely.

       

      Revolutionizing Car Privacy

      Audra: Like racing seats and then kind of a data blocker and that sort of thing.

      Zoe: Or how about that by default and then you can opt into sharing your data for some kickback that you get from the car company? That would be cool.

      Audra: That one sounds good too. That actually just simplifies the whole thing.

      Zoe: There you go. yes, being paid for your data would be nice,

      Rachael: Right? It's like crowdsourcing, right? And we all get to benefit. I love that.

      Zoe: Exactly.

      Rachael: I think we got a call to action here, Audra.

      Audra: We do. I'm going to agree. I want to sign the petition,

      Zoe: Sign a petition, privacy not included.org. That's our attempt to ask the car companies to do the right thing. But failing that, what we're really hoping for is more robust privacy legislation in the US because I think back to what you were saying earlier about table stakes, I think that's really the only way to get them is to just kind of force these companies to fall in line.

      Rachael: And I'm wondering too though, we talked a little bit about the White House executive order on AI that came out this morning, and I know it's fresh off the presses, so I know I haven't had a chance to dig into it, but privacy is a piece of that and you wonder where is there some kind of follow through goodness as we look to ai, AI applications within cars. Maybe that could be a step to help start getting us some of this more privacy regulation in force based on applications by defaults or I don't know.

       

      A Conversation with Zoe MacDonald on the Risks of Sensitive Information in the Digital Age

      Zoe: Yes, absolutely. I think that all this conversation about AI and about AI regulation is also shining a light on the need for privacy regulation because of these privacy breaches that are happening in the context of AI and free AI tools.

      Rachael: Yes, that would be great. Can we talk about the 23 and me breach, this person was his name Golum. Golum released what? 4 million records out there? I don't know, you guys probably track these kinds of applications, right? I know you had the mental health applications and I don't know if this category has come up within your research. But that's very private information and I can imagine if you were to sell it to insurance companies, for example, preexisting conditions. 

      Even though it hasn't hit yet, you have a 78% chance of, I dunno, dementia by the time you hit 74. That could be very dangerous depending on the hands and tracking how that information was gotten by certain industries, let's say. I mean that's a little concerning too. I don't know. What do you think, guys? I'm flabbergasted.

      Zoe: I totally agree. I think it's very scary and I think something that we learned when we were researching mental health apps is that people, when they're disclosing very private information like chat transcripts with someone who is a coach slash therapist for example, sometimes there's an assumption that because this information is so sensitive that it will be treated with extra care and that would be the same case for genetic or ancestry apps and websites. But that just unfortunately isn't always the case. 

       

      [27:15] The Origin of Privacy Not Included with Zoe MacDonald

      Zoe: And then even when that information is treated with care from a cybersecurity standpoint, nothing is a hundred percent and nothing is totally bulletproof. So you can still have these data breaches and the consequences in the case of your genetic information, which by the way is also a line item that the car's privacy policies say that they can collect is it's really high stakes. That's another weird one

      Audra: To me. Absolutely, absolutely. So can I ask, what was the catalyst to actually, for the creation of privacy not included, what's the project backstory? What was the event or events that led to its creation tipping?

      Zoe: Yes, I dunno that there was any single one thing. I think it was just like I said earlier, there was an acknowledgment that there was this huge gap in this, there are connected products. They are doing things with your data. Nobody really knows what, how, and when, and that gap needed to be filled and consumers needed to be empowered with that information.

      Audra: I was wondering whether it was kind of the Internet of things kicking it off when you have your coffee machine that wants your date of birth and your sexual preferences and things like that and you're kind of going, it's a coffee machine.

      Rachael: It's got the asterisk next to it where you have to answer it or you can't submit the form. Right?

      Audra: Exactly. My washing machine, I looked at what it would take to set it up. It had other tunes it could play. So I was going to have different music for my washing machine. And when I read what the information it wanted for me in order to connect it, I'm like, that's not happening.

       

      Bridging Privacy Gaps

      Zoe: I think that's exactly it. I think that people do have a lot of questions like that. Why does my washing machine need to know my sexual orientation or something like that, right? And what are they doing with it? At the same time, people just do not have the time and resources to read thousands of privacy policies for all the things that they have to begrudgingly click accept on throughout the day. So we're trying to help bridge that gap and kind of put the fine print in bold print.

      Audra: So can you talk about any positives, positive effects that your project has had on getting people like businesses to change how they're handling privacy and data?

      Zoe: Yes, absolutely. I mean, aside from cars, general non-responsiveness, and seeming not too keen on cooperating, we have had a lot of companies in the past that have been really eager to work with us and learn how can I improve my privacy policy. What do you recommend that I do? I know that we spoke to Garmin specifically. We made a recommendation that their privacy policies are already quite good, but that they allow users to get access and delete their personal information no matter where they live in the world. 

      And they ultimately took that into consideration and said, yes, okay, we'll add it in there. When we were doing the mental health apps, I think it was Wobo and Modern Health and a few others, we helped them also improve the language in their privacy policies, and improve either their cybersecurity or their data control for their users, their consumers. 

       

      Empowering Consumers and Raising Awareness

      Zoe: So we have seen those small incremental changes on a very direct level with companies. I think some of the biggest impact we're having though is more just bringing things to the forefront. Bringing things as part of the public conversation, and making consumers aware of what is the status quo with privacy.

      Audra: I think awareness is key. I really do. There are so many people who don't realize, particularly with the connected devices that they have in their homes, what kind of information is being collected.

      Zoe: Exactly. And I think it's very clear based on the attention that this and other guides have gotten that people do care. They absolutely care about their privacy. It's just that they're not aware of what's going on and they have so many unanswered questions.

      Rachael: Well, particularly the language a lot of times, I mean I know they have to use legalese or whatever. But kind of like what you were talking about before, they use it in the broadest possible way. We're going to use it in ways we haven't even figured out yet, but we're going to get it just in case. 

      That just seems wrong. How are they able to do that? Or because it's an, well, I guess it's not opt-in, but because it's required to use the platform, you could either not use the car or you could have your data taken. That just seems, I dunno, it just seems wrong. I don't know how else to characterize it, but they're able to do it. And is it because there's no real oversight? I mean, there's no legal oversight, right? There are no repercussions for them to do this, correct?

       

      Zoe MacDonald Advocates Clarity and Reform Amidst Data Collection Concerns

      Zoe: Yes. I mean a lot of what they're doing, a lot of what's stated in their privacy policies isn't against the law in the United States. That's why we need stronger privacy legislation.
      Rachael: Yes, it's not like GDPR.

      Audra: And I appreciate GDPR, it was a pain to put in place, but I actually really appreciate what it is and that it does control the information. In the US though, if you ever want to look anyone up and find out all sorts of stuff about them, you pay a couple of dollars and you can get a massive report on it. That's not the same over here. I prefer it that way. Not having all your information or anything about you is all out there.

      Rachael: How do we feel about cookie policies? No, sorry to cut you off, Zoe, but it makes me think of the cookie policies. Like every website I go to, you have to do the cookie policy. And while I appreciate having options, generally if you're doing that on the websites, it seems like there should be more opt-in, I guess that was what I'm getting at. I dunno.

      Zoe: Yes, I think that's true. And to your earlier point, they don't have to write these privacy policies in legalese. They don't have to be so vague. They could make it simple and easy for people to understand. As many brands and products that respect privacy do, they can have very clear privacy policies. And the same can be said for cookies. 

       

      [34:26]Zoe MacDonald Examines How Companies Nudge Consumer Choices

      Zoe: They can make it easier for you to opt-out. But if that's not what they want you to do, then they are using deceptive design or fine print or confusing language basically to nudge you towards the choice that they prefer that you make. But it is possible for these companies to make it easier for consumers. That's just maybe not always in their best interest.

      Rachael: Yes, because convenience, I mean, sometimes you're just like, ah, I just accept it all because I just want to go look at this thing. You have to take that extra step. It's like multifactor authentication. It's kind of a hassle, but it works. So it's conflicting.

      Zoe: For sure, yes.

      Audra: I admit to the fact that recently when it keeps saying, do you opt into everything? I keep rejecting it and then seeing whether websites still work. And so far I have not hit anything that stopped me from accessing and looking for the information I wanted on the website. 

      Rachael: That's a good point. I haven't really thought to test it like that. I want all my suggestions. And I only shop online, so I want all the suggestions. I don't want them to be stingy with the suggestions.

      Zoe: You like the targeted advertising. Fair enough.

      Audra: A proactive consumer.

      Rachael: That's right. Well, that's what they say. It'll impact what we share with you as options. And I'm like, don't take it away from me. But I guess that's what's interesting. There are so many divides in how people feel about what they're willing to give up. I always wondered if there was a price. 

       

      Navigating the Murky Waters of Data Consent and Privacy Policies With Zoe MacDonald

      Rachael: We used to joke about this, how much would you be willing to sell your social security number, or your driver's license number for your birthday? I feel like the birth date's sailed, right? Everybody, your birthday's out there, maybe it's the right one, maybe it's not, but it's out there. But what's it worth to you? I think that should be another thing. I'll just give it to you. Who was the guy that started that company where he put his social security number on the side of a truck and drove around and he's like, I dare you to hack me. It was one of those companies. yes, I don't know if they're still around.

      Audra: He's probably been hacked and is now living somewhere else.

      Zoe: However, we distributed a consent survey. I think this was following the launch of our mental health apps research because at that time we published an article about how mental health apps in particular, and cars do the same thing, kind of manipulate the meaning of consent. They're really stretching consent to the point where, as I said, if it's implicit or it's assumed, side note, I think it's Subaru that says in their privacy policy that just getting into a Subaru means that you've consented to their privacy policy. So it gets pretty wacky. 

       

      Exploring User-Centric Data Consent and Compensation Models

      Zoe: But we distributed a survey asking people, what does consent mean to you? What would you like to see? And a lot of people said that. They said they'll ask me if you can use my data and then pay me for it. I'll opt-in if you give me the chance. But I also want to benefit from that data.

      Rachael: Yes, I mean, why not just a little popup or a message like on Zoom when someone starts recording. It's like, hey, we're recording, just so you're aware, you could choose to drop. I mean not attend your business meeting. Or you could say, I'm okay with that. And why can't they do that in a car? Just a little voice. Hey, just a reminder, we're using your privacy today.

      Audra: And we're going to pay you two pounds 50.

      Rachael: Exactly

      Audra: For the day.

      Rachael: As long as you drive X amount of time and drive this many miles or kilometers, we're going to give you two bucks 50. And I say, okay, I feel like that's an equitable exchange. I think we're onto something, Audra.

      Audra: I think another business idea.

       

      Championing Privacy Conversations in a Fun, Informative Way

      Rachael: So many ideas. Well, as I said, I think we could spend days and days talking to Zoe on this topic, just so much fun, but also really important that these conversations be had. Because I love the work that Mozilla Foundation is doing. It's critical. So many people just don't know and they don't know how to get access to information. And like you said, who's going to read a privacy policy, let alone understand it? It's like stereo instructions. So thanks for the great work that you guys are doing. I'm glad we've discovered it and I'm glad our listeners are having a chance to learn more about it as well.

      Audra: Yes, me too. Thank you for having me. 

      Rachael: Well, it's come to that time, Audra, another end of an episode. I always have so much fun with our conversation. So to all of our listeners, hope you guys enjoyed today. We love catching up with Zoe and sharing all of these really hot topics with you. If you have any feedback, we'd love to hear it. Other topics you'd like to hear about and don't forget to subscribe. As always, smash that subscription button and you get a fresh episode every Tuesday. So until next time, everybody, be safe.

       

      About Our Guest

      Zoë MacDonald - Content Creator, Privacy Not Included at Mozilla *

       

       Zoë MacDonald is a writer and digital strategist based in Toronto, Canada. Before her passion for digital rights led her to Mozilla and *Privacy Not Included, she wrote about cybersecurity and e-commerce. When she’s not being a privacy nerd at work, she’s side-eyeing smart devices at home.