AI transcript
0:00:03 If the police showed up at your front door and said, Guy, you know, we’re really worried
0:00:06 there’s a lot of break-ins in the neighborhood. So what we want you to do is to leave your back
0:00:11 door open so that if the burglars break in, we can catch them easily because we don’t want to
0:00:15 have to bound down your front door in order to do it. Like you’d look at them and tell them they
0:00:20 were crazy. But when that’s happening in the context of digital devices, where it’s a little
0:00:26 more abstract and they use language to obscure what they’re doing, like this gets put forward
0:00:31 as if it’s, oh, law enforcement absolutely needs this. It’s crazy. And it’s bad for all of us.
0:00:41 Good morning, everyone. It’s Guy Kawasaki. This is the Remarkable People podcast. And we’re on this
0:00:46 mission to make you remarkable. So we go all over the world looking for remarkable people. And we
0:00:52 found one really close to us in San Francisco, California. Her name is Cindy Cohn, and she’s
0:01:00 the executive director of the Electronic Frontier Foundation. Wow. And in my humble opinion, that’s
0:01:08 probably the leading defender of civil liberties in the digital world. She has led this great case of
0:01:15 Bernstein versus Department of Justice, which established that software programming is protected
0:01:22 speech under the First Amendment. And the National Law Journal named her one of 100 most influential
0:01:29 lawyers in America. And I love this quote about Cindy, man. I hope somebody says something like this about
0:01:38 me someday. The quote is, if Big Brother is watching, he better watch out for Cindy Cohn. Oh, my God.
0:01:44 I got to go back in your history. I noticed something doing research about you. So you got
0:01:53 your law degree in 1989 or 1990, right? Yes. And then in a mere four years, you were lead counsel for
0:02:00 Bernstein versus Department of Justice. How did you get to that position in a mere four years?
0:02:07 Well, you know, I kind of fell into it, to be honest. You got to remember, 1989, 1990 to 1994,
0:02:14 there was no World Wide Web. Technology was being done mainly by people who had high technical skills
0:02:20 out of universities and research institutions. And I happened to meet some of them. And one of them was
0:02:26 EFF’s founder, John Gilmore. And I literally met him at a party in Haight-Ashbury. And we became
0:02:32 friends. And for a while, I dated one of his friends. And he was putting EFF together, the
0:02:38 Electronic Frontier Foundation. A couple years later, he called me up. And he said, do you know
0:02:44 how to do a lawsuit? And I just a couple years out of law school, really not the right person for this
0:02:49 said, sure, I know how to do a lawsuit. He said, good, because we’ve got this guy and he wrote a
0:02:55 computer program. And he wants to publish it on the internet. And he’s been told that if he does,
0:03:00 he could go to jail as an arms dealer. And I said, what does it do? Does it blow things up?
0:03:06 And he said, no, it keeps things secret. And I said, that sounds like a problem and a First Amendment
0:03:12 problem at that. And he said, I think so too. Will you take the case? And I said, yes. I had never been
0:03:17 online. I didn’t really know very much about what these guys were doing. They were my friends. But
0:03:23 I wasn’t even, as you pointed out, I’m kind of a baby lawyer, right? I’d never done a constitutional case
0:03:31 of that magnitude. But I got lucky. And between John and some of the other early internet people,
0:03:37 and then very kind people like cryptographers, computer science professors, Hal Abelson at MIT,
0:03:44 and a bunch of others, they’ve actually taught me enough about how the internet works, how coding works,
0:03:50 and how cryptography works, that we were able to mount this challenge and do so successfully. But to me,
0:03:55 I just was in the right place at the right time and had the good fortune to think my friends would
0:04:00 think I was cool if I did this lawsuit. And then the patience and support of a lot of people to be
0:04:05 able to sit at the, I always call it driving the big truck, right? To be able to sit in the driver’s
0:04:09 seat of this big truck that we drove through the government’s cryptography regulations.
0:04:17 So Cindy, are you basically saying to me that in one of the most pivotal cases in intellectual
0:04:20 property, you are faking it until you make it?
0:04:26 Totally. Absolutely. Now, I had the good sense and the luck to have a lot of people around me and to be
0:04:31 able to pull in people. By the time we got deep enough in the case, we had a guy named Bob Corden
0:04:37 Revere join our case. He’d already argued a First Amendment case in the Supreme Court. And he saw what
0:04:42 we were doing and came in and was like, let me help ground you in this kind of long history of the
0:04:48 Constitution. It’s more like we were kind of this rolling tumbleweed that started with just me. But
0:04:54 as we went along, we picked up people with expertise and support so that by the time I was standing in
0:04:59 front of the Ninth Circuit Court of Appeal arguing this case, I was standing on the shoulders of lots
0:05:04 and lots of giants who had thrown in to help us. But yeah, the very start of it was quite literally,
0:05:07 I thought my friend John would think I was cool if I said yes.
0:05:13 Wow. Listen, when I saw that four years after your law degree, you were leading this case,
0:05:20 I did a search on ChatGPT and I asked for examples of lawyers who had huge cases early in their careers.
0:05:29 Wow. And it came up with this list of Neil Katyal, Gloria Allred, Sherilyn Eiffel, Preet Bharara,
0:05:35 whatever. Yeah, very famous lawyer, very good lawyer.
0:05:41 Yeah. And I looked at that list and you were faster than all of them. So I said, oh my God,
0:05:45 Cindy is the Caitlin Clark of civil liberties. My God.
0:05:52 Oh, as an Iowan, you could give me no higher compliment than comparing me to Caitlin Clark.
0:05:53 Yeah. She’s from my home state.
0:06:01 And Cindy, I will confess to you that I checked to see if you related to Roy Cohn to see if there
0:06:07 was any nepotism involved, but there is not. No, no, no. I don’t want to be related to him,
0:06:12 but yeah, my family doesn’t come from that. Neither of my parents went to college and I was lucky enough
0:06:16 to get to go. I would not brag about being related to Roy Cohn.
0:06:21 No, I do not want to be related to him, but I did not come from highly educated, lawyered up family.
0:06:26 While we’re on the subject of Bernstein versus Department of Justice, if the Department of
0:06:30 Justice had prevailed, what would be different today?
0:06:35 I think that what would be different today is that we’d have even less security online than we do
0:06:41 now. Now we still have a lot to do. I don’t claim that this is an ongoing fight and the attacks on
0:06:47 encryption keep coming. The UK is horrible right now. Australia passed a bad law, but we have signal.
0:06:54 We have HTTPS, right? The ability to go from your browser to a website without that information being
0:07:01 in the clear so you can access information without it being immediately tracked. We have basic security,
0:07:05 right? When you turn off your phone, it’s encrypted. So if you lose your phone, you don’t lose all your
0:07:11 data or access to all your data. That’s because Apple put encryption into the actual device so that
0:07:16 your data is encrypted. When you turn the device off, the same thing’s true for most computers and
0:07:24 phones now. Encryption is really baked into so many things we do. And I think it would be not baked
0:07:28 into nearly as many. I think the government would have ultimately had to let us have some security
0:07:35 online, but it wouldn’t have been nearly as pervasive and we wouldn’t be able to continue to
0:07:41 deploy it without government approval. And parts of governments really understand the need for strong
0:07:46 security. I would say NIST and some of the other parts of government. But when you get over on the
0:07:52 law enforcement side, they’re really, really hostile to it. And we’ve had an upper hand in that fight since
0:07:59 the 90s because it got taken off the munitions list and it wasn’t regulated. So we could go ahead and
0:08:05 innovate first and not have to go to the government on bended knee and beg for permission. And I think
0:08:10 that’s benefited all of us in terms of even having the security we have now online.
0:08:17 Now, when you refer to the United Kingdom just now, is that the request they’re making of Apple so that
0:08:21 there’s a backdoor to the encrypted iCloud files?
0:08:27 Yes. It’s hard because the UK has even more secrecy around these things than we have in the US. So we
0:08:33 don’t know exactly what’s going on. What we know is that Apple offered something, I think it’s called ADP,
0:08:38 Advanced Data Protection, or it might be APT. I might have that wrong. But basically, you could turn on a
0:08:43 switch and have advanced protection. And that would mean that your iCloud backups were encrypted. Now,
0:08:48 we think that should be the default and not a switch you have to turn on. But that’s okay. At least they
0:08:52 offered it. And that’s really important for human rights defenders, for journalists, for people who find
0:09:01 themselves targeted for espionage as well as by governments. And we know that the UK government has
0:09:07 demanded that anybody who provides you with a service or a device have access to the plain text of
0:09:12 everything you do on your device. And this is the way they talk about they don’t say we’re going to ban
0:09:15 encryption. They always say, Oh, we love encryption. We’re not banning it. But we’re just going to make
0:09:20 sure that the people who provide you with services and devices always have access to the plain text. And
0:09:24 there’s no way to read that as anything other than denying you encryption, right? Real encryption.
0:09:30 So we know that the UK government provided something to Apple, we think it’s that they had to provide
0:09:36 access to the plain text, and that Apple reengineer the device so that they could always have access to
0:09:40 the plain text. And we know that what Apple did in response is say, Well, we’re just not going to let
0:09:46 anybody turn on this extra protection in the UK, because I think they didn’t want to have to downgrade
0:09:52 it. It’s pretty hard to downgrade it just for UK people. And of course, if they downgrade it for
0:09:57 people in the UK, that’s anybody who’s talking to anybody in the UK. So that affects all of us,
0:10:03 or most of us. So we think that’s what’s going on. We haven’t seen the actual documents yet. But I
0:10:08 think it’s a safe bet. That’s what’s going on with Apple. And I really appreciate Apple because
0:10:12 they’ve been pretty public about it as public as they can be. We don’t know what kind of orders have
0:10:17 been issued to the other companies who have been very quiet. But I think it’s highly unlikely that
0:10:23 the UK government just picked Apple. They’re not even the biggest operating system. The Android is
0:10:28 bigger when you go global. So I suspect that something similar could be going on to Android
0:10:34 and other devices that would that is not as visible to us. Again, I don’t know. It’s all secret. But I
0:10:37 think in time, we’re going to figure it out. And it’s problematic. People should be rising up like
0:10:43 we need strong security and privacy. And it’s not just because of law enforcement access. If the police
0:10:47 showed up at your front door and said, guy, you know, we’re really worried. There’s a lot of break-ins
0:10:51 in the neighborhood. So what we want you to do is to leave your back door open so that if the burglars
0:10:55 break in, we can catch them easily because we don’t want to have to bound down your front door in order
0:11:01 to do it. Like you’d look at them and tell them they were crazy. But when that’s happening in the context
0:11:06 of digital devices, where it’s a little more abstract, and they use language to obscure what
0:11:12 they’re doing, like this gets put forward as if it’s, oh, law enforcement absolutely needs this. It’s
0:11:21 crazy. And it’s bad for all of us. Wow. So if Apple were to agree with the UK, then logically,
0:11:26 the FBI in America would say, well, you did it for the UK, you should do it for us. And then
0:11:31 none of us have encryption anymore. That’s correct. And even if you think the UK,
0:11:36 they’re a Western democracy, they have rule of law, and they have due process. It’s not just the US
0:11:41 that’s going to be following. It’s all the countries of the world, right? You’re going to have Nigeria,
0:11:46 which has a tremendous problem with corruption in their government and attacking political opponents.
0:11:51 All of the countries in the world are going to say, well, you did it for the UK, you should do it for
0:11:56 us. And I think it’s terrible in the United States. It gets even scarier as you go around the world.
0:12:00 Okay. So can we back up for a second? Sorry, I scared the pants off you, didn’t I?
0:12:07 No, no, no. I mean, it is a time to be scared. So could you, we just back up a little bit and could
0:12:10 you explain for us what the EFF actually does?
0:12:16 Yep. The Electronic Frontier Foundation is the oldest and the biggest online digital rights
0:12:22 organization. We’re based in San Francisco. We’re now 125 people strong. And essentially,
0:12:27 we work to make sure that when you go online, your rights go with you. So we are civil liberties
0:12:35 organization. We’re focused on law and rights as they relate to digital technologies. And we really
0:12:40 center ourselves on the users of technology. So making sure that the users of technologies have a
0:12:46 voice and are protected as we’re moving forward. Now, our tools are, we’re a lot of lawyers. I think
0:12:50 that’s still my biggest team. We do impact litigation. So we take cases like my Bernstein
0:12:56 case to try to set the law, especially constitutional law, because that’s the province of the courts in
0:13:02 the right place to protect users. But we also have an activism team. And we have a tech team. We call
0:13:08 them the pit crew, the public interest technology team. And we build some technologies. So we build a
0:13:14 Firefox, we build a plugin for Firefox and Chrome called Privacy Badger that blocks tracking cookies,
0:13:19 those cookies that follow you all around the web and blocks other kinds of tracking. We build a thing
0:13:25 called Certbot, which is part of the process for certificate authorities for making sure that when
0:13:32 you go to visit a website, your traffic to that website is not trackable because it’s encrypted.
0:13:39 So we build technologies, we bring lawsuits, we do activism, all of these kind of towards this goal
0:13:43 of trying to make sure that your rights are respected when you’re using technologies.
0:13:46 Steve McLaughlin: And how do you pay for all of this?
0:13:52 Speaker 1: We are member supported. We get no government money. We get a little bit of foundation
0:13:56 money. We get a little bit of corporate money, but not from the big guys. That often comes with too
0:14:03 many strings about our advocacy. So companies that are a little smaller and that might be running a VPN
0:14:08 service or a privacy service, we get some support from them, but mainly it’s individuals. Over half
0:14:14 of our money comes from individuals. And a huge chunk of that come from people who are ordinary members,
0:14:20 who give us 60 bucks or a hundred bucks or a thousand bucks and get t-shirts and hats and stuff. I think
0:14:26 EFF has been a marker for people that they love tech, but they also love rights that they’re trying
0:14:31 not to use tech to crunch on people. And we have a pretty good membership inside the big tech companies,
0:14:36 the Facebooks and the Googles and Alphabets and other companies, even inside the government,
0:14:41 we have a lot of members because I think it’s a way for people to show that they’re trying to be in
0:14:46 this for the right reasons and that they really want to build and support tools that support people
0:14:50 rather than oppress people. We’re member supported and always have been.
0:14:57 So member supported is like NPR where you’re a member of KQED or something like that.
0:15:03 Correct. Only we have much cooler t-shirts, hats and stickers, but yeah, it’s a lot like that. I kind
0:15:07 of joke sometimes at EFF, we work for tips, right? We’re going to go out there and do what we’re going
0:15:11 to do. And if people think that what we’re doing is important and it’s important to have kind of a
0:15:16 foothold and a voice out there to counterbalance the governmental voices or the corporate voices that
0:15:22 might, you know, be in shittifying your tools or not really on your side. We’re one of those people
0:15:26 who show up and try to fight for it. And there’s a lot of digital rights groups now. And I really
0:15:31 love when I got into this, there was just us. And now there is a whole constellation of people doing
0:15:37 really good work. I think what makes us different is that we do have this tech team. We’re really grounded
0:15:44 in how the tech actually works. We don’t fly off and pretend or tell the scary stories about how tech’s
0:15:50 going to eat your children or any of those things. We’re really trying to stay very grounded in how
0:15:55 things actually work. And we’ve developed a reputation in the courts and in Congress and in various
0:15:59 administrations going all the way back to the nineties as the people who show up and will tell
0:16:04 you the truth about how technology works and how it doesn’t work.
0:16:11 So this is a dumb question. And I know the answer already, but I got to ask it just to make sure.
0:16:19 So theoretically, if Elon or Mark calls you up and says, we want to give you a $10 million donation,
0:16:25 the answer is it’s probably no, it’s probably no. And this has actually happened. Some of those people,
0:16:31 not ones you’ve made, but some of those companies have offered us a lot of money. And historically,
0:16:35 in the past, there was a time when we were more aligned with them, especially the early days with
0:16:40 Google. We were pretty aligned with them because they were trying to free things up, especially in
0:16:45 some of the copyright fights that we’ve done and IP fights where they were really trying to give users
0:16:50 access to information and stuff. Those days are kind of gone. There is a different leadership and
0:16:55 they’re much bigger and they have a different viewpoint. Right now, if one of those companies showed up and
0:17:00 said, let us shower you with money, I would take the call. But if there were any strings attached,
0:17:04 if there was anything that made it look like it, and honestly, for some of them, I think at this point,
0:17:09 I probably would just say no, because there’s no way it wouldn’t be perceived that way.
0:17:14 The answer is no. I really want our support to be who the people we are, who we’re standing up for.
0:17:19 And I’m not in this to stand up for Jeff Bezos. I’m not in this to stand up for Mark Zuckerberg.
0:17:23 I’m in this to stand up for all the people who are, you know, in some ways feel like they’re
0:17:28 hostages to these people. And you can’t really do both, right? You can’t stand up for the people who
0:17:33 are locking you in with a surveillance business model that tracks everything you do and ranks you,
0:17:38 and the people who are being tracked. You kind of have to be on one side or the other. At this point in
0:17:44 time. I’m sad about that. Those companies used to side with their users a lot more. And one of the
0:17:51 sad things that I’ve seen in the 35 years that the organization has been in existence is the sliding
0:17:59 away from those kind of tech and user roots towards a more adversarial position towards their users.
0:18:05 I would use a stronger verb than sliding away. But yeah, we agree.
0:18:10 I’m trying to be a little kind, but yeah, no. And I think it’s problematic, right? Because I worry.
0:18:15 It used to be people came to Silicon Valley because they had a cool idea that they really wanted to
0:18:19 make happen. I know he was a difficult guy, but even Steve Jobs, like he was a problem solver,
0:18:23 right? He was trying to solve interesting problems that would help people. And again,
0:18:27 didn’t know the man. I don’t claim to know everything. Now it just seems like it’s like,
0:18:32 how do we exploit people’s data to make as much money as possible? And that’s a very different
0:18:37 framing than what I lived through in the 90s and the 2000s.
0:18:42 From the outside looking in, because I’m not inside the tech bro community,
0:18:46 I think that all they care about is long-term capital gains and crypto.
0:18:52 Yeah. And it’s about money for them and power for them. And it’s not really about giving us
0:18:56 anything better anymore. It’s more about exploiting us so that they can maintain their positions.
0:19:02 And it’s so disheartening, right? Because again, I was in the Silicon Valley in the 90s and the 2000s,
0:19:08 and I know there was another vision. I know that there was another thing that a lot of people were
0:19:14 doing. And the good news is, there are people doing that now. We’re seeing with decentralization,
0:19:20 with signal exists, it’s strong, it’s powerful. It’s so powerful that the people in the federal
0:19:24 government use it when they shouldn’t. There is, it’s just the rest of the internet is still there.
0:19:31 It’s just been completely overshadowed and underfunded because of the rise of these tech giants and their
0:19:38 surveillance capitalism business model. But if you peel it back, you can still find people with those
0:19:43 ideals and those visions. And if you look in Macedon in the decentralization space or Wikipedia
0:19:50 is still here, it still exists. It’s under threat right now. But those places still exist. Just all the
0:19:55 air gets sucked out of the room by the tech giants. And some of what we try to do is to point out that the
0:20:00 the internet isn’t Facebook, there’s a whole set of other things that aren’t in the tech giants. And if
0:20:06 we turn our attention towards them, there are people there who could use a little support and coding and
0:20:09 lifting up to build a better version of our world.
0:20:30 Of all these things that are going on right now, what scares you the most?
0:20:36 : I think it’s hard to be alive in America right now and not be worried about authoritarianism. I
0:20:43 think that’s the scariest thing. The scariest thing is we’re seeing the takeover of both our business
0:20:50 side and our civil liberties side by an idea that one guy gets to make all the rules for all of us and
0:20:55 that there’s no questioning that, this kind of king-like mentality. I think unless we fix that,
0:21:01 we can’t even get at most of the other problems. And we’re seeing it in kind of rule by executive
0:21:05 order. Executive orders have always existed, but they weren’t the law of the land. And they shouldn’t
0:21:11 be, right? We’re supposed to have checks and balances and due process. And for me, as a civil liberties
0:21:17 lawyer, these are our tools, right? Like we need tools to go into court or to have a Congress that’s
0:21:23 actually willing to pass a law that protects us as opposed to just doing the bidding of one guy.
0:21:28 I think until we get past the rule of King’s mentality, it’s hard to deal with any of the
0:21:33 other problems. And that to me is the scariest thing that’s going on right now is watching these
0:21:39 institutions that we need in order to protect us not step up to the moment and do it.
0:21:45 : You mentioned Signal several times now, so obviously you must use Signal. But
0:21:51 I have some really tactical questions to ask you about Signal from someone who is in the middle of
0:21:51 this, okay?
0:21:52 : Yeah.
0:21:59 : So first of all, what time period is your default disappearing messages set to?
0:22:04 : It depends on the conversation. I try to set it for a week, but if it’s something where we’re
0:22:09 planning something over a longer time, I will sometimes keep it longer than that. But I have
0:22:14 occasionally used Signal to develop an expert witness in a case or something like that. And then I keep them
0:22:19 longer because I want to be able to go back and make sure that my memory isn’t so great. And different
0:22:20 things have different needs.
0:22:29 : So what happens if a Department of Justice lawyer says to you that you have signals set to automatic
0:22:35 disappearing messages and that’s spoilage, you are destroying evidence?
0:22:42 : It depends on the situation. If something isn’t privileged and is evidence in a case, then you have to turn it off,
0:22:48 just like anything else. If you’ve got auto deletion of your email or anything else. The law requires that if
0:22:54 something is at issue in a case, then you can’t get rid of it. But I don’t think you should live your life as
0:22:59 if you’re always under a litigation hold because I think that can end up being its own problem on its own side.
0:23:04 : So certainly if something is looking like it’s going to be evidence in a case that’s actually
0:23:09 pending or threatened, then yes, you should put a litigation hold in place and you should not get
0:23:14 rid of it. But I think that it’s still better in the rest of your life, which shouldn’t be all your life,
0:23:20 to only keep things for as long as you need them and get rid of them. And this is our advice to companies
0:23:25 too, right? People shouldn’t be just gathering up data and keeping it in case it might be helpful
0:23:31 someday. That way lies a lot of problems. I used to joke at that EFF that we had become an anti-logging
0:23:36 society, not in terms of trees, but in terms of your logs, that you should really think hard about
0:23:40 what you’re logging and why, because it can end up being a vector. And as people who’ve been through
0:23:46 litigation know, it’s really, really expensive if you’ve kept everything all the time and you have
0:23:51 to turn it over in litigation, whether it’s even remotely useful or not, because sorting through
0:23:57 what might be relevant to a litigation hold and what isn’t is its own huge burden. So you may not
0:24:02 be saving yourself money or hassle or time in the long run by defaulting to keeping everything.
0:24:08 : I don’t want you to think I’m obsessed with the topic of spoilage, but I have one more
0:24:10 spoilage question. : Did something happen to you Guy? What happened?
0:24:20 : If you set your default for every new chat to disappear after a week, can you not make a case
0:24:27 that as a course of routine use of signal, I make everything disappear. I didn’t do it to destroy
0:24:33 evidence in anticipation of litigation. : So this is not legal advice. I am not your lawyer,
0:24:39 but once you have a clear indication that litigation is coming, whether that’s because you’ve gotten a
0:24:45 demand letter or you’re in negotiations with someone, somebody showed up, or you reasonably
0:24:49 know it’s coming. And that can be a little vague at times, but the courts will generally think very
0:24:53 specifically. If you’re going back and forth saying, we know we’re going to get sued for this,
0:24:57 but I think we can defend it. That’s the time you ought to turn your little light on. And certainly,
0:25:03 once you get a demand letter, then a good lawyer will send out what’s called a litigation hold
0:25:08 letter to you, your entire organization, and say anything that’s about this dispute,
0:25:15 we need to stop getting rid of it and we need to start keeping it. So yes, putting in an automatic
0:25:20 thing that gets rid of communications and stuff that you don’t need is useful and it can help protect
0:25:27 you that it is your automatic thing. But you can’t then pretend like you don’t know litigation is
0:25:31 coming. Once you know litigation is coming, you need to change course for stuff that’s related to that.
0:25:35 : Okay. Two more tactical questions. : Okay.
0:25:42 : Because, you know, this is a rare opportunity to speak to an expert like this. I know that you must
0:25:49 probably not use biometric authentication for your phone, not your fingerprint or your face, right?
0:25:51 : No, I do. I do. : You do?
0:25:59 : Yeah. : Okay. So explain that to me because it seems to me that, not that I am a lawyer, but it seems
0:26:06 to me that under the fifth amendment, if they cannot compel you to give them their passcode, but they can
0:26:14 compel your fingerprint or face. So isn’t it better to use a passcode instead of your face or fingerprint?
0:26:19 : I think if you’re at risk of being arrested, then that’s important. I think if you’re going
0:26:24 through a border, if you may be going to a protest, if you’re engaged in something where you think law
0:26:29 enforcement is likely to stop you, then you’re right. You should turn off the biometrics and you’re exactly
0:26:35 right. The fifth amendment for what I think are some pretty dumb reasons, actually, distinguishes between
0:26:40 putting in a password or showing your face for purposes of the fifth amendment. And honestly,
0:26:45 I think that whole case law is pretty stupid, right? I think that the constitution should reflect how
0:26:51 people live and not have this, you know, did it require the contents of your mind or not analysis,
0:26:56 but whatever, that’s where we are with the fifth amendment right now. So yes, if you think something’s
0:27:01 coming, then that’s a really good idea. But you know, the rest of your life, people can’t follow
0:27:06 ridiculous instructions. I want technology that makes my life better, that makes it easier.
0:27:11 And so does everybody else. So what security people call this is threat modeling, right? You need to
0:27:15 figure out who you are, what are you doing in the world and what’s your threat model and make your
0:27:23 security based on that. EFF has something called surveillance self-defense, ssd.eff.org or
0:27:28 So look for surveillance self-defense. And we have playlists based on who you are and what you’re
0:27:33 thinking about. So if you’re a journalist or you’re a human rights defender, you’re attending a protest,
0:27:39 you’re helping people who might be seeking abortions in America today, you have to worry about that,
0:27:44 then you might have a different set of things you do to protect yourself than people who aren’t at risk.
0:27:50 And so I think everybody has to do their own analysis. For me, most of the time walking
0:27:57 around, I’m pretty unlikely to be picked up by the cops in the street and asked to have my phone
0:28:01 seized. If you’re really worried, you can do that. And there are times and places where I make sure my
0:28:06 phone is off or that I’ve turned those biometrics off. There are other times and places in my life where
0:28:12 I just want to be able to open it up and look at maps and make sure I’m not lost. And I really don’t
0:28:16 want to have to fumble with putting in a password. So everybody has to make those decisions for
0:28:20 themselves. And we have tools to help people make them intelligently.
0:28:31 But Cindy, okay, so what I find almost incredulous is you are the executive director of EFF and you’re
0:28:38 saying that you feel pretty comfortable walking around with your face or fingerprint opening up your phone.
0:28:43 As the executive director of the EFF, you’re saying that I am astounded.
0:28:49 I think that everybody has to make these decisions for themselves. I love technology,
0:28:53 right? I mean, look, if I was the most paranoid person, I wouldn’t be carrying around a smartphone.
0:28:57 If you’re going to take this to the end of what makes you absolutely the safest
0:29:02 in every situation, I don’t know why you would carry around a beacon that’s tracking you all the
0:29:06 time in the first place. But we all have to make these trade-offs. And I would not say that my trade-offs
0:29:11 are the ones that other people should make. I have this interesting position where, you know,
0:29:16 right now we’re suing Doge. EFF is suing Doge under the Privacy Act for access to the Office of Personnel
0:29:22 Management records. Now, in some ways that may make me worried that at some point the Trump administration
0:29:27 doesn’t think lawyers are off limits for purposes of targeting them. On the other hand, there’s a
0:29:33 federal judge who knows that I’m counsel in the case. It’s not a good look for the government to be
0:29:40 attacking, harassing, and tossing in jail the people who are suing over the Privacy Act in the thing.
0:29:47 And I have always felt, and this is just my threat model, that being high profile and being somebody
0:29:53 who’s laboring in the courts to try to bring justice makes me probably not the first people
0:29:57 they’re going to go after if they go after. Now, things are changing fast in this country,
0:30:04 and that might not be the right threat model today as it was 10 years ago or even 20 when we were doing
0:30:08 the Bernstein case. And believe me, the NSA and the national security people were not very psyched
0:30:12 about us attacking cryptography. I did not for a minute think that they were going to come after
0:30:17 me personally. That was a different time and it was off limits. And I think that it would have
0:30:22 completely backfired on them in the courts. I still think it would backfire on them in the courts
0:30:28 if they did this kind of direct attack. Now, other people should make their own evaluations. And again,
0:30:34 I wouldn’t say that this is my position everywhere all the time, but it is my position when I’m walking
0:30:39 out my front door and going to the grocery store or all the other things that I do. The other piece
0:30:43 of this, and I think it’s really important because you’re asking me personal questions about my own
0:30:49 decision-making, about my own security, and I think that’s useful for people. But we have to fix these
0:30:55 systems. This isn’t a set of individual decisions that anyone should have to make. We need to have a
0:31:00 comprehensive privacy law. We need to have strong encryption built into our tools so that we don’t
0:31:04 have to mess with settings or turn things off in order to have strong encryption. We need to have
0:31:10 laws that protect our ability to have security and privacy and make it something that the government
0:31:16 just can’t do to do these kinds of things. So I think on the one hand, individual choices are really
0:31:21 important. And on the other hand, sometimes in privacy specifically, people get caught up in their
0:31:26 individual decisions as if it’s their responsibility to make sure that they’re as protected as possible.
0:31:30 And I think that makes no more sense than, you know, if you buy a car, you expect it to have
0:31:36 brakes and that those brakes work. And nobody expects you to go out and search for, find and install your
0:31:42 own brakes. I think basic security and privacy is like brakes on a car and all of our devices and tools
0:31:49 and laws need to have them baked in to protect us rather than the responsibility being foisted on us
0:31:56 to find all these tools, pick the right ones and use them in the right way. That’s broken. And a lot
0:32:01 of what we do at EFF is try to give you individual advice about how to do what you’re doing. But the vast
0:32:07 majority of what we do is to try to set the laws and the policies and pressure the companies to make this
0:32:15 not your responsibility anymore. Cindy, knock me over with a feather. If you want to use the brake analogy,
0:32:24 yes, a Porsche may break from 60 to zero in 125 feet and a Ford 150 may take 250 feet. You need to know that
0:32:30 not all brakes are created equal and you still put on a seat belt, right? Yeah, absolutely. All of those
0:32:35 things are important. You don’t have zero responsibility. We have a regulatory system
0:32:40 that says brakes must be within these normal tolerances, right? Same thing. We need a privacy
0:32:45 act. The privacy act isn’t going to say, it’s not going to be a one size fits all thing. It shouldn’t
0:32:49 be. That would hurt innovation, but it should set the boundaries. You can’t put something out on the
0:32:56 marketplace that spies so dramatically on your customers that they can’t possibly turn it off.
0:33:02 They can’t possibly control it. They have no agency about that. And I think of it again, like the way
0:33:08 a good regulation will set the tolerances of what can go out there. So yeah, you might have much
0:33:15 better brakes on a car that has a much bigger engine, but there is an outer boundary, right? You can’t have
0:33:22 no brakes on a car and regulation does some of that. Consumers do some of that by the consumer reports or
0:33:26 other things telling people, watch out, this car doesn’t have very good brakes. You got to have a
0:33:33 mix of markets and smart regulation. I’m not a big fan of regulation. I think it can be very bad and it
0:33:40 can help prop up oligarchies and monopolies. But smart regulation, my classic example of this is when
0:33:45 you know, there’s a decision by the FCC that the phone companies were saying you could only plug their
0:33:51 phones into the wall. And the FCC said, no, you have to let people plug modems into the wall. And
0:33:56 that’s how we got the home internet revolution. That’s smart regulation, right? That’s regulation
0:34:03 that is not only creating the outer tolerances of what we can accept, but also making sure that there’s
0:34:06 a competitive and other options for people within that space.
0:34:14 Are you trying to convince your friends and family to use Signal instead of WhatsApp or you think it’s
0:34:16 irrelevant for most people?
0:34:23 I think WhatsApp uses the same security, it’s the same encryption under the hood as Signal. So I don’t
0:34:28 think WhatsApp is a bad choice in terms of end-to-end encryption. What I don’t like about WhatsApp is
0:34:33 because it’s a Facebook property, they know who you’re talking to, even if they don’t know what you’re
0:34:38 saying. And so on that measure, Signal is better because Signal is designed not to know who you’re
0:34:44 talking to at the level that WhatsApp is and is trying to monetize. But as a matter of encryption,
0:34:50 WhatsApp is not a bad choice. But Facebook Messenger, for instance, is not end-to-end encrypted. I think
0:34:54 they’re fixing that, but it was not end-to-end encrypted. And let me tell you the consequences of
0:35:01 that. So there’s a woman and her daughter in Nebraska who are both in jail right now. And they’re in jail
0:35:06 because they use Facebook Messenger to talk to each other about the daughter needing an abortion.
0:35:13 And that’s illegal in Nebraska. And as a result of Facebook having the plain text of that communication,
0:35:18 because it was not end-to-end encrypted, and Facebook got a warrant that required them to turn
0:35:23 over the copy of the communications that it has because it’s a centralized system. So Facebook has a copy of
0:35:27 all those communications. Both the mother and the daughter went to jail.
0:35:33 If that same communication had happened over Signal or probably even over WhatsApp,
0:35:38 the mother and daughter wouldn’t be in jail right now because the plain text of that conversation
0:35:43 wouldn’t have been available to law enforcement. Many more people are having to pay attention to that
0:35:48 fact, which might not seem at all when you’re just using these technologies. You’re just using whatever’s
0:35:54 easiest for you. But now that we have a world in which some communications are illegal at a level
0:35:59 that I think was not true before, say, the Dobbs decision and all of these states started passing
0:36:04 things, there’s a whole new community of people who need to understand the difference between the
0:36:09 securities and the securities of their communication techniques than did before. Now, this was always
0:36:14 true for people who are human rights defenders, people who are working with immigrants, people around the
0:36:19 world who come from marginalized backgrounds have known this for a while, and now there’s a whole new
0:36:25 community of people who are starting to wake up to these differences. So, yeah, it’s important that people
0:36:31 move to end-to-end encrypted services, and it’s important to more people now than ever before.
0:36:39 Cindy, I would make the case that WhatsApp, which is end-to-end encrypted, but it doesn’t encrypt the
0:36:45 metadata. And there’s a lot you can figure out from metadata that the mother and the daughter
0:36:50 communicated. At this point, they contacted this abortion service and all that. You don’t know what they
0:36:54 said, but it’s little markers on the trail, right?
0:36:59 Yeah, no, you’re right. And EFF fought the NSA over metadata. One of the things that we learned
0:37:05 is that the Patriot Act had a section in it called 215 that let the government demand everybody’s
0:37:11 telephone records from the telephone companies. And one of the things that we learned and that we learned
0:37:16 in 2006, but then everybody learned in 2013 with Mr. Snowden, is that this was actually happening,
0:37:20 that the phone companies were handing over the metadata of our phone records. And you’re
0:37:25 exactly right, that you can glean a lot from those. The reason I’m a little soft on WhatsApp,
0:37:29 but I think it’s a perfectly reasonable choice not to use them, is that it is
0:37:35 how people around the world really use it at a level. And I’d rather not shame them for the
0:37:40 differences between the two of it, but really kind of encourage them to come away from the things that
0:37:48 are entirely unencrypted or that are fake encrypted. Telegram, as we’ve learned, while it sells itself as
0:37:53 being encrypted, it really isn’t at the level that gives people protection. In the world of secure
0:37:59 messaging, I agree with you that Signal is more secure and a better option. I just want people
0:38:04 to pick something that’s a little more secure, even if they don’t go to the maximum secure. And on that
0:38:10 scale, especially, again, around the world, Signal is still so small compared to the reach of something
0:38:15 like WhatsApp. I don’t want to shame people who are using the one, even as we encourage them to
0:38:21 come to a little more. So that’s more strategy than it is like hardcore security advice. But
0:38:27 it’s certainly better to use Signal, but it’s better to use WhatsApp and stick them with the metadata than
0:38:34 it is to use something that’s completely in the clear. You brought up the Nebraska case, and I am
0:38:42 familiar with the Nebraska case, and it opens up a whole nother can of worms that I never figured, which
0:38:49 is the narrative seems to be that if it wasn’t for Facebook and them using Messenger, they wouldn’t be in
0:38:57 jail. On the other hand, the facts show that she did have an abortion after the period permitted in
0:39:04 Nebraska, and they did try to burn the fetus and all that. So in a sense, they did do what they were charged
0:39:12 with. So it’s not like they were falsely imprisoned, or did I get this wrong? Well, it depends on your view of
0:39:19 the law. I think that this is a law in Nebraska that most people think is tremendously unfair and wrong.
0:39:27 And disconnected from the reality of people in America and women in America. I think that
0:39:32 in a world in which every law is perfect and wonderful and should be celebrated and supported,
0:39:36 you might be able to take the position that they broke the law, so therefore they got what they
0:39:42 deserved and how they got found is irrelevant. I don’t think we live in that world. And I think that
0:39:48 when the law is unjust, making sure that people can still live their lives and have protection and have
0:39:55 security is tremendously important. And we live in a world with a lot of laws that are not just right
0:40:00 now and a lot of things like executive orders and other sorts of things that are just ignoring the
0:40:05 law. They’re just snatching people off the street and sending them to El Salvador. This is one of the
0:40:11 reasons that we need privacy and security is because not all governments are just and not all laws are
0:40:17 just. The other reasons we might need it is just basic human dignity and having the space to be able to live
0:40:23 your life without being tracked all the time. But I would maintain that there’s a lot of people in
0:40:28 America who are very uncomfortable and unhappy with some of these laws. They were not passed
0:40:34 in ways that I think people feel very good about. And I think that giving people the ability to have the
0:40:40 level of privacy and security they need to live their lives and not making a world of perfect enforcement
0:40:46 of every single law regard is how the law has generally been. And that’s to stop things that
0:40:50 I think shock the conscience. And I think in this particular instance, this was a mother and daughter
0:40:55 who I believe were having conversation inside their own home. Traditionally, the Fourth Amendment would
0:41:00 say that what happens inside your home is completely not available to law enforcement, right? That’s why they
0:41:07 need a probable cause warrant to come into your house. But because technology meant that this third party
0:41:14 company had the plain text of the communication, suddenly what happens inside the home between a mother and
0:41:19 daughter is available to law enforcement. So you have to look at how is technology changing
0:41:26 everything? And this is a situation in which the founders of America would never thought that the
0:41:31 government would be able to prosecute you even if you were violating the law based on a mother-daughter
0:41:35 conversation and beside the home. And because of the way technology has happened here, that actually
0:41:42 was able to occur. You have to balance all of these things. It’s not just one thing that changes. And
0:41:48 technology has made changes to the way that we communicate in ways that the Constitution needs to catch up.
0:41:56 Up next on Remarkable People. But we’re going to start to see weaponized takedowns at a level that I
0:42:00 think we haven’t seen before because this law facilitates it. And it creates the incentives for the
0:42:07 companies to take things down if they get a complaint. And again, I don’t think that those complaints are
0:42:11 going to be about non-consensual sexual imagery. They’re going to be about people saying things they don’t like.
0:42:21 Do you want to be more remarkable? One way to do it is to spend three days with the boldest
0:42:27 builders in business. I’m Jeff Berman, host of Masters of Scale, inviting you to join us at this
0:42:33 year’s Masters of Scale Summit, October 7th to 9th in San Francisco. You’ll hear from visionaries like
0:42:40 Waymo’s Takidra Mawakana, Chobani’s Hamdi Ulukaya, celebrity chef David Chang, Patagonia’s Ryan Gellert,
0:42:49 Promises’ Phaedra Ellis Lampkins, and many, many more. Apply to attend at mastersofscale.com/remarkable.
0:42:55 That’s mastersofscale.com/remarkable. And Guy Kawasaki will be there too.
0:43:01 Become a little more remarkable with each episode of Remarkable People.
0:43:05 It’s found on Apple Podcasts or wherever you listen to your favorite shows.
0:43:10 Welcome back to Remarkable People with Guy Kawasaki.
0:43:20 So it’s obviously 2025 and now we have someone in charge of homeland security who
0:43:27 cannot even define the writ of habeas corpus. So I’m asking you like if somebody says to you or
0:43:35 your family or friends says to you, “I’m not worried. I have nothing to hide.” Is the “nothing to hide”
0:43:41 statement true these days or does everybody have something to hide at this point?
0:43:45 I haven’t done a demographic survey, but I would suggest that most people,
0:43:48 even if they don’t have something to hide, talk to somebody who does.
0:43:55 Do you have somebody in your life whose papers have expired, who’ve overstayed their visa?
0:44:00 Do you have someone in your life who’s a person of color, who’s trans, who’s LGBTQ of any kind,
0:44:06 not just trans? Do you have somebody in your life who’s a person of color who may think that diversity
0:44:11 and equity are important values and have said something about that? The line over who has
0:44:15 something to hide is really changing. And I would argue that by the time you go through all the lists,
0:44:22 just of the things we know, there aren’t very many people who wouldn’t be impacted by this. And again,
0:44:28 this is why security and privacy are so important. I also think they’re important regardless of whether you
0:44:33 individually need them. I think that one of the problems that we have in privacy is people think
0:44:38 about it in individual personal terms. And so they can come to the, “Well, I have nothing to hide”
0:44:44 kind of position. But privacy isn’t just important for each of us. It’s important for all of us.
0:44:48 And that’s an important distinction. Like most people don’t want to stand on a street corner
0:44:53 and shout out what they think ought to happen in this country. But I think all of us understand that
0:44:56 the First Amendment protects us all, even if we don’t want to speak.
0:45:03 The Fourth Amendment and privacy do work the same way. Giving everybody the shelter of privacy means
0:45:06 that even if you don’t personally need it, somebody who you love, somebody who you know,
0:45:12 or somebody who’s going to help change the world for the better does. And I’m going to give you an
0:45:19 example. In my lifetime, being gay in this country was very, very dangerous. Saying that gay people
0:45:24 ought to have the right to marry, they ought to have the equal right to love who they want to love,
0:45:28 that could get you killed. And in fact, it’s still pretty dangerous, right? We’re moving backwards.
0:45:34 But there was a time in which those conversations had to happen in private. This idea that maybe
0:45:38 loving who you want to love as opposed to the traditional heterosexual thing isn’t such a bad
0:45:43 thing. Maybe we should normalize that and make that okay. That was a very dangerous conversation.
0:45:48 Those conversations had to happen in private and in secret. And in my lifetime,
0:45:52 those have gone from conversations that had to happen in private and secret to something where
0:45:59 we’ve really changed the law. We’ve changed a lot of people’s minds about it. We’ve changed attitudes.
0:46:04 And that public part of the conversation couldn’t have happened unless there was a private part of
0:46:08 the conversation. And I think the same is true if you look at most social movements. If you look at
0:46:14 the anti-slavery movement in the United States, like way back, if you look at some of the anti-immigration
0:46:19 sentiments in this country, where we had the Chinese Exclusion Act and other kinds of things,
0:46:24 and we shifted into a world where we thought differently about differences in America,
0:46:28 the public part of those conversations couldn’t have happened without the private part. So it may
0:46:34 not be you. It may not even be the people you love, but it may be the people who are going to help us
0:46:39 make change for the better. And I think we need to stand up for the rights of all of us because this
0:46:44 is what a human right is. This is what a human value is. It’s not something that’s just dependent on you
0:46:48 and your everyday life, although I think most of us have an increasing need in our everyday life for
0:46:54 privacy and security. But these are values that we should stand up for, even if it’s not
0:47:01 right now visible individually to us, that we need them because this is how society self-governs.
0:47:07 This is how we make changes. This is how we have the space to decide that we don’t like the guy
0:47:11 who’s the president right now and we want to vote for someone else. Increasingly in this country,
0:47:16 those conversations can be pretty dangerous for people to start talking. We’re not all the way to
0:47:21 the kind of repression of other systems where they put the opposition candidate and anybody who’s friends
0:47:27 with them in jail. You can see that on our horizon right now. We need to stand up for privacy and
0:47:35 security even if we don’t need it right now because we may need it pretty soon. One of the big activities
0:47:42 of the EFF right now is involved with the Take It Down Act. And I would like if you would please
0:47:49 explain what is that act supposed to do. Yeah, we spend a lot of time with this particular problem
0:47:54 where there’s a harm online that people agree is a harm. In the instance of the Take It Down Act,
0:48:01 it’s non-consensual sexual images. Your ex posts your sex tape online or other kinds of situations in
0:48:08 which sexual imagery of people is posted online without their consent. It’s a real problem. So people will
0:48:15 take a real problem and then they’ll propose a legal solution that is not good. So the Take It Down Act
0:48:21 says that if somebody tells you that you as a platform or a host of a site that you have to
0:48:26 take something down, you have to take it down immediately or you’re liable. And it is not limited
0:48:33 to non-consensual sexual imagery. Even if we could agree what the definition of that is and that can get a
0:48:38 little fuzzy. It just means that people have to take it down if they get a complaint. And the worry
0:48:44 is that those complaints get weaponized. President Trump, his big speech that he gave in January or early
0:48:49 February said he can’t wait to use this law, that we should pass it. He can’t wait to use it. I don’t
0:48:54 think that what President Trump is worried about is non-consensual sexual imagery. I don’t think that’s
0:49:00 what he meant. But there’s a classic example of how a law that is passed for one narrow purpose can be
0:49:07 used to create a censorship regime for far broader speech than just that. This is why we really
0:49:12 oppose it. This law, I don’t think it’s going to help for non-consensual sexual imagery. The problem
0:49:16 for most of that imagery isn’t that the platforms don’t take it down. They take it down pretty fast
0:49:22 all the time. It’s that they can’t keep up because there’s so much of it. So it’s not even responsive to
0:49:27 the problem because I don’t think the problem is that platforms don’t care about this. I mean,
0:49:31 some might, and that’s important, but we didn’t need a federal law for that piece of the problem.
0:49:37 But instead, it opens it up so that there could be a censorship machine from anybody with power
0:49:42 to take down anything they don’t like, or at least a wide, wide range of what they don’t like. And
0:49:45 again, when you’ve got the President of the United States saying he can’t wait to use this power,
0:49:51 it ought to be a pretty good sign that maybe this law is doing something different than the people who
0:49:55 proposed it do. And to the point where at the very end of it, some of the people who
0:50:00 originally proposed the law flipped and said, “This is a bad idea. This isn’t the right thing.” We had
0:50:05 been opposed to it all along because we worried that it could be misused. But by the end, some of the
0:50:09 very people who started proposing, a couple of law professors who were big fans of this and who proposed
0:50:15 it issued blog posts saying, “Don’t support this. This is not what we meant. And this is a bad thing.”
0:50:21 Nonetheless, it passed and it got signed into law. It’s about a year or two before it really gets
0:50:27 implemented. So we won’t see right away, but we’re going to start to see weaponized takedowns at a
0:50:32 level that I think we haven’t seen before because this law facilitates it. And it creates the incentives
0:50:38 for the companies to take things down if they get a complaint. And again, I don’t think that those
0:50:42 complaints are going to be about non-consensual sexual imagery. They’re going to be about people
0:50:49 saying things they don’t like. Cindy, I don’t know if you realize this, but I think you just said one of
0:50:58 the funniest things I’ve heard in five years of podcasting, which is I don’t think Trump is concerned
0:51:07 about non-consensual images. I would say that if that was said at the white house correspondence dinner,
0:51:19 it would be in the words of Barack Obama, a mic drop moment. But anyway, I’m still recovering from that.
0:51:28 And when you see something like that in a bill and the possible perversions of how the bill is used,
0:51:36 is that something that Mike Johnson snuck in and with that intent or is this unintended consequence?
0:51:41 Am I being paranoid? They’re putting shit like that in purposely or it’s unintended?
0:51:47 I think that it’s a mix. I think for some of the people, it might be unintended. This is a bill that
0:51:54 was sponsored by Amy Klobuchar of Minnesota. And I suspect that she’s a very smart person. And it’s not
0:51:58 like people haven’t tried to tell her. I don’t want to give her too much credit, but I think that
0:52:02 people come in with a pretty honest intent to try to address the harms. They’re just more interested
0:52:08 in the harms than they are in the actual impact of how things are going to work in the real world once
0:52:13 they get passed. So some people are dishonest. I don’t think that Mr. Trump is honest in his support of
0:52:19 this, that he really wants to make a stand about non-consensual sexual imagery. Other people are cynical and
0:52:24 some people are well-meaning. There’s another law coming along that I want to flag that has a similar
0:52:31 problem. It’s a law called COSA. It’s the Kids Online Safety Act. And again, this is trying to get at an online
0:52:37 harm, which is kids online and them having access to information that could be dangerous for them.
0:52:43 But what it’s going to do is it’s going to create a requirement that you provide credentials to get
0:52:51 access to most information online. It’s going to require you to show your ID at some level in order
0:52:56 to get access to things online. This is going to age gate everything online. It’s going to make it harder
0:53:01 for people online who don’t have credentials. And that’s a lot of people in this country to actually
0:53:07 get access to the internet in any meaningful way. It’s not going to stop kids from having access to
0:53:12 stuff that they don’t want to have, but it is going to age gate everything on the internet. It’s going to
0:53:16 require a lot of things. And then it’s going to create these huge companies that have everybody’s
0:53:22 identity information that are going to be sitting ducks for data breaches. It’s going to be the mother
0:53:27 load for people who want to do spying, who want to do identity theft or other sorts of things,
0:53:33 because they’re going to create this. And so I think COSA is another one where there is a real harm
0:53:39 of kids having access to stuff that they shouldn’t have online and that the solution that is being
0:53:44 proposed in this law is not going to solve the problem and is going to cause a whole other set of
0:53:49 problems. We know the things that work for kids online and having them not have access to harms,
0:53:55 but they’re a lot more expensive and require a lot more thought than simply just requiring companies
0:54:01 to put in an age gating thing, either on your device or otherwise. And this is something where we live a
0:54:08 lot, especially on the legislative side, which is good intention, bad idea. And it’s hard because I think
0:54:14 a lot of lawmakers really want to respond to this problem and they just don’t pay as much attention to
0:54:18 whether the thing that they’re championing is actually going to solve the problem and what the
0:54:19 collateral impacts are.
0:54:27 So you mean to say that the Speaker of the House and his son cannot maintain control of each
0:54:30 other and take care of this problem. We need other ways to do this.
0:54:36 I think there are other ways to do it. I was a kid. You were a kid. Is the idea that you had to show an
0:54:42 ID, was that a thing that actually kept you out of anything that you really wanted to have access to?
0:54:48 No. So why do we think that’s going to work online where it’s even harder, right? It’s not like fake
0:54:52 credentials were just made up yesterday, right? I just don’t think that’s really going to be the way
0:54:59 to do it. Again, if it caused no collateral problems at all, then okay, whatever, let’s give it a try. But
0:55:03 it’s going to cause a lot of collateral problems and those problems are going to fall on the people who
0:55:05 otherwise don’t have resources.
0:55:09 Isn’t Australia already doing this? Is it causing problems there?
0:55:13 Yeah. You know, I haven’t seen the research yet. Australia is doing a version of it. They’re also
0:55:19 doing a version of blocking encryption. And I haven’t seen the research yet, but I would be shocked if it
0:55:26 was actually having a significant impact. We know that it’s a mix for kids online, right? We know that
0:55:32 there’s a certain percentage of kids who have a hard time online and react badly. There’s another
0:55:36 percentage of kids, and I want to be clear about this. This is LGBTQ kids. It’s kids from marginalized
0:55:41 backgrounds, kids who don’t fit in where they’re growing up, for whom having access to information
0:55:48 on the internet is literally a lifeline. EFF did a survey and it’s convenience data who filled out our
0:55:53 survey, but we asked kids to tell us, you know, what’s their experience online and how has it helped
0:55:58 them? And we had so many, you should read these, they’re on the website, heartwarming and terrible
0:56:03 testimonials from kids who said, if it weren’t for my online community, I would have killed myself
0:56:10 by now. Because nobody in my house or in my community understands what it’s like to be LGBTQ, gender
0:56:17 queer. And it’s the online world that saved my life. And there are a lot of those kids. And I think that
0:56:24 people who are thinking only about one kind of harm and legislating based upon one kind of harm that
0:56:30 definitely impacts a segment of kids online, especially a certain segment of young girls online.
0:56:36 But only legislating based on that and not seeing the other people that they’re going to harm with
0:56:41 it, which includes a lot of gender queer kids and other kids who don’t fit in, whether that’s religiously
0:56:47 or otherwise in the place that they grow up. Like, that’s just bad legislation. We have to save all the
0:56:48 kids, not just some of them.
0:56:52 But maybe they want to harm those kids.
0:56:56 Well, this is one of the things that the Republicans have been pretty clear about. Marshall Blackburn has
0:57:02 been very clear about. Like, when they talk about online harms for kids in a segment of the conservative
0:57:06 side, they mean kids shouldn’t have access to information about this. They’re talking about
0:57:14 their view of harms is if kids have access to information that isn’t the very narrow Christian
0:57:20 infused version of things, that’s the harm is getting access to DEI information or other kinds
0:57:25 of information like that. And so when we talk about online harms, if we don’t specify which harms we’re
0:57:31 talking about, we’re talking about people who really just want to censor what other people’s children can
0:57:37 see. And I think it’s very vulnerable to that. That again, Marsha Blackburn, who is, you know,
0:57:43 or the Heritage Foundation have both said that’s what they want to pass COSA to do. And for the Democrats
0:57:49 and the other people who are really focused on this one area of online harms that I think we could all
0:57:54 agree are not great. To empower those people as well, like it’s wrong and it’s scary.
0:58:03 I have to ask you tactical questions because who better to ask tactical questions? So this is a
0:58:11 very tactical thread we’re going to go into now, which is, let’s say that you are a US citizen,
0:58:17 born and bred, you have no criminal record, you return to the United States from overseas,
0:58:26 and border patrol asks for your phone. Do you give it? Is it your regular phone or do you take
0:58:34 another phone overseas because you knew this might happen? Is it locked? Do you unlock it for them?
0:58:41 Or do you hand it to them and you say, have added boys, try to decrypt this phone. What’s your
0:58:42 attitude at the border?
0:58:48 Sadly, our border is largely a constitutional rights free zone. EFF did a case a few years ago
0:58:53 where we tried to get the Fourth Amendment to apply to the border and we were not successful. We’re not
0:58:59 done. We’re going to keep trying. But you’re pointing out something really true, which is you have many
0:59:05 fewer rights to protect your phone at the border than you do otherwise. You still have to do some
0:59:10 threat modeling and figure out your situation. If you’re an American citizen and you’re coming back
0:59:15 into the country, they can detain you for a while, but they can’t kick you out. You have a right to
0:59:20 come back, but they can make you sit in detention for four or five hours while they try to open your
0:59:25 phone if you don’t open it for them. And you have to decide for yourself, is that something I want to
0:59:30 do that can be very uncomfortable? Other people are like, sure, that’s fine. But I think, do you have
0:59:34 another plane to catch? You’re going to miss another, you’re going to miss your connection. What is
0:59:39 your life like? Are you trying to make it to your daughter’s wedding? Even as an American citizen,
0:59:42 you still have to think about your threat model as you’re coming into the country. And that should
0:59:49 inform what you decide to do. I do recommend that if you’ve got stuff on your phone or accessible
0:59:54 through your phone that you really do need to keep private. Think about taking a second phone. Think about
1:00:00 getting a burner phone that you use for that or a device like an empty Chromebook so that when you get
1:00:07 overseas, you can use a lot of your services that are cloud-based so you can log back in. You don’t need all that
1:00:13 information on the computer you carry. And same for coming back into the country. Wipe this stuff off
1:00:18 of it and then just sign back on again once you get back safely home. And the cloud computing revolution
1:00:23 has made that a lot more accessible to a lot more people than it used to be. The other thing I recommend
1:00:30 is if you are going to carry your own device through the border, turn it off. Turn it off. Because all the
1:00:36 devices require when you turn it back on again, that you put in a password, that you turn off the biometrics,
1:00:41 that you put in a password to open it up again. And it’s encrypted at the time. They can break into most
1:00:47 phones, but it takes a lot more effort and a lot more money. And so you put them in a position where they
1:00:52 have to decide how much work they want to put into entering into your phone. And I sometimes say,
1:00:57 make them fish with a line and a pole. Don’t let them drift net fish through everything. And I think
1:01:02 for a lot of people, unless you’re really the target, that will mean that it’s not worth it to them.
1:01:07 They’ll troll through what’s easy, but they’re not going to deploy the thing that they have to buy from
1:01:11 in order to actually collect information from your phone. Again, it depends on what kind of target you
1:01:15 think you are and how important it is. But I always maintain you should make it a little harder on them.
1:01:19 Make them have to go through every step, even if at the end they might be able to have access.
1:01:24 Make them go through every step because a lot of people just wash out of the process through then.
1:01:27 And I think that’s important to put them through it.
1:01:32 But what about the logic that if you refuse to unlock your phone,
1:01:34 that’s an admission that there’s something you’re hiding?
1:01:38 I don’t think it is an admission. I mean, at the end of the day,
1:01:42 they got to convince a jury or a judge. And I think as long as enough of us do it,
1:01:47 and it’s not just the guilty, then we need to combat that. Like privacy is a human right.
1:01:51 It’s your right. It’s your right not to have the law enforcement go
1:01:56 rifling through your stuff, unless they’ve demonstrated that you’ve done something wrong.
1:02:01 Where do we lawyers call a probable cause finding in front of a judge, right? That’s why we have the
1:02:04 Fourth Amendment the way we have it, which is they have to go to a judge. They have to say there’s
1:02:09 probable cause that you’ve violated the law. And then the judge has to agree with them. That’s what a
1:02:15 warrant is. If they haven’t done all those steps, then it’s your right to say, no, I’m not going to
1:02:21 voluntarily let you do this. That’s why I have a doormat that one of my interns gave me a long time
1:02:25 ago that says, come back with a warrant. And the EFF has them actually stickers for your phone.
1:02:32 That due process protection is important. And if you just decide that you don’t want that protection
1:02:36 anymore, of course, that’s your right. But that doesn’t mean that you’re doing something wrong if
1:02:41 you avail yourselves of the protection of the law. And I think we all need to stand up for that. This
1:02:47 idea that by not letting police just blow past all the protections that people fought and died for us to
1:02:54 have is somehow standing up for our rights as a citizen in order to do what we’ve done. To me,
1:03:00 that’s a patriotic thing to do. That’s why we did a war against a king to have our own country was so
1:03:06 that we could set our own rules. And when we could have a government that abided by them, holding the
1:03:10 government by the rules to me is the more patriotic thing to do, not less.
1:03:19 A few seconds ago, you used the phrase stuff that you might want to hide. But what is the definition
1:03:25 of that? I would think that on almost anybody’s phone, you could find a place where you said,
1:03:31 these tariffs are stupid. It’s going to ruin our economy. Are we at a point where, oh my God,
1:03:37 what if the border patrol saw me say that on social media because they opened up my phone? Am I going
1:03:42 to be deported or something? Yeah. Where are we on that? Well, I think it’s getting more and more
1:03:47 scary. And the Trump administration is trying to require people who want to come to the United States
1:03:52 and get visas to open up their social media to turn everything to public that used to be private. It’s
1:03:57 horrible. And we need to fight this proposal as best we can. I don’t think it’s constitutional,
1:04:02 but yeah, I think one of the things about the time we’re living in, which is really, really scary,
1:04:07 is that the needle is moving so fast and so unpredictably that I think when you ask me,
1:04:14 what if I have nothing to hide? I don’t think anybody can feel safe right now that their presumption
1:04:19 of what that means for them personally, much less for all the people they talk to. Remember,
1:04:23 what’s on your phone isn’t just what you say. It’s what other people say to you that you have.
1:04:28 Even if you might not implicate yourself, you might implicate your friends who got pissed off and wrote
1:04:34 a text about being angry about something that now law enforcement is looking at to try to decide
1:04:40 whether they get to stay in the country or whether they get detained. We often say privacy is a team
1:04:44 sport. The other thing people have to remember is it’s not just them. They have information about all
1:04:50 the people who they communicate with, who they love, who they follow. And so I do think it’s a time where
1:04:58 that story ought to be going away pretty fast. Everybody has reason to want to avail themselves
1:05:04 of their constitutional rights to privacy, to avail themselves of due process. Even if you can’t think
1:05:09 of what you have that might be at risk, that story is changing so fast that I don’t think anybody can give
1:05:16 you an accurate, up-to-date risk profile for yourself. And you ought to take that into consideration.
1:05:24 What does it mean if you are threatened, you or Wikipedia or NPR is threatened with the loss
1:05:28 of a not-for-profit status? What would it mean to you if you lost that?
1:05:35 Oh, it would be terrible. Again, EFF gets support from individuals and many of those individuals get
1:05:40 a tax deduction for supporting us. Now, lots of people don’t. And I think that there is a community
1:05:45 of support that ought not be dependent on nonprofits. And we ought to think about that a little hard,
1:05:51 but we’ve built up this system for civil society, for nonprofits like NPR and otherwise,
1:05:58 that is really based on the idea that there is a tax protected status for our donations. That if that
1:06:03 goes away, people are going to have to get funding in a way that isn’t tax protected. And that’s okay
1:06:08 for individual donations, as again, there are wealthy individuals who itemize for whom this is an important
1:06:14 thing. And that’s a big source of funding. But there’s a lot of poor people who support charities even if they
1:06:20 don’t get a tax deduction. So we need to think about that. But when it comes to foundations and
1:06:26 other kinds of money, many of those foundations can only give to organizations that have C3 status.
1:06:30 So if the MacArthur Foundation or the Ford Foundation, or even the foundations on the right
1:06:36 wanted to give money that wasn’t tax-free, they can’t. They have to change their whole charters and
1:06:42 ways of being in order to do that. So it’s a huge drain of money and support from these organizations
1:06:48 that do everything from soup kitchens and being in the court to religious organizations. They’re all C3,
1:06:55 those kinds of things, as well as people like me who do civil liberties and civil society protections to
1:07:00 people like NPR and other things who provide us information. It’s a huge blow and a huge risk to
1:07:06 this entire sector. Again, because we built up a system that is all interlocking and is all based
1:07:12 on the idea that the IRS C3 protection means that something is in the nonprofit side.
1:07:20 So how do you think this all plays out? There’s like some possibilities, like we all wake up,
1:07:27 there’s a midterm slaughter, we all go, phew, we duck that bullet. That’s one possibility. Another
1:07:34 possibility is Margaret Atwood come to find out wasn’t a novelist. She was a historian. She got it all right.
1:07:41 Another possibility is we have this performative democracy with a constitution and separation of
1:07:47 powers and balance of power, but none of that is really true. What’s your prediction for what’s
1:07:52 going to happen? I’m so bad at predicting. I’m really not good at it. We’re going to work really
1:07:57 hard to make sure that we end up in a position where we are still a self-governing, constitutionally
1:08:02 protected society. We’re going to pull all the levers we can. I would say, look, I was a person who
1:08:09 told the founders of Wired Magazine that the country didn’t need another tech magazine. I am so bad at
1:08:14 predicting the future, but I can say that we won’t get to a better future unless people lean in and try.
1:08:19 We’re not going to be able to just sit back and have this magically fix itself. We didn’t get into this
1:08:24 problem overnight. I don’t think we’re going to get out of it overnight. And we need people to vote,
1:08:30 to lean in, to support the organizations that are working to do this. Of course, EFF is one of them,
1:08:35 but we’re not the only one. Whatever speaks to your heart. We need people to show up. We don’t have an
1:08:43 armchair democracy anymore. We need people to show up to make their voices heard because without that,
1:08:48 we will definitely lose. Sometimes people ask me, are we just going to lose no matter what? And I’m
1:08:54 like, we could lose today or we could fight and lose in the future. And those are the only two choices,
1:08:58 because if you just sit back, it’s not going to get better magically. So I think people need to lean
1:09:03 in. They need to engage. They need to find what speaks to their heart and really show up for it.
1:09:08 I hope for some people that’s EFF because I think we do show up and we have done it. We know how to push
1:09:14 right now. But if that’s not the thing, find the thing that works for you because we’re not going to get
1:09:19 out of this by just sitting back and magically thinking things are going to get better. Tech’s
1:09:25 not going to solve this all on its own. Tech needs people who are willing to step in and make sure that
1:09:29 our tech and our world support us rather than suppress us.
1:09:36 Wow, Cindy. So listen, I want to thank you. I want to thank you in two senses. The first sense is,
1:09:42 of course, for the simple act of coming on my podcast, because it’s been a very remarkable
1:09:49 podcast. But even bigger, I want to thank you and the EFF for the work that you’re doing to preserve
1:09:56 democracy. The work you’re doing is so important. And as soon as I hang up, I’m going to send you money.
1:10:00 Thank you so much. Oh, guy, that’s wonderful.
1:10:05 Thank you very much, Cindy. And just let me thank the Remarkable People team. That would,
1:10:13 of course, be Madison Nismer, who is our producer. Jeff C is our co-producer. And we have a sound
1:10:19 design engineer named Shannon Hernandez and a researcher named Tessa Nismer. So, Cindy,
1:10:24 that’s all the people on the Remarkable People team. And we’re trying to make this a remarkable,
1:10:25 long-lasting democracy.
1:10:32 This is Remarkable People.

Four years out of law school, and she’s taking on the entire U.S. Department of Justice? Meet Cindy Cohn, the attorney who turned a Haight-Ashbury party connection into one of the most pivotal legal victories in internet history. As Executive Director of the Electronic Frontier Foundation—the world’s leading digital rights organization—Cindy commands a team of 125 lawyers, technologists, and activists fighting the surveillance state daily. She spills the brutal truth about encryption backdoors threatening global security, why the “nothing to hide” argument crumbles in 2025’s political reality, and how well-intentioned laws become authoritarian weapons. From tactical Signal advice to border crossing strategies, Cindy shares the security practices she actually uses while exposing how the UK’s encryption demands could destroy privacy worldwide. This conversation will shatter your assumptions about online privacy and arm you with the knowledge to fight back against the surveillance state while revealing EFF’s urgent mission to reclaim our digital democracy.

Guy Kawasaki is on a mission to make you remarkable. His Remarkable People podcast features interviews with remarkable people such as Jane Goodall, Marc Benioff, Woz, Kristi Yamaguchi, and Bob Cialdini. Every episode will make you more remarkable.

With his decades of experience in Silicon Valley as a Venture Capitalist and advisor to the top entrepreneurs in the world, Guy’s questions come from a place of curiosity and passion for technology, start-ups, entrepreneurship, and marketing. If you love society and culture, documentaries, and business podcasts, take a second to follow Remarkable People.

Listeners of the Remarkable People podcast will learn from some of the most successful people in the world with practical tips and inspiring stories that will help you be more remarkable.

Episodes of Remarkable People organized by topic: https://bit.ly/rptopology

Listen to Remarkable People here: **https://podcasts.apple.com/us/podcast/guy-kawasakis-remarkable-people/id1483081827**

Like this show? Please leave us a review — even one sentence helps! Consider including your Twitter handle so we can thank you personally!

Thank you for your support; it helps the show!

See Privacy Policy at https://art19.com/privacy and California Privacy Notice at https://art19.com/privacy#do-not-sell-my-info.

Leave a Reply

Your email address will not be published. Required fields are marked *