AI transcript
0:00:02 There’s a lot nobody tells you about running a small business,
0:00:08 like the pricing, the marketing, the budgeting, the accidents,
0:00:13 the panicking, and the things, and the things, and the non-stop things.
0:00:17 But having the right insurance can help protect you from many things.
0:00:20 Customize your coverage to get the protection you need
0:00:22 with BCAA Small Business Insurance.
0:00:28 Use promo code PROTECT to receive $50 off at bcaa.com slash smallbusiness.
0:00:34 Using AI chatbots is pretty easy.
0:00:38 Knowing how to feel about them, that’s more complicated.
0:00:42 You know, and I don’t think that biologically we’re necessarily equipped
0:00:46 to be emotionally handling this type of relationship
0:00:48 with something that’s not human.
0:00:50 Our AI companions.
0:00:52 That’s this week on Explain It To Me.
0:00:56 New episodes every Sunday, wherever you get your podcasts.
0:01:06 Episode 357.
0:01:09 357 is the area code covering parts of Central California.
0:01:14 In 1957, the baby boom hit its peak with more than 4.3 million births,
0:01:17 and Sputnik launched, which kicked off the space race.
0:01:20 I remember when I was a younger man, a boy actually,
0:01:24 saying to my mom, Mom, someday I’m going to be shot into space.
0:01:25 To which my mom replied,
0:01:28 well, if your dad had done his job, that would have happened.
0:01:31 Go! Go! Go!
0:01:42 Welcome to the 357th episode of The Prop G Pod.
0:01:43 I am in Aspen.
0:01:44 Why am I here?
0:01:45 Because I can be.
0:01:46 I absolutely love it here.
0:01:48 Building a home here.
0:01:51 This is where I’m going to sit around and wait for the ass cancer,
0:01:54 meaning this is where I think I’m going to leave feet first,
0:01:58 where I plan to wind down and give up podcasting when I’m, I don’t know, 93.
0:01:59 I think that would be a good time.
0:02:02 I used to think when I was a younger man, when I was in my 40s,
0:02:06 that I was going to create space or room and go totally dark on social media
0:02:08 and stop podcasting by the time I was 50.
0:02:09 But here’s the thing.
0:02:12 I love the fame, the relevance, and the Benjamins.
0:02:20 But I am here and our technical director, Drew, rented an apartment called the Aspen Alps
0:02:23 right on the mountain here and set up this giant studio.
0:02:25 So I hope you appreciate all the production values here.
0:02:30 They told me to take off my hat because they didn’t like the way it looked.
0:02:31 And I took it off and I thought, you know, fuck it.
0:02:32 It’s my image.
0:02:33 It’s me.
0:02:34 They’re AI.
0:02:34 I’m me.
0:02:35 I own me.
0:02:36 I own the digital, Scott.
0:02:39 I’ve been, it’s been an emotional weekend for me.
0:02:43 I’ll get back to that later in the episode and I look like shit and I’m self-conscious
0:02:45 and, you know, all that good stuff.
0:02:47 But what are we doing here?
0:02:48 We’re very much enjoying ourselves.
0:02:53 I think I used to come to Aspen in the winter, put our kids on skis, came here in the summer
0:02:55 and now just come here in the summer.
0:02:57 I think mountain towns in the summer are absolutely wonderful.
0:03:00 I went to this place called Woody’s Creek’s Tavern, Woody Creek Tavern yesterday.
0:03:05 And a bunch of people rolled up on a horse, which I thought was ridiculously cool.
0:03:06 Okay, what’s going on?
0:03:08 The Epstein file.
0:03:10 I got this wrong.
0:03:11 I thought it was going to blow over.
0:03:12 I thought people were sick of hearing about it.
0:03:17 But it ends up that when you promote conspiracy theory for a good, I don’t know, five or seven
0:03:21 years and won’t stop hammering on it and then keep talking about this file and this list that
0:03:24 when you decide, oh no, I’m on the list and I’d rather not come out.
0:03:28 So nothing to see here, folks keep moving along, that everyone gets angry.
0:03:33 I did watch, I did enjoy watching Alex Jones cry in his car over the Epstein list.
0:03:39 But a lot of this comes down to sort of a major theme, I think, or a broader theme, and that
0:03:40 is one of identity.
0:03:46 And I think under the auspices of being able to create bots, not being subject to standards
0:03:52 around moderation, a public, and not taking responsibility for the comments they make, that
0:03:59 identity or specifically some sort of fidelity or irrational passion for the value of anonymity
0:04:05 has really hurt our society, and that is whether it’s, look at the most depraved behavior on
0:04:06 behalf of our government right now.
0:04:11 I would argue that it’s simply put is, is it the administration cutting food stamps?
0:04:13 That’s right up there.
0:04:20 Or the world’s wealthiest man murdering or killing the world’s most vulnerable and poorest
0:04:20 children?
0:04:21 That’s right up there.
0:04:28 But right close, maybe a close third, would be a bunch of individuals who’ve been weaponized
0:04:34 to create a private army for the president, who separate, rip families apart, are now, I
0:04:35 guess, rounding up citizens as well.
0:04:39 When you treat people differently based on identity, that is the definition of racism.
0:04:44 And these actions are, in fact, racist, where they’re targeting people based on their identity,
0:04:45 not on their behavior.
0:04:47 And what do we have?
0:04:51 We have individuals who realize how depraved this behavior is, so they wear masks.
0:04:53 They hide their identity.
0:04:56 And online, we have a lot of people with masks.
0:05:02 It’s somewhere between 20, 40, 50 percent sometimes of activity on a social media platform are bods
0:05:06 who have been weaponized by someone who doesn’t want you to know their identity, because what
0:05:09 they’re saying is either slanderous or they’re too cowardly to live up to it, or they would
0:05:13 be embarrassed to say such aggressive, inaccurate things.
0:05:19 And so we tolerate it under some bullshit notion that a civil rights activist in the
0:05:20 Gulf needs anonymity.
0:05:24 Well, with the blockchain, you could probably allocate a certain number of accounts for anonymous
0:05:26 accounts if, in fact, they needed the anonymity.
0:05:30 But the 99.9 percent of people who are just acting like cowards or being aggressive or tearing
0:05:34 at the fabric of our society because of anonymity, I don’t buy that bullshit.
0:05:42 When some idiots at UCLA decide to pass out bans to non-Jews and then we’ll let anyone without
0:05:46 abandon, i.e. Jewish people, onto certain parts of the UCLA campus and the UCLA leadership does not
0:05:49 show up to stop that shit right away.
0:05:51 What do those people do?
0:05:52 They wear masks.
0:05:58 So I think it’s pretty easy to spot people with depraved who are about to engage in things that
0:06:01 they do not want to associate their identity with because they are wrong.
0:06:06 And whether it’s a stormtrooper for Star Wars, a member of the KKK, a member of ICE,
0:06:12 or all of these bots online, anonymity has become a real problem in our society.
0:06:18 And that is, just as an example, I get a lot of really nice messages online.
0:06:20 I also get some of the vilest shit I’ve ever seen in my life.
0:06:24 And if I were a woman, I would be really, I would feel physically intimidated.
0:06:28 And I’ve been forwarded some messages that some of my female friends get online and it
0:06:29 is just totally unacceptable.
0:06:31 And it’d be pretty easy.
0:06:35 Find out who that motherfucker is on the other side of that keyboard and they will stop because
0:06:40 they will realize what they’re doing not only carries penalties, but just does not acquit
0:06:41 them very well.
0:06:45 And, but instead we’ve decided, oh, we need to sell more Nissan ads.
0:06:48 And under this bullshit notion that anonymity is key to progress.
0:06:49 No, it’s not.
0:06:54 And you could have a certain amount of anonymity for people who have a legitimate reason to be
0:06:54 anonymous.
0:07:01 But there is an issue here around our love of letting people have no accountability for their
0:07:06 actions under the auspices of some sort of First Amendment or free speech or protection.
0:07:10 And it has gone too far and the snake is eating its tail.
0:07:14 I would like to see, I like the fact that there’s cameras everywhere in New York and London.
0:07:19 What you also need when you have this kind of surveillance technology is really strong laws
0:07:26 to ensure, I don’t even think any camera footage or online tracking can be used to prosecute
0:07:27 someone in a misdemeanor.
0:07:31 I think it has to be a very serious crime and there has to be a lot of safeguards that err on
0:07:36 the side of not getting a search warrant for that data such that people feel comfortable being
0:07:39 their true selves, but at the same time have to represent their identity.
0:07:42 But where I was headed was some really vile shit online.
0:07:47 If I’ve been recognized several times in Aspen and people couldn’t be nicer or wherever I
0:07:50 am in the world, even when people disagree with me, they come up and say, I didn’t like your
0:07:50 take here.
0:07:51 This is what I think.
0:07:52 And they listen and they’re thoughtful.
0:07:58 And one of the really terrible things about AI and LLMs is LLMs are crawling the online
0:08:03 world, which is much harsher and much more cowardly and much more mendacious.
0:08:03 Why?
0:08:04 Because of anonymity.
0:08:09 Whereas if these AI LLMs were crawling the real world where people have to take responsibility
0:08:14 for what they say and you get to look them in the eye when they say something, I think the
0:08:19 world would be a better place because AI would be training people how to behave in person where
0:08:23 you have accountability as opposed to training the world to behave the way they behave online.
0:08:28 And it’s not only people who individually pulse negative behavior.
0:08:32 There are people who want to create dissent and tear out the fabric of America, i.e.
0:08:39 the GRU and the CCP, and create millions of bots that manufacture content that doesn’t even reflect
0:08:44 how any individual feels, but gives you the impression that this is how millions of people
0:08:44 feel.
0:08:49 If you wanted, say you were pro-Ukraine, say you were a professor who was constantly talking
0:08:54 about Putin’s illegal invasion of Europe and how the U.S. should absolutely allocate the funds to
0:08:59 push back on a murderous autocrat, wouldn’t you be stupid not to create a troll farm in Albania
0:09:05 and then slowly but surely using AI, try to undermine that professor’s credibility with negative comments
0:09:10 all of the time about any of his or her content?
0:09:12 And I believe those lists have been assembled.
0:09:14 It would be stupid not to weaponize those lists.
0:09:15 And oh, great.
0:09:20 We have social media platforms that love the lies because the lies and the aggressive behavior
0:09:22 create more engagement.
0:09:24 The algorithms are Tyrannosaurus Rex.
0:09:26 They’re attracted to movement and violence.
0:09:29 And it creates more clicks, more engagement, and more Nissan ads.
0:09:30 So where are we?
0:09:35 Should an individual have First Amendment rights and be able to say pretty much anything about
0:09:37 pretty much anybody at pretty much any time?
0:09:38 Yeah, I think so.
0:09:41 But should a bot have First Amendment protection?
0:09:42 I don’t think so.
0:09:44 Should we be creating this atmosphere?
0:09:47 And it has gotten much worse over the last 20 years.
0:09:53 Where anonymity serves as a chaser and an incendiary to take the worst among us
0:09:56 and absolutely expand that behavior and forgive them for it
0:09:59 and encourage that behavior and also let bad actors
0:10:04 pretend to be people who engage in some of the most uncivil conduct
0:10:07 experienced in our society.
0:10:12 So I’m a big fan of getting rid of this love of anonymity.
0:10:16 And if you look at what’s going on, whether it’s ICE, whether it’s troll farms,
0:10:21 whether it’s people spewing hate speech on campus, what’s the problem?
0:10:21 Anonymity.
0:10:24 You want to show up and protest?
0:10:24 Fine.
0:10:30 But I don’t think a movement where everyone on your side feels the need to wear a mask,
0:10:35 I think that says something about what you’re saying and says something about your character.
0:10:39 Anonymity has been abused and it is tearing at the fabric of our society.
0:10:41 Okay, moving on.
0:10:45 In today’s episode, we speak with Greg Lukianoff, a free speech advocate,
0:10:49 First Amendment attorney, president of FIRE, the Foundation for Individual Rights and Expression,
0:10:55 and co-author of The Coddling of the American Mind, and most recently, The Cancelling of the American Mind.
0:10:57 I’m going to bring up some of these topics with Greg.
0:11:00 We discuss with Greg free speech in a divided country,
0:11:04 how cancel culture took off, and what today’s campus protests tell us about the state of open debate.
0:11:09 We also get into how schools are failing to build resilient students.
0:11:13 So with that, here’s our conversation with Greg Lukianoff.
0:11:27 Greg, where does this podcast find you?
0:11:32 Maine, actually. This is my first day up here since last year.
0:11:34 First day up there. Where are you usually?
0:11:35 D.C.
0:11:40 Oh, nice. So let’s bust right into it. Give us your thoughts on cancel culture.
0:11:47 How did it start? Brief history of it. And how does it differ from accountability, so to speak?
0:11:53 Sure. I mean, I wrote a book called Canceling of the American Mind when I started the project.
0:11:59 20-year-old Ricky Schlott, who’s absolutely brilliant, and I feel very lucky to work with her.
0:12:07 And it definitely was one of those things that was a really striking discontinuity from the rest of my career.
0:12:11 I started working defending free speech on campus in 2001.
0:12:17 And back then, you were most likely to get in trouble on campus from administrators.
0:12:20 Professors were fairly good on freedom of speech.
0:12:25 Students were great on freedom of speech and freedom to differ, differing opinions.
0:12:36 But right around 2013, going into 2014, a cohort of students showed up that were much less, you know, just to be blunt, tolerant of difference.
0:12:41 But essentially, you started seeing a lot more demands that speakers be canceled.
0:12:44 You started seeing demands for new speech codes.
0:12:47 And this was a big shift from what I’d seen before.
0:12:50 But I also started seeing some of this happening off campus.
0:12:54 So I tried to define cancel culture as a historical period.
0:13:00 Because all moments in the history of censorship have commonalities.
0:13:03 But they also have things that make them distinct.
0:13:13 And I think one of the distinct characteristics of cancel culture is that it was essentially impossible to have it as we understood it without something like social media.
0:13:22 Something that allows you to create the reality or oftentimes appearance of a sudden mob that’s demanding you fire that one employee.
0:13:24 And this wasn’t a subtle shift.
0:13:29 You know, I’d been doing this job for a long time prior to 2014.
0:13:41 And from 2014 on, you know, I’ve seen more professors lose their job, more tenured professors lose their job than, you know, what I’d seen in my previous half of my career, you know, times 20.
0:13:44 You know, it was really quite a shift.
0:13:59 So the way that Ricky and I define cancel culture is the uptick of campaigns to get people fired, punished, penalized, expelled, or otherwise punished for speech that would be protected under the First Amendment.
0:14:12 There, I’m making an analogy to public employee law, which basically means that there’s some common sense injected in there, but that essentially you’re not supposed to fire people just for their outside speech as a citizen.
0:14:15 And the culture of fear that resulted from it.
0:14:19 And one thing you should notice about that definition is there’s no political valence to it.
0:14:22 So cancel culture is cancel culture, whether it comes from the left or the right.
0:14:35 Isn’t that one of the deltas, though, that there’s always been shaming or criticism of people if they, you know, if their narrative doesn’t match yours or an opportunity to kind of play into gotcha culture.
0:14:42 But what I see is the big difference over the last 10, 15 years is the discovery that you could go after people’s livelihood.
0:14:48 It used to be that that was somewhat isolated, like we’re, I mean, I don’t like what Greg says.
0:14:49 I’m going to publicly shame him.
0:14:50 I’m angry at him.
0:14:51 He’s a bad person.
0:14:52 You shouldn’t be his friend.
0:14:53 You shouldn’t be listening.
0:14:58 But it never jumped the shark to now go after his livelihood.
0:14:59 Wasn’t that the big difference here?
0:15:08 Well, that was part of the definition, our campaigns to get people punished in some real material way, like get them fired, expelled, etc.
0:15:10 It’s not cancel culture if you’re just telling it.
0:15:16 There was a phenomenon called trashing in the 1960s that Musa Al-Gharbi likes to point to.
0:15:23 And it’s this really nasty, vitriolic way of going after your political enemies that was everything you’re talking about.
0:15:24 It’s like the person’s a bad person.
0:15:25 Don’t listen to the person anymore.
0:15:31 Or they’re not, you know, they’re not doctrinaire, as I would like to be doctrinaire.
0:15:34 But it generally didn’t get to the point of, and this professor has to be fired.
0:15:41 And the biggest shift, the one that kind of shocked me, was the uptick around 2017.
0:15:46 Because at first, students were focusing on each other and outside speakers.
0:15:52 But 2017 really marks the moment when they started going after professors in large number.
0:15:56 Talk a little bit about, quote unquote, de-platforming.
0:16:15 De-platforming is just the idea that essentially, the way we define de-platforming in our research department at FIRE is essentially getting a speaker, either getting them disinvited or making it so difficult to hear them or otherwise chasing them off campus.
0:16:23 The ones that scare me, the kind of de-platforming that actually scares me the most are the ones that involve violence or the threat of violence, for obvious reasons.
0:16:26 It’s primarily targeted at speakers.
0:16:31 We also consider it de-platforming if you do the same thing to, like, say, playing a movie.
0:16:34 That essentially you’re showing a documentary that’s not very popular on campus.
0:16:36 Students show up and shout it down.
0:16:40 But this is actually one of the areas where a lot of it actually comes from the right as well.
0:16:49 Because for a lot of speakers on campus, there’ll be off-campus pressure to get that person disinvited.
0:16:57 And this is particularly true of, say, speakers that could be painted as, like, a pro-choice and sometimes Catholic groups.
0:16:59 I think the Cardinal Newman Center is big on this.
0:17:03 Pressure schools to disinvite that person.
0:17:12 So generally, and this is for fairly obvious reasons, if the threat to free speech and de-platforming comes from the left, it tends to come from on campus.
0:17:15 If it comes from the right, it tends to come from off campus.
0:17:19 I saw some of this as a faculty member at NYU.
0:17:31 I remember about 10, 15 years ago, it became sort of in vogue for department chairs to put out very long emails about how certain microaggressions would not be tolerated.
0:17:38 And it was, OK, we’re charging kids $280,000 to come here.
0:17:41 Some of them leave riddled in student debt.
0:17:44 Two-thirds of the faculty probably isn’t holding their weight.
0:17:47 So there was an opportunity to step into this virtue circle.
0:17:50 And no one could ever criticize them.
0:17:51 I’d do anything but applaud.
0:17:52 Otherwise, you are a racist.
0:17:59 And there was this, almost this sort of self-appointed police.
0:18:12 That, and it was always, quite frankly, and I consider myself a pro-progressive, but people never got counseled for being too progressive.
0:18:15 And it felt very unhealthy.
0:18:27 And then, well, comment on that, and then I’m going to play identity politics and just make some anecdotal observations in the classroom and see if there’s any actual data that supports my thesis.
0:18:43 But talk about how all of a sudden, do you think some of it is, I just saw it as people who weren’t adding any actual value and were trying to find some merit and grab virtue or some sort of relevance and saw this as an easy way to try and grab status, so to speak.
0:18:47 Yeah, I think it’s a lot of things going on at once.
0:19:12 Well, one thing from my work with Jonathan Haidt, one chapter we ended up leaving out because I just didn’t have enough research to back it up, was my intuition that a lot of the phenomena we were seeing seemed to be playing out some of the values of the circa 2010 anti-bullying movement.
0:19:17 If you can get me to have a little time to develop this, I can explain it.
0:19:31 So in coddling, we talk about there being three great untruths, which are basically terrible advice to give someone that’s inconsistent with either modern psychology or ancient wisdom, and that will make you more miserable if you believe them.
0:19:37 And so we give this negative advice as what doesn’t kill you makes you weaker, you know, kind of like the opposite of Nietzsche.
0:19:43 The second one is always trust your feelings, which sounds nice, but it’s just awful advice.
0:19:54 And three, life is a battle between good people and evil people, which is contrary to a more sophisticated understanding of that everybody has some aspect of good and evil within them, which is more how I was raised.
0:20:10 And there was a critic who pointed out after coddling came out in 2018 that these were more or less kind of like the way anti-bullying was being taught after sort of like a moral panic about it.
0:20:14 Not that bullying isn’t real and should be addressed, but things manifest in their own ways.
0:20:23 And this was primarily due to parents being aware of more of this stuff due to the fact that they could see it on their cell phones, they could see it on their on their screens.
0:20:27 And this did have an emphasis of, you know, human fragility.
0:20:31 If you feel that you’ve been wronged, you’ve been wronged.
0:20:39 And that there’s basically only two types of people, good people and evil people, you know, good people, victims and bullies.
0:20:43 I think that this wasn’t the only cause by any any stretch of the imagination.
0:20:51 But to explain why I think young people had such sympathy for this movement, I think they were framing it partially in that way.
0:20:55 But unfortunately, because the rule of human behavior is that all motives are mixed.
0:21:07 It also met with something that my very young colleague and co-author on Canceling the American Mind was able to point out that she was part of the first generation of people to grow up with cell phones in her pocket.
0:21:09 She had them since she was 10.
0:21:16 And in junior high school, you know, it took on very much the nature of what you would expect.
0:21:32 A mass communication device given to kids in junior high school, it became a way of showing aggression against your perceived enemies, but doing it within the rules of the time, which you don’t call out someone for being unpopular or ugly.
0:21:37 You call them out for being something more that makes you allows you to feel more moral.
0:21:49 So I think these two mixed motives kind of came together for the students when it comes to the utterly crucial role, though, of the administrators, because this one, if administrators looked at this and went, no way.
0:21:49 No, no, no, no, no.
0:21:53 You’re not you’re not you’re not getting a professor fired because you don’t like what they said.
0:21:55 This would have died in the crib.
0:22:01 But they had met those same administrators we’ve been fighting for fighting against at fire forever.
0:22:20 And together it created this kind of calamity for for freedom of speech, where these people who already believed it was their job to say what shall be orthodox on this particular campus met a cohort of students that were more willing to play along with that, too.
0:22:22 And again, as with all things, with mixed motives.
0:22:27 You said something that I thought was so, I don’t know, puncturing.
0:22:35 You said that students are being taught the mental habits of anxious and depressed people, which really struck me.
0:22:42 And I thought we’re teaching the kids to be fragile and actually make them less resilient.
0:22:45 It’s more than just word.
0:22:48 These I have seen this evolution where kids in my class.
0:22:50 They feel weaker.
0:22:56 It’s not just it’s not just a cool virtue thing and trendy or fashionable.
0:23:01 They appear to me to be less resilient.
0:23:03 Talk about that.
0:23:29 Yeah, I mean, the whole project with me and me and John came out of my observation when I was dealing with my own anxiety and depression and cognitive behavioral therapy is what saved me and utterly transformed my life, which is a process by which you develop all these tools and sort of talking back to the exaggerated voices in your head that tell you things like you’re doomed or you’re a failure or nobody loves you.
0:23:34 All of these kinds of voices that to a degree, to be clear, everybody except sociopaths have.
0:23:39 But, you know, when you’re anxious and depressed, they’re louder and they’re harder to ignore.
0:23:53 The amazing thing about CBT is it teaches you that if you actually rationally not not power of positive thinking stuff, but just rationally interrogate these, you realize you’re overgeneralizing or engaging in fortune telling that.
0:23:59 You know, you know, you know, you know, you know, you’re overgeneralizing or mind reading all these things that logically don’t really stand up to scrutiny.
0:24:11 And the observation that I that really brought me to talk to John about a potential collaboration, although actually I just told him the idea that I thought was cool, I didn’t actually think we’d collaborate on it.
0:24:26 That was a that was a dream come true, was just that it was like we were teaching kids reverse CBT, that we’re teaching them do overgeneralize, do catastrophize, do engage in binary thinking, do do believe, you know, the future.
0:24:45 And and and and it comes from, I believe, two different places, one, a very well meaning idea from both parents and administrators all through all through K through PhD to sort of and an instinct to sort of protect young people and to insulate them from harm.
0:24:57 But then a less admirable quality is that essentially if you make people feel guilty or frightened, it’s in theory will motivate them towards political action that you prefer.
0:25:01 And to me, that’s that that’s the one that makes me pretty angry.
0:25:06 The first one makes me kind of sad because it’s like, yeah, no, it’s an understandable instinct, but it’s still terrible advice.
0:25:22 And you should have known that the second one is the idea that we can sort of guilt, shame, anger, upset through telling people that they are more fragile than they are, that they are in greater danger than they are, that that will somehow result in a better world.
0:25:36 And I always make the point, listen, this is a bad calculation, even just rationally, because people who are filled with despair and anxiety don’t always choose, to say the least, the best course of action to get from point A to point B.
0:25:41 Yeah, I want to make some observations anecdotal, and you tell me if there’s any data to back it up.
0:25:56 In terms of, I think, I’m trying to think of that product, was it called JotForm, where all of a sudden someone would get upset by something and spin up an online petition and within a certain amount of time, everybody thought it was cool to join in and the dean had to deal with this bullshit.
0:26:17 And I do think a lot of it was bullshit, but a couple observations, and you tell me if the nullifier validate them, I would never got in the way, or I was never subject to this sort of scrutiny or blowback, because the first thing I say in class, one, I’m known as being provocative and, quite frankly, a little bit aggressive and obnoxious, so the expectation is there.
0:26:32 And the first thing I say is, if you think there’s a non-zero probability, something I say is going to trigger you, I curse, I have certain unconscious biases I’m still working on, but if you think something’s really going to emotionally trigger you, you should call your parents and tell them to come get you, because you’re not ready for college.
0:26:41 So there’s a certain expectation that I’m going to be a little bit out there, and no one ever has, I’ve never gotten run over by this.
0:26:53 I have some colleagues who are much more thoughtful and considerate than me, 99.9% of the time, and then they make an error.
0:27:01 They’re inarticulate around something, and it’s shocking because they’re known as these nice, benign people.
0:27:15 And then they get taken, you know, they get taken out and shot, because it’s almost like those of us who are a little bit more aggressive and provocative regularly are not subject to the same scrutiny as someone who makes one false move.
0:27:33 And then the second observation I would make, and this is identity politics, but I’m going to do it anyways, the people I have observed in class who get really upset, I mean physically upset, they’re not faking it, it tend to be women, tend to be white women.
0:27:35 Upper-class white women, yes.
0:27:37 What’s the data there?
0:27:48 The data on women, particularly white women, and particularly upper-class white women being more free-speech skeptical is just very apparent.
0:27:55 And that’s one of those things that, you know, it makes me a little uncomfortable to say it, but it’s a consistent finding.
0:28:00 And it’s something that we found among the professorate, among students as well.
0:28:11 And, of course, I think the free-speech skepticism comes from a good place, but so does all of censorship, you know, so I don’t think that that’s a strange observation.
0:28:25 The observation that the professors who start out more sympathetic tend to be more vulnerable definitely accords with my professional experience, that it certainly seems that way.
0:28:37 But then there’s also a category of sort of outspoken conservative or outspoken, you know, iconoclastic professors that I’ve definitely seen get targeted quite aggressively over the years.
0:28:48 So, in some ways, I will say that some of your observations come a little bit from luck, because, like, you get one student or one administrator who decides, I’m getting Scott Galloway, and the whole dynamic changes.
0:28:52 We’ll be right back after a quick break.
0:29:09 Whether you’re a startup founder navigating your first audit or a seasoned security professional scaling your GRC program, proving your commitment to security has never been more critical or more complex.
0:29:11 That’s where Vanta comes in.
0:29:22 Businesses use Vanta to build trust by automating compliance for in-demand frameworks like SOC 2, ISO 27001, HIPAA, GDPR, and more.
0:29:33 And with automation and AI throughout the platform, you can proactively manage vendor risk and complete security questionnaires up to five times faster, getting valuable time back.
0:29:36 Vanta not only saves you time, it can also save you money.
0:29:47 A new IDC white paper found that Vanta customers achieve $535,000 per year in benefits, and the platform pays for itself in just three months.
0:29:50 For any business, establishing trust is essential.
0:29:53 Vanta can help your business with exactly that.
0:29:59 Go to Vanta.com slash Vox to meet with a Vanta expert about your business needs.
0:30:03 That’s Vanta.com slash Vox.
0:30:11 President Trump met with the leaders of five African nations at the White House yesterday.
0:30:16 One oops got all the attention when Trump paid Liberia’s president a compliment.
0:30:18 Well, thank you.
0:30:19 It’s such good English.
0:30:20 Such beautiful.
0:30:23 Where did you learn to speak so beautifully?
0:30:26 English is Liberia’s official language.
0:30:28 Were you educated where?
0:30:29 Yes, sir.
0:30:31 In Liberia.
0:30:32 Yes, sir.
0:30:33 Well, that’s very interesting.
0:30:37 Anyway, you know what happened behind closed doors right before that meeting?
0:30:43 President Trump pushed those African leaders to accept people who are being deported from the U.S.
0:30:46 That’s according to a Wall Street Journal exclusive.
0:30:50 In fact, it’s trying all kinds of ideas to increase the pace of deportations.
0:30:53 And we’re going to tell you about some of them on Today Explained.
0:30:56 Today Explained is in your feeds every weekday.
0:31:06 This week on Net Worth and Chill, we’re diving deep into Trump’s one big, beautiful bill,
0:31:11 the sweeping legislation that promises to reshape America’s economic landscape.
0:31:15 From tax cuts to student loans, I’m breaking down what this massive piece of legislation
0:31:19 actually means for your wallet, your investments, and your financial future.
0:31:22 We’re going to find out who wins and loses in this economic overhaul,
0:31:25 analyze the market reactions that have investors buzzing,
0:31:30 and discuss whether this bill will deliver on its promises or create unexpected consequences.
0:31:34 Just because you’re not on Medicaid doesn’t mean this doesn’t impact you.
0:31:37 Poor people don’t stop having medical emergencies.
0:31:39 They just stop being able to afford them.
0:31:44 Listen wherever you get your podcasts or watch on youtube.com slash yourrichbff.
0:31:53 What are your thoughts on how the presidents of Harvard, MIT, and Penn handled that situation
0:32:01 and generally assess their response, Congress’s or the Congressional Committee’s viewpoint on this?
0:32:03 And this is a difficult one.
0:32:10 And what is the line between free speech and hate speech from people in masks that creates an environment
0:32:12 that’s unhealthy for the community?
0:32:13 Your thoughts?
0:32:22 Now, there have been critics who have been really critical of those professors when they went to the anti-Semitism hearing in December of 2023.
0:32:33 There have been people who have been primarily critical of the fact that when asked if calling for genocide was protected or not on their campus,
0:32:35 they said it depends on context.
0:32:37 Now, here’s the truth.
0:32:39 It does depend on context.
0:32:46 In First Amendment law, if you’re saying something, and particularly academically, if you’re saying something theoretically,
0:32:52 if you’re saying something in the course of a philosophical discussion, that is different than being like, I’m going to kill you.
0:32:53 So context is right.
0:32:58 But the reason why I nonetheless have zero sympathy, actually, to be fair,
0:33:05 I have sympathy for the president of MIT, because MIT has not been the best, but it sure as hell has not been the worst.
0:33:17 Penn and Harvard, Claudine Gay, I had no sympathy for them at all, because the reason why I have no sympathy for them is they’d been utterly terrible on freedom of speech prior to that point.
0:33:26 And definitely your critics, including people like Barry Weiss, have a good point that saying these people who claim to be exquisitely sensitive about fat phobia,
0:33:34 you know, is the example that she usually uses, were suddenly not caring if someone said something that sounded an awful lot like you,
0:33:36 your country should be wiped off the map.
0:33:37 So what is the line?
0:33:42 The line, as far as fire is concerned, and we think to a large degree the law is,
0:33:47 is something that actually crosses the line into anti-Semitic or racial or sexual harassment.
0:33:51 And that’s not as simple as just saying something offensive.
0:33:56 Actually, it can’t just be saying something offensive, which, by the way, I think is absolutely the right rule.
0:34:04 I think the situation for free speech would be even worse than it currently is if we didn’t have that bedrock principle,
0:34:05 which is called in the law.
0:34:11 It has to be a pattern of discriminatory behavior directed at somebody for it to be harassment.
0:34:16 But a lot of what we saw on campus after October 7th, you don’t even have to get to that question.
0:34:24 A lot of what we saw was, you know, violent attempts to intimidate, actual threats, in some cases, physical assault,
0:34:29 taking over buildings, all of these things that are just not protected, nor should they be.
0:34:32 And particularly, this is something that I just did a TED Talk.
0:34:36 And the thing that I, and I probably angered some people by opening up with this example,
0:34:38 but I want to be really clear here.
0:34:41 There was a speech, for example, at Berkeley.
0:34:50 There was an Israeli Defense Force speaker there, and students organized to, and it’s nice to have this actually like in a,
0:34:54 like in a screenshot of a tweet, of a text message to everybody.
0:34:56 Shut it down.
0:35:05 And 200 students stormed where the guy was supposed to speak, you know, broke down a door, broke down a window, and chased the guy off.
0:35:09 And it’s, and I always have to explain, okay, that’s mob censorship.
0:35:16 Like, like, like that is an attempt by people to say, 200 people to say to anybody who would want to hear this person,
0:35:19 you’re not allowed to hear this person because I don’t approve of them.
0:35:20 That’s not okay.
0:35:26 That’s the kind of thing that, in my opinion, should get you kicked out of a university because it means you’re not actually understanding the point of a university.
0:35:32 And one of the reasons why this angered some people is because I also tend to point out that from October 7th on,
0:35:38 all but about three, and in the two worst years for deplatforming involving violence and involving,
0:35:44 um, and involving shout downs, as they’re called, um, all but three were pro-Palestinian activists.
0:35:51 So, like, it was one of these things, we’re spending plenty of time defending the free speech rights of pro-Palestinian students,
0:35:57 pro-Palestinian professors, but nonetheless, we were also seeing these same students and, uh,
0:36:02 who expected to be protected by freedom of speech showing no respect whatsoever for the free speech of others.
0:36:05 So, a lot of these issues are actually not that hard.
0:36:13 The hardest issue you get into is essentially if it’s not a threat, if it’s not, uh, blocking someone from getting from point A to point B,
0:36:16 it’s just really, really offensive.
0:36:22 At what point does that actually become, uh, anti-Semitic or, uh, racial or ethnic harassment?
0:36:27 And the answer is essentially that, that, that Davis test, um, that we always refer to at FIRE,
0:36:32 which is, is it severe, persistent, and pervasive such that it causes a reasonable person
0:36:36 um, to, uh, uh, to be denied effectively in education.
0:36:43 So, that’s, that’s a high standard, but it should be if you’re dealing with something that has an, an offense aspect in it.
0:36:47 But again, for a lot of these situations, you didn’t even have to get to that analysis
0:36:50 because what the students were doing wasn’t protected in the first place.
0:37:00 So, I’m asking this to learn, not to make a statement, but I struggle with, I feel like anonymity has been conflated with free speech to our detriment.
0:37:10 That some of the really vile things you see online, you know, I understand how important it is for individuals to have First Amendment free speech rights,
0:37:12 but I don’t think that applies to bots.
0:37:20 And I think that an understandable protection for anonymity has morphed into a total lack of accountability
0:37:22 and a real coarsening of our discourse online.
0:37:27 And I think it extends into letting government agencies wear masks and things like that.
0:37:35 But I’m curious, I’d love to just get your thoughts on the fulcrum between the importance of people having the right to say things anonymously
0:37:39 because what they’re saying could trigger danger or self-harm or harm for them.
0:37:45 And at the same time, how this reverence for anonymity may have gone too far and resulted in a lack of accountability
0:37:48 and some really ugly shit spreading online.
0:37:50 Your thoughts, Greg?
0:37:54 Well, in terms of First Amendment law, anonymous speech is protected.
0:37:57 But I don’t think that’s sufficient enough of an answer.
0:38:03 And I think, I tend to think of the justification for anonymity as like a seesaw.
0:38:12 That essentially, if we lived in a free and enlightened society in which people welcomed dissent and welcomed disagreement
0:38:16 and there was no imaginable idea that you’d be punished for it,
0:38:22 then the justification for anonymity would kind of fall on, bring hollow to people.
0:38:24 They’d be like, who cares?
0:38:25 But we don’t live in that world.
0:38:27 And we live less in that world than we used to.
0:38:37 Because even, I’d say, 10 years ago or 12 years ago, before cancel culture, the idea of saying something that was your genuinely held opinion
0:38:46 had a much lower likelihood of ruining your career, add to it the possibility of being actually punished in some way.
0:38:56 Now, that certainly applies to now, to a much larger degree than I ever thought I’d see, to say, a lot of countries in Western Europe at this point, a lot of countries in the Anglosphere.
0:39:03 I mean, you know, by different estimates, they’re arresting something like 30 people a day for offense speech in Britain.
0:39:11 I’ve heard different accounts, but generally they go between 7 and 40 people being arrested a day for that.
0:39:14 Germany, you know, like that will brag about the fact that they did morning raid.
0:39:20 They did this on 60 Minutes as well, brag about doing morning raids on someone who called a politician a penis.
0:39:25 Under that situation, the justification for having an anonymous speech goes way up.
0:39:27 Can it and is it abused?
0:39:29 Absolutely.
0:39:35 But, you know, I think actually I’m going to quote Milton Friedman here, but it’s just a really good quote.
0:39:37 Something isn’t a right unless it can be abused.
0:39:39 I like that.
0:39:52 What about, so Section 230, the idea that these nascent platforms aren’t subject to the same kind of libel, slander, disparagement laws that traditional media platforms.
0:39:54 What are your thoughts on that?
0:39:59 I think we toy with Section 30 to our great peril.
0:40:07 I think that it’s, you know, like democracy, it’s the worst of any system except all the others.
0:40:13 Now, to be clear, there might be some other system that I haven’t thought of that could be better.
0:40:23 But when it comes to things and but I do find it particularly almost amusing that conservatives are going after 230 or were going after 230 with such gusto.
0:40:47 Because if you actually even let’s just take it to the defamation protections that 230 gives to ISPs, to Internet service providers, if suddenly that were to vanish, it would lead to Internet service providers censoring a lot more, like a lot, lot more, because they can be held liable for liable for defamation.
0:40:57 And I think that given the biases in a lot of social media companies, that would wildly disproportionately affect what conservatives say.
0:41:02 So I think that 230 is probably I think that overall we benefit so much from 230.
0:41:04 Of course, it’s going to have downsides.
0:41:25 But you see you don’t see an issue that traditional media platforms, which are struggling to stay viable and raise the funds to do fact checking and put out, I don’t want to say the truth, but a greater attempt to do the good work of journalism and fact check and do their research.
0:41:33 You don’t see that a problem with holding traditional media to an entirely different standard, a higher standard than these online platforms.
0:41:35 Yeah, and that’s generally the way they’re held.
0:41:40 Essentially, traditional media that’s responsible for the content that they produce.
0:41:49 I think it makes more sense to hold them liable for not doing the sufficient fact checking for defamation as opposed to something that hosts everything.
0:42:01 I mean, something that hosts the Wall Street Journal, the New York Times and YouTube and everything else under the sun, you know, is something that’s quite distinct than just, you know, just the New York Times by itself.
0:42:04 Do you think there’s opportunity for nuance or gray area?
0:42:08 And I’ll propose a solution or what I think we should think about.
0:42:09 I think about 230 a lot.
0:42:24 The idea that people can break through and say things and post something and that a company that creates a lot of economic value, lets a lot of interesting opinions, sometimes the conspiracy view ends up being actually more true than you think.
0:42:36 There’s been some just wonderful things about these platforms and the ability for viewpoints and consumers or content producers to kind of go direct to consumer and kind of have at it.
0:42:43 At the same time, I worry that the protection is not consistent in the sense that, well, let me propose a solution.
0:42:56 So if my co-host on one of my podcasts, Raging Moderates, and she’s the kind of the sole Democrat on this show, which is actually the most watched show on cable news called The Five.
0:43:15 Someone got upset that she called out Ken Paxson or something and mocked up a picture of her with her previous boyfriend and has, you know, has gone into this tried and true misogynistic, slut-shaming, misinformation, having an affair from her first husband, hasn’t been married to her.
0:43:16 I mean, just total nonsense, right?
0:43:39 And the algorithm on Twitter loves that because it creates a lot of comments, a lot of engagement, people weighing in, conspiracy theorists, and also people protecting her creates more Nissan ads because, and the algorithm itself from Twitter is trained to elevate that content and give it broader and further reach than it would organically because it creates Nissan ads.
0:43:42 In other words, in other words, there’s an economic incentive to spread this information.
0:43:49 Do you think there’s a solution or some gray area where maybe we say, okay, you’re a bulletin board and you can’t be responsible?
0:43:54 It’s just, it’s, it’s unrealistic suppression of speech, economic impairment.
0:43:59 If you were responsible for policing everything, someone pins up on the board.
0:44:08 But if you as a social media company decide to elevate algorithmically content and give it more spread than it might organically,
0:44:17 at that point, are you really different than an editor at CNBC or MSNBC or at Fox who’s subject to a different set of standards?
0:44:25 Shouldn’t they be subject to the same standards if they make the conscious decision to algorithmically elevate content?
0:44:37 I’m always worried about the distortative impact of government, also regulation sometimes, and liability.
0:45:00 And so I’m very hesitant to change anything without, and my job is to make the argument for, you know, err on the side of free speech as much as possible and err on the side of as little, as few things being banned and as few things being government regulated as possible.
0:45:13 So my fear is that essentially if you started having government entanglement with algorithmic choices, you know, you really got to decide, one, you know, which government do you trust?
0:45:15 Do you trust Biden to do that?
0:45:16 Do you trust Trump to do that?
0:45:26 But also, particularly when it’s a liability standpoint, how distortative that actually can be to what gets reported in the first place?
0:45:32 Because this is, in a sense, kind of like why everybody sues for libel in Britain.
0:45:46 It’s very, they still, even though they’re, even though they’ve made slight improvements around the edges, it’s still much easier to find people guilty of committing libel in Britain than it is the United States.
0:46:02 We actually have a shield for the country, basically saying, you know, providing some modicum of protection for people in the U.S. from libel tourism that takes place in, that takes place in the U.K.
0:46:11 So, you know, I’m always going to be fairly skeptical of that kind of stuff, but it’s also my societal role to be skeptical of that kind of stuff.
0:46:14 We’ll be right back.
0:46:26 Foldable phones have been around for a while now, but maybe you’ve never used one.
0:46:31 This week on The Verge Cast, we take a look at Samsung’s new lineup of foldables.
0:46:37 This could be a big moment where foldable phones become a lot more interesting to a lot more people.
0:46:45 Plus, we look at executive shakeups at Apple, Meta, and X, where Grok is going absolutely off the rails.
0:46:54 Plus, we do our signature microphone test with the latest over-ear headphones, and we get into why it’s so hard to make a great strength training app.
0:46:56 That’s this week on The Verge Cast.
0:47:06 We’re back with more from Greg Lukianoff.
0:47:12 So I want to throw kind of the most difficult stuff at you and get your thoughts.
0:47:25 So if I were Putin and I’d lost a million men to a war in spending, you know, $70 to $100 billion a year on a losing war,
0:47:31 and at some point, if this war continued to wreak the kind of economic and human damage,
0:47:36 if it were to continue to do that, at some point I might find myself falling out of a window.
0:47:43 So I think he would be stupid not to weaponize and spin up troll farms in Albania
0:47:48 and then create a list of the 10,000 most influential people online who are pro-Ukraine
0:47:54 and start attacking their reputation with millions of bots in a very thoughtful way.
0:47:56 Is he already doing that, though?
0:47:57 Well, yeah, I think he is.
0:48:01 And that’s my question, and that is, do bots have First Amendment rights,
0:48:04 and do these platforms have some sort of obligation,
0:48:09 which I think would only be registered or adhered to through some sort of regulation,
0:48:14 to protect us against bad actors that might be, quite frankly,
0:48:18 raising a generation of military, civic, and nonprofit leaders who don’t like America
0:48:20 or begin to have their views shaped,
0:48:27 and ultimately our votes and our military decisions shaped on outside forces
0:48:34 that are taking advantage of a very porous and lightly regulated tech ecosystem and platforms.
0:48:35 Your thoughts?
0:48:38 The question of whether or not bots have First Amendment rights,
0:48:42 of course they don’t, but do bot creators have First Amendment rights?
0:48:46 At least when they’re in the United States, absolutely, certainly they do.
0:48:50 When you’re talking about the kind of propaganda, kind of warfare,
0:48:55 and targeting that is possible in the age of the Internet, in the age of social media,
0:49:00 when you’ve kind of fall down this rabbit hole of how you actually address it
0:49:05 without actually devastating, without having huge government encroachment,
0:49:08 which will end in bad places as well,
0:49:12 or without creating massive unintended consequences,
0:49:18 the best way to do this historically has simply to have authorities that people actually trust.
0:49:23 And we have blown giant holes in the only thing that really can protect you
0:49:25 from disinformation and misinformation.
0:49:32 And you have to start figuring out ways to get authorities that people essentially trust.
0:49:36 Because one of the ways we could potentially address some of this stuff
0:49:43 is by having institutions pointing out what is troll farms and what isn’t.
0:49:46 But under the current environment, there’s going to be a lot of like,
0:49:47 sure, they are.
0:49:51 You just don’t like what they’re saying is going to be the response there.
0:49:55 And in a situation where these institutions had better societal trust,
0:49:56 it’d be like, oh my God, you’re right.
0:49:59 So you’re a First Amendment attorney.
0:50:02 What are we not paying attention to in the courts?
0:50:05 Have there been any legal decisions that you think are
0:50:11 especially important to the future of First Amendment or speech or its regulation or lack thereof?
0:50:15 What have you seen come down the pike that you think has not gotten enough attention?
0:50:17 Yeah, I mean, I think it got good attention,
0:50:20 but I think people haven’t thought through all the ramifications of it.
0:50:26 And this gets to your point on anonymity where we may disagree to some degree.
0:50:39 But the change in the law to say that you can’t actually require verification for kids
0:50:44 and actually really for anybody to use porn sites in Texas
0:50:50 is a case that could really have some serious bad ramifications
0:50:54 unless it stays relatively cabined.
0:50:57 Now, I was definitely among the First Amendment people saying,
0:51:01 listen, there’s a case called New York v. Ginsburg from the late 1960s
0:51:06 that says you can require store owners to put the, you know,
0:51:09 put the nudie mags, you know, on the back shelves
0:51:12 and to make sure that that miners don’t get them.
0:51:19 But then we had decade after decade of the Supreme Court and other courts basically saying,
0:51:23 but online that can’t possibly apply for all sorts of,
0:51:25 and to be clear, very serious reasons.
0:51:28 I knew that wouldn’t last forever.
0:51:33 And then eventually Texas passed a verification regime
0:51:36 that was actually more complex than I originally understood,
0:51:41 but was first marketed as something where you had to basically show like a driver’s license
0:51:46 if you want to see porn that would also include that they had to have disclaimers
0:51:51 saying that, you know, porn is harmful to your mental health and all this kind of stuff,
0:51:53 which has compelled speech issues.
0:51:59 I think they made some efforts to sort of improve the law and make it clear that there’s other ways to verify.
0:52:04 Anyway, so that fight was something that I predicted we were going to lose in the Paxon case.
0:52:05 Now, here’s the question.
0:52:09 Are we then going to, with the best of intentions,
0:52:15 create an environment where you essentially can’t use the Internet without identifying yourself in some way?
0:52:20 And that scares me, because I do actually think that the situation for free speech,
0:52:25 even in the so-called free world, is dodgier than it’s ever been in my lifetime.
0:52:33 And the idea that at this precise moment we’d also make it harder for people to hide what they’re looking at
0:52:35 or what they’re reading scares me.
0:52:40 So our efforts at FIRE are definitely going to be to make sure that that decision,
0:52:46 as much as possible, stays cabined to kids’ access to adult materials.
0:52:48 Don’t you think the platforms are already doing that?
0:52:52 Don’t the platforms already know exactly what we’re doing, saying, and when?
0:52:58 But as long as they use it to monetize advertising, it seems to be there’s a tolerance for it.
0:53:01 I think the cat’s already out of the bag.
0:53:02 I think they already know everything we do and what we say.
0:53:04 We don’t seem that worried about that.
0:53:11 But then we have this, do we have this tremendous fidelity for protecting them
0:53:17 when it comes to any, I don’t know, forward-facing viewpoints that might result in more,
0:53:22 I don’t know, just more, it seems like we’re just protecting them in the wrong areas
0:53:24 and not looking at them in others.
0:53:26 I apologize for the word salad there, Greg.
0:53:28 Do you see any inconsistency?
0:53:30 Yeah, no, I definitely get the concern.
0:53:37 But I do think that there are tools that people badly underutilize that can actually protect your privacy.
0:53:38 Well, they’re purposely made complicated to utilize.
0:53:42 Have you tried to regulate your kid’s content on Face, on Meta?
0:53:48 I mean, they are not, I would argue, they are not readily accessible purposefully,
0:53:49 or they’re not easily used.
0:53:55 Yeah, no, and I definitely ask for help to make, with my kid’s stuff.
0:53:59 But now we have, you know, we have Signal, you know, for example.
0:54:02 We do DuckDuckGo, you know, like that.
0:54:05 I have Custodial installed on my kid’s phones.
0:54:09 Yeah, there are some basic steps you can do to somewhat protect your privacy.
0:54:13 And of course, when it comes to private corporations doing bad things,
0:54:18 and this is something that I feel like we have an entire generation of young Americans
0:54:22 sort of brainwashed to believe that you should be more afraid of corporations
0:54:23 than you should be of governments.
0:54:30 And I just think that’s absolute nonsense, particularly foreign governments,
0:54:33 but also, frankly, the U.S. government.
0:54:39 And that corporations, you know, people talk about that evil profit motive.
0:54:43 And I’m kind of like, I prefer the profit motive to a lot of the other motives you can have
0:54:45 for finding this stuff out.
0:54:50 And profit motive often lends itself to, and by the way, we protect our users’ privacy
0:54:56 in a way that, you know, the Chinese, the CCP, or Russia, or even our own government,
0:55:00 it’s like, no, we want this information for other reasons.
0:55:06 So just so you know, I’m very good at turning this podcast into, it’s really just an excuse
0:55:07 for me to talk about me.
0:55:14 Steve Bannon suggested that the president, that the administration sue me for some of the things
0:55:14 I’ve said about him.
0:55:16 I called him a rapist.
0:55:23 And do you feel that the president is, in different ways, trying to suppress free speech?
0:55:26 And if and what laws, or what do you think should be done about it?
0:55:32 What are your views on, it feels to me like free speech has been chilled from the administration.
0:55:34 And I’m just curious, curious to get your thoughts on it.
0:55:35 Sounds like you agree with it.
0:55:38 But what can and should be done to push back on that?
0:55:44 Well, it’s tough because the, okay, so the ways in which it’s being chilled, just really
0:55:48 quickly, there’s been, you know, attacks on mainstream media.
0:55:52 People can argue that it’s deserved, but that doesn’t mean you get to violate the First
0:55:52 Amendment.
0:55:54 There’s attacks on higher education.
0:55:58 Again, you can feel like it’s deserved, but it doesn’t mean you get to violate the First
0:56:00 Amendment or existing laws.
0:56:05 And then there’s the attack on the law firms, which probably is the one that I think gets
0:56:07 the least attention, but probably scares me the most.
0:56:11 When it comes to the media, for example, like, were the group defending Ann Selzer?
0:56:18 Ann Selzer is the pollster in Iowa who got the poll really wrong right before the election,
0:56:21 having Kamala up by two points in Iowa.
0:56:23 And of course, that was way off.
0:56:25 She was like 11 points off.
0:56:28 But when it came out, she apologized.
0:56:33 She explained how she got it wrong, saying that she was using methodology that was really
0:56:39 effective maybe 10 years ago, but has gotten increasingly ineffective as fewer people have
0:56:40 landlines and that kind of stuff.
0:56:43 Because she used to be considered like the gold standard of pollsters.
0:56:47 But she was nonetheless sued by Trump himself, actually.
0:56:49 This is before Inauguration Day.
0:56:55 For under a Consumer Protection Act in Iowa.
0:57:03 And the Consumer Protection Act was really designed for preventing false advertising, like as in
0:57:07 in commercial speech, saying that, you know, these pills will help you lose 40 pounds a day
0:57:12 type things, not getting a poll wrong, which is, you know, good faith reporting.
0:57:14 So we’re defending her in that case.
0:57:19 Then there’s also like the 60 Minutes situation, the ABC News.
0:57:25 The 60 Minutes one, I think of as particularly bad because it really seemed like the administration
0:57:36 was dangling a proposed merger with Skydance in front of CBS to saying, kind of implying, we’re not going to agree
0:57:39 to this unless you play ball, which is good.
0:57:45 The university stuff, nobody’s been a bigger critic of Harvard, for example, than I’ve been.
0:57:51 They have finished dead last in our campus free speech rankings, one of the best and most data intensive
0:57:54 things that FIRE does, or the most data intensive thing that FIRE does.
0:57:57 And Harvard was dead last two years in a row.
0:58:02 But we’re currently defending Harvard because the letter the administration sent to Harvard
0:58:07 was basically saying, because you’re probably in violation of Title VI, which they may be,
0:58:17 and Title VII, which when it comes to admissions, probably are, that we essentially have to
0:58:19 nationalize Harvard.
0:58:24 Like basically, the government gets to decide all of the key things about what Harvard would
0:58:27 decide on its own, which is not a power the government’s been granted.
0:58:32 And when it comes to the law firms, I mean, like that, that’s the one that I really, I wrote
0:58:38 on this, my substack, the Eternally Radical Idea, about the, about all of these, these cases.
0:58:45 And it started with them just going after attorneys who had opposed the Trump administration, even
0:58:50 like people like Robert Mueller, and where they had law firms and, and saying that they
0:58:55 would be denied their secret service protection, and not secret service, their-
0:58:56 Yeah, their security details.
0:58:57 Repackage violence.
0:59:03 If you’re someone who ordered a strike on Suleiman, the head of the Iranian security forces, and
0:59:07 you take away a general’s security detail, you’re putting that person in harm’s way, in my view.
0:59:12 And get rid of their security clearance, and then also deny them access to federal buildings,
0:59:15 which of course include court, uh, uh, uh, courtrooms.
0:59:19 And that’s, that’s to me like some of the most chilling stuff.
0:59:20 Now, what can be done about it?
0:59:25 Uh, the most, the, the thing that’s happening consistently is that Trump is losing in court.
0:59:30 And so far, he’s mostly been abiding by those rulings.
0:59:35 Um, I’m a little bit concerned, given how fast and loose sometimes this administration
0:59:40 plays with the rules, that that might not hold up when push really comes to shove.
0:59:42 Um, but, you know, fingers crossed.
0:59:47 In terms of, like, what else people can do about it, I think it really is a question of what
0:59:51 happens in the next, uh, in the midterms, and then, of course, in the presidential election.
0:59:56 Um, but, uh, it’s, it’s, it’s troubling, but not unexpected.
0:59:58 Yeah, shocking, but not surprising.
1:00:04 Uh, just as we wrap up here, Greg, a lot of young men listen to this podcast, um, based
1:00:08 on some of the many challenges, all young people, but especially, I would argue, some young men
1:00:10 in our society are facing right now.
1:00:14 A lot of them are struggling with their own mental health, and I appreciate how transparent
1:00:18 and vulnerable you were at the beginning of the podcast, talking about your own struggles,
1:00:22 and you had said that cognitive behavioral therapy really helped you.
1:00:28 Can you share, uh, some thoughts on your struggles with your own anxiety and depression and any
1:00:31 advice you might have for young people who are facing their own challenges?
1:00:37 Sure, um, and that’s, you know, and, and it’s tough because, like, you know, everyone struggles
1:00:43 kind of differently, and I, I really understand people’s kind of concern about, well, one, of
1:00:48 course, the expense of getting a therapist, uh, but also the fear that’s given that therapy
1:00:54 has to some degree become politicized, that they don’t want to end up someone with a therapist
1:00:58 who’s going to judge them, you know, for, for, for, in some cases, just for being a male
1:01:01 for having, um, and, you know, non-conforming political views.
1:01:08 Um, so I, I get all of that, but I, I did hear from one, uh, friend about their, about
1:01:15 their kid who basically said he didn’t need therapy because he watches a lot of podcasts.
1:01:22 Um, and he, he’s on YouTube a lot getting advice and it’s, and it’s just not the same thing.
1:01:23 Yeah, that’s not the fix.
1:01:30 Yeah, so there’s people out there like Camilo Ortiz, uh, Ortiz, sorry, um, who is trying
1:01:37 to put together apolitical therapists, you know, um, who, or ones who won’t judge you, you know,
1:01:41 who won’t let their political opinion interfere with their therapy, which is amazing that you
1:01:42 have to do that, but you do, unfortunately.
1:01:49 Um, and, and within, so looking for people who are, who are recommended that way for CBT,
1:01:54 there’s also some, you know, some approaches to CBT that actually lend themselves fairly well
1:01:59 to even apps, um, which is, I don’t think it’s sufficient, uh, but it can help.
1:02:01 Uh, but here’s the thing.
1:02:06 It may be simple, but it’s not easy because you have to do it several times a day.
1:02:11 You have to actually do it when those, um, self-hating voices come up in your head,
1:02:12 those catastrophizing voices.
1:02:17 Otherwise your brain will never get in the habit of talking back to those and you have
1:02:23 to do it every time they come up and you have to do it for pretty much, like I would say probably
1:02:27 you’re not really going to see much change, uh, for the first six months even.
1:02:31 But I remember about nine months in suddenly being like, wait a second, all these things that
1:02:34 used to pop up in my head, they’re not, they don’t sound convincing anymore.
1:02:36 And it was really dramatic after that.
1:02:39 So I, I definitely believe looking to CBT.
1:02:44 I think that, you know, um, one thing that I do a lot when I’m having a hard time is I
1:02:46 go reread Seneca’s letters to a young man.
1:02:52 I, I, I find that they’re really approachable, uh, meditation is very helpful to people, but
1:02:54 don’t forget things like things.
1:02:55 Exercise is really key.
1:02:59 And if you’re in it, um, there is something there.
1:03:02 My favorite book is Upward Spiral by Alex Korb.
1:03:07 I highly recommend, I have like a whole thing, I have a whole, like a, a, a sub stack, um,
1:03:09 on, on, on this very issue.
1:03:11 Cause it’s, I get asked it so much and I give kind of all of my advice.
1:03:13 And what’s that sub stack called or what’s that post called?
1:03:18 Well, I don’t remember what that post is called, but my sub stack is the eternally radical idea.
1:03:21 Uh, but also, you know, talk to people about it, talk to friends.
1:03:26 Um, the, when you’re in a really bad way, there’s a sense that nobody’s going to want to hear
1:03:29 your, hear your whining about it, but that’s just not true.
1:03:34 Um, and, and, and in most cases, cause you know, as, as hard as it may be, may be, may
1:03:38 be to believe sometimes when you’re really deep down and dark, um, there are people out
1:03:39 there who love you.
1:03:44 Greg Lukianoff is a free speech advocate, first amendment attorney, president of FIRE, the
1:03:49 foundation for individual rights and expression and coauthor of the coddling of the American
1:03:55 mind, which has probably had more impact on my parenting than any book I have read.
1:04:01 Uh, and also the canceling of the American mind, his latest book war on words, 10 arguments
1:04:05 against free speech and why they fail is out next week.
1:04:07 He joins us from Maine.
1:04:11 Greg, I’ve wanted, I wanted to meet you and speak to you for a couple of years because one
1:04:18 of my role models, uh, Jonathan Heights, whenever he talks about you, he speaks about you in such
1:04:19 reverence and with such respect.
1:04:23 Uh, so I was really excited to, uh, have this conversation.
1:04:24 Very much appreciate your time.
1:04:30 Also very much appreciate, uh, what you said, um, at the end about cognitive behavioral therapy.
1:04:37 Uh, and I, the, the takeaway I have and that I hope people take away from this podcast is when
1:04:41 you’re really down and you think everyone’s sick of you and sick of hearing from you and doesn’t
1:04:43 have time for you that that that’s just not true.
1:04:45 Uh, so anyways, thank you for sharing that, Greg.
1:05:14 My father passed away last week and it’s been a rough few days for me as it is for anybody
1:05:15 who loses a parent.
1:05:20 Uh, our species, our competitive advantage is that is our brain.
1:05:24 It’s so big that we’re expelled from the human body prematurely and our brain is exactly the
1:05:25 wrong size.
1:05:30 It’s big enough to ask very complicated questions, but not big enough to answer them.
1:05:35 And death is something our brain still hasn’t come, uh, to grips with.
1:05:39 And that is, um, especially with a parent, this is someone who is your first protector.
1:05:44 And then when you lose that person, the idea that all of a sudden that protector isn’t
1:05:45 around is devastating.
1:05:47 It’s a mirror.
1:05:52 You see a lot of yourself in this person and, uh, you immediately think about all the different
1:05:55 things in your life that developed good and bad with this person.
1:05:59 And you have to deal with those and come to attempt to come to grips with them, which
1:06:00 sometimes can be painful.
1:06:03 Our brains are used to continuity and patterns.
1:06:07 We’re used to having that person in our life and we assume they’re going to be around
1:06:09 forever and it’s impossible to believe they’re not going to be around forever.
1:06:14 So the finality of death is just very shocking and very difficult to wrap your head around.
1:06:20 The biggest or most profound moments in my life have involved, uh, birth and death.
1:06:26 Uh, the death of my mother, my mother passed away when I was 39 after what was a pretty ugly
1:06:32 model with a smoking related illness, um, specifically cancer, breast cancer, uh, twice.
1:06:33 And then it metastasized in her stomach.
1:06:40 Uh, and it was just the finality and the harshness of it and the brutality of the way she died kind
1:06:46 of really, um, it’s sort of, you know, these things change you.
1:06:50 They, they, they, they really, I think for most people, you’re sort of never the same.
1:06:54 I was much lighter and funnier before that happened.
1:06:59 And I think something kind of died in me, but at the same time, I developed a wonderful sense
1:07:00 of the finite nature of life.
1:07:03 And then when my kids were born, that changed everything for me.
1:07:09 I became much more responsible, uh, much more anxious, but, uh, started for the first time
1:07:12 in my life thinking about other people, which was an enormous unlock.
1:07:13 And I’ll come back to that.
1:07:19 And then the death of my father is a debt, a different sort of feeling, not nearly as close
1:07:21 to my father as I was to my mother.
1:07:26 So my dad, George Thomas Galloway was born in 1930 in Sydney, Australia to a woman who was
1:07:28 a domestic servant for a wealthy family.
1:07:30 He was born out of wedlock.
1:07:36 And the deal was that the family found out my grandmother was pregnant and they had a daughter
1:07:39 who did not have any children of her own and was in her thirties, which was
1:07:40 considered a, you know, a tragedy.
1:07:46 And they agreed to adopt her child, her unborn baby and would give her enough money.
1:07:48 But the deal was she had to, to leave.
1:07:52 And I think they even gave her some money, uh, cause they didn’t want the biological mother
1:07:52 around.
1:07:54 And my, my grandmother agreed.
1:07:58 And then, uh, my grandmother gave birth to my father in Sydney, Australia.
1:08:03 And I don’t know the full story, but convinced her boyfriend or the father of the child, which
1:08:05 is obviously very upset to meet her at the docks.
1:08:09 And they got on a ship for, for Scotland.
1:08:14 I can’t even imagine what the, the ship route was like from Sydney to Glasgow.
1:08:17 And so my father always jokingly said, I could have been a McVicar.
1:08:20 It was the McVicar family that built like battleships or something.
1:08:21 And he says that he’s pissed off.
1:08:25 He would have much rather stay in, in Sydney, Australia than the son of a rich family.
1:08:32 Anyways, uh, raised in depression era, Scotland and world war two, Scotland, he says his first
1:08:34 memory is watching the Clydes bank rate.
1:08:41 I think it’s called where Henkel Henkels and Messerschmitts dropped bombs on, uh, munitions
1:08:45 factories or shipbuilding factories, uh, just outside, I believe of Glasgow.
1:08:49 And he jokes that, uh, they were obviously very patriotic.
1:08:55 He was nine when the war broke out, 15 when it ended and that anyone with an accent in
1:08:59 Glasgow, his, him and his 10 year old buddies would follow around and take notes on them
1:09:00 because they assumed they were spies.
1:09:07 And then, uh, uh, after the war ended, he was 15, but at the age of 17, he lied about
1:09:11 his age and went to a recruitment office and wanted to be a pilot for the IRF.
1:09:14 And the recruiter said, you’re too tall to be a pilot.
1:09:17 So he went across the street or somewhere to where they were recruiting for the Royal Navy
1:09:20 and joined the Navy at a very young age.
1:09:26 And before he knew it was on, uh, I believe it was an aircraft carrier and my father could
1:09:29 repair, they do an assessment, a skills assessment test.
1:09:32 And he disclosed that he could repair things.
1:09:35 He repaired motorcycles and that he was a good swimmer.
1:09:41 And so the next day he was no joke in a helicopter, in a wetsuit, in the North Atlantic,
1:09:44 practicing what he found out later was pilot rescue.
1:09:49 They kind of informed him what he was going to do while he was in the helicopter in a poorly
1:09:50 fashioned wetsuit.
1:09:52 And they said, okay, this is the deal.
1:09:53 You’re going to jump out into the water.
1:09:56 We’re going to throw out 150 pound dummy, not in that order.
1:09:58 Then we’re going to lower a basket.
1:10:04 And your job is to get this 150 pound dummy into a basket as if it was a pilot in the North
1:10:04 Atlantic.
1:10:08 Oh, and by the way, even with your wetsuit, in about 14 minutes, you’re going to die of
1:10:08 exposure.
1:10:09 So there’s some motivation.
1:10:16 So my dad jumps into the North, wavy North Atlantic when it was dark out and tries to get
1:10:18 this dummy into this basket.
1:10:23 And then he said the scariest moment was he got the dummy into the basket, most exhausting
1:10:23 thing he’s ever done.
1:10:24 They pull it up.
1:10:28 And he said the current started taking him away from the helicopter.
1:10:32 And he was worried they were no longer even going to be able to see him and get him out.
1:10:33 And they drop a winch.
1:10:35 He connects it and they pull him up.
1:10:38 His first week, he got his pay.
1:10:42 He put it in a locker at the foot of his cot.
1:10:44 And the entire locker was stolen.
1:10:49 I guess this was sort of something that the freshman recruits were stupid and would put
1:10:51 money in their locker thinking it would be secure.
1:10:54 So there was a service where he could send money home.
1:10:58 So he would send all of his money from the Navy, all of his pay home to his mother.
1:11:02 And after two years, he calculated he had enough money to get to America.
1:11:07 And he came home to find out that his mother had spent his money on whiskey and cigarettes.
1:11:12 And in her defense, she claimed that, what did you expect me to do?
1:11:13 I was bored.
1:11:18 So my father has always had a very unhealthy relationship with money.
1:11:22 I mean, it really scarred him growing up in the Depression-era Scotland and I think acts
1:11:23 like that.
1:11:30 But he did get some money together and got to America and led what could arguably be called
1:11:31 the American dream.
1:11:36 My favorite story about him first arriving in America was he and my mom met in Canada.
1:11:37 They got pregnant.
1:11:38 They hated the weather.
1:11:39 They bought a newspaper.
1:11:43 And there was an article saying that the nicest weather in North America was in San Diego.
1:11:49 So they loaded up my seven-month pregnant mom into an Austin mini-metro and drove from
1:11:51 Toronto to San Diego.
1:11:54 My dad’s first job interview was to be a salesman for a candle company.
1:11:58 And the head of HR there said, you’ve got to stay here.
1:11:59 She asked him how long he’d been in the country.
1:12:00 And he said, just two weeks.
1:12:01 And she said, that’s incredible.
1:12:03 She said, just wait right here.
1:12:07 And she went and got her boss and said, you need to meet this guy, Tom Galloway.
1:12:11 He’s only been in the country for two weeks from Scotland and he can already speak the language.
1:12:12 I love that.
1:12:19 Anyways, my dad was aggressive, smart, and charming and got jobs.
1:12:23 And at his peak, we had a home in Laguna Niguel.
1:12:28 I was thriving and they were living what could best be described as an upper middle-class home.
1:12:33 And unfortunately, my dad was not a high-character person.
1:12:34 Married and divorced four times.
1:12:39 A handsome man with a strong jaw and a Glaswegian accent in 70s California.
1:12:42 Did not only think with his dick, he could listen to it.
1:12:49 And my dad rifled through four marriages and four divorces, including his last one where he decided to leave his fourth wife when she had late-stage Parkinson’s.
1:13:01 So he was never able to really connect with people or ever really develop a sense outside of kind of the survival instinct of investing in other people and other people’s relationships with other people.
1:13:03 He was broken that way.
1:13:06 And there’s just no not acknowledging that.
1:13:17 However, what has helped me or some of the learnings here is that what has helped me process this is I love that Dr. Sue saying, don’t cry because it’s over.
1:13:18 Laugh or smile because it happened.
1:13:27 I’m thinking about all the things I’m grateful for, being the son of George Galloway, on very basic things.
1:13:27 I am tall.
1:13:29 I have broad shoulders.
1:13:30 I have a good voice.
1:13:32 And I have made an exceptional living communicating.
1:13:34 None of those things are my fault.
1:13:36 I got all of those things from my father.
1:13:41 And just because maybe he didn’t purposely work to give them, give me those things, there’s no reason I can’t be grateful for them.
1:13:50 I think the ultimate test of evolution and the most basic box you need to check as a man is the following.
1:13:56 And that is, are you a better father to your son than your father was to you?
1:13:59 And my dad checked that box in indelible ink.
1:14:03 His father, I found out later in life, used to physically abuse him.
1:14:05 I can’t even imagine what that would be like.
1:14:10 The person you’re supposed to trust most in the world, the person you’re supposed to protect you, physically abuses you.
1:14:12 And he never did that for me.
1:14:17 And he did try after my mom and dad got divorced, he would try and meet me in places and take me to museums.
1:14:22 And there’s a lot for me to be grateful for.
1:14:35 What has helped me in terms of my relationship with my father that has been one of the biggest unlocks in my life, hands down, was I really struggled with my relationship with my father.
1:14:47 Because every time I thought I was being a good son, and I remember back to the fact that he kind of left me and my mom and wasn’t very kind to us, was I would get resentful and angry and kind of cut him out of my life for small periods of time.
1:14:55 And then an enormous unlock, and my biggest piece of advice, if you’ve made it this far, is I decided, okay, what kind of son do I want to be?
1:15:05 I don’t think about the relationship as a transaction and what he owes you or what you owe him or base my behavior on what he did or did not do for me.
1:15:08 But just simply put, what kind of son do I want to be?
1:15:13 And the reality is I wanted to be a loving, generous son, and I wanted to have a great relationship with my dad.
1:15:17 And I had all of the qualities and resources to do that.
1:15:22 And from that moment on, I put away the scorecard, I put away the bullshit, and I was a great son.
1:15:25 And it was not only wonderful for him, it was wonderful for me.
1:15:29 I’ve loved these last 20 years of just having a great relationship with my father.
1:15:30 He’s charming.
1:15:30 He’s funny.
1:15:31 He does love me.
1:15:36 And it’s just been a huge lesson for me in life.
1:15:48 You know, not is my partner as good to my parents or is nice and generous to me based on how I should behave around her, but what kind of partner do I want to be?
1:15:49 What kind of business person do I want to be?
1:15:52 What kind of employer do I want to be?
1:15:55 I used to look at every employee as, are they adding as much value as I’m paying them?
1:15:57 And if they’re not, I’m going to fire them.
1:15:59 Now I think, how can I be just an amazing employer?
1:16:01 How can I be an amazing friend?
1:16:05 How can I be, what kind of investor do I want to be known as?
1:16:08 And then live to that standard and put away the scorecard.
1:16:13 Don’t base your behavior on what you did or didn’t get from that person, but on the person you want to be.
1:16:16 And that’s just been such an enormous unlock for me.
1:16:25 So as I sit here and I think about my dad and I try to process his death, you know, I think this guy lived the American dream.
1:16:27 He came to America.
1:16:32 The biggest gift he gave me was that he took this enormous risk and got to America.
1:16:45 And so much of my success, so much of the ability to have a wonderful family is based on something that was not my fault, specifically my dad having the courage and taking the risk to get to America.
1:16:47 And I’m very appreciative of that.
1:16:52 George Thomas Galloway was 95 when he died.
1:16:53 He was very much a man.
1:16:55 He was a protector.
1:16:58 He wanted to protect his country.
1:16:59 He was a provider.
1:17:01 He provided for two families.
1:17:04 And he was a procreator.
1:17:07 He had two kids and four grandkids.
1:17:26 His son and his daughter will miss him terribly.
1:17:52 This episode was produced by Jennifer Sanchez.
1:17:55 Our assistant producer is Laura Janair.
1:17:57 Drew Burrows is our technical director.
1:18:01 Thank you for listening to the Prop G Pod from the Vox Media Podcast Network.
1:18:47 Thank you.
0:00:08 like the pricing, the marketing, the budgeting, the accidents,
0:00:13 the panicking, and the things, and the things, and the non-stop things.
0:00:17 But having the right insurance can help protect you from many things.
0:00:20 Customize your coverage to get the protection you need
0:00:22 with BCAA Small Business Insurance.
0:00:28 Use promo code PROTECT to receive $50 off at bcaa.com slash smallbusiness.
0:00:34 Using AI chatbots is pretty easy.
0:00:38 Knowing how to feel about them, that’s more complicated.
0:00:42 You know, and I don’t think that biologically we’re necessarily equipped
0:00:46 to be emotionally handling this type of relationship
0:00:48 with something that’s not human.
0:00:50 Our AI companions.
0:00:52 That’s this week on Explain It To Me.
0:00:56 New episodes every Sunday, wherever you get your podcasts.
0:01:06 Episode 357.
0:01:09 357 is the area code covering parts of Central California.
0:01:14 In 1957, the baby boom hit its peak with more than 4.3 million births,
0:01:17 and Sputnik launched, which kicked off the space race.
0:01:20 I remember when I was a younger man, a boy actually,
0:01:24 saying to my mom, Mom, someday I’m going to be shot into space.
0:01:25 To which my mom replied,
0:01:28 well, if your dad had done his job, that would have happened.
0:01:31 Go! Go! Go!
0:01:42 Welcome to the 357th episode of The Prop G Pod.
0:01:43 I am in Aspen.
0:01:44 Why am I here?
0:01:45 Because I can be.
0:01:46 I absolutely love it here.
0:01:48 Building a home here.
0:01:51 This is where I’m going to sit around and wait for the ass cancer,
0:01:54 meaning this is where I think I’m going to leave feet first,
0:01:58 where I plan to wind down and give up podcasting when I’m, I don’t know, 93.
0:01:59 I think that would be a good time.
0:02:02 I used to think when I was a younger man, when I was in my 40s,
0:02:06 that I was going to create space or room and go totally dark on social media
0:02:08 and stop podcasting by the time I was 50.
0:02:09 But here’s the thing.
0:02:12 I love the fame, the relevance, and the Benjamins.
0:02:20 But I am here and our technical director, Drew, rented an apartment called the Aspen Alps
0:02:23 right on the mountain here and set up this giant studio.
0:02:25 So I hope you appreciate all the production values here.
0:02:30 They told me to take off my hat because they didn’t like the way it looked.
0:02:31 And I took it off and I thought, you know, fuck it.
0:02:32 It’s my image.
0:02:33 It’s me.
0:02:34 They’re AI.
0:02:34 I’m me.
0:02:35 I own me.
0:02:36 I own the digital, Scott.
0:02:39 I’ve been, it’s been an emotional weekend for me.
0:02:43 I’ll get back to that later in the episode and I look like shit and I’m self-conscious
0:02:45 and, you know, all that good stuff.
0:02:47 But what are we doing here?
0:02:48 We’re very much enjoying ourselves.
0:02:53 I think I used to come to Aspen in the winter, put our kids on skis, came here in the summer
0:02:55 and now just come here in the summer.
0:02:57 I think mountain towns in the summer are absolutely wonderful.
0:03:00 I went to this place called Woody’s Creek’s Tavern, Woody Creek Tavern yesterday.
0:03:05 And a bunch of people rolled up on a horse, which I thought was ridiculously cool.
0:03:06 Okay, what’s going on?
0:03:08 The Epstein file.
0:03:10 I got this wrong.
0:03:11 I thought it was going to blow over.
0:03:12 I thought people were sick of hearing about it.
0:03:17 But it ends up that when you promote conspiracy theory for a good, I don’t know, five or seven
0:03:21 years and won’t stop hammering on it and then keep talking about this file and this list that
0:03:24 when you decide, oh no, I’m on the list and I’d rather not come out.
0:03:28 So nothing to see here, folks keep moving along, that everyone gets angry.
0:03:33 I did watch, I did enjoy watching Alex Jones cry in his car over the Epstein list.
0:03:39 But a lot of this comes down to sort of a major theme, I think, or a broader theme, and that
0:03:40 is one of identity.
0:03:46 And I think under the auspices of being able to create bots, not being subject to standards
0:03:52 around moderation, a public, and not taking responsibility for the comments they make, that
0:03:59 identity or specifically some sort of fidelity or irrational passion for the value of anonymity
0:04:05 has really hurt our society, and that is whether it’s, look at the most depraved behavior on
0:04:06 behalf of our government right now.
0:04:11 I would argue that it’s simply put is, is it the administration cutting food stamps?
0:04:13 That’s right up there.
0:04:20 Or the world’s wealthiest man murdering or killing the world’s most vulnerable and poorest
0:04:20 children?
0:04:21 That’s right up there.
0:04:28 But right close, maybe a close third, would be a bunch of individuals who’ve been weaponized
0:04:34 to create a private army for the president, who separate, rip families apart, are now, I
0:04:35 guess, rounding up citizens as well.
0:04:39 When you treat people differently based on identity, that is the definition of racism.
0:04:44 And these actions are, in fact, racist, where they’re targeting people based on their identity,
0:04:45 not on their behavior.
0:04:47 And what do we have?
0:04:51 We have individuals who realize how depraved this behavior is, so they wear masks.
0:04:53 They hide their identity.
0:04:56 And online, we have a lot of people with masks.
0:05:02 It’s somewhere between 20, 40, 50 percent sometimes of activity on a social media platform are bods
0:05:06 who have been weaponized by someone who doesn’t want you to know their identity, because what
0:05:09 they’re saying is either slanderous or they’re too cowardly to live up to it, or they would
0:05:13 be embarrassed to say such aggressive, inaccurate things.
0:05:19 And so we tolerate it under some bullshit notion that a civil rights activist in the
0:05:20 Gulf needs anonymity.
0:05:24 Well, with the blockchain, you could probably allocate a certain number of accounts for anonymous
0:05:26 accounts if, in fact, they needed the anonymity.
0:05:30 But the 99.9 percent of people who are just acting like cowards or being aggressive or tearing
0:05:34 at the fabric of our society because of anonymity, I don’t buy that bullshit.
0:05:42 When some idiots at UCLA decide to pass out bans to non-Jews and then we’ll let anyone without
0:05:46 abandon, i.e. Jewish people, onto certain parts of the UCLA campus and the UCLA leadership does not
0:05:49 show up to stop that shit right away.
0:05:51 What do those people do?
0:05:52 They wear masks.
0:05:58 So I think it’s pretty easy to spot people with depraved who are about to engage in things that
0:06:01 they do not want to associate their identity with because they are wrong.
0:06:06 And whether it’s a stormtrooper for Star Wars, a member of the KKK, a member of ICE,
0:06:12 or all of these bots online, anonymity has become a real problem in our society.
0:06:18 And that is, just as an example, I get a lot of really nice messages online.
0:06:20 I also get some of the vilest shit I’ve ever seen in my life.
0:06:24 And if I were a woman, I would be really, I would feel physically intimidated.
0:06:28 And I’ve been forwarded some messages that some of my female friends get online and it
0:06:29 is just totally unacceptable.
0:06:31 And it’d be pretty easy.
0:06:35 Find out who that motherfucker is on the other side of that keyboard and they will stop because
0:06:40 they will realize what they’re doing not only carries penalties, but just does not acquit
0:06:41 them very well.
0:06:45 And, but instead we’ve decided, oh, we need to sell more Nissan ads.
0:06:48 And under this bullshit notion that anonymity is key to progress.
0:06:49 No, it’s not.
0:06:54 And you could have a certain amount of anonymity for people who have a legitimate reason to be
0:06:54 anonymous.
0:07:01 But there is an issue here around our love of letting people have no accountability for their
0:07:06 actions under the auspices of some sort of First Amendment or free speech or protection.
0:07:10 And it has gone too far and the snake is eating its tail.
0:07:14 I would like to see, I like the fact that there’s cameras everywhere in New York and London.
0:07:19 What you also need when you have this kind of surveillance technology is really strong laws
0:07:26 to ensure, I don’t even think any camera footage or online tracking can be used to prosecute
0:07:27 someone in a misdemeanor.
0:07:31 I think it has to be a very serious crime and there has to be a lot of safeguards that err on
0:07:36 the side of not getting a search warrant for that data such that people feel comfortable being
0:07:39 their true selves, but at the same time have to represent their identity.
0:07:42 But where I was headed was some really vile shit online.
0:07:47 If I’ve been recognized several times in Aspen and people couldn’t be nicer or wherever I
0:07:50 am in the world, even when people disagree with me, they come up and say, I didn’t like your
0:07:50 take here.
0:07:51 This is what I think.
0:07:52 And they listen and they’re thoughtful.
0:07:58 And one of the really terrible things about AI and LLMs is LLMs are crawling the online
0:08:03 world, which is much harsher and much more cowardly and much more mendacious.
0:08:03 Why?
0:08:04 Because of anonymity.
0:08:09 Whereas if these AI LLMs were crawling the real world where people have to take responsibility
0:08:14 for what they say and you get to look them in the eye when they say something, I think the
0:08:19 world would be a better place because AI would be training people how to behave in person where
0:08:23 you have accountability as opposed to training the world to behave the way they behave online.
0:08:28 And it’s not only people who individually pulse negative behavior.
0:08:32 There are people who want to create dissent and tear out the fabric of America, i.e.
0:08:39 the GRU and the CCP, and create millions of bots that manufacture content that doesn’t even reflect
0:08:44 how any individual feels, but gives you the impression that this is how millions of people
0:08:44 feel.
0:08:49 If you wanted, say you were pro-Ukraine, say you were a professor who was constantly talking
0:08:54 about Putin’s illegal invasion of Europe and how the U.S. should absolutely allocate the funds to
0:08:59 push back on a murderous autocrat, wouldn’t you be stupid not to create a troll farm in Albania
0:09:05 and then slowly but surely using AI, try to undermine that professor’s credibility with negative comments
0:09:10 all of the time about any of his or her content?
0:09:12 And I believe those lists have been assembled.
0:09:14 It would be stupid not to weaponize those lists.
0:09:15 And oh, great.
0:09:20 We have social media platforms that love the lies because the lies and the aggressive behavior
0:09:22 create more engagement.
0:09:24 The algorithms are Tyrannosaurus Rex.
0:09:26 They’re attracted to movement and violence.
0:09:29 And it creates more clicks, more engagement, and more Nissan ads.
0:09:30 So where are we?
0:09:35 Should an individual have First Amendment rights and be able to say pretty much anything about
0:09:37 pretty much anybody at pretty much any time?
0:09:38 Yeah, I think so.
0:09:41 But should a bot have First Amendment protection?
0:09:42 I don’t think so.
0:09:44 Should we be creating this atmosphere?
0:09:47 And it has gotten much worse over the last 20 years.
0:09:53 Where anonymity serves as a chaser and an incendiary to take the worst among us
0:09:56 and absolutely expand that behavior and forgive them for it
0:09:59 and encourage that behavior and also let bad actors
0:10:04 pretend to be people who engage in some of the most uncivil conduct
0:10:07 experienced in our society.
0:10:12 So I’m a big fan of getting rid of this love of anonymity.
0:10:16 And if you look at what’s going on, whether it’s ICE, whether it’s troll farms,
0:10:21 whether it’s people spewing hate speech on campus, what’s the problem?
0:10:21 Anonymity.
0:10:24 You want to show up and protest?
0:10:24 Fine.
0:10:30 But I don’t think a movement where everyone on your side feels the need to wear a mask,
0:10:35 I think that says something about what you’re saying and says something about your character.
0:10:39 Anonymity has been abused and it is tearing at the fabric of our society.
0:10:41 Okay, moving on.
0:10:45 In today’s episode, we speak with Greg Lukianoff, a free speech advocate,
0:10:49 First Amendment attorney, president of FIRE, the Foundation for Individual Rights and Expression,
0:10:55 and co-author of The Coddling of the American Mind, and most recently, The Cancelling of the American Mind.
0:10:57 I’m going to bring up some of these topics with Greg.
0:11:00 We discuss with Greg free speech in a divided country,
0:11:04 how cancel culture took off, and what today’s campus protests tell us about the state of open debate.
0:11:09 We also get into how schools are failing to build resilient students.
0:11:13 So with that, here’s our conversation with Greg Lukianoff.
0:11:27 Greg, where does this podcast find you?
0:11:32 Maine, actually. This is my first day up here since last year.
0:11:34 First day up there. Where are you usually?
0:11:35 D.C.
0:11:40 Oh, nice. So let’s bust right into it. Give us your thoughts on cancel culture.
0:11:47 How did it start? Brief history of it. And how does it differ from accountability, so to speak?
0:11:53 Sure. I mean, I wrote a book called Canceling of the American Mind when I started the project.
0:11:59 20-year-old Ricky Schlott, who’s absolutely brilliant, and I feel very lucky to work with her.
0:12:07 And it definitely was one of those things that was a really striking discontinuity from the rest of my career.
0:12:11 I started working defending free speech on campus in 2001.
0:12:17 And back then, you were most likely to get in trouble on campus from administrators.
0:12:20 Professors were fairly good on freedom of speech.
0:12:25 Students were great on freedom of speech and freedom to differ, differing opinions.
0:12:36 But right around 2013, going into 2014, a cohort of students showed up that were much less, you know, just to be blunt, tolerant of difference.
0:12:41 But essentially, you started seeing a lot more demands that speakers be canceled.
0:12:44 You started seeing demands for new speech codes.
0:12:47 And this was a big shift from what I’d seen before.
0:12:50 But I also started seeing some of this happening off campus.
0:12:54 So I tried to define cancel culture as a historical period.
0:13:00 Because all moments in the history of censorship have commonalities.
0:13:03 But they also have things that make them distinct.
0:13:13 And I think one of the distinct characteristics of cancel culture is that it was essentially impossible to have it as we understood it without something like social media.
0:13:22 Something that allows you to create the reality or oftentimes appearance of a sudden mob that’s demanding you fire that one employee.
0:13:24 And this wasn’t a subtle shift.
0:13:29 You know, I’d been doing this job for a long time prior to 2014.
0:13:41 And from 2014 on, you know, I’ve seen more professors lose their job, more tenured professors lose their job than, you know, what I’d seen in my previous half of my career, you know, times 20.
0:13:44 You know, it was really quite a shift.
0:13:59 So the way that Ricky and I define cancel culture is the uptick of campaigns to get people fired, punished, penalized, expelled, or otherwise punished for speech that would be protected under the First Amendment.
0:14:12 There, I’m making an analogy to public employee law, which basically means that there’s some common sense injected in there, but that essentially you’re not supposed to fire people just for their outside speech as a citizen.
0:14:15 And the culture of fear that resulted from it.
0:14:19 And one thing you should notice about that definition is there’s no political valence to it.
0:14:22 So cancel culture is cancel culture, whether it comes from the left or the right.
0:14:35 Isn’t that one of the deltas, though, that there’s always been shaming or criticism of people if they, you know, if their narrative doesn’t match yours or an opportunity to kind of play into gotcha culture.
0:14:42 But what I see is the big difference over the last 10, 15 years is the discovery that you could go after people’s livelihood.
0:14:48 It used to be that that was somewhat isolated, like we’re, I mean, I don’t like what Greg says.
0:14:49 I’m going to publicly shame him.
0:14:50 I’m angry at him.
0:14:51 He’s a bad person.
0:14:52 You shouldn’t be his friend.
0:14:53 You shouldn’t be listening.
0:14:58 But it never jumped the shark to now go after his livelihood.
0:14:59 Wasn’t that the big difference here?
0:15:08 Well, that was part of the definition, our campaigns to get people punished in some real material way, like get them fired, expelled, etc.
0:15:10 It’s not cancel culture if you’re just telling it.
0:15:16 There was a phenomenon called trashing in the 1960s that Musa Al-Gharbi likes to point to.
0:15:23 And it’s this really nasty, vitriolic way of going after your political enemies that was everything you’re talking about.
0:15:24 It’s like the person’s a bad person.
0:15:25 Don’t listen to the person anymore.
0:15:31 Or they’re not, you know, they’re not doctrinaire, as I would like to be doctrinaire.
0:15:34 But it generally didn’t get to the point of, and this professor has to be fired.
0:15:41 And the biggest shift, the one that kind of shocked me, was the uptick around 2017.
0:15:46 Because at first, students were focusing on each other and outside speakers.
0:15:52 But 2017 really marks the moment when they started going after professors in large number.
0:15:56 Talk a little bit about, quote unquote, de-platforming.
0:16:15 De-platforming is just the idea that essentially, the way we define de-platforming in our research department at FIRE is essentially getting a speaker, either getting them disinvited or making it so difficult to hear them or otherwise chasing them off campus.
0:16:23 The ones that scare me, the kind of de-platforming that actually scares me the most are the ones that involve violence or the threat of violence, for obvious reasons.
0:16:26 It’s primarily targeted at speakers.
0:16:31 We also consider it de-platforming if you do the same thing to, like, say, playing a movie.
0:16:34 That essentially you’re showing a documentary that’s not very popular on campus.
0:16:36 Students show up and shout it down.
0:16:40 But this is actually one of the areas where a lot of it actually comes from the right as well.
0:16:49 Because for a lot of speakers on campus, there’ll be off-campus pressure to get that person disinvited.
0:16:57 And this is particularly true of, say, speakers that could be painted as, like, a pro-choice and sometimes Catholic groups.
0:16:59 I think the Cardinal Newman Center is big on this.
0:17:03 Pressure schools to disinvite that person.
0:17:12 So generally, and this is for fairly obvious reasons, if the threat to free speech and de-platforming comes from the left, it tends to come from on campus.
0:17:15 If it comes from the right, it tends to come from off campus.
0:17:19 I saw some of this as a faculty member at NYU.
0:17:31 I remember about 10, 15 years ago, it became sort of in vogue for department chairs to put out very long emails about how certain microaggressions would not be tolerated.
0:17:38 And it was, OK, we’re charging kids $280,000 to come here.
0:17:41 Some of them leave riddled in student debt.
0:17:44 Two-thirds of the faculty probably isn’t holding their weight.
0:17:47 So there was an opportunity to step into this virtue circle.
0:17:50 And no one could ever criticize them.
0:17:51 I’d do anything but applaud.
0:17:52 Otherwise, you are a racist.
0:17:59 And there was this, almost this sort of self-appointed police.
0:18:12 That, and it was always, quite frankly, and I consider myself a pro-progressive, but people never got counseled for being too progressive.
0:18:15 And it felt very unhealthy.
0:18:27 And then, well, comment on that, and then I’m going to play identity politics and just make some anecdotal observations in the classroom and see if there’s any actual data that supports my thesis.
0:18:43 But talk about how all of a sudden, do you think some of it is, I just saw it as people who weren’t adding any actual value and were trying to find some merit and grab virtue or some sort of relevance and saw this as an easy way to try and grab status, so to speak.
0:18:47 Yeah, I think it’s a lot of things going on at once.
0:19:12 Well, one thing from my work with Jonathan Haidt, one chapter we ended up leaving out because I just didn’t have enough research to back it up, was my intuition that a lot of the phenomena we were seeing seemed to be playing out some of the values of the circa 2010 anti-bullying movement.
0:19:17 If you can get me to have a little time to develop this, I can explain it.
0:19:31 So in coddling, we talk about there being three great untruths, which are basically terrible advice to give someone that’s inconsistent with either modern psychology or ancient wisdom, and that will make you more miserable if you believe them.
0:19:37 And so we give this negative advice as what doesn’t kill you makes you weaker, you know, kind of like the opposite of Nietzsche.
0:19:43 The second one is always trust your feelings, which sounds nice, but it’s just awful advice.
0:19:54 And three, life is a battle between good people and evil people, which is contrary to a more sophisticated understanding of that everybody has some aspect of good and evil within them, which is more how I was raised.
0:20:10 And there was a critic who pointed out after coddling came out in 2018 that these were more or less kind of like the way anti-bullying was being taught after sort of like a moral panic about it.
0:20:14 Not that bullying isn’t real and should be addressed, but things manifest in their own ways.
0:20:23 And this was primarily due to parents being aware of more of this stuff due to the fact that they could see it on their cell phones, they could see it on their on their screens.
0:20:27 And this did have an emphasis of, you know, human fragility.
0:20:31 If you feel that you’ve been wronged, you’ve been wronged.
0:20:39 And that there’s basically only two types of people, good people and evil people, you know, good people, victims and bullies.
0:20:43 I think that this wasn’t the only cause by any any stretch of the imagination.
0:20:51 But to explain why I think young people had such sympathy for this movement, I think they were framing it partially in that way.
0:20:55 But unfortunately, because the rule of human behavior is that all motives are mixed.
0:21:07 It also met with something that my very young colleague and co-author on Canceling the American Mind was able to point out that she was part of the first generation of people to grow up with cell phones in her pocket.
0:21:09 She had them since she was 10.
0:21:16 And in junior high school, you know, it took on very much the nature of what you would expect.
0:21:32 A mass communication device given to kids in junior high school, it became a way of showing aggression against your perceived enemies, but doing it within the rules of the time, which you don’t call out someone for being unpopular or ugly.
0:21:37 You call them out for being something more that makes you allows you to feel more moral.
0:21:49 So I think these two mixed motives kind of came together for the students when it comes to the utterly crucial role, though, of the administrators, because this one, if administrators looked at this and went, no way.
0:21:49 No, no, no, no, no.
0:21:53 You’re not you’re not you’re not getting a professor fired because you don’t like what they said.
0:21:55 This would have died in the crib.
0:22:01 But they had met those same administrators we’ve been fighting for fighting against at fire forever.
0:22:20 And together it created this kind of calamity for for freedom of speech, where these people who already believed it was their job to say what shall be orthodox on this particular campus met a cohort of students that were more willing to play along with that, too.
0:22:22 And again, as with all things, with mixed motives.
0:22:27 You said something that I thought was so, I don’t know, puncturing.
0:22:35 You said that students are being taught the mental habits of anxious and depressed people, which really struck me.
0:22:42 And I thought we’re teaching the kids to be fragile and actually make them less resilient.
0:22:45 It’s more than just word.
0:22:48 These I have seen this evolution where kids in my class.
0:22:50 They feel weaker.
0:22:56 It’s not just it’s not just a cool virtue thing and trendy or fashionable.
0:23:01 They appear to me to be less resilient.
0:23:03 Talk about that.
0:23:29 Yeah, I mean, the whole project with me and me and John came out of my observation when I was dealing with my own anxiety and depression and cognitive behavioral therapy is what saved me and utterly transformed my life, which is a process by which you develop all these tools and sort of talking back to the exaggerated voices in your head that tell you things like you’re doomed or you’re a failure or nobody loves you.
0:23:34 All of these kinds of voices that to a degree, to be clear, everybody except sociopaths have.
0:23:39 But, you know, when you’re anxious and depressed, they’re louder and they’re harder to ignore.
0:23:53 The amazing thing about CBT is it teaches you that if you actually rationally not not power of positive thinking stuff, but just rationally interrogate these, you realize you’re overgeneralizing or engaging in fortune telling that.
0:23:59 You know, you know, you know, you know, you know, you’re overgeneralizing or mind reading all these things that logically don’t really stand up to scrutiny.
0:24:11 And the observation that I that really brought me to talk to John about a potential collaboration, although actually I just told him the idea that I thought was cool, I didn’t actually think we’d collaborate on it.
0:24:26 That was a that was a dream come true, was just that it was like we were teaching kids reverse CBT, that we’re teaching them do overgeneralize, do catastrophize, do engage in binary thinking, do do believe, you know, the future.
0:24:45 And and and and it comes from, I believe, two different places, one, a very well meaning idea from both parents and administrators all through all through K through PhD to sort of and an instinct to sort of protect young people and to insulate them from harm.
0:24:57 But then a less admirable quality is that essentially if you make people feel guilty or frightened, it’s in theory will motivate them towards political action that you prefer.
0:25:01 And to me, that’s that that’s the one that makes me pretty angry.
0:25:06 The first one makes me kind of sad because it’s like, yeah, no, it’s an understandable instinct, but it’s still terrible advice.
0:25:22 And you should have known that the second one is the idea that we can sort of guilt, shame, anger, upset through telling people that they are more fragile than they are, that they are in greater danger than they are, that that will somehow result in a better world.
0:25:36 And I always make the point, listen, this is a bad calculation, even just rationally, because people who are filled with despair and anxiety don’t always choose, to say the least, the best course of action to get from point A to point B.
0:25:41 Yeah, I want to make some observations anecdotal, and you tell me if there’s any data to back it up.
0:25:56 In terms of, I think, I’m trying to think of that product, was it called JotForm, where all of a sudden someone would get upset by something and spin up an online petition and within a certain amount of time, everybody thought it was cool to join in and the dean had to deal with this bullshit.
0:26:17 And I do think a lot of it was bullshit, but a couple observations, and you tell me if the nullifier validate them, I would never got in the way, or I was never subject to this sort of scrutiny or blowback, because the first thing I say in class, one, I’m known as being provocative and, quite frankly, a little bit aggressive and obnoxious, so the expectation is there.
0:26:32 And the first thing I say is, if you think there’s a non-zero probability, something I say is going to trigger you, I curse, I have certain unconscious biases I’m still working on, but if you think something’s really going to emotionally trigger you, you should call your parents and tell them to come get you, because you’re not ready for college.
0:26:41 So there’s a certain expectation that I’m going to be a little bit out there, and no one ever has, I’ve never gotten run over by this.
0:26:53 I have some colleagues who are much more thoughtful and considerate than me, 99.9% of the time, and then they make an error.
0:27:01 They’re inarticulate around something, and it’s shocking because they’re known as these nice, benign people.
0:27:15 And then they get taken, you know, they get taken out and shot, because it’s almost like those of us who are a little bit more aggressive and provocative regularly are not subject to the same scrutiny as someone who makes one false move.
0:27:33 And then the second observation I would make, and this is identity politics, but I’m going to do it anyways, the people I have observed in class who get really upset, I mean physically upset, they’re not faking it, it tend to be women, tend to be white women.
0:27:35 Upper-class white women, yes.
0:27:37 What’s the data there?
0:27:48 The data on women, particularly white women, and particularly upper-class white women being more free-speech skeptical is just very apparent.
0:27:55 And that’s one of those things that, you know, it makes me a little uncomfortable to say it, but it’s a consistent finding.
0:28:00 And it’s something that we found among the professorate, among students as well.
0:28:11 And, of course, I think the free-speech skepticism comes from a good place, but so does all of censorship, you know, so I don’t think that that’s a strange observation.
0:28:25 The observation that the professors who start out more sympathetic tend to be more vulnerable definitely accords with my professional experience, that it certainly seems that way.
0:28:37 But then there’s also a category of sort of outspoken conservative or outspoken, you know, iconoclastic professors that I’ve definitely seen get targeted quite aggressively over the years.
0:28:48 So, in some ways, I will say that some of your observations come a little bit from luck, because, like, you get one student or one administrator who decides, I’m getting Scott Galloway, and the whole dynamic changes.
0:28:52 We’ll be right back after a quick break.
0:29:09 Whether you’re a startup founder navigating your first audit or a seasoned security professional scaling your GRC program, proving your commitment to security has never been more critical or more complex.
0:29:11 That’s where Vanta comes in.
0:29:22 Businesses use Vanta to build trust by automating compliance for in-demand frameworks like SOC 2, ISO 27001, HIPAA, GDPR, and more.
0:29:33 And with automation and AI throughout the platform, you can proactively manage vendor risk and complete security questionnaires up to five times faster, getting valuable time back.
0:29:36 Vanta not only saves you time, it can also save you money.
0:29:47 A new IDC white paper found that Vanta customers achieve $535,000 per year in benefits, and the platform pays for itself in just three months.
0:29:50 For any business, establishing trust is essential.
0:29:53 Vanta can help your business with exactly that.
0:29:59 Go to Vanta.com slash Vox to meet with a Vanta expert about your business needs.
0:30:03 That’s Vanta.com slash Vox.
0:30:11 President Trump met with the leaders of five African nations at the White House yesterday.
0:30:16 One oops got all the attention when Trump paid Liberia’s president a compliment.
0:30:18 Well, thank you.
0:30:19 It’s such good English.
0:30:20 Such beautiful.
0:30:23 Where did you learn to speak so beautifully?
0:30:26 English is Liberia’s official language.
0:30:28 Were you educated where?
0:30:29 Yes, sir.
0:30:31 In Liberia.
0:30:32 Yes, sir.
0:30:33 Well, that’s very interesting.
0:30:37 Anyway, you know what happened behind closed doors right before that meeting?
0:30:43 President Trump pushed those African leaders to accept people who are being deported from the U.S.
0:30:46 That’s according to a Wall Street Journal exclusive.
0:30:50 In fact, it’s trying all kinds of ideas to increase the pace of deportations.
0:30:53 And we’re going to tell you about some of them on Today Explained.
0:30:56 Today Explained is in your feeds every weekday.
0:31:06 This week on Net Worth and Chill, we’re diving deep into Trump’s one big, beautiful bill,
0:31:11 the sweeping legislation that promises to reshape America’s economic landscape.
0:31:15 From tax cuts to student loans, I’m breaking down what this massive piece of legislation
0:31:19 actually means for your wallet, your investments, and your financial future.
0:31:22 We’re going to find out who wins and loses in this economic overhaul,
0:31:25 analyze the market reactions that have investors buzzing,
0:31:30 and discuss whether this bill will deliver on its promises or create unexpected consequences.
0:31:34 Just because you’re not on Medicaid doesn’t mean this doesn’t impact you.
0:31:37 Poor people don’t stop having medical emergencies.
0:31:39 They just stop being able to afford them.
0:31:44 Listen wherever you get your podcasts or watch on youtube.com slash yourrichbff.
0:31:53 What are your thoughts on how the presidents of Harvard, MIT, and Penn handled that situation
0:32:01 and generally assess their response, Congress’s or the Congressional Committee’s viewpoint on this?
0:32:03 And this is a difficult one.
0:32:10 And what is the line between free speech and hate speech from people in masks that creates an environment
0:32:12 that’s unhealthy for the community?
0:32:13 Your thoughts?
0:32:22 Now, there have been critics who have been really critical of those professors when they went to the anti-Semitism hearing in December of 2023.
0:32:33 There have been people who have been primarily critical of the fact that when asked if calling for genocide was protected or not on their campus,
0:32:35 they said it depends on context.
0:32:37 Now, here’s the truth.
0:32:39 It does depend on context.
0:32:46 In First Amendment law, if you’re saying something, and particularly academically, if you’re saying something theoretically,
0:32:52 if you’re saying something in the course of a philosophical discussion, that is different than being like, I’m going to kill you.
0:32:53 So context is right.
0:32:58 But the reason why I nonetheless have zero sympathy, actually, to be fair,
0:33:05 I have sympathy for the president of MIT, because MIT has not been the best, but it sure as hell has not been the worst.
0:33:17 Penn and Harvard, Claudine Gay, I had no sympathy for them at all, because the reason why I have no sympathy for them is they’d been utterly terrible on freedom of speech prior to that point.
0:33:26 And definitely your critics, including people like Barry Weiss, have a good point that saying these people who claim to be exquisitely sensitive about fat phobia,
0:33:34 you know, is the example that she usually uses, were suddenly not caring if someone said something that sounded an awful lot like you,
0:33:36 your country should be wiped off the map.
0:33:37 So what is the line?
0:33:42 The line, as far as fire is concerned, and we think to a large degree the law is,
0:33:47 is something that actually crosses the line into anti-Semitic or racial or sexual harassment.
0:33:51 And that’s not as simple as just saying something offensive.
0:33:56 Actually, it can’t just be saying something offensive, which, by the way, I think is absolutely the right rule.
0:34:04 I think the situation for free speech would be even worse than it currently is if we didn’t have that bedrock principle,
0:34:05 which is called in the law.
0:34:11 It has to be a pattern of discriminatory behavior directed at somebody for it to be harassment.
0:34:16 But a lot of what we saw on campus after October 7th, you don’t even have to get to that question.
0:34:24 A lot of what we saw was, you know, violent attempts to intimidate, actual threats, in some cases, physical assault,
0:34:29 taking over buildings, all of these things that are just not protected, nor should they be.
0:34:32 And particularly, this is something that I just did a TED Talk.
0:34:36 And the thing that I, and I probably angered some people by opening up with this example,
0:34:38 but I want to be really clear here.
0:34:41 There was a speech, for example, at Berkeley.
0:34:50 There was an Israeli Defense Force speaker there, and students organized to, and it’s nice to have this actually like in a,
0:34:54 like in a screenshot of a tweet, of a text message to everybody.
0:34:56 Shut it down.
0:35:05 And 200 students stormed where the guy was supposed to speak, you know, broke down a door, broke down a window, and chased the guy off.
0:35:09 And it’s, and I always have to explain, okay, that’s mob censorship.
0:35:16 Like, like, like that is an attempt by people to say, 200 people to say to anybody who would want to hear this person,
0:35:19 you’re not allowed to hear this person because I don’t approve of them.
0:35:20 That’s not okay.
0:35:26 That’s the kind of thing that, in my opinion, should get you kicked out of a university because it means you’re not actually understanding the point of a university.
0:35:32 And one of the reasons why this angered some people is because I also tend to point out that from October 7th on,
0:35:38 all but about three, and in the two worst years for deplatforming involving violence and involving,
0:35:44 um, and involving shout downs, as they’re called, um, all but three were pro-Palestinian activists.
0:35:51 So, like, it was one of these things, we’re spending plenty of time defending the free speech rights of pro-Palestinian students,
0:35:57 pro-Palestinian professors, but nonetheless, we were also seeing these same students and, uh,
0:36:02 who expected to be protected by freedom of speech showing no respect whatsoever for the free speech of others.
0:36:05 So, a lot of these issues are actually not that hard.
0:36:13 The hardest issue you get into is essentially if it’s not a threat, if it’s not, uh, blocking someone from getting from point A to point B,
0:36:16 it’s just really, really offensive.
0:36:22 At what point does that actually become, uh, anti-Semitic or, uh, racial or ethnic harassment?
0:36:27 And the answer is essentially that, that, that Davis test, um, that we always refer to at FIRE,
0:36:32 which is, is it severe, persistent, and pervasive such that it causes a reasonable person
0:36:36 um, to, uh, uh, to be denied effectively in education.
0:36:43 So, that’s, that’s a high standard, but it should be if you’re dealing with something that has an, an offense aspect in it.
0:36:47 But again, for a lot of these situations, you didn’t even have to get to that analysis
0:36:50 because what the students were doing wasn’t protected in the first place.
0:37:00 So, I’m asking this to learn, not to make a statement, but I struggle with, I feel like anonymity has been conflated with free speech to our detriment.
0:37:10 That some of the really vile things you see online, you know, I understand how important it is for individuals to have First Amendment free speech rights,
0:37:12 but I don’t think that applies to bots.
0:37:20 And I think that an understandable protection for anonymity has morphed into a total lack of accountability
0:37:22 and a real coarsening of our discourse online.
0:37:27 And I think it extends into letting government agencies wear masks and things like that.
0:37:35 But I’m curious, I’d love to just get your thoughts on the fulcrum between the importance of people having the right to say things anonymously
0:37:39 because what they’re saying could trigger danger or self-harm or harm for them.
0:37:45 And at the same time, how this reverence for anonymity may have gone too far and resulted in a lack of accountability
0:37:48 and some really ugly shit spreading online.
0:37:50 Your thoughts, Greg?
0:37:54 Well, in terms of First Amendment law, anonymous speech is protected.
0:37:57 But I don’t think that’s sufficient enough of an answer.
0:38:03 And I think, I tend to think of the justification for anonymity as like a seesaw.
0:38:12 That essentially, if we lived in a free and enlightened society in which people welcomed dissent and welcomed disagreement
0:38:16 and there was no imaginable idea that you’d be punished for it,
0:38:22 then the justification for anonymity would kind of fall on, bring hollow to people.
0:38:24 They’d be like, who cares?
0:38:25 But we don’t live in that world.
0:38:27 And we live less in that world than we used to.
0:38:37 Because even, I’d say, 10 years ago or 12 years ago, before cancel culture, the idea of saying something that was your genuinely held opinion
0:38:46 had a much lower likelihood of ruining your career, add to it the possibility of being actually punished in some way.
0:38:56 Now, that certainly applies to now, to a much larger degree than I ever thought I’d see, to say, a lot of countries in Western Europe at this point, a lot of countries in the Anglosphere.
0:39:03 I mean, you know, by different estimates, they’re arresting something like 30 people a day for offense speech in Britain.
0:39:11 I’ve heard different accounts, but generally they go between 7 and 40 people being arrested a day for that.
0:39:14 Germany, you know, like that will brag about the fact that they did morning raid.
0:39:20 They did this on 60 Minutes as well, brag about doing morning raids on someone who called a politician a penis.
0:39:25 Under that situation, the justification for having an anonymous speech goes way up.
0:39:27 Can it and is it abused?
0:39:29 Absolutely.
0:39:35 But, you know, I think actually I’m going to quote Milton Friedman here, but it’s just a really good quote.
0:39:37 Something isn’t a right unless it can be abused.
0:39:39 I like that.
0:39:52 What about, so Section 230, the idea that these nascent platforms aren’t subject to the same kind of libel, slander, disparagement laws that traditional media platforms.
0:39:54 What are your thoughts on that?
0:39:59 I think we toy with Section 30 to our great peril.
0:40:07 I think that it’s, you know, like democracy, it’s the worst of any system except all the others.
0:40:13 Now, to be clear, there might be some other system that I haven’t thought of that could be better.
0:40:23 But when it comes to things and but I do find it particularly almost amusing that conservatives are going after 230 or were going after 230 with such gusto.
0:40:47 Because if you actually even let’s just take it to the defamation protections that 230 gives to ISPs, to Internet service providers, if suddenly that were to vanish, it would lead to Internet service providers censoring a lot more, like a lot, lot more, because they can be held liable for liable for defamation.
0:40:57 And I think that given the biases in a lot of social media companies, that would wildly disproportionately affect what conservatives say.
0:41:02 So I think that 230 is probably I think that overall we benefit so much from 230.
0:41:04 Of course, it’s going to have downsides.
0:41:25 But you see you don’t see an issue that traditional media platforms, which are struggling to stay viable and raise the funds to do fact checking and put out, I don’t want to say the truth, but a greater attempt to do the good work of journalism and fact check and do their research.
0:41:33 You don’t see that a problem with holding traditional media to an entirely different standard, a higher standard than these online platforms.
0:41:35 Yeah, and that’s generally the way they’re held.
0:41:40 Essentially, traditional media that’s responsible for the content that they produce.
0:41:49 I think it makes more sense to hold them liable for not doing the sufficient fact checking for defamation as opposed to something that hosts everything.
0:42:01 I mean, something that hosts the Wall Street Journal, the New York Times and YouTube and everything else under the sun, you know, is something that’s quite distinct than just, you know, just the New York Times by itself.
0:42:04 Do you think there’s opportunity for nuance or gray area?
0:42:08 And I’ll propose a solution or what I think we should think about.
0:42:09 I think about 230 a lot.
0:42:24 The idea that people can break through and say things and post something and that a company that creates a lot of economic value, lets a lot of interesting opinions, sometimes the conspiracy view ends up being actually more true than you think.
0:42:36 There’s been some just wonderful things about these platforms and the ability for viewpoints and consumers or content producers to kind of go direct to consumer and kind of have at it.
0:42:43 At the same time, I worry that the protection is not consistent in the sense that, well, let me propose a solution.
0:42:56 So if my co-host on one of my podcasts, Raging Moderates, and she’s the kind of the sole Democrat on this show, which is actually the most watched show on cable news called The Five.
0:43:15 Someone got upset that she called out Ken Paxson or something and mocked up a picture of her with her previous boyfriend and has, you know, has gone into this tried and true misogynistic, slut-shaming, misinformation, having an affair from her first husband, hasn’t been married to her.
0:43:16 I mean, just total nonsense, right?
0:43:39 And the algorithm on Twitter loves that because it creates a lot of comments, a lot of engagement, people weighing in, conspiracy theorists, and also people protecting her creates more Nissan ads because, and the algorithm itself from Twitter is trained to elevate that content and give it broader and further reach than it would organically because it creates Nissan ads.
0:43:42 In other words, in other words, there’s an economic incentive to spread this information.
0:43:49 Do you think there’s a solution or some gray area where maybe we say, okay, you’re a bulletin board and you can’t be responsible?
0:43:54 It’s just, it’s, it’s unrealistic suppression of speech, economic impairment.
0:43:59 If you were responsible for policing everything, someone pins up on the board.
0:44:08 But if you as a social media company decide to elevate algorithmically content and give it more spread than it might organically,
0:44:17 at that point, are you really different than an editor at CNBC or MSNBC or at Fox who’s subject to a different set of standards?
0:44:25 Shouldn’t they be subject to the same standards if they make the conscious decision to algorithmically elevate content?
0:44:37 I’m always worried about the distortative impact of government, also regulation sometimes, and liability.
0:45:00 And so I’m very hesitant to change anything without, and my job is to make the argument for, you know, err on the side of free speech as much as possible and err on the side of as little, as few things being banned and as few things being government regulated as possible.
0:45:13 So my fear is that essentially if you started having government entanglement with algorithmic choices, you know, you really got to decide, one, you know, which government do you trust?
0:45:15 Do you trust Biden to do that?
0:45:16 Do you trust Trump to do that?
0:45:26 But also, particularly when it’s a liability standpoint, how distortative that actually can be to what gets reported in the first place?
0:45:32 Because this is, in a sense, kind of like why everybody sues for libel in Britain.
0:45:46 It’s very, they still, even though they’re, even though they’ve made slight improvements around the edges, it’s still much easier to find people guilty of committing libel in Britain than it is the United States.
0:46:02 We actually have a shield for the country, basically saying, you know, providing some modicum of protection for people in the U.S. from libel tourism that takes place in, that takes place in the U.K.
0:46:11 So, you know, I’m always going to be fairly skeptical of that kind of stuff, but it’s also my societal role to be skeptical of that kind of stuff.
0:46:14 We’ll be right back.
0:46:26 Foldable phones have been around for a while now, but maybe you’ve never used one.
0:46:31 This week on The Verge Cast, we take a look at Samsung’s new lineup of foldables.
0:46:37 This could be a big moment where foldable phones become a lot more interesting to a lot more people.
0:46:45 Plus, we look at executive shakeups at Apple, Meta, and X, where Grok is going absolutely off the rails.
0:46:54 Plus, we do our signature microphone test with the latest over-ear headphones, and we get into why it’s so hard to make a great strength training app.
0:46:56 That’s this week on The Verge Cast.
0:47:06 We’re back with more from Greg Lukianoff.
0:47:12 So I want to throw kind of the most difficult stuff at you and get your thoughts.
0:47:25 So if I were Putin and I’d lost a million men to a war in spending, you know, $70 to $100 billion a year on a losing war,
0:47:31 and at some point, if this war continued to wreak the kind of economic and human damage,
0:47:36 if it were to continue to do that, at some point I might find myself falling out of a window.
0:47:43 So I think he would be stupid not to weaponize and spin up troll farms in Albania
0:47:48 and then create a list of the 10,000 most influential people online who are pro-Ukraine
0:47:54 and start attacking their reputation with millions of bots in a very thoughtful way.
0:47:56 Is he already doing that, though?
0:47:57 Well, yeah, I think he is.
0:48:01 And that’s my question, and that is, do bots have First Amendment rights,
0:48:04 and do these platforms have some sort of obligation,
0:48:09 which I think would only be registered or adhered to through some sort of regulation,
0:48:14 to protect us against bad actors that might be, quite frankly,
0:48:18 raising a generation of military, civic, and nonprofit leaders who don’t like America
0:48:20 or begin to have their views shaped,
0:48:27 and ultimately our votes and our military decisions shaped on outside forces
0:48:34 that are taking advantage of a very porous and lightly regulated tech ecosystem and platforms.
0:48:35 Your thoughts?
0:48:38 The question of whether or not bots have First Amendment rights,
0:48:42 of course they don’t, but do bot creators have First Amendment rights?
0:48:46 At least when they’re in the United States, absolutely, certainly they do.
0:48:50 When you’re talking about the kind of propaganda, kind of warfare,
0:48:55 and targeting that is possible in the age of the Internet, in the age of social media,
0:49:00 when you’ve kind of fall down this rabbit hole of how you actually address it
0:49:05 without actually devastating, without having huge government encroachment,
0:49:08 which will end in bad places as well,
0:49:12 or without creating massive unintended consequences,
0:49:18 the best way to do this historically has simply to have authorities that people actually trust.
0:49:23 And we have blown giant holes in the only thing that really can protect you
0:49:25 from disinformation and misinformation.
0:49:32 And you have to start figuring out ways to get authorities that people essentially trust.
0:49:36 Because one of the ways we could potentially address some of this stuff
0:49:43 is by having institutions pointing out what is troll farms and what isn’t.
0:49:46 But under the current environment, there’s going to be a lot of like,
0:49:47 sure, they are.
0:49:51 You just don’t like what they’re saying is going to be the response there.
0:49:55 And in a situation where these institutions had better societal trust,
0:49:56 it’d be like, oh my God, you’re right.
0:49:59 So you’re a First Amendment attorney.
0:50:02 What are we not paying attention to in the courts?
0:50:05 Have there been any legal decisions that you think are
0:50:11 especially important to the future of First Amendment or speech or its regulation or lack thereof?
0:50:15 What have you seen come down the pike that you think has not gotten enough attention?
0:50:17 Yeah, I mean, I think it got good attention,
0:50:20 but I think people haven’t thought through all the ramifications of it.
0:50:26 And this gets to your point on anonymity where we may disagree to some degree.
0:50:39 But the change in the law to say that you can’t actually require verification for kids
0:50:44 and actually really for anybody to use porn sites in Texas
0:50:50 is a case that could really have some serious bad ramifications
0:50:54 unless it stays relatively cabined.
0:50:57 Now, I was definitely among the First Amendment people saying,
0:51:01 listen, there’s a case called New York v. Ginsburg from the late 1960s
0:51:06 that says you can require store owners to put the, you know,
0:51:09 put the nudie mags, you know, on the back shelves
0:51:12 and to make sure that that miners don’t get them.
0:51:19 But then we had decade after decade of the Supreme Court and other courts basically saying,
0:51:23 but online that can’t possibly apply for all sorts of,
0:51:25 and to be clear, very serious reasons.
0:51:28 I knew that wouldn’t last forever.
0:51:33 And then eventually Texas passed a verification regime
0:51:36 that was actually more complex than I originally understood,
0:51:41 but was first marketed as something where you had to basically show like a driver’s license
0:51:46 if you want to see porn that would also include that they had to have disclaimers
0:51:51 saying that, you know, porn is harmful to your mental health and all this kind of stuff,
0:51:53 which has compelled speech issues.
0:51:59 I think they made some efforts to sort of improve the law and make it clear that there’s other ways to verify.
0:52:04 Anyway, so that fight was something that I predicted we were going to lose in the Paxon case.
0:52:05 Now, here’s the question.
0:52:09 Are we then going to, with the best of intentions,
0:52:15 create an environment where you essentially can’t use the Internet without identifying yourself in some way?
0:52:20 And that scares me, because I do actually think that the situation for free speech,
0:52:25 even in the so-called free world, is dodgier than it’s ever been in my lifetime.
0:52:33 And the idea that at this precise moment we’d also make it harder for people to hide what they’re looking at
0:52:35 or what they’re reading scares me.
0:52:40 So our efforts at FIRE are definitely going to be to make sure that that decision,
0:52:46 as much as possible, stays cabined to kids’ access to adult materials.
0:52:48 Don’t you think the platforms are already doing that?
0:52:52 Don’t the platforms already know exactly what we’re doing, saying, and when?
0:52:58 But as long as they use it to monetize advertising, it seems to be there’s a tolerance for it.
0:53:01 I think the cat’s already out of the bag.
0:53:02 I think they already know everything we do and what we say.
0:53:04 We don’t seem that worried about that.
0:53:11 But then we have this, do we have this tremendous fidelity for protecting them
0:53:17 when it comes to any, I don’t know, forward-facing viewpoints that might result in more,
0:53:22 I don’t know, just more, it seems like we’re just protecting them in the wrong areas
0:53:24 and not looking at them in others.
0:53:26 I apologize for the word salad there, Greg.
0:53:28 Do you see any inconsistency?
0:53:30 Yeah, no, I definitely get the concern.
0:53:37 But I do think that there are tools that people badly underutilize that can actually protect your privacy.
0:53:38 Well, they’re purposely made complicated to utilize.
0:53:42 Have you tried to regulate your kid’s content on Face, on Meta?
0:53:48 I mean, they are not, I would argue, they are not readily accessible purposefully,
0:53:49 or they’re not easily used.
0:53:55 Yeah, no, and I definitely ask for help to make, with my kid’s stuff.
0:53:59 But now we have, you know, we have Signal, you know, for example.
0:54:02 We do DuckDuckGo, you know, like that.
0:54:05 I have Custodial installed on my kid’s phones.
0:54:09 Yeah, there are some basic steps you can do to somewhat protect your privacy.
0:54:13 And of course, when it comes to private corporations doing bad things,
0:54:18 and this is something that I feel like we have an entire generation of young Americans
0:54:22 sort of brainwashed to believe that you should be more afraid of corporations
0:54:23 than you should be of governments.
0:54:30 And I just think that’s absolute nonsense, particularly foreign governments,
0:54:33 but also, frankly, the U.S. government.
0:54:39 And that corporations, you know, people talk about that evil profit motive.
0:54:43 And I’m kind of like, I prefer the profit motive to a lot of the other motives you can have
0:54:45 for finding this stuff out.
0:54:50 And profit motive often lends itself to, and by the way, we protect our users’ privacy
0:54:56 in a way that, you know, the Chinese, the CCP, or Russia, or even our own government,
0:55:00 it’s like, no, we want this information for other reasons.
0:55:06 So just so you know, I’m very good at turning this podcast into, it’s really just an excuse
0:55:07 for me to talk about me.
0:55:14 Steve Bannon suggested that the president, that the administration sue me for some of the things
0:55:14 I’ve said about him.
0:55:16 I called him a rapist.
0:55:23 And do you feel that the president is, in different ways, trying to suppress free speech?
0:55:26 And if and what laws, or what do you think should be done about it?
0:55:32 What are your views on, it feels to me like free speech has been chilled from the administration.
0:55:34 And I’m just curious, curious to get your thoughts on it.
0:55:35 Sounds like you agree with it.
0:55:38 But what can and should be done to push back on that?
0:55:44 Well, it’s tough because the, okay, so the ways in which it’s being chilled, just really
0:55:48 quickly, there’s been, you know, attacks on mainstream media.
0:55:52 People can argue that it’s deserved, but that doesn’t mean you get to violate the First
0:55:52 Amendment.
0:55:54 There’s attacks on higher education.
0:55:58 Again, you can feel like it’s deserved, but it doesn’t mean you get to violate the First
0:56:00 Amendment or existing laws.
0:56:05 And then there’s the attack on the law firms, which probably is the one that I think gets
0:56:07 the least attention, but probably scares me the most.
0:56:11 When it comes to the media, for example, like, were the group defending Ann Selzer?
0:56:18 Ann Selzer is the pollster in Iowa who got the poll really wrong right before the election,
0:56:21 having Kamala up by two points in Iowa.
0:56:23 And of course, that was way off.
0:56:25 She was like 11 points off.
0:56:28 But when it came out, she apologized.
0:56:33 She explained how she got it wrong, saying that she was using methodology that was really
0:56:39 effective maybe 10 years ago, but has gotten increasingly ineffective as fewer people have
0:56:40 landlines and that kind of stuff.
0:56:43 Because she used to be considered like the gold standard of pollsters.
0:56:47 But she was nonetheless sued by Trump himself, actually.
0:56:49 This is before Inauguration Day.
0:56:55 For under a Consumer Protection Act in Iowa.
0:57:03 And the Consumer Protection Act was really designed for preventing false advertising, like as in
0:57:07 in commercial speech, saying that, you know, these pills will help you lose 40 pounds a day
0:57:12 type things, not getting a poll wrong, which is, you know, good faith reporting.
0:57:14 So we’re defending her in that case.
0:57:19 Then there’s also like the 60 Minutes situation, the ABC News.
0:57:25 The 60 Minutes one, I think of as particularly bad because it really seemed like the administration
0:57:36 was dangling a proposed merger with Skydance in front of CBS to saying, kind of implying, we’re not going to agree
0:57:39 to this unless you play ball, which is good.
0:57:45 The university stuff, nobody’s been a bigger critic of Harvard, for example, than I’ve been.
0:57:51 They have finished dead last in our campus free speech rankings, one of the best and most data intensive
0:57:54 things that FIRE does, or the most data intensive thing that FIRE does.
0:57:57 And Harvard was dead last two years in a row.
0:58:02 But we’re currently defending Harvard because the letter the administration sent to Harvard
0:58:07 was basically saying, because you’re probably in violation of Title VI, which they may be,
0:58:17 and Title VII, which when it comes to admissions, probably are, that we essentially have to
0:58:19 nationalize Harvard.
0:58:24 Like basically, the government gets to decide all of the key things about what Harvard would
0:58:27 decide on its own, which is not a power the government’s been granted.
0:58:32 And when it comes to the law firms, I mean, like that, that’s the one that I really, I wrote
0:58:38 on this, my substack, the Eternally Radical Idea, about the, about all of these, these cases.
0:58:45 And it started with them just going after attorneys who had opposed the Trump administration, even
0:58:50 like people like Robert Mueller, and where they had law firms and, and saying that they
0:58:55 would be denied their secret service protection, and not secret service, their-
0:58:56 Yeah, their security details.
0:58:57 Repackage violence.
0:59:03 If you’re someone who ordered a strike on Suleiman, the head of the Iranian security forces, and
0:59:07 you take away a general’s security detail, you’re putting that person in harm’s way, in my view.
0:59:12 And get rid of their security clearance, and then also deny them access to federal buildings,
0:59:15 which of course include court, uh, uh, uh, courtrooms.
0:59:19 And that’s, that’s to me like some of the most chilling stuff.
0:59:20 Now, what can be done about it?
0:59:25 Uh, the most, the, the thing that’s happening consistently is that Trump is losing in court.
0:59:30 And so far, he’s mostly been abiding by those rulings.
0:59:35 Um, I’m a little bit concerned, given how fast and loose sometimes this administration
0:59:40 plays with the rules, that that might not hold up when push really comes to shove.
0:59:42 Um, but, you know, fingers crossed.
0:59:47 In terms of, like, what else people can do about it, I think it really is a question of what
0:59:51 happens in the next, uh, in the midterms, and then, of course, in the presidential election.
0:59:56 Um, but, uh, it’s, it’s, it’s troubling, but not unexpected.
0:59:58 Yeah, shocking, but not surprising.
1:00:04 Uh, just as we wrap up here, Greg, a lot of young men listen to this podcast, um, based
1:00:08 on some of the many challenges, all young people, but especially, I would argue, some young men
1:00:10 in our society are facing right now.
1:00:14 A lot of them are struggling with their own mental health, and I appreciate how transparent
1:00:18 and vulnerable you were at the beginning of the podcast, talking about your own struggles,
1:00:22 and you had said that cognitive behavioral therapy really helped you.
1:00:28 Can you share, uh, some thoughts on your struggles with your own anxiety and depression and any
1:00:31 advice you might have for young people who are facing their own challenges?
1:00:37 Sure, um, and that’s, you know, and, and it’s tough because, like, you know, everyone struggles
1:00:43 kind of differently, and I, I really understand people’s kind of concern about, well, one, of
1:00:48 course, the expense of getting a therapist, uh, but also the fear that’s given that therapy
1:00:54 has to some degree become politicized, that they don’t want to end up someone with a therapist
1:00:58 who’s going to judge them, you know, for, for, for, in some cases, just for being a male
1:01:01 for having, um, and, you know, non-conforming political views.
1:01:08 Um, so I, I get all of that, but I, I did hear from one, uh, friend about their, about
1:01:15 their kid who basically said he didn’t need therapy because he watches a lot of podcasts.
1:01:22 Um, and he, he’s on YouTube a lot getting advice and it’s, and it’s just not the same thing.
1:01:23 Yeah, that’s not the fix.
1:01:30 Yeah, so there’s people out there like Camilo Ortiz, uh, Ortiz, sorry, um, who is trying
1:01:37 to put together apolitical therapists, you know, um, who, or ones who won’t judge you, you know,
1:01:41 who won’t let their political opinion interfere with their therapy, which is amazing that you
1:01:42 have to do that, but you do, unfortunately.
1:01:49 Um, and, and within, so looking for people who are, who are recommended that way for CBT,
1:01:54 there’s also some, you know, some approaches to CBT that actually lend themselves fairly well
1:01:59 to even apps, um, which is, I don’t think it’s sufficient, uh, but it can help.
1:02:01 Uh, but here’s the thing.
1:02:06 It may be simple, but it’s not easy because you have to do it several times a day.
1:02:11 You have to actually do it when those, um, self-hating voices come up in your head,
1:02:12 those catastrophizing voices.
1:02:17 Otherwise your brain will never get in the habit of talking back to those and you have
1:02:23 to do it every time they come up and you have to do it for pretty much, like I would say probably
1:02:27 you’re not really going to see much change, uh, for the first six months even.
1:02:31 But I remember about nine months in suddenly being like, wait a second, all these things that
1:02:34 used to pop up in my head, they’re not, they don’t sound convincing anymore.
1:02:36 And it was really dramatic after that.
1:02:39 So I, I definitely believe looking to CBT.
1:02:44 I think that, you know, um, one thing that I do a lot when I’m having a hard time is I
1:02:46 go reread Seneca’s letters to a young man.
1:02:52 I, I, I find that they’re really approachable, uh, meditation is very helpful to people, but
1:02:54 don’t forget things like things.
1:02:55 Exercise is really key.
1:02:59 And if you’re in it, um, there is something there.
1:03:02 My favorite book is Upward Spiral by Alex Korb.
1:03:07 I highly recommend, I have like a whole thing, I have a whole, like a, a, a sub stack, um,
1:03:09 on, on, on this very issue.
1:03:11 Cause it’s, I get asked it so much and I give kind of all of my advice.
1:03:13 And what’s that sub stack called or what’s that post called?
1:03:18 Well, I don’t remember what that post is called, but my sub stack is the eternally radical idea.
1:03:21 Uh, but also, you know, talk to people about it, talk to friends.
1:03:26 Um, the, when you’re in a really bad way, there’s a sense that nobody’s going to want to hear
1:03:29 your, hear your whining about it, but that’s just not true.
1:03:34 Um, and, and, and in most cases, cause you know, as, as hard as it may be, may be, may
1:03:38 be to believe sometimes when you’re really deep down and dark, um, there are people out
1:03:39 there who love you.
1:03:44 Greg Lukianoff is a free speech advocate, first amendment attorney, president of FIRE, the
1:03:49 foundation for individual rights and expression and coauthor of the coddling of the American
1:03:55 mind, which has probably had more impact on my parenting than any book I have read.
1:04:01 Uh, and also the canceling of the American mind, his latest book war on words, 10 arguments
1:04:05 against free speech and why they fail is out next week.
1:04:07 He joins us from Maine.
1:04:11 Greg, I’ve wanted, I wanted to meet you and speak to you for a couple of years because one
1:04:18 of my role models, uh, Jonathan Heights, whenever he talks about you, he speaks about you in such
1:04:19 reverence and with such respect.
1:04:23 Uh, so I was really excited to, uh, have this conversation.
1:04:24 Very much appreciate your time.
1:04:30 Also very much appreciate, uh, what you said, um, at the end about cognitive behavioral therapy.
1:04:37 Uh, and I, the, the takeaway I have and that I hope people take away from this podcast is when
1:04:41 you’re really down and you think everyone’s sick of you and sick of hearing from you and doesn’t
1:04:43 have time for you that that that’s just not true.
1:04:45 Uh, so anyways, thank you for sharing that, Greg.
1:05:14 My father passed away last week and it’s been a rough few days for me as it is for anybody
1:05:15 who loses a parent.
1:05:20 Uh, our species, our competitive advantage is that is our brain.
1:05:24 It’s so big that we’re expelled from the human body prematurely and our brain is exactly the
1:05:25 wrong size.
1:05:30 It’s big enough to ask very complicated questions, but not big enough to answer them.
1:05:35 And death is something our brain still hasn’t come, uh, to grips with.
1:05:39 And that is, um, especially with a parent, this is someone who is your first protector.
1:05:44 And then when you lose that person, the idea that all of a sudden that protector isn’t
1:05:45 around is devastating.
1:05:47 It’s a mirror.
1:05:52 You see a lot of yourself in this person and, uh, you immediately think about all the different
1:05:55 things in your life that developed good and bad with this person.
1:05:59 And you have to deal with those and come to attempt to come to grips with them, which
1:06:00 sometimes can be painful.
1:06:03 Our brains are used to continuity and patterns.
1:06:07 We’re used to having that person in our life and we assume they’re going to be around
1:06:09 forever and it’s impossible to believe they’re not going to be around forever.
1:06:14 So the finality of death is just very shocking and very difficult to wrap your head around.
1:06:20 The biggest or most profound moments in my life have involved, uh, birth and death.
1:06:26 Uh, the death of my mother, my mother passed away when I was 39 after what was a pretty ugly
1:06:32 model with a smoking related illness, um, specifically cancer, breast cancer, uh, twice.
1:06:33 And then it metastasized in her stomach.
1:06:40 Uh, and it was just the finality and the harshness of it and the brutality of the way she died kind
1:06:46 of really, um, it’s sort of, you know, these things change you.
1:06:50 They, they, they, they really, I think for most people, you’re sort of never the same.
1:06:54 I was much lighter and funnier before that happened.
1:06:59 And I think something kind of died in me, but at the same time, I developed a wonderful sense
1:07:00 of the finite nature of life.
1:07:03 And then when my kids were born, that changed everything for me.
1:07:09 I became much more responsible, uh, much more anxious, but, uh, started for the first time
1:07:12 in my life thinking about other people, which was an enormous unlock.
1:07:13 And I’ll come back to that.
1:07:19 And then the death of my father is a debt, a different sort of feeling, not nearly as close
1:07:21 to my father as I was to my mother.
1:07:26 So my dad, George Thomas Galloway was born in 1930 in Sydney, Australia to a woman who was
1:07:28 a domestic servant for a wealthy family.
1:07:30 He was born out of wedlock.
1:07:36 And the deal was that the family found out my grandmother was pregnant and they had a daughter
1:07:39 who did not have any children of her own and was in her thirties, which was
1:07:40 considered a, you know, a tragedy.
1:07:46 And they agreed to adopt her child, her unborn baby and would give her enough money.
1:07:48 But the deal was she had to, to leave.
1:07:52 And I think they even gave her some money, uh, cause they didn’t want the biological mother
1:07:52 around.
1:07:54 And my, my grandmother agreed.
1:07:58 And then, uh, my grandmother gave birth to my father in Sydney, Australia.
1:08:03 And I don’t know the full story, but convinced her boyfriend or the father of the child, which
1:08:05 is obviously very upset to meet her at the docks.
1:08:09 And they got on a ship for, for Scotland.
1:08:14 I can’t even imagine what the, the ship route was like from Sydney to Glasgow.
1:08:17 And so my father always jokingly said, I could have been a McVicar.
1:08:20 It was the McVicar family that built like battleships or something.
1:08:21 And he says that he’s pissed off.
1:08:25 He would have much rather stay in, in Sydney, Australia than the son of a rich family.
1:08:32 Anyways, uh, raised in depression era, Scotland and world war two, Scotland, he says his first
1:08:34 memory is watching the Clydes bank rate.
1:08:41 I think it’s called where Henkel Henkels and Messerschmitts dropped bombs on, uh, munitions
1:08:45 factories or shipbuilding factories, uh, just outside, I believe of Glasgow.
1:08:49 And he jokes that, uh, they were obviously very patriotic.
1:08:55 He was nine when the war broke out, 15 when it ended and that anyone with an accent in
1:08:59 Glasgow, his, him and his 10 year old buddies would follow around and take notes on them
1:09:00 because they assumed they were spies.
1:09:07 And then, uh, uh, after the war ended, he was 15, but at the age of 17, he lied about
1:09:11 his age and went to a recruitment office and wanted to be a pilot for the IRF.
1:09:14 And the recruiter said, you’re too tall to be a pilot.
1:09:17 So he went across the street or somewhere to where they were recruiting for the Royal Navy
1:09:20 and joined the Navy at a very young age.
1:09:26 And before he knew it was on, uh, I believe it was an aircraft carrier and my father could
1:09:29 repair, they do an assessment, a skills assessment test.
1:09:32 And he disclosed that he could repair things.
1:09:35 He repaired motorcycles and that he was a good swimmer.
1:09:41 And so the next day he was no joke in a helicopter, in a wetsuit, in the North Atlantic,
1:09:44 practicing what he found out later was pilot rescue.
1:09:49 They kind of informed him what he was going to do while he was in the helicopter in a poorly
1:09:50 fashioned wetsuit.
1:09:52 And they said, okay, this is the deal.
1:09:53 You’re going to jump out into the water.
1:09:56 We’re going to throw out 150 pound dummy, not in that order.
1:09:58 Then we’re going to lower a basket.
1:10:04 And your job is to get this 150 pound dummy into a basket as if it was a pilot in the North
1:10:04 Atlantic.
1:10:08 Oh, and by the way, even with your wetsuit, in about 14 minutes, you’re going to die of
1:10:08 exposure.
1:10:09 So there’s some motivation.
1:10:16 So my dad jumps into the North, wavy North Atlantic when it was dark out and tries to get
1:10:18 this dummy into this basket.
1:10:23 And then he said the scariest moment was he got the dummy into the basket, most exhausting
1:10:23 thing he’s ever done.
1:10:24 They pull it up.
1:10:28 And he said the current started taking him away from the helicopter.
1:10:32 And he was worried they were no longer even going to be able to see him and get him out.
1:10:33 And they drop a winch.
1:10:35 He connects it and they pull him up.
1:10:38 His first week, he got his pay.
1:10:42 He put it in a locker at the foot of his cot.
1:10:44 And the entire locker was stolen.
1:10:49 I guess this was sort of something that the freshman recruits were stupid and would put
1:10:51 money in their locker thinking it would be secure.
1:10:54 So there was a service where he could send money home.
1:10:58 So he would send all of his money from the Navy, all of his pay home to his mother.
1:11:02 And after two years, he calculated he had enough money to get to America.
1:11:07 And he came home to find out that his mother had spent his money on whiskey and cigarettes.
1:11:12 And in her defense, she claimed that, what did you expect me to do?
1:11:13 I was bored.
1:11:18 So my father has always had a very unhealthy relationship with money.
1:11:22 I mean, it really scarred him growing up in the Depression-era Scotland and I think acts
1:11:23 like that.
1:11:30 But he did get some money together and got to America and led what could arguably be called
1:11:31 the American dream.
1:11:36 My favorite story about him first arriving in America was he and my mom met in Canada.
1:11:37 They got pregnant.
1:11:38 They hated the weather.
1:11:39 They bought a newspaper.
1:11:43 And there was an article saying that the nicest weather in North America was in San Diego.
1:11:49 So they loaded up my seven-month pregnant mom into an Austin mini-metro and drove from
1:11:51 Toronto to San Diego.
1:11:54 My dad’s first job interview was to be a salesman for a candle company.
1:11:58 And the head of HR there said, you’ve got to stay here.
1:11:59 She asked him how long he’d been in the country.
1:12:00 And he said, just two weeks.
1:12:01 And she said, that’s incredible.
1:12:03 She said, just wait right here.
1:12:07 And she went and got her boss and said, you need to meet this guy, Tom Galloway.
1:12:11 He’s only been in the country for two weeks from Scotland and he can already speak the language.
1:12:12 I love that.
1:12:19 Anyways, my dad was aggressive, smart, and charming and got jobs.
1:12:23 And at his peak, we had a home in Laguna Niguel.
1:12:28 I was thriving and they were living what could best be described as an upper middle-class home.
1:12:33 And unfortunately, my dad was not a high-character person.
1:12:34 Married and divorced four times.
1:12:39 A handsome man with a strong jaw and a Glaswegian accent in 70s California.
1:12:42 Did not only think with his dick, he could listen to it.
1:12:49 And my dad rifled through four marriages and four divorces, including his last one where he decided to leave his fourth wife when she had late-stage Parkinson’s.
1:13:01 So he was never able to really connect with people or ever really develop a sense outside of kind of the survival instinct of investing in other people and other people’s relationships with other people.
1:13:03 He was broken that way.
1:13:06 And there’s just no not acknowledging that.
1:13:17 However, what has helped me or some of the learnings here is that what has helped me process this is I love that Dr. Sue saying, don’t cry because it’s over.
1:13:18 Laugh or smile because it happened.
1:13:27 I’m thinking about all the things I’m grateful for, being the son of George Galloway, on very basic things.
1:13:27 I am tall.
1:13:29 I have broad shoulders.
1:13:30 I have a good voice.
1:13:32 And I have made an exceptional living communicating.
1:13:34 None of those things are my fault.
1:13:36 I got all of those things from my father.
1:13:41 And just because maybe he didn’t purposely work to give them, give me those things, there’s no reason I can’t be grateful for them.
1:13:50 I think the ultimate test of evolution and the most basic box you need to check as a man is the following.
1:13:56 And that is, are you a better father to your son than your father was to you?
1:13:59 And my dad checked that box in indelible ink.
1:14:03 His father, I found out later in life, used to physically abuse him.
1:14:05 I can’t even imagine what that would be like.
1:14:10 The person you’re supposed to trust most in the world, the person you’re supposed to protect you, physically abuses you.
1:14:12 And he never did that for me.
1:14:17 And he did try after my mom and dad got divorced, he would try and meet me in places and take me to museums.
1:14:22 And there’s a lot for me to be grateful for.
1:14:35 What has helped me in terms of my relationship with my father that has been one of the biggest unlocks in my life, hands down, was I really struggled with my relationship with my father.
1:14:47 Because every time I thought I was being a good son, and I remember back to the fact that he kind of left me and my mom and wasn’t very kind to us, was I would get resentful and angry and kind of cut him out of my life for small periods of time.
1:14:55 And then an enormous unlock, and my biggest piece of advice, if you’ve made it this far, is I decided, okay, what kind of son do I want to be?
1:15:05 I don’t think about the relationship as a transaction and what he owes you or what you owe him or base my behavior on what he did or did not do for me.
1:15:08 But just simply put, what kind of son do I want to be?
1:15:13 And the reality is I wanted to be a loving, generous son, and I wanted to have a great relationship with my dad.
1:15:17 And I had all of the qualities and resources to do that.
1:15:22 And from that moment on, I put away the scorecard, I put away the bullshit, and I was a great son.
1:15:25 And it was not only wonderful for him, it was wonderful for me.
1:15:29 I’ve loved these last 20 years of just having a great relationship with my father.
1:15:30 He’s charming.
1:15:30 He’s funny.
1:15:31 He does love me.
1:15:36 And it’s just been a huge lesson for me in life.
1:15:48 You know, not is my partner as good to my parents or is nice and generous to me based on how I should behave around her, but what kind of partner do I want to be?
1:15:49 What kind of business person do I want to be?
1:15:52 What kind of employer do I want to be?
1:15:55 I used to look at every employee as, are they adding as much value as I’m paying them?
1:15:57 And if they’re not, I’m going to fire them.
1:15:59 Now I think, how can I be just an amazing employer?
1:16:01 How can I be an amazing friend?
1:16:05 How can I be, what kind of investor do I want to be known as?
1:16:08 And then live to that standard and put away the scorecard.
1:16:13 Don’t base your behavior on what you did or didn’t get from that person, but on the person you want to be.
1:16:16 And that’s just been such an enormous unlock for me.
1:16:25 So as I sit here and I think about my dad and I try to process his death, you know, I think this guy lived the American dream.
1:16:27 He came to America.
1:16:32 The biggest gift he gave me was that he took this enormous risk and got to America.
1:16:45 And so much of my success, so much of the ability to have a wonderful family is based on something that was not my fault, specifically my dad having the courage and taking the risk to get to America.
1:16:47 And I’m very appreciative of that.
1:16:52 George Thomas Galloway was 95 when he died.
1:16:53 He was very much a man.
1:16:55 He was a protector.
1:16:58 He wanted to protect his country.
1:16:59 He was a provider.
1:17:01 He provided for two families.
1:17:04 And he was a procreator.
1:17:07 He had two kids and four grandkids.
1:17:26 His son and his daughter will miss him terribly.
1:17:52 This episode was produced by Jennifer Sanchez.
1:17:55 Our assistant producer is Laura Janair.
1:17:57 Drew Burrows is our technical director.
1:18:01 Thank you for listening to the Prop G Pod from the Vox Media Podcast Network.
1:18:47 Thank you.
Greg Lukianoff, a free speech advocate, first-amendment attorney, and president of FIRE, joins Scott to break down the rise of cancel culture and its chilling effect on free speech.Β
They discuss why social media supercharged censorship, how college campuses became ground zero for speech suppression, and why younger generations may be more fragile and less free. Greg also opens up about his own struggles with anxiety and how cognitive behavioral therapy helped rewire his thinking.
Follow Greg, @glukianoff.
Algebra of Happiness: in memory of George Thomas Galloway (1930 – 2025)
Learn more about your ad choices. Visit podcastchoices.com/adchoices
Leave a Reply