AI transcript
0:00:03 Support for the show comes from Atlassian.
0:00:05 Atlassian software like Jira, Confluence, and Loom
0:00:07 help power the collaboration needed for teams
0:00:10 to accomplish what would otherwise be impossible alone.
0:00:11 Because individually, we’re great,
0:00:13 but together, we’re so much better.
0:00:15 That’s why millions of teams around the world,
0:00:17 including 75% of the Fortune 500,
0:00:18 trust Atlassian software for everything
0:00:21 from space exploration and green energy
0:00:22 to delivering pizzas and podcasts.
0:00:25 Whether you’re a team of two, 200, or two million,
0:00:28 Atlassian software is built to help keep you connected
0:00:29 and moving together as one.
0:00:31 Learn how to unleash the potential of your team
0:00:32 at Atlassian.com.
0:00:36 That’s A-T-L-A-S-S-I-A-N.com.
0:00:37 Atlassian.
0:00:45 – So you’ve arrived.
0:00:47 You head to the Brasserie, then the terrace.
0:00:48 Cocktail?
0:00:52 Don’t mind if I do.
0:00:54 You raise your glass to another guest
0:00:57 because you both know the holidays just beginning.
0:00:58 – Passengers, please proceed to game four.
0:01:01 – And you’re only in terminal three.
0:01:03 Welcome to Virgin Atlantic’s
0:01:06 unique upper-class clubhouse experience,
0:01:07 where you’ll feel like you’ve arrived
0:01:09 before you’ve taken off.
0:01:13 Virgin Atlantic, see the world differently.
0:01:21 – I’m Sky Galloway, and this is No Mercy, No Malice.
0:01:24 We’ve decided that children should not have access
0:01:29 to the military, alcohol, driving, pornography,
0:01:31 and many other things.
0:01:35 Social media, come one, come all.
0:01:37 Age-gating, as read by George Hahn.
0:01:44 – I’m in a dark place.
0:01:46 I just watched democracy collapsing
0:01:50 as a con man abused an old man.
0:01:52 I haven’t hit rock bottom yet,
0:01:55 so let’s discuss social media and age-gating.
0:01:59 Social media is unprecedented
0:02:02 in its reach and addictive potential,
0:02:06 a bottomless dopa bag that fits in your pocket.
0:02:09 For kids, it poses heightened risks.
0:02:13 The evidence is overwhelming and has been for a while.
0:02:17 It just took a beat to absorb how brazen the lies were.
0:02:19 We’re proud of our progress.
0:02:22 Social media can be dangerous.
0:02:24 That doesn’t make it net bad.
0:02:27 There’s plenty of good things about it.
0:02:31 But similar to automobiles, alcohol, and AK-47s,
0:02:34 it has a mixed impact on our lives.
0:02:36 It presents dangers.
0:02:39 And one of the things a healthy society does
0:02:42 is limit the availability of dangerous products
0:02:46 to children who lack the capacity to use them safely.
0:02:51 Yet two decades into the social media era,
0:02:54 we permit unlimited all-ages access
0:02:57 to this dangerous addictive product.
0:02:58 The reason?
0:03:00 Incentives.
0:03:04 Specifically, the platforms disincentivized
0:03:07 to age-gate their products throw sand in the gears
0:03:10 of any effort to limit access.
0:03:15 To change the outcome, we must change the incentives.
0:03:20 I’m a better person when I drink.
0:03:23 More interesting and more interested.
0:03:25 One of the reasons I work out so much
0:03:28 is so I can continue to drink.
0:03:31 Muscle absorbs alcohol better than fat does.
0:03:33 Kids are different.
0:03:35 And we’ve long been comfortable
0:03:37 treating them differently.
0:03:41 In 1838, Wisconsin forbid the selling of liquor to minors
0:03:43 without parental consent.
0:03:47 And by 1900, state laws setting minimum drinking ages
0:03:48 were common.
0:03:50 There’s a good case to be made
0:03:54 that the U.S. alcohol limit of 21 is too high.
0:03:56 But nobody would argue we should dispense
0:03:58 with age-gating booze altogether.
0:04:02 That trend has paralleled laws
0:04:05 restricting childhood access to other things.
0:04:08 The right to bear arms is enshrined in the Constitution,
0:04:10 yet courts don’t blink at keeping guns
0:04:12 out of the hands of children,
0:04:14 even as they dismantle every other limitation
0:04:16 on gun ownership.
0:04:17 If there’s a lobbying group trying
0:04:20 to give driver’s licenses to 13-year-olds,
0:04:21 I can’t find it.
0:04:25 Age of consent laws make sex with children a crime.
0:04:27 Minors are not permitted to enter into contracts.
0:04:30 We limit the hours and conditions in which they can work.
0:04:33 They cannot serve in the military or on juries,
0:04:36 nor are they allowed to vote.
0:04:38 That last one we may want to reconsider.
0:04:41 These are not trivial things.
0:04:44 On the contrary, we exclude children
0:04:47 from or substantially limit their participation
0:04:51 in many core activities of society.
0:04:55 The only time I have appeared on a late night TV
0:04:58 was when Jimmy Fallon mocked me,
0:05:00 showing a CNN video clip where I said,
0:05:03 “I would rather give my 14-year-old son
0:05:06 “a bottle of Jack Daniels in marijuana
0:05:08 “than an Instagram and a Snap account.”
0:05:12 4,000 likes and 265,000 views later,
0:05:14 it appears America agrees.
0:05:17 My now almost 17-year-old son has engaged
0:05:20 with all three substances.
0:05:23 Alcohol and Instagram make him feel worse afterward.
0:05:25 Not sure about weed.
0:05:27 However, he is restricted from carrying
0:05:29 a bottle of Jack in his pocket,
0:05:31 and his parents would ask for a word
0:05:34 if his face was hermetically sealed to a bong.
0:05:37 Note, spare me any bullshit parenting advice
0:05:40 from non-parents or therapists
0:05:42 whose kids don’t come home for the holidays.
0:05:47 He, we, and society restrict his access
0:05:49 to these substances,
0:05:51 and when he abstains from drinking or smoking,
0:05:54 he isn’t sequestered from all social contact
0:05:57 and the connective tissue of his peer group.
0:06:01 We freaked out when we found, as you will if you have boys,
0:06:04 porn on one of his devices.
0:06:06 But the research is clear.
0:06:07 We should be more alarmed
0:06:12 when we find Instagram, Snap, or TikTok on his phone.
0:06:14 Mark Zuckerberg and Sheryl Sandberg
0:06:17 are the pornographers of our global economy.
0:06:21 Actually, that’s unfair to pornographers.
0:06:25 Age-gating social media is hugely popular.
0:06:28 Over 80% of adults believe parental consent
0:06:30 should be required for social media
0:06:34 and almost 70% want platforms
0:06:37 to limit the time minors spend on them.
0:06:40 Those numbers are from last fall
0:06:43 before my NYU colleague, Jonathan Haidt,
0:06:46 published The Anxious Generation,
0:06:49 which builds on the work of Jean Twenge and others,
0:06:51 making the most forceful case yet
0:06:54 that social media is hurting our children.
0:06:59 Reviewing the shocking increase in depression, self-harm,
0:07:03 and general suffering our children are experiencing,
0:07:07 and the explanations offered by the platform apologists,
0:07:10 Professor Haidt highlights the twin specters
0:07:13 of social media and mobile devices
0:07:17 and the lasting damage they’re doing to a generation.
0:07:21 Unconstrained smartphone use, Haidt observes,
0:07:25 has been, quote, “The largest uncontrolled experiment
0:07:29 “humanity has ever performed on its own children,”
0:07:33 unquote, and the results are in.
0:07:36 Legislatures are responding.
0:07:39 States from California to Utah to Louisiana
0:07:41 have passed laws that limit access
0:07:43 to social media based on age.
0:07:45 If you haven’t noticed any change
0:07:47 in the behavior of the platforms, however,
0:07:51 that’s because courts have blocked nearly all of them.
0:07:54 A social media and digital commerce trade group
0:07:58 called NetChoice is quick to sue any state
0:08:00 that interferes with its members’ ability
0:08:03 to exploit children for maximum profit.
0:08:07 Judges are siding with the platforms,
0:08:10 and probably not because they enjoy seeing depressed teenagers
0:08:13 fed content glorifying self-harm
0:08:18 or teenage boys committing suicide after being sex-storted.
0:08:21 The platforms and other opponents of these laws,
0:08:25 such as the ACLU, make two main points.
0:08:29 First, they claim that verifying age online
0:08:32 is too complicated, requiring the collection
0:08:34 of all sorts of information about users
0:08:37 and won’t work in all cases.
0:08:41 Second, requiring users to collect this information
0:08:45 creates free speech, privacy, and security concerns.
0:08:48 The platforms also deny their products
0:08:50 are harmful to children.
0:08:54 On their face, these points are valid.
0:08:57 It is more difficult to confirm age online,
0:08:58 where there’s no clerk at the counter
0:09:00 who can ask to see your driver’s license
0:09:02 and reference her face.
0:09:05 And these platforms have proven reckless
0:09:06 with personal data.
0:09:09 It’s sort of a, they’re so irresponsible,
0:09:11 but we can’t take action dilemma.
0:09:15 But these objections are not about age verification,
0:09:19 children’s rights, free speech, or privacy.
0:09:24 They are concerns about the platform company’s capabilities.
0:09:27 Their arguments boil down to the assertion
0:09:30 that these multi-billion dollar organizations
0:09:33 who’ve assembled vast pools of human capital
0:09:35 that wield god-like technology
0:09:38 can’t figure out how to build effective, efficient,
0:09:41 constitutionally compliant age verification systems
0:09:43 to protect children.
0:09:49 If this sounds like bullshit, trust your instincts.
0:09:52 This isn’t a conversation regarding the realm
0:09:55 of the possible, but the profitable.
0:10:00 When you pay an industry not to understand something,
0:10:03 it will never figure it out.
0:10:05 Just look at the tobacco industry’s inability
0:10:07 to see a link with cancer.
0:10:10 What’s more challenging?
0:10:13 Figuring out if someone is younger than 16,
0:10:16 or building a global real-time communication network
0:10:19 that stores a near-infinite amount of text, video,
0:10:23 and audio retrievable by billions of simultaneous users
0:10:26 in milliseconds with 24/7 uptime.
0:10:30 The social media giants know where you are,
0:10:33 what you’re doing, how you’re feeling,
0:10:36 and if you’re experiencing suicidal ideation.
0:10:39 But they can’t figure out your age.
0:10:42 You can’t make this shit up.
0:10:45 The platforms could design technology
0:10:47 that reliably collects sufficient information
0:10:49 to confirm a user’s age,
0:10:53 then wipes the information from its servers.
0:10:56 They could create a private or public entity
0:11:00 that processes age verification anonymously.
0:11:02 Remember the blockchain?
0:11:05 Isn’t this exactly the kind of problem
0:11:06 it was supposed to solve?
0:11:11 They could deploy AI to estimate when a user
0:11:14 is likely underage based on their online behaviors
0:11:18 and seek age verification from at-risk people.
0:11:23 If device manufacturers, or just the device OS duopoly
0:11:28 of Apple and Alphabet, were properly incentivized,
0:11:30 they could implement age verification
0:11:32 on the device itself.
0:11:34 This is what Meta says it wants
0:11:38 when it isn’t fighting age verification requirements.
0:11:42 Or, crazy idea, they could stop glorifying suicide
0:11:45 and pushing pornography to everyone.
0:11:50 The reason Zuck and other access powers
0:11:53 haven’t built age verification into their platforms
0:11:56 is it will reduce their profits,
0:11:59 because they will serve fewer ads to kids,
0:12:02 which will suppress their stock prices,
0:12:05 and the job of a public company CEO
0:12:09 is to increase the stock price.
0:12:14 Period, full stop, end of strategic plan.
0:12:18 So long as the negative impact to the stock price
0:12:21 caused by the bad PR of teen suicide and depression
0:12:25 is less than the positive impact
0:12:27 of the incremental ad revenue obtained
0:12:30 through unrestricted algorithmic manipulation
0:12:32 of those teens.
0:12:36 The rational, shareholder-driven thing to do
0:12:40 is fight age verification requirements.
0:12:42 If we want the platforms
0:12:45 to make their products safe for children,
0:12:47 we need to change the incentives,
0:12:51 force them to bear the cost of their damage,
0:12:56 internalize the externalities in economists speak.
0:13:00 There are three forces powerful enough to do this,
0:13:05 the market, plaintiff lawyers, and the government.
0:13:09 The market solution would be to let consumers decide
0:13:12 if they want to be exploited and manipulated.
0:13:16 And by consumers, I mean teenagers.
0:13:18 One big shortcoming of this approach
0:13:22 is that teenagers are idiots.
0:13:26 I have proof here as I’m raising two and I used to be one.
0:13:30 My job as their dad is to be their prefrontal cortex
0:13:32 until it shows up.
0:13:36 I told my son on a Thursday it was Thursday
0:13:39 and he disagreed.
0:13:44 The next approach is to let the platforms do whatever they want,
0:13:46 but if they harm someone,
0:13:49 let that person sue them for damages.
0:13:51 This is how we police product safety
0:13:54 in almost all contexts.
0:13:57 Did your car’s airbag explode shrapnel into your neck?
0:14:00 Sue Takata.
0:14:02 Did talcum powder give you cancer?
0:14:04 Sue J&J.
0:14:07 Did your phone burn the skin off your leg?
0:14:10 Sue Samsung.
0:14:12 People don’t like plaintiff lawyers,
0:14:14 but lawsuits are a big part of the reason
0:14:18 that more products don’t give you cancer or scald you.
0:14:22 Nobody can successfully sue social media platforms,
0:14:27 however, because of a 28-year-old law known as Section 230,
0:14:32 which gives them blanket protection against litigation.
0:14:35 I’ve written about the need to limit Section 230 before
0:14:36 and whenever I do,
0:14:41 a zombie apocalypse of free speech absolutists is unleashed.
0:14:45 The proposition remains unchanged, however.
0:14:47 If social media platforms believe
0:14:49 they’ve done everything reasonable
0:14:53 to protect children from the dangers of their product,
0:14:55 then let them prove it in court.
0:14:58 Or better yet, let the fear of tobacco
0:15:01 or asbestos-shaped litigation gorging on their profits
0:15:06 and motivate them to agegate their products.
0:15:09 Finally, the government can go after companies
0:15:12 whose products harm consumers.
0:15:15 The Federal Trade Commission has fined meta $5 billion
0:15:19 over privacy violations to no apparent effect.
0:15:23 This was perfect, except it was missing a zero.
0:15:29 For these firms, $5 billion is a nuisance, not a deterrent.
0:15:32 There’s a bill in the Senate right now,
0:15:34 the Kids Online Safety Act,
0:15:39 which would give the FTC new authority to go after platforms
0:15:42 which fail to build guardrails for kids.
0:15:44 It’s not without risk.
0:15:46 Some right-wing groups are supporting it
0:15:49 because they believe it can be used to suppress LGBT content
0:15:53 or anything else the patriarchy deems undesirable.
0:15:57 But I have more faith in Congress’s ability to refine a law
0:16:00 than I do in the social platform’s willingness
0:16:02 to change without one.
0:16:06 Until we change the incentives
0:16:09 and put the costs of these platforms where they belong
0:16:13 on the platforms themselves, they will not change.
0:16:18 Legislators trying to design agegating systems
0:16:21 or craft detailed policies for platforms
0:16:24 are playing a fool’s game.
0:16:27 The social media companies can just shoot holes
0:16:30 in every piece of legislation, fund endless lawsuits,
0:16:34 and deploy their armies of lobbyists and faux heat shields,
0:16:38 lean in, all the while making their systems
0:16:41 ever more addictive and exploitative.
0:16:44 Or maybe we have it wrong
0:16:47 and we should let our kids drink, drive,
0:16:50 and join the military at 12.
0:16:52 After slitting their wrists,
0:16:55 survivors often get tattoos to cover the scars.
0:17:00 Maybe teens should skip social media and just get tattoos.
0:17:04 I warned you, dark.
0:17:09 Life is so rich.
0:17:13 (gentle music)
0:17:16 (gentle music)
0:17:18 (soft music)
0:17:28 [BLANK_AUDIO]
0:00:05 Atlassian software like Jira, Confluence, and Loom
0:00:07 help power the collaboration needed for teams
0:00:10 to accomplish what would otherwise be impossible alone.
0:00:11 Because individually, we’re great,
0:00:13 but together, we’re so much better.
0:00:15 That’s why millions of teams around the world,
0:00:17 including 75% of the Fortune 500,
0:00:18 trust Atlassian software for everything
0:00:21 from space exploration and green energy
0:00:22 to delivering pizzas and podcasts.
0:00:25 Whether you’re a team of two, 200, or two million,
0:00:28 Atlassian software is built to help keep you connected
0:00:29 and moving together as one.
0:00:31 Learn how to unleash the potential of your team
0:00:32 at Atlassian.com.
0:00:36 That’s A-T-L-A-S-S-I-A-N.com.
0:00:37 Atlassian.
0:00:45 – So you’ve arrived.
0:00:47 You head to the Brasserie, then the terrace.
0:00:48 Cocktail?
0:00:52 Don’t mind if I do.
0:00:54 You raise your glass to another guest
0:00:57 because you both know the holidays just beginning.
0:00:58 – Passengers, please proceed to game four.
0:01:01 – And you’re only in terminal three.
0:01:03 Welcome to Virgin Atlantic’s
0:01:06 unique upper-class clubhouse experience,
0:01:07 where you’ll feel like you’ve arrived
0:01:09 before you’ve taken off.
0:01:13 Virgin Atlantic, see the world differently.
0:01:21 – I’m Sky Galloway, and this is No Mercy, No Malice.
0:01:24 We’ve decided that children should not have access
0:01:29 to the military, alcohol, driving, pornography,
0:01:31 and many other things.
0:01:35 Social media, come one, come all.
0:01:37 Age-gating, as read by George Hahn.
0:01:44 – I’m in a dark place.
0:01:46 I just watched democracy collapsing
0:01:50 as a con man abused an old man.
0:01:52 I haven’t hit rock bottom yet,
0:01:55 so let’s discuss social media and age-gating.
0:01:59 Social media is unprecedented
0:02:02 in its reach and addictive potential,
0:02:06 a bottomless dopa bag that fits in your pocket.
0:02:09 For kids, it poses heightened risks.
0:02:13 The evidence is overwhelming and has been for a while.
0:02:17 It just took a beat to absorb how brazen the lies were.
0:02:19 We’re proud of our progress.
0:02:22 Social media can be dangerous.
0:02:24 That doesn’t make it net bad.
0:02:27 There’s plenty of good things about it.
0:02:31 But similar to automobiles, alcohol, and AK-47s,
0:02:34 it has a mixed impact on our lives.
0:02:36 It presents dangers.
0:02:39 And one of the things a healthy society does
0:02:42 is limit the availability of dangerous products
0:02:46 to children who lack the capacity to use them safely.
0:02:51 Yet two decades into the social media era,
0:02:54 we permit unlimited all-ages access
0:02:57 to this dangerous addictive product.
0:02:58 The reason?
0:03:00 Incentives.
0:03:04 Specifically, the platforms disincentivized
0:03:07 to age-gate their products throw sand in the gears
0:03:10 of any effort to limit access.
0:03:15 To change the outcome, we must change the incentives.
0:03:20 I’m a better person when I drink.
0:03:23 More interesting and more interested.
0:03:25 One of the reasons I work out so much
0:03:28 is so I can continue to drink.
0:03:31 Muscle absorbs alcohol better than fat does.
0:03:33 Kids are different.
0:03:35 And we’ve long been comfortable
0:03:37 treating them differently.
0:03:41 In 1838, Wisconsin forbid the selling of liquor to minors
0:03:43 without parental consent.
0:03:47 And by 1900, state laws setting minimum drinking ages
0:03:48 were common.
0:03:50 There’s a good case to be made
0:03:54 that the U.S. alcohol limit of 21 is too high.
0:03:56 But nobody would argue we should dispense
0:03:58 with age-gating booze altogether.
0:04:02 That trend has paralleled laws
0:04:05 restricting childhood access to other things.
0:04:08 The right to bear arms is enshrined in the Constitution,
0:04:10 yet courts don’t blink at keeping guns
0:04:12 out of the hands of children,
0:04:14 even as they dismantle every other limitation
0:04:16 on gun ownership.
0:04:17 If there’s a lobbying group trying
0:04:20 to give driver’s licenses to 13-year-olds,
0:04:21 I can’t find it.
0:04:25 Age of consent laws make sex with children a crime.
0:04:27 Minors are not permitted to enter into contracts.
0:04:30 We limit the hours and conditions in which they can work.
0:04:33 They cannot serve in the military or on juries,
0:04:36 nor are they allowed to vote.
0:04:38 That last one we may want to reconsider.
0:04:41 These are not trivial things.
0:04:44 On the contrary, we exclude children
0:04:47 from or substantially limit their participation
0:04:51 in many core activities of society.
0:04:55 The only time I have appeared on a late night TV
0:04:58 was when Jimmy Fallon mocked me,
0:05:00 showing a CNN video clip where I said,
0:05:03 “I would rather give my 14-year-old son
0:05:06 “a bottle of Jack Daniels in marijuana
0:05:08 “than an Instagram and a Snap account.”
0:05:12 4,000 likes and 265,000 views later,
0:05:14 it appears America agrees.
0:05:17 My now almost 17-year-old son has engaged
0:05:20 with all three substances.
0:05:23 Alcohol and Instagram make him feel worse afterward.
0:05:25 Not sure about weed.
0:05:27 However, he is restricted from carrying
0:05:29 a bottle of Jack in his pocket,
0:05:31 and his parents would ask for a word
0:05:34 if his face was hermetically sealed to a bong.
0:05:37 Note, spare me any bullshit parenting advice
0:05:40 from non-parents or therapists
0:05:42 whose kids don’t come home for the holidays.
0:05:47 He, we, and society restrict his access
0:05:49 to these substances,
0:05:51 and when he abstains from drinking or smoking,
0:05:54 he isn’t sequestered from all social contact
0:05:57 and the connective tissue of his peer group.
0:06:01 We freaked out when we found, as you will if you have boys,
0:06:04 porn on one of his devices.
0:06:06 But the research is clear.
0:06:07 We should be more alarmed
0:06:12 when we find Instagram, Snap, or TikTok on his phone.
0:06:14 Mark Zuckerberg and Sheryl Sandberg
0:06:17 are the pornographers of our global economy.
0:06:21 Actually, that’s unfair to pornographers.
0:06:25 Age-gating social media is hugely popular.
0:06:28 Over 80% of adults believe parental consent
0:06:30 should be required for social media
0:06:34 and almost 70% want platforms
0:06:37 to limit the time minors spend on them.
0:06:40 Those numbers are from last fall
0:06:43 before my NYU colleague, Jonathan Haidt,
0:06:46 published The Anxious Generation,
0:06:49 which builds on the work of Jean Twenge and others,
0:06:51 making the most forceful case yet
0:06:54 that social media is hurting our children.
0:06:59 Reviewing the shocking increase in depression, self-harm,
0:07:03 and general suffering our children are experiencing,
0:07:07 and the explanations offered by the platform apologists,
0:07:10 Professor Haidt highlights the twin specters
0:07:13 of social media and mobile devices
0:07:17 and the lasting damage they’re doing to a generation.
0:07:21 Unconstrained smartphone use, Haidt observes,
0:07:25 has been, quote, “The largest uncontrolled experiment
0:07:29 “humanity has ever performed on its own children,”
0:07:33 unquote, and the results are in.
0:07:36 Legislatures are responding.
0:07:39 States from California to Utah to Louisiana
0:07:41 have passed laws that limit access
0:07:43 to social media based on age.
0:07:45 If you haven’t noticed any change
0:07:47 in the behavior of the platforms, however,
0:07:51 that’s because courts have blocked nearly all of them.
0:07:54 A social media and digital commerce trade group
0:07:58 called NetChoice is quick to sue any state
0:08:00 that interferes with its members’ ability
0:08:03 to exploit children for maximum profit.
0:08:07 Judges are siding with the platforms,
0:08:10 and probably not because they enjoy seeing depressed teenagers
0:08:13 fed content glorifying self-harm
0:08:18 or teenage boys committing suicide after being sex-storted.
0:08:21 The platforms and other opponents of these laws,
0:08:25 such as the ACLU, make two main points.
0:08:29 First, they claim that verifying age online
0:08:32 is too complicated, requiring the collection
0:08:34 of all sorts of information about users
0:08:37 and won’t work in all cases.
0:08:41 Second, requiring users to collect this information
0:08:45 creates free speech, privacy, and security concerns.
0:08:48 The platforms also deny their products
0:08:50 are harmful to children.
0:08:54 On their face, these points are valid.
0:08:57 It is more difficult to confirm age online,
0:08:58 where there’s no clerk at the counter
0:09:00 who can ask to see your driver’s license
0:09:02 and reference her face.
0:09:05 And these platforms have proven reckless
0:09:06 with personal data.
0:09:09 It’s sort of a, they’re so irresponsible,
0:09:11 but we can’t take action dilemma.
0:09:15 But these objections are not about age verification,
0:09:19 children’s rights, free speech, or privacy.
0:09:24 They are concerns about the platform company’s capabilities.
0:09:27 Their arguments boil down to the assertion
0:09:30 that these multi-billion dollar organizations
0:09:33 who’ve assembled vast pools of human capital
0:09:35 that wield god-like technology
0:09:38 can’t figure out how to build effective, efficient,
0:09:41 constitutionally compliant age verification systems
0:09:43 to protect children.
0:09:49 If this sounds like bullshit, trust your instincts.
0:09:52 This isn’t a conversation regarding the realm
0:09:55 of the possible, but the profitable.
0:10:00 When you pay an industry not to understand something,
0:10:03 it will never figure it out.
0:10:05 Just look at the tobacco industry’s inability
0:10:07 to see a link with cancer.
0:10:10 What’s more challenging?
0:10:13 Figuring out if someone is younger than 16,
0:10:16 or building a global real-time communication network
0:10:19 that stores a near-infinite amount of text, video,
0:10:23 and audio retrievable by billions of simultaneous users
0:10:26 in milliseconds with 24/7 uptime.
0:10:30 The social media giants know where you are,
0:10:33 what you’re doing, how you’re feeling,
0:10:36 and if you’re experiencing suicidal ideation.
0:10:39 But they can’t figure out your age.
0:10:42 You can’t make this shit up.
0:10:45 The platforms could design technology
0:10:47 that reliably collects sufficient information
0:10:49 to confirm a user’s age,
0:10:53 then wipes the information from its servers.
0:10:56 They could create a private or public entity
0:11:00 that processes age verification anonymously.
0:11:02 Remember the blockchain?
0:11:05 Isn’t this exactly the kind of problem
0:11:06 it was supposed to solve?
0:11:11 They could deploy AI to estimate when a user
0:11:14 is likely underage based on their online behaviors
0:11:18 and seek age verification from at-risk people.
0:11:23 If device manufacturers, or just the device OS duopoly
0:11:28 of Apple and Alphabet, were properly incentivized,
0:11:30 they could implement age verification
0:11:32 on the device itself.
0:11:34 This is what Meta says it wants
0:11:38 when it isn’t fighting age verification requirements.
0:11:42 Or, crazy idea, they could stop glorifying suicide
0:11:45 and pushing pornography to everyone.
0:11:50 The reason Zuck and other access powers
0:11:53 haven’t built age verification into their platforms
0:11:56 is it will reduce their profits,
0:11:59 because they will serve fewer ads to kids,
0:12:02 which will suppress their stock prices,
0:12:05 and the job of a public company CEO
0:12:09 is to increase the stock price.
0:12:14 Period, full stop, end of strategic plan.
0:12:18 So long as the negative impact to the stock price
0:12:21 caused by the bad PR of teen suicide and depression
0:12:25 is less than the positive impact
0:12:27 of the incremental ad revenue obtained
0:12:30 through unrestricted algorithmic manipulation
0:12:32 of those teens.
0:12:36 The rational, shareholder-driven thing to do
0:12:40 is fight age verification requirements.
0:12:42 If we want the platforms
0:12:45 to make their products safe for children,
0:12:47 we need to change the incentives,
0:12:51 force them to bear the cost of their damage,
0:12:56 internalize the externalities in economists speak.
0:13:00 There are three forces powerful enough to do this,
0:13:05 the market, plaintiff lawyers, and the government.
0:13:09 The market solution would be to let consumers decide
0:13:12 if they want to be exploited and manipulated.
0:13:16 And by consumers, I mean teenagers.
0:13:18 One big shortcoming of this approach
0:13:22 is that teenagers are idiots.
0:13:26 I have proof here as I’m raising two and I used to be one.
0:13:30 My job as their dad is to be their prefrontal cortex
0:13:32 until it shows up.
0:13:36 I told my son on a Thursday it was Thursday
0:13:39 and he disagreed.
0:13:44 The next approach is to let the platforms do whatever they want,
0:13:46 but if they harm someone,
0:13:49 let that person sue them for damages.
0:13:51 This is how we police product safety
0:13:54 in almost all contexts.
0:13:57 Did your car’s airbag explode shrapnel into your neck?
0:14:00 Sue Takata.
0:14:02 Did talcum powder give you cancer?
0:14:04 Sue J&J.
0:14:07 Did your phone burn the skin off your leg?
0:14:10 Sue Samsung.
0:14:12 People don’t like plaintiff lawyers,
0:14:14 but lawsuits are a big part of the reason
0:14:18 that more products don’t give you cancer or scald you.
0:14:22 Nobody can successfully sue social media platforms,
0:14:27 however, because of a 28-year-old law known as Section 230,
0:14:32 which gives them blanket protection against litigation.
0:14:35 I’ve written about the need to limit Section 230 before
0:14:36 and whenever I do,
0:14:41 a zombie apocalypse of free speech absolutists is unleashed.
0:14:45 The proposition remains unchanged, however.
0:14:47 If social media platforms believe
0:14:49 they’ve done everything reasonable
0:14:53 to protect children from the dangers of their product,
0:14:55 then let them prove it in court.
0:14:58 Or better yet, let the fear of tobacco
0:15:01 or asbestos-shaped litigation gorging on their profits
0:15:06 and motivate them to agegate their products.
0:15:09 Finally, the government can go after companies
0:15:12 whose products harm consumers.
0:15:15 The Federal Trade Commission has fined meta $5 billion
0:15:19 over privacy violations to no apparent effect.
0:15:23 This was perfect, except it was missing a zero.
0:15:29 For these firms, $5 billion is a nuisance, not a deterrent.
0:15:32 There’s a bill in the Senate right now,
0:15:34 the Kids Online Safety Act,
0:15:39 which would give the FTC new authority to go after platforms
0:15:42 which fail to build guardrails for kids.
0:15:44 It’s not without risk.
0:15:46 Some right-wing groups are supporting it
0:15:49 because they believe it can be used to suppress LGBT content
0:15:53 or anything else the patriarchy deems undesirable.
0:15:57 But I have more faith in Congress’s ability to refine a law
0:16:00 than I do in the social platform’s willingness
0:16:02 to change without one.
0:16:06 Until we change the incentives
0:16:09 and put the costs of these platforms where they belong
0:16:13 on the platforms themselves, they will not change.
0:16:18 Legislators trying to design agegating systems
0:16:21 or craft detailed policies for platforms
0:16:24 are playing a fool’s game.
0:16:27 The social media companies can just shoot holes
0:16:30 in every piece of legislation, fund endless lawsuits,
0:16:34 and deploy their armies of lobbyists and faux heat shields,
0:16:38 lean in, all the while making their systems
0:16:41 ever more addictive and exploitative.
0:16:44 Or maybe we have it wrong
0:16:47 and we should let our kids drink, drive,
0:16:50 and join the military at 12.
0:16:52 After slitting their wrists,
0:16:55 survivors often get tattoos to cover the scars.
0:17:00 Maybe teens should skip social media and just get tattoos.
0:17:04 I warned you, dark.
0:17:09 Life is so rich.
0:17:13 (gentle music)
0:17:16 (gentle music)
0:17:18 (soft music)
0:17:28 [BLANK_AUDIO]
As read by George Hahn.
Age Gating
Learn more about your ad choices. Visit podcastchoices.com/adchoices
Leave a Reply