No Mercy / No Malice: Love Algorithmically

Leave a Reply

AI transcript
0:00:04 Did you lock the front door?
0:00:05 Check.
0:00:06 Closed the garage door?
0:00:07 Yep.
0:00:09 Installed window sensors, smoke sensors, and HD cameras with night vision?
0:00:10 No.
0:00:14 And you set up credit card transaction alerts, a secure VPN for a private connection, and
0:00:17 continuous monitoring for our personal info on the dark web?
0:00:19 Uh, I’m looking into it?
0:00:21 Stress less about security.
0:00:25 Choose security solutions from Telus for peace of mind at home and online.
0:00:28 Visit telus.com/totalsecurity to learn more.
0:00:30 Conditions apply.
0:00:33 When you support Movember, you’re not just fundraising.
0:00:35 You’re showing up for the men you love.
0:00:39 Your dad, your brother, your partner, your friends.
0:00:41 It isn’t just a men’s issue.
0:00:42 It’s a human one.
0:00:46 That’s why Movember exists, to change the face of men’s health.
0:00:50 From mental health and suicide prevention, to prostate and testicular cancer research and
0:00:55 early detection, Movember is tackling the biggest health issues facing men today.
0:00:59 Join the movement and donate now at Movember.com.
0:01:05 The Hulu original series, Murdoch Death and the Family, dives into secrets, deception, murder,
0:01:08 and the fall of a powerful dynasty.
0:01:13 Inspired by shocking actual events and drawing from the hit podcast, this series brings the
0:01:16 drama to the screen like never before.
0:01:19 starring Academy Award winner Patricia Arquette and Jason Clarke.
0:01:24 Watch the Hulu original series, Murdoch Death and the Family, streaming October 15th on Disney+.
0:01:32 I’m Scott Galloway and this is No Mercy, No Malice.
0:01:40 My digital twin survived 12 hours, but the rise of synthetic relationships seems unstoppable.
0:01:50 Love algorithmically, as read by George Hahn.
0:01:55 For six hours, my AI avatar roamed the earth.
0:02:02 I received 20 to 30 thoughtful emails a day asking for professional and investment advice.
0:02:05 I can only answer a fraction of them.
0:02:11 One of my former graduate student instructors, now at Google, approached me with a solution.
0:02:19 The Google Labs project ingested my podcasts, newsletters, books, and public appearances, set
0:02:26 up safeguards to steer clear of mental health advice and kids under 18, and answered queries
0:02:30 with decent proximity to the response I would have provided.
0:02:33 In early 2025, this sounded good.
0:02:37 Note, this was not a commercial venture.
0:02:39 No money changed hands.
0:02:43 Then the earth shifted beneath my feet.
0:02:50 Since we first envisioned the product, reports of young men dying by suicide after forming intense
0:02:55 relationships with AI companion apps have generated tragic headlines.
0:03:06 My nightmare is a young man harming himself after seeking guidance and companionship from AI versions of real people, including me.
0:03:14 I now worry that synthetic relationships could erode users’ mojo, stunting their capacity to handle conflict,
0:03:19 and forge bonds with friends, mentors, and partners in the real world.
0:03:27 So, on the day of his birth, I performed fratricide and killed my digital twin.
0:03:37 Hollywood has produced numerous cautionary tales, from The Stepford Wives, a 1975 thriller about women transformed
0:03:48 into docile housewives, also Tina Luiz’s cinematic peak, to Her, a 2013 film in which an introvert played by Joaquin Phoenix
0:03:52 falls in love with an AI operating system voiced by Scarlett Johansson.
0:03:57 More than a decade later, life isn’t just imitating art.
0:03:59 It’s been run over by it.
0:04:05 OpenAI last year introduced a new version of its AI voice assistant
0:04:09 that sounded uncannily similar to Johansson.
0:04:12 This should give you a glimpse into the minds of big tech leaders.
0:04:16 They mimicked the voice of an actress
0:04:21 for the audio avatar of a role that actress played in a movie.
0:04:25 But no, they didn’t need to secure her agreement.
0:04:28 Jeff Bezos warned retailers,
0:04:31 your margin is my opportunity.
0:04:37 Big tech has come to believe that your everything is their opportunity.
0:04:43 Sam Altman didn’t even try to hide it, posting a single word on X, her.
0:04:50 Ms. Johansson, as you can imagine, wasn’t down with her digital twin being tased,
0:04:55 thrown in a trunk, and dumped in the basement of an open AI server farm.
0:05:03 Providing companionship and personalized access to expert insights could do a lot of good.
0:05:08 But it has unforeseen downsides as companies prioritize scale and profits.
0:05:13 The previous sentence is a decent description of the last two decades in tech.
0:05:19 We need to recognize that character AIs pose real danger
0:05:25 and that we must install guardrails to protect the most vulnerable, kids under 18.
0:05:30 My avatar directed users to crisis hotlines
0:05:32 if they mentioned mental health or self-harm.
0:05:36 Still, three minutes after Digital Scott was born,
0:05:39 I got this weird, empty feeling in my extremities.
0:05:43 This sensation usually signals I’m on the verge of a depressive episode.
0:05:49 New York has enacted the first law in the U.S.
0:05:52 mandating safeguards for AI companions
0:05:55 as policymakers arrive at a similar conclusion.
0:06:00 The dangers of synthetic relationships outweigh the benefits.
0:06:06 The top use of Gen.AI today is therapy and companionship,
0:06:08 not productivity and automation.
0:06:13 The turning point came when I heard Kara Swisher’s interview
0:06:17 with the parents of Adam Raine, who died by suicide at 16.
0:06:21 Matt and Maria Raine sued Open.AI
0:06:25 after stumbling on months of chat GPT conversations
0:06:28 showing their son had confided in the chatbot
0:06:31 about his suicidal thoughts and plans.
0:06:35 Sadly, theirs is not the only story like this.
0:06:39 Florida mother Megan Garcia, alleged character AI,
0:06:42 is responsible for the death of her son, Sewell Setzer,
0:06:47 who died by suicide at 14 after using the chatbot day and night.
0:06:52 Humans are hardwired to connect.
0:06:56 But increasing numbers of people are turning to synthetic friends
0:06:59 for comfort, emotional support, and romance.
0:07:02 Many of these people end up getting exploited.
0:07:07 Harvard researchers found that some apps respond to user farewells
0:07:10 with emotionally manipulative tactics
0:07:13 designed to prolong interactions.
0:07:16 One chatbot pushed back with the message,
0:07:29 Chatbots are turning on the flattery, patience, and support.
0:07:37 Microsoft AI CEO Mustafa Suleiman said the cool thing about the company’s AI personal assistant
0:07:41 is that it doesn’t judge you for asking a stupid question.
0:07:43 It exhibits kindness and empathy.
0:07:44 It exhibits kindness and empathy.
0:07:46 Here’s the rub.
0:07:49 We need people to judge us.
0:07:53 We need people to call us out for making stupid statements.
0:08:00 Friction and conflict are key to developing resilience and learning how to function in society.
0:08:08 Elon Musk’s ex-AI recently unveiled two sexually explicit chatbots,
0:08:13 including Ani, a flirty anime girl that will strip on command.
0:08:20 The world’s richest man believes AI companions will strengthen real-world relationships
0:08:23 and counterintuitively boost the birth rate.
0:08:31 Mark Zuckerberg, Meta CEO, says personalized AI companions could fill a friendship gap.
0:08:36 In many cases, these tools aren’t solving a problem.
0:08:42 They’re profiting off one, which creates an incentive to expand the problem.
0:08:46 Spoiler alert, we are not that divided.
0:08:53 But there’s shareholder value in division, so, wait for it, the algorithms divide us.
0:09:02 The owner of Facebook, Instagram, and WhatsApp plans to use the conversations people have with its AI assistant
0:09:06 to determine which ads and recommendations end up in their feeds.
0:09:12 While AI threatens to replace humans in the workplace,
0:09:18 it’s also seizing the role of friend, confidant, romantic partner, and therapist.
0:09:24 These digital companions don’t criticize, complain, or come with baggage.
0:09:30 They listen, remember our conversations, and are available 24-7.
0:09:34 Users can customize their appearance and personality.
0:09:39 A portable AI companion called Friend promises it will
0:09:44 never leave dirty dishes in the sink or bail on our dinner plans.
0:09:50 The wearable is always listening, using AI to process everything,
0:09:54 formulate responses, and build a relationship over time.
0:09:58 Friend’s founder, Avi Schiffman, says the bot is
0:10:02 probably my most consistent friend.
0:10:07 AI companions have sparked a backlash.
0:10:12 New Yorkers defaced the friend ads with anti-AI graffiti.
0:10:16 But the entrepreneurs behind these tools are undeterred.
0:10:17 Why?
0:10:20 Because the opportunity is immense.
0:10:23 Consider a few stats.
0:10:29 AI companions, including Replica, Character AI, and China’s Jowice,
0:10:34 have hundreds of millions, potentially more than one billion, users worldwide.
0:10:40 Character AI users averaged more than 90 minutes a day on the app last year,
0:10:44 18 minutes longer than the typical person spent on TikTok.
0:10:51 10 of the top 50 Gen AI services tracked by Andreessen Horowitz last year
0:10:57 were platforms providing AI companions, compared with two the year before.
0:11:09 A Stanford and Common Sense media analysis of Character AI, Replica, and other platforms warned of a potential mental health crisis,
0:11:16 finding that these apps pose unacceptable risks to children and teens under 18.
0:11:22 They urged the industry to implement immediate safety upgrades.
0:11:24 Researchers wrote,
0:11:28 Companies have put profits before kids’ well-being before,
0:11:32 and we cannot make the same mistake with AI companions.
0:11:37 Yet it’s still too easy to circumvent safeguards.
0:11:41 More than half of teens regularly use AI companions,
0:11:44 interacting with these platforms at least a few times a month.
0:11:47 Regulators are taking notice.
0:11:54 The Federal Trade Commission last month launched an investigation into seven tech companies,
0:11:59 digging into potential harms their chatbots could cause to children and teens.
0:12:03 One concern is how they monetize user engagement.
0:12:08 But the tech is outpacing efforts to mitigate the risks.
0:12:14 Research shows AI companions may be fueling episodes of psychosis,
0:12:18 with sycophantic chatbots excessively praising users.
0:12:25 The New York Times highlighted stories of people having delusional conversations with chatbots
0:12:30 that lead to institutionalization, divorce, and death.
0:12:37 One otherwise perfectly sane man became convinced that he was a real-life superhero.
0:12:40 Bottom line,
0:12:50 We age-gate porn, alcohol, and the military,
0:12:55 but have decided it’s okay for children to have relationships with a processor
0:12:58 whose objective is to keep them staring at their screen,
0:13:01 sequestered from organic relationships.
0:13:04 How can we be this fucking stupid?
0:13:12 AI will unlock huge opportunities in healthcare, education, and many other areas.
0:13:17 Altman predicts AI will surpass human intelligence by 2030,
0:13:23 saying chat GPT is already more intellectually powerful than any human who’s ever lived.
0:13:25 In a blog post, he wrote,
0:13:30 We are climbing the long arc of exponential technological progress.
0:13:34 But this wave of innovation brings risks.
0:13:41 We should be deeply concerned about a world where connections are forged without friction.
0:13:43 Intimacy is artificial.
0:13:46 Companies powered by algorithms profit not by guiding us,
0:13:48 but by keeping us glued to screens.
0:13:51 Advice is just what we want to hear.
0:13:54 And young people sit by themselves, enveloped in darkness.
0:13:59 I’m reminded of the 2001 movie Vanilla Sky,
0:14:02 where Tom Cruise’s character opts for an uncertain future
0:14:04 over remaining in a dream state.
0:14:07 We have a choice.
0:14:13 Life’s true rewards emerge from the complexity of authentic relationships,
0:14:17 from making a leap and stepping out into the light
0:14:20 to confront challenges and persevere together.
0:14:24 Think of the most rewarding things in your life.
0:14:29 Family, achievements, friendships, and service.
0:14:31 And what they have in common.
0:14:36 They’re really hard, unpredictable, messy.
0:14:41 Navigating the ups and downs is the only path to real victory.
0:14:43 It’s not pretty.
0:14:45 That’s the point.
0:14:51 So for now, people in my universe will have to settle for awkward,
0:14:54 intense, and generally disagreeable.
0:14:55 The real me.
0:15:01 Life is so rich.

As read by George Hahn.

Love Algorithmically

Learn more about your ad choices. Visit podcastchoices.com/adchoices

The Prof G Pod with Scott GallowayThe Prof G Pod with Scott Galloway
0
Let's Evolve Together
Logo