AI transcript
0:00:03 Support for the show comes from ServiceNow,
0:00:05 the AI platform for business transformation.
0:00:07 You’ve heard the big hype around AI.
0:00:09 The truth is AI is only as powerful
0:00:11 as the platform it’s built into.
0:00:13 ServiceNow is the platform that puts AI
0:00:15 to work for people across your business,
0:00:18 removing friction and frustration for your employees,
0:00:20 supercharging productivity for your developers,
0:00:22 providing intelligent tools for your service agents
0:00:24 to make customers happier.
0:00:27 All built into a single platform you can use right now.
0:00:29 That’s why the world works with ServiceNow.
0:00:33 Visit servicenow.com/aiforpeople to learn more.
0:00:39 Support for this show comes from Constant Contact.
0:00:42 If you struggle just to get your customers to notice you,
0:00:46 Constant Contact has what you need to grab their attention.
0:00:49 Constant Contact’s award-winning marketing platform
0:00:53 offers all the automation, integration and reporting tools
0:00:55 that get your marketing running seamlessly,
0:00:59 all backed by their expert live customer support.
0:01:01 It’s time to get going and growing
0:01:03 with Constant Contact today.
0:01:06 Ready, set, grow.
0:01:10 Go to ConstantContact.ca and start your free trial today.
0:01:14 Go to ConstantContact.ca for your free trial,
0:01:17 ConstantContact.ca.
0:01:23 Thumbtack presents the ins and outs of caring for your home.
0:01:26 Out, procrastination,
0:01:30 putting it off, kicking the can down the road.
0:01:33 In, plans and guides that make it easy
0:01:35 to get home projects done.
0:01:39 Out, carpet in the bathroom, like why.
0:01:45 In, knowing what to do, when to do it, and who to hire.
0:01:49 Start caring for your home with confidence.
0:01:50 Download Thumbtack today.
0:01:57 Episode 326, 326 is the area code serving southwest in Ohio.
0:02:00 In 1926, the first SATs took place.
0:02:03 Latest exam for me, a prostate exam.
0:02:05 My doctor told me it’s perfectly normal
0:02:08 to become aroused and even ejaculate.
0:02:12 That being said, I still wish he hadn’t.
0:02:18 Go, go, go.
0:02:21 (upbeat music)
0:02:26 – Welcome to the 326th episode of The Prop G Bot.
0:02:28 In today’s episode, we speak with Eric Schmidt,
0:02:30 a technologist, entrepreneur, and philanthropist.
0:02:32 He also previously served as Google’s
0:02:33 chief executive officer.
0:02:34 I don’t know if you’ve heard of him.
0:02:35 It’s a tech company.
0:02:37 You can actually go there and type in your own name
0:02:39 and you see what the world thinks of you.
0:02:42 Later, he was the executive chairman and technical advisor.
0:02:44 We discussed with Eric the dangers
0:02:47 and opportunities AI presents in his latest book,
0:02:49 “Genesis, Artificial Training.”
0:02:51 “Genesis, Artificial Intelligence,
0:02:53 Hope in the Human Spirit.”
0:02:55 Well, that sounds like a show
0:02:58 on the Hallmark channel in hell.
0:02:59 Okay, what’s happening?
0:03:02 Off to Vegas this week, I’ve been at Summit.
0:03:03 That’s beautiful here.
0:03:04 It’s lovely.
0:03:07 I love kind of the western Baja sky or light.
0:03:09 I think I may retire here.
0:03:11 When I retire in Mexico, I think the food’s amazing.
0:03:13 The people are just incredibly cool.
0:03:14 The service go, no joke,
0:03:18 think that Mexico’s the best vacation deal in the world.
0:03:20 Anyways, where am I headed to next?
0:03:21 I go to Vegas tonight.
0:03:24 Then know that you asked doing a talk there tomorrow.
0:03:26 Vegas turned the week, not so much fun.
0:03:27 Not so much fun.
0:03:30 That definitely kind of an unusual vibe there.
0:03:31 And then I go to LA for a couple of days.
0:03:33 Daddy will be at the Beverly Hills Hotel.
0:03:34 Swing by, stay high.
0:03:37 I’ll be the guy alone at the bar.
0:03:39 Oh, I love eating alone at the Polo Lounge.
0:03:40 How do you know if I like you?
0:03:42 I stare at your shoes and I’m mine.
0:03:45 Anyways, then I’m back to Vegas for Formula One,
0:03:47 which I am so excited about.
0:03:48 I love it.
0:03:49 The city comes alive.
0:03:51 And then, just ’cause I know you like to keep up
0:03:53 with my travels, I head to Sao Paulo,
0:03:54 where the nicest hotel in the world is right now.
0:03:56 I think the Rosewood and Sao Paulo.
0:03:59 I think Rosewood is actually the best brand
0:04:01 in high-end hospitality.
0:04:02 Isn’t that good to know?
0:04:04 A lot of insight here.
0:04:05 A lot of insight.
0:04:07 All right, let’s move on.
0:04:10 Some news in the media and entertainment space.
0:04:13 Netflix said that a record 60 million households worldwide
0:04:14 tuned in to watch the boxing match
0:04:15 between Jake Paul and Mike Todd.
0:04:17 I’m sorry.
0:04:18 I’m sorry.
0:04:19 Just a quick announcement.
0:04:21 This is very exciting.
0:04:24 I just struck a deal as I told you I’m going to LA.
0:04:26 And you’re the first to know that Hulu is announced
0:04:28 it’ll be live streaming a fight between me and Jimmy Carter.
0:04:29 (bell dings)
0:04:32 By the way, if you get paid $20 million,
0:04:33 I don’t know what Tyson was paid.
0:04:34 I think it was $20 million.
0:04:37 You have an obligation to either kick the shit out
0:04:39 of someone or have the shit kicked out of you.
0:04:42 This kind of jab snort through your nose
0:04:43 and just stay away from the guy.
0:04:44 I don’t buy it.
0:04:47 I want my $12 back Netflix.
0:04:50 Despite the disappointment in the fight,
0:04:53 Jake Paul did in fact defeat Mike Tyson in eight rounds.
0:04:54 Can you even call it a win?
0:04:56 Can you?
0:04:58 The fight was shown in over 6,000 bars and restaurants
0:04:59 across the U.S., breaking the record
0:05:03 for the biggest commercial distribution in the sport.
0:05:05 But the record numbers came with a few hiccups.
0:05:06 Viewers reported various tech issues,
0:05:10 including slow loading times, pixelated screens,
0:05:13 and a malfunctioning earpiece from one of the commentators.
0:05:14 That’s a weird one.
0:05:16 A malfunctioning earpiece from one of the commentators.
0:05:20 Data from Down Detector revealed that user reported out
0:05:21 it just peaked at more than 95,000
0:05:23 around 11 p.m. Eastern time.
0:05:25 Frustrated fans flooded social media,
0:05:28 criticizing Netflix for the poor streaming quality.
0:05:33 Netflix CTO Elizabeth Stone, soon to be probably former CTO,
0:05:34 wrote to employees, “I’m sure many of you
0:05:35 “have seen the chatter and the press
0:05:37 “and the social media about the quality issues.
0:05:39 “We don’t want to dismiss the poor experience
0:05:41 “of some members and know we have room for improvement,
0:05:43 “but still consider this event a huge success.”
0:05:47 No, there was a pretty big fuck up for you, Ms. Stone.
0:05:50 Specifically, Netflix tries to garner evaluation,
0:05:52 not of a media company, but of a tech company,
0:05:53 which means you’re actually supposed
0:05:54 to be pretty good at this shit.
0:05:56 And didn’t you know exactly how many people
0:05:57 were gonna show up for this?
0:06:00 Didn’t you kind of, weren’t you able to sort of estimate
0:06:03 pretty accurately just exactly how many people
0:06:05 would be dialing at exactly the same time
0:06:06 and then test the shit out of this?
0:06:09 You’re beginning to smell a little bit like Twitter
0:06:10 in a presidential announcement.
0:06:13 That just is unforgivable for a fucking tech company.
0:06:15 Come on, guys, this is what you do.
0:06:16 This isn’t the first time Netflix
0:06:17 has fumbled with a live event.
0:06:20 Last year, their Love Is Blind reunion show
0:06:21 faced a similar situation,
0:06:22 leaving viewers waiting over an hour
0:06:25 before a recorded version was made available.
0:06:27 And this brings up a bigger question.
0:06:29 With Netflix’s pushing to live sports,
0:06:31 including NFL game scheduled for Christmas
0:06:34 and a major deal with WWE starting next year,
0:06:36 can they deliver the kind of quality viewers expect
0:06:38 that they get from broadcast cable?
0:06:40 It looks like what’s old is new again
0:06:41 that we have taken for granted,
0:06:44 kind of the production quality of live TV
0:06:45 and how difficult it is.
0:06:49 That’s one thing I’ll say about Morning Joe or The View
0:06:51 or even I think Fox does a great,
0:06:53 they’re great at delivering TV live.
0:06:56 I think CNN also does a fantastic job.
0:06:57 Netflix isn’t alone.
0:07:00 Other streaming platforms including Comcast’s Peacock
0:07:02 have also been getting into live sports.
0:07:04 Earlier this year, Peacock’s January playoff game
0:07:06 between the Kansas City Chiefs and Miami Dolphins
0:07:08 drew 23 million viewers,
0:07:10 which broke records for internet usage in the US.
0:07:14 Get this, the game was responsible for 30%
0:07:15 of internet traffic that night.
0:07:16 That’s like squid games.
0:07:19 This is all proof that the market for live sports
0:07:21 on streaming platforms is a massive opportunity
0:07:23 and companies are willing to spend big.
0:07:24 According to the Wall Street Journal,
0:07:27 Netflix is paying around $75 million
0:07:28 for NFL game this season.
0:07:30 They also recently signed a 10 year,
0:07:31 $5 million deal with WWE.
0:07:35 It used to be that live in sports
0:07:38 were sort of the last walls to be breached
0:07:39 in broadcast cable.
0:07:40 Like we’ll always have sports.
0:07:42 And then the people with the cheapest capital
0:07:43 in the deepest pockets showed up and said,
0:07:45 “Hey, we’ll take Thursday night football.
0:07:48 Hey, we’ll take the Logan Paul or Jake Paul,
0:07:49 is it Jake or Logan?”
0:07:51 And I can’t remember, anyways.
0:07:54 I mean, literally broadcast cable television right now,
0:07:56 it’s like Mark Twain said about going bankrupt.
0:07:58 It was slowly then suddenly.
0:08:01 We’re in the suddenly stage of the decline
0:08:02 of linear ad supported TV.
0:08:06 It has gotten really bad in the last few months.
0:08:11 I had breakfast with the former CEO of CNN.
0:08:12 Who’s a lovely guy?
0:08:15 And he said that CNN’s viewership
0:08:17 versus the last election has been cut in half.
0:08:19 Can you imagine trying to explain to advertisers,
0:08:22 our viewership is off 50% since the last time
0:08:24 we were talking about election advertising.
0:08:29 My theory is that the unnatural unearned torrent of cash
0:08:31 at local news stations have been earning
0:08:33 for the last 20 years is about to go away.
0:08:34 And what are we talking about?
0:08:35 Scott, tell us more.
0:08:36 What are you saying?
0:08:38 Effectively, a lot of smart companies,
0:08:39 including I think Hearst and others
0:08:41 have gone around and bought up these local news stations.
0:08:42 And why?
0:08:43 ‘Cause they’re dying, aren’t they?
0:08:45 Well, yeah, they are.
0:08:46 But old people watch local news,
0:08:48 mostly to get the weather and local sports
0:08:51 because that Jerry Dumphy is just so likable
0:08:52 in that hot little number.
0:08:55 They always have some old guy with good hair
0:08:58 and broad shoulders who makes you feel comfortable and safe
0:09:00 and some hot woman in her 30s
0:09:04 who’s still waiting for the call up to do daytime TV.
0:09:08 And everybody, old people love this and old people vote.
0:09:09 Now what’s happening?
0:09:11 Okay, so the numbers are in.
0:09:13 A million people watch the best shows on MSNBC,
0:09:16 the average age is 70, it’s mostly white
0:09:17 and it’s mostly women.
0:09:18 So a 70 year old white woman.
0:09:20 Podcasts, 34 year old male.
0:09:21 Think about that.
0:09:23 Also the zeitgeist is different.
0:09:26 People go to cable news to sanctify their religion
0:09:27 or specifically their politics.
0:09:29 People come to podcasts to learn.
0:09:30 The zeitgeist is different.
0:09:34 We try to present our guests in a more aspirational light.
0:09:35 We’re not looking for a gotcha moment
0:09:36 to go live on TikTok.
0:09:38 It’s not, say a twist of phrase,
0:09:39 dead at done in six minutes
0:09:41 ’cause we got a break for an opioid induced constipation
0:09:43 commercial or life alert.
0:09:44 I’m following.
0:09:45 We don’t do that shit.
0:09:48 We sell zipper cruder and athletic greens
0:09:51 and thunder eyes and different kind of modern cool stuff
0:09:52 like that.
0:09:53 Also, Viori.
0:09:54 I’m wearing Viori shorts right now.
0:09:57 By the way, I fucking love this athleisure.
0:10:00 Oh my God, I look so good in this shit.
0:10:02 Actually, no one really looks good.
0:10:04 No man looks good in athleisure,
0:10:06 but I look less bad than I look in most athleisure.
0:10:08 I love the fabrics.
0:10:09 Not even getting paid to say this.
0:10:11 Wearing it right now.
0:10:13 So let’s talk a little bit about Netflix.
0:10:14 It’s up 81% year today.
0:10:16 True story, I bought Netflix at 10 bucks a share.
0:10:17 That’s a good news.
0:10:20 The bad news is I sold it at eight bucks a share
0:10:22 and now it’s at $840.
0:10:23 Daddy would be live broadcasting
0:10:25 from his own fucking Gulfstream right now.
0:10:26 Had I not been such a shit head,
0:10:29 I’m gonna find a time machine, get in it, go back,
0:10:31 find me, kill me and then kill myself.
0:10:34 Eight, Jesus, God.
0:10:37 Anyways, Amazon is up 34%.
0:10:38 I do own that stock.
0:10:39 Disney is up 22%.
0:10:40 My stock pick for 2024.
0:10:44 Warner Brothers Discovery down 22%.
0:10:47 Jesus Christ Malone, you fired the wrong guy, Paramount.
0:10:49 By the way, Zazloff, the guy who was overseeing
0:10:52 a destruction of about 60 or 70% of shareholder value
0:10:53 since he talked to much of stupid people
0:10:55 into why this merger made any fucking sense
0:10:57 and took on way too much debt.
0:11:00 He’s managed to pull out about a third of a billion dollars
0:11:01 despite destroying a massive amount
0:11:02 of shareholder value.
0:11:05 Paramount is down 28% year today.
0:11:07 Comcast is down 2.3%.
0:11:12 Comcast I think is arguably the best run of the cable folks.
0:11:13 Obviously not including Netflix,
0:11:16 which is just a gangster run company.
0:11:19 So Netflix has about 250 million users.
0:11:21 Amazon Prime Video has 200 million.
0:11:22 Is that fair though?
0:11:24 ‘Cause you just automatically get it with Prime.
0:11:27 Disney Plus 150 million.
0:11:29 Max 95.
0:11:33 I love Max, we sold our series into Netflix.
0:11:34 Our big tech drama.
0:11:38 I think most of us would have liked HBO
0:11:40 just ’cause HBO has a certain culture
0:11:42 that feeds kind of the water cooler.
0:11:45 You’re talking about something in streaming media.
0:11:47 You’re usually talking about something on Max
0:11:49 but Netflix has also got bigger reach.
0:11:51 These are good problems.
0:11:54 Hulu’s Paramount is at 63 million.
0:11:57 Hulu 49, Peacock 28.
0:12:02 ESPN Plus at 26, Apple TV at 25.
0:12:05 And then Stars, remember them at 16 million.
0:12:07 Effectively these guys have cheaper capital.
0:12:09 They’re absolutely killing linear TV.
0:12:11 Does that mean it’s a bad business now?
0:12:13 Someone’s gonna come in and roll up all of these assets
0:12:18 between the old Viacom assets, CNN, Turner,
0:12:22 all the Disney shit, ABC.
0:12:23 They’re gonna roll them all up.
0:12:26 Milk ’em for their cash flow, cut costs faster
0:12:28 than the revenue declines.
0:12:29 These businesses,
0:12:30 while they seem to be going out of business
0:12:33 pretty fast right now, it’ll probably level out.
0:12:35 AOL’s still a small but great business.
0:12:37 I think it does something like four or $500 million
0:12:38 in EBITDA ’cause there’s still a lot of people
0:12:40 that depend on AOL and rural areas
0:12:42 for their dial-up for their internet.
0:12:46 And some people will kind of hang in there, if you will.
0:12:48 But this is gonna be a distress play.
0:12:50 They’re gonna stop this consensual hallucination
0:12:52 that these things are gonna ever grow again.
0:12:54 They’ll consolidate them to start cutting costs.
0:12:57 One of the best investments I’ve ever made, yellow pages.
0:12:58 We bought a yellow pages company
0:13:01 for about two or two and a half times cash flow.
0:13:05 Yeah, it’s going down by eight to 12% a year.
0:13:06 But if you cut costs faster than that
0:13:09 by going and buying the other shitty yellow pages companies
0:13:10 and then consolidating the staff,
0:13:12 which is Latin for layoff people,
0:13:15 and you can cut costs faster than 8%,
0:13:17 you have an increase in EBITDA every year.
0:13:19 I still find across the entire asset class,
0:13:21 and this is where I’ll wrap up.
0:13:25 In general, a basic axiom that I have found holds water
0:13:26 through the test of time around investing,
0:13:29 is the sexier it is, the lower the ROI.
0:13:31 And if you look at asset classes
0:13:33 in terms of their sex appeal,
0:13:35 venture investing or angel investing is fun, right?
0:13:37 It’s for what I call FIPS,
0:13:39 formerly important people that wanna stay involved
0:13:40 and wanna help entrepreneurs.
0:13:42 But be clear, the only return you get is psychic.
0:13:45 It is a terrible asset class, even if something works.
0:13:48 And at that stage, it is very hard to predict.
0:13:51 You’re talking about one in seven, maybe, do well.
0:13:52 And even at one company,
0:13:54 likely you’ll get washed out along the way
0:13:56 at a little bump and the VCs have showed up
0:13:56 and they’ll wash you out.
0:13:59 It is a very tough asset class to make money.
0:14:02 Venture does better, but the majority of the returns
0:14:04 are not only crowded to a small number of brands
0:14:05 that get all the deal flow,
0:14:06 but a small number of partners
0:14:09 within that small number of firms.
0:14:11 And then you have growth, I think that’s better.
0:14:14 Then you have IPOs, unfortunately IPOs.
0:14:16 That winter is really ugly right now.
0:14:20 The IPO market’s basically been in a pretty big deep freeze
0:14:21 for several years now.
0:14:22 And people keep thinking it’s gonna come back.
0:14:25 We got excited about Reddit, but not a lot followed.
0:14:27 And then you go into public company stocks.
0:14:30 It’s impossible to pick stocks by an index fund.
0:14:31 Then you get into distressed
0:14:33 or mature companies, dividend plays.
0:14:35 And then what I love is distressed.
0:14:37 I find that distressed is the best asset class.
0:14:38 Why?
0:14:43 What business has the greatest likelihood of succeeding?
0:14:44 Anything in senior care.
0:14:45 Why?
0:14:46 Against the above, the less sexy it is.
0:14:48 People don’t wanna be around old people.
0:14:49 It reminds them of death.
0:14:50 They’re generally pretty boring.
0:14:51 I know I’m supposed to say
0:14:55 they just have so much experience and wisdom sometimes.
0:14:56 And people wanna avoid them.
0:14:58 People wanna hang out with hot young people, right?
0:15:00 And people wanna hang out with hot young companies,
0:15:03 specifically capital wants to hang out
0:15:05 with hot young growing companies.
0:15:06 And they don’t like the way
0:15:09 that old companies smell, so to speak.
0:15:10 So they avoid them.
0:15:12 And that’s why there’s a greater return
0:15:13 on investment in distressed.
0:15:15 What’s the learning here?
0:15:18 Sex appeal and ROI are inversely correlated.
0:15:20 So yeah, if you wanna invest
0:15:21 in a member’s club downtown
0:15:24 for the fashion industry and the music industry have at it,
0:15:28 but keep in mind ROI and sex appeal inversely correlated.
0:15:33 We’ll be right back for our conversation with Eric Schmidt.
0:15:38 Support for property comes from Mint Mobile.
0:15:40 You’re probably paying too much for your cell phone plan.
0:15:42 It’s one of those budgetary line items
0:15:43 that always looks pretty ugly.
0:15:44 And it might feel like
0:15:46 there’s nothing you can do about it.
0:15:48 That’s where Mint Mobile has something to say.
0:15:49 Mint Mobile’s latest deal might challenge your idea
0:15:51 of what a phone plan costs.
0:15:52 If you make the switch now,
0:15:54 you’ll pay just $15 a month
0:15:56 when you purchase a new three-month phone plan.
0:15:58 All Mint Mobile plans come with high-speed data
0:15:59 on a limit-talked index delivered
0:16:01 on the nation’s largest 5D network.
0:16:03 You can even keep your phone, your contacts,
0:16:04 and your number.
0:16:06 It doesn’t get much easier than that.
0:16:07 To get this new customer offer
0:16:09 and your new three-month premium wireless plan
0:16:10 for just 15 bucks a month,
0:16:13 you can go to mintmobile.com/profg.
0:16:15 That’s mintmobile.com/profg.
0:16:16 You can cut your wireless bill
0:16:18 to 15 bucks a month at mintmobile.com/profg.
0:16:20 $45 upfront payment required,
0:16:21 equivalent to $15 a month.
0:16:23 New customers on a first three-month plan
0:16:25 only speeds slower, about 40 gigabytes on a limited plan,
0:16:28 additional taxes, fees, and restrictions apply.
0:16:30 See Mint Mobile for details.
0:16:38 Your business is ready for launch.
0:16:40 But what’s the most important thing to do
0:16:42 before those doors open?
0:16:45 Is it getting more social media followers?
0:16:47 Or is it actually legitimizing
0:16:50 and protecting the business you’ve been busy building?
0:16:53 Make it official with LegalZoom.
0:16:55 LegalZoom has everything you need to launch,
0:16:58 run, and protect your business all in one place.
0:17:00 Setting up your business properly
0:17:01 and remaining compliant
0:17:04 are the things you want to get right from the get-go.
0:17:06 And LegalZoom saves you from wasting hours
0:17:09 making sense of the legal stuff.
0:17:11 And if you need some hands-on help,
0:17:12 their network of experienced attorneys
0:17:15 from around the country has your back.
0:17:18 Launch, run, and protect your business
0:17:21 to make it official today at LegalZoom.com.
0:17:24 And use promo code VOXBiz to get 10% off
0:17:26 any LegalZoom business formation product
0:17:29 excluding subscriptions and renewals.
0:17:32 Expires December 31st, 2024.
0:17:34 Get everything you need from setup to success
0:17:36 at LegalZoom.com.
0:17:38 And use promo code VOXBiz.
0:17:40 LegalZoom.com.
0:17:43 And use promo code VOXBiz.
0:17:46 LegalZoom provides access to independent attorneys
0:17:47 and self-service tools.
0:17:48 LegalZoom is not a law firm
0:17:50 and does not provide legal advice
0:17:52 except for authorized through its subsidiary law firm
0:17:54 LZ Legal Services LLC.
0:17:59 Support for the show comes from one password.
0:18:01 How do you make a password that’s strong enough
0:18:04 so no one will guess it and impossible to forget?
0:18:07 And now how can you do it for over 100 different sites
0:18:08 and make it so everyone in your company
0:18:10 can do the exact same thing
0:18:11 without ever needing to reset them?
0:18:13 It’s not impossible.
0:18:15 One password makes it simple.
0:18:17 One password combines industry-leading security
0:18:19 with award-winning design to bring private secure
0:18:22 and user-friendly password management to everyone.
0:18:25 One password makes strong security easy for your people
0:18:27 and gives you the visibility you need to take action
0:18:28 when you need to.
0:18:30 A single data breach can cost millions of dollars
0:18:32 while one password secures every sign-in
0:18:34 to save you time and money.
0:18:36 And it lets you securely switch
0:18:38 between iPhone, Android, Mac, and PC.
0:18:41 All you have to remember is the one strong account password
0:18:42 that protects everything else.
0:18:44 Your logins, your credit cards, secure notes,
0:18:46 or the office Wi-Fi password.
0:18:49 Right now, our listeners get a free two-week trial
0:18:53 at onepassword.com/prof for your growing business.
0:18:57 That’s two weeks free at onepassword.com/prof.
0:18:59 Don’t let security slow your business down.
0:19:02 Go to onepassword.com/prof.
0:19:05 (upbeat music)
0:19:14 – Welcome back.
0:19:15 Here’s our conversation with Eric Schmidt,
0:19:17 a technologist, entrepreneur, philanthropist,
0:19:19 and Google former CEO.
0:19:24 Eric, where does this podcast find you?
0:19:25 – I’m in Boston.
0:19:28 I’m at Harvard and giving a speech to students later today.
0:19:30 – Oh, nice.
0:19:31 So let’s bust right into it.
0:19:33 You have a new book out that you co-authored
0:19:35 with the late Henry Kissinger titled
0:19:37 “Genesis, Artificial Intelligence,
0:19:40 “Hope, and the Human Spirit.”
0:19:41 What is it about this book?
0:19:44 Or give us what you would call the Pillars of Insight here
0:19:48 around that’ll help people understand the evolution of AI.
0:19:51 – Well, the world is full of stories about what AI can do.
0:19:54 And we generally agree with those.
0:19:58 What we believe, however, is the world is not ready for this.
0:20:00 And there are so many examples,
0:20:04 whether it’s trust, military power, deception,
0:20:07 economic power, the effect on humans,
0:20:12 the effect on children that are relatively poorly explored.
0:20:16 So the reader of this book doesn’t need to understand AI,
0:20:18 but they need to be worried
0:20:20 that this stuff is going to be unmanaged.
0:20:23 Dr. Kissinger was very concerned
0:20:26 that the future should not be left to people like myself.
0:20:30 He believed very strongly that these tools are so powerful
0:20:33 in terms of their effect on human society.
0:20:35 It was important that the decisions be made
0:20:37 by more than just the tech people.
0:20:40 And the book is really a discussion
0:20:43 about what happens to the structure of organizations,
0:20:46 the structure of jobs, the structure of power,
0:20:49 and all the things that people worry about.
0:20:54 I personally believe that this will happen much, much more
0:20:57 quickly than societies are ready for,
0:20:59 including in the United States and China.
0:21:02 It’s happening very fast.
0:21:04 – And what do you see as the real existential threats here?
0:21:07 Is it that it becomes sentient?
0:21:11 Is it misinformation, income inequality, loneliness?
0:21:14 What do you think are the kind of first and foremost
0:21:17 biggest concerns you have about this rapid evolution of AI?
0:21:20 – There are many things to worry about.
0:21:22 Before we say the bad things,
0:21:25 let me remind you enormous improvements
0:21:28 in drug capability for healthcare,
0:21:31 solutions to climate change, better vehicles,
0:21:33 huge discoveries in science,
0:21:36 greater productivity for kind of everyone,
0:21:38 a universal doctor, a universal educator,
0:21:40 all of these things are coming.
0:21:42 And those are fantastic.
0:21:47 A long way you come with, because these are very powerful,
0:21:49 especially in the hands of an evil person
0:21:51 and we know evil exists,
0:21:55 these systems can be used to harm large numbers of people.
0:21:58 The most obvious one is their use in biology.
0:22:00 Can these systems at some point in the future
0:22:02 generate biological pathogens
0:22:06 that could harm many, many, many, many humans?
0:22:08 Today, we’re quite sure they can’t,
0:22:09 but there’s a lot of people who think
0:22:12 that they will be able to unless we take some action.
0:22:15 Those actions are being worked on now.
0:22:16 What about cyber attacks?
0:22:18 You have a lone actor, a terrorist group,
0:22:22 North Korea, whomever, whatever your evil person or group is,
0:22:25 and they decide to take down the financial system
0:22:28 using a previously unknown attack vector,
0:22:30 so-called zero-day exploits.
0:22:33 So the systems are so powerful
0:22:36 that we are quite concerned
0:22:39 that in addition to democracies using them for gains,
0:22:42 dictators will use them to aggregate power
0:22:45 and they’ll be used in a harmful and military context.
0:22:49 So I’m freaked out about these AI girlfriends.
0:22:52 I feel as if the biggest threat in the U.S. right now
0:22:55 is loneliness that leads to extremism,
0:22:59 and I see these AI girlfriends and AI searches popping up,
0:23:02 and I see a lot of young men who have a lack of romantic
0:23:06 or economic opportunities turning to AI girlfriends
0:23:09 and begin to sequester from real relationships
0:23:11 and they become less likely to believe in climate change,
0:23:14 more likely to engage in misogynistic content,
0:23:17 sequester from school, their parents’ work,
0:23:20 and some they become really shitty citizens.
0:23:23 And I think men, young men are having so much trouble
0:23:28 that this low risk entry into these faux relationships
0:23:30 is just gonna speedball loneliness
0:23:32 and the externalities of loneliness.
0:23:33 Your thoughts?
0:23:35 – I completely agree.
0:23:38 There’s lots of evidence that there’s now a problem
0:23:39 with young men.
0:23:42 In many cases, the path to success for young men
0:23:46 has been, shall we say, been made more difficult
0:23:48 because they’re not as educated as the women are now.
0:23:50 Remember, there are more women in college than men,
0:23:55 and many of the traditional paths are no longer as available.
0:23:58 And so they turn to the online world
0:24:01 for enjoyment and sustenance,
0:24:03 but also because of the social media algorithms,
0:24:07 they find like-minded people who ultimately radicalize them
0:24:10 either in a horrific way like terrorism
0:24:12 or in the kind of way that you’re describing,
0:24:13 where they’re just maladjusted.
0:24:18 This is a good example of an unexpected problem
0:24:20 of existing technology.
0:24:24 So now imagine that the AI girlfriend or boyfriend,
0:24:26 although she was AI girlfriend as an example,
0:24:31 is perfect, perfect visually, perfect emotionally.
0:24:35 And the AI girlfriend in this case captures your mind
0:24:39 as a man to the point where she or whatever it is
0:24:41 takes over the way you thinking.
0:24:44 You’re obsessed with her.
0:24:46 That kind of obsession is possible,
0:24:49 especially for people who are not fully formed.
0:24:51 Parents are going to have to be more involved
0:24:52 for all the obvious reasons,
0:24:53 but at the end of the day,
0:24:56 parents can only control what their sons and daughters
0:24:58 are doing within reason.
0:25:01 We’ve ended up, again, using teenagers as an example.
0:25:05 We have all sorts of rules about age of maturity, 16, 18,
0:25:07 what have you, 21 in some cases,
0:25:10 and yet you put a 12 or 13 year old
0:25:11 in front of one of these things
0:25:13 and they have access to every evil
0:25:14 as well as every good in the world
0:25:16 and they’re not ready to take it.
0:25:17 So I think the general question of,
0:25:21 are you mature enough to handle it?
0:25:24 Sort of the general version of your AI girlfriend example
0:25:26 is unresolved.
0:25:28 – So I think people, most people would agree
0:25:30 that the pace of AI is scary
0:25:35 and that our institutions and our ability to regulate
0:25:37 are not keeping up with the pace of evolution here.
0:25:40 And we see what perfectly what happened
0:25:41 with social around this.
0:25:43 What can be done?
0:25:47 How, what’s an example or a construct or framework
0:25:50 that you can point to where we get the good stuff,
0:25:54 the drug discovery, the help with climate change,
0:25:57 but attempt to screen out or at least put in check
0:26:00 or put in some guardrails around the bad stuff.
0:26:03 What’s the, what are you advocating for?
0:26:06 – I think it starts with having an honest conversation
0:26:08 of where the problems come from.
0:26:11 So you have people who are absolutist on free speech,
0:26:14 which I happen to agree with,
0:26:17 but they confuse free speech of an individual
0:26:19 versus free speech for a computer.
0:26:23 I am strongly in favor of free speech for every human.
0:26:26 I am not in favor of free speech for computers.
0:26:30 And the algorithms are not necessarily optimizing
0:26:32 the best thing for humanity.
0:26:35 So as a general point, specifically,
0:26:37 we’re going to have to have some conversations
0:26:40 about what is, at what age are things appropriate?
0:26:42 And we’re also going to have to change some of the laws,
0:26:44 for example, section 230,
0:26:48 to allow for liability in the worst possible cases.
0:26:52 So when someone is harmed from this technology,
0:26:54 we need to have a solution to prevent further harm.
0:26:57 Every new invention has created harm.
0:26:58 Think about cars, right?
0:27:02 So cars used to hit everything and they were very unsafe.
0:27:04 Now cars are really quite safe.
0:27:08 Certainly by comparison to anything in history.
0:27:10 So the history of these inventions
0:27:13 is that you allow for the greatness
0:27:16 and you police the guard, technically the guardrails.
0:27:19 You put limits on what they can do.
0:27:21 And it’s an appropriate debate,
0:27:23 but it’s one that we have to have now for this technology.
0:27:26 I’m particularly concerned about the issue
0:27:28 that you mentioned earlier
0:27:31 about the effect of on human psyche.
0:27:34 Dr. Kissinger, who studied Kant,
0:27:37 was very concerned, and we write in the book at some length,
0:27:40 about what happens when your worldview
0:27:45 is taken over by a computer as opposed to your friends, right?
0:27:49 Isolated, the computer is feeding you stuff.
0:27:53 It’s not optimized around human values, good or bad.
0:27:55 God knows what it’s trying to do.
0:27:57 It’s trying to make money or something.
0:27:59 That’s not a good answer.
0:28:01 – So I think most reasonable people would say,
0:28:04 “Okay, some sort of fossil fuels are a net good.”
0:28:06 I would argue pesticides are a net good,
0:28:10 but we have emission standards and an FDA.
0:28:12 Most people would, I think, loosely agree
0:28:15 or mostly agree that some sort of regulation
0:28:18 that keeps these things in check makes sense.
0:28:20 Now, let’s talk about big tech,
0:28:22 which you were an instrumental player in.
0:28:24 You guys figured out a way, quite frankly,
0:28:27 to overrun Washington with lobbyists
0:28:30 and avoid all reasonable regulation.
0:28:31 Why are things gonna be different now
0:28:33 than what they were in your industry
0:28:35 when you were involved in it?
0:28:37 – Well, President Trump has indicated
0:28:41 that he is likely to repeal the executive order
0:28:43 that came out of President Biden,
0:28:45 which was an attempt at this.
0:28:49 So I think a fair prediction is that for the next four years,
0:28:51 there’ll be very little regulation in this area
0:28:54 as the president will be focused on the things.
0:28:57 So what will happen in those companies
0:29:00 is if there is real harm, there’s liability,
0:29:02 there’s lawsuits and things.
0:29:04 So the companies are not completely scot-free.
0:29:07 Our companies, remember, are economic agents
0:29:09 and they have lawyers whose jobs are to protect
0:29:12 their intellectual property and their goals.
0:29:14 So it’s gonna take, I’m sorry to say,
0:29:18 it’s likely to take some kind of a calamity
0:29:21 to cause a change in regulation.
0:29:23 And I remember when I was in California,
0:29:27 when I was younger, California driver’s licenses,
0:29:30 the address on your driver’s license was public
0:29:31 and there was a horrific crime
0:29:33 where a woman was followed to her home
0:29:36 and then she was murdered based on that information.
0:29:38 And then they changed the law.
0:29:42 And my reaction was, didn’t you foresee this, right?
0:29:46 You put millions and millions of license information
0:29:49 to the public and you don’t think that some idiot
0:29:51 who’s horrific is gonna harm somebody.
0:29:54 So my frustration is not that it will occur
0:29:55 because I’m sure it will,
0:29:58 but why did we not anticipate that as an example?
0:30:03 We should anticipate, make a list of the biggest harms.
0:30:05 I’ll give you another example.
0:30:08 These systems should not be allowed access to weapons.
0:30:10 Very simple.
0:30:14 You don’t want the AI deciding when to launch a missile.
0:30:17 You want the human to be responsible.
0:30:20 And these kinds of sensible regulations
0:30:22 are not complicated to state.
0:30:25 – Are you familiar with character AI?
0:30:26 – I am.
0:30:30 – Really, just a horrific incident
0:30:33 where a 14-year-old thinks he establishes a relationship
0:30:37 with an AI agent that he thinks is a character
0:30:38 from Game of Thrones.
0:30:40 He’s obviously unwell,
0:30:42 although he, my understanding is from his mother
0:30:45 who’s taken this on as an issue, understandably.
0:30:49 He did not qualify as someone who was mentally ill.
0:30:52 Establishes this very deep relationship
0:30:55 with obviously a very nuanced character.
0:30:59 And the net effect is he contemplates suicide
0:31:02 and she invites him to do that.
0:31:05 And the story does not end well.
0:31:07 And my view, Eric, is that if we’re waiting
0:31:09 for people’s critical thinking to show up
0:31:11 or for the better angels of CEOs of companies
0:31:13 that are there to make a profit,
0:31:14 that’s what they’re supposed to do.
0:31:15 They’re doing their job.
0:31:19 We’re just gonna have tragedy after tragedy after tragedy.
0:31:22 My sense is someone needs to go to jail.
0:31:23 And in order to do that,
0:31:26 we need to pass laws showing that if you’re reckless
0:31:29 with technology and we can reverse engineer it
0:31:31 to the death of a 14-year-old,
0:31:33 that you are criminally liable.
0:31:35 But I don’t see that happening.
0:31:37 So I would push back on the notion
0:31:39 that people need to think more critically.
0:31:40 That would be lovely.
0:31:42 I don’t see that happening.
0:31:45 I have no evidence that any CEO of a tech company
0:31:47 is gonna do anything but increase the value
0:31:49 of their shares, which I understand
0:31:52 and is a key component of capitalism.
0:31:54 It feels like we need laws
0:31:57 that either remove this liability shield.
0:31:58 I mean, does any of this change
0:32:01 until someone shows up in an orange jumpsuit?
0:32:04 – I can tell you how we dealt with this at Google.
0:32:07 We had a rule that in the morning we would look at things.
0:32:10 And if there was something that looked like real harm,
0:32:12 we would resolve it by noon.
0:32:15 And we would make the necessary adjustments.
0:32:19 The example that you gave is horrific,
0:32:21 but it’s all too common.
0:32:24 And it’s gonna get worse for the following reason.
0:32:26 So now imagine you have a two-year-old
0:32:28 and you have the equivalent of a bear
0:32:30 that is the two-year-old’s best friend.
0:32:32 And every year the bear gets smarter
0:32:34 and the two-year-old gets smarter too,
0:32:37 becomes three, four, five, and so forth.
0:32:40 That now 15-year-old’s best friend
0:32:43 will not be a boy or a girl of the same age.
0:32:45 It’ll be a digital device.
0:32:50 And such people highlighted in your terrible example
0:32:52 are highly suggestible.
0:32:55 So either the people who are building
0:32:58 the equivalent of that bear 10 years from now
0:33:01 are gonna be smart enough to never suggest harm,
0:33:05 or they’re gonna get regulated and criminalized.
0:33:06 Those are the choices.
0:33:09 The technology, I used to say that the internet
0:33:12 is really wonderful, but it’s full of misinformation
0:33:15 and there’s an off button for a reason, turn it off.
0:33:17 I can’t do that anymore.
0:33:20 The internet is so intertwined in our daily lives.
0:33:23 All of us, every one of us, for the good and bad,
0:33:25 that we can’t get out of the cesspool
0:33:27 if we think it’s a cesspool and we can’t make it better
0:33:29 ’cause it keeps coming at us.
0:33:32 The industry, to answer your question,
0:33:36 the industry is optimized to maximize your attention
0:33:37 and monetize it.
0:33:40 So that behavior is gonna continue.
0:33:43 The question is how do you manage the extreme cases?
0:33:46 Anything involving personal harm of the nature
0:33:49 that you’re describing will be regulated
0:33:50 one way or the other.
0:33:53 – Yeah, at some point, it’s just a damage
0:33:54 we incur until then, right?
0:33:58 We’ve had 40 congressional hearings on child safety
0:34:00 and social media and we’ve had zero laws.
0:34:05 – In fairness to that, there is a very, very extensive set
0:34:08 of laws around child sexual abuse,
0:34:10 which is obviously horrific as well.
0:34:14 And those laws are universally implemented
0:34:16 and well adhered to.
0:34:19 So we do have examples where everyone agrees
0:34:20 what the harm is.
0:34:22 I think all of us would agree that a suicide
0:34:25 of a teenager is not okay.
0:34:27 And so regulating the industry,
0:34:29 so it doesn’t generate that message,
0:34:31 strikes me as a brainer.
0:34:34 The ones which will be much harder are where
0:34:38 the system has essentially captured the emotions
0:34:41 of the person and is feeding them back to the person
0:34:43 as opposed to making suggestions.
0:34:46 And that’s, and we talk about this in the book,
0:34:48 when the system is shaping your thinking,
0:34:50 you are being shaped by a computer,
0:34:52 you’re not shaping it.
0:34:54 And because these systems are so powerful,
0:34:57 we worry and again, we talk about this in the book,
0:35:01 of the impact on the perception of truth and of society.
0:35:02 Who am I?
0:35:03 What do I do?
0:35:06 And ultimately, one of the risks here,
0:35:08 if we don’t get this under control,
0:35:11 is that we will be the dogs to the powerful AI
0:35:14 as opposed to us telling the AI what to do.
0:35:17 A simple answer to the question of when
0:35:20 is the industry believes that within five to 10 years,
0:35:22 these systems will be so powerful
0:35:25 that they might be able to do self-learning.
0:35:27 And this is a point where the system begins
0:35:29 to have its own actions, its own religion,
0:35:32 it’s called evolution, it’s called general intelligence,
0:35:34 AGI as it’s called.
0:35:37 And the arrival of AGI will need to be regulated.
0:35:39 We’ll be right back.
0:35:42 (upbeat music)
0:35:44 Support for PropG comes from Miro.
0:35:46 While a lot of CEOs believe that innovation
0:35:48 is the lifeblood of business,
0:35:50 very few of them actually see their team unlock
0:35:52 the creativity needed to innovate.
0:35:54 A lot of times that’s because once you’ve moved
0:35:56 from discovery and ideation of product development,
0:35:58 outdated process management tools,
0:35:59 context switching, team alignment
0:36:03 and constant updates massively slow, the process.
0:36:06 But now you can take a big step to solving these problems
0:36:09 with the innovation workspace from Miro.
0:36:11 Miro is a workspace where teams can work together
0:36:13 from initial stages of project or product design
0:36:16 all the way to designing and delivering the finished product.
0:36:18 Powered by AI, Miro can help teams increase the speed
0:36:20 of their work by generating AI-powered summaries,
0:36:22 product briefs and research insights
0:36:24 in the early stages of development.
0:36:28 Then move to prototypes, process flows and diagrams.
0:36:30 And once there, execute those tasks with timelines
0:36:33 and project trackers all in a single shared space.
0:36:36 Whether you work in product design, engineering, UX, agile
0:36:39 or marketing, bring your team together on Miro.
0:36:40 Your first three Miro boards are free
0:36:43 when you sign up today at Miro.com.
0:36:46 That’s three free boards at M-I-R-O.com.
0:36:54 (gentle music)
0:36:57 – Autograph collection hotels offer over 300
0:36:59 independent hotels around the world,
0:37:02 each exactly like nothing else.
0:37:04 Hand selected for their inherent craft,
0:37:07 each hotel tells its own unique story
0:37:10 through distinctive design and immersive experiences
0:37:13 from medieval falconry to volcanic wine tasting.
0:37:16 Autograph collection is part of the Marriott Bonvoy portfolio
0:37:20 of over 30 hotel brands around the world.
0:37:23 Find the unforgettable at autographcollection.com.
0:37:28 Support for PropG comes from Fundrise.
0:37:29 Artificial Intelligence is poised to be
0:37:31 one of the biggest wealth creation events in history.
0:37:34 Some experts expect AI to add more than $15 trillion
0:37:37 to the global economy by 2030.
0:37:38 Unfortunately, your portfolio
0:37:40 probably doesn’t own the biggest names in AI.
0:37:42 That’s because most of the AI revolution
0:37:45 is largely being built and funded in private markets.
0:37:47 That means the vast majority of AI startups
0:37:49 are going to be backed and owned by venture capitalists,
0:37:50 not public investors.
0:37:53 But with the launch of the Fundrise Innovation Fund last year,
0:37:55 you can get in on it now.
0:37:56 The Innovation Fund pairs a $100 million
0:37:59 plus venture portfolio of some of the biggest names in AI
0:38:01 with one of the lowest investment minimums
0:38:03 the venture industry has ever seen.
0:38:06 Get in early at fundrise.com/propg.
0:38:09 Carefully consider the investment material before investing,
0:38:11 including objectives, risks, charges, and expenses.
0:38:13 This and other information can be found
0:38:15 at the Innovation Fund’s prospectus
0:38:17 at fundrise.com/innovation.
0:38:20 This is a paid advertisement.
0:38:29 – We know that social media
0:38:31 and a lot of these platforms and apps
0:38:34 and time on phone, is this not a good idea?
0:38:36 I’m curious what you think of my colleague’s work,
0:38:37 Jonathan Hyde, and that is,
0:38:40 is there any reason for anyone under the age of 14
0:38:41 to have a smartphone?
0:38:44 And is there any reason for anyone under the age of 16
0:38:45 to be on social media?
0:38:48 Shouldn’t we agegate pornography, alcohol, the military?
0:38:52 Shouldn’t we, specifically the device makers
0:38:55 and the operating systems, including your old firm,
0:38:58 shouldn’t they get in the business of agegating?
0:38:59 – They should.
0:39:02 Indeed, Jonathan’s work is incredible.
0:39:05 He and I wrote an article together two years ago,
0:39:07 which called for a number of things
0:39:09 in the area of regulating social media.
0:39:12 And we start with changing a law called COPPA
0:39:15 from 13 to 16.
0:39:17 And we are quite convinced
0:39:19 that using various techniques,
0:39:21 we can determine the age of the person
0:39:23 with a little bit of work.
0:39:25 And so people say, well, you can’t implement it.
0:39:27 Well, that doesn’t mean you shouldn’t try.
0:39:30 And so we believe that at least the pernicious effects
0:39:34 of this technology on below 16 can be addressed.
0:39:36 When I think about all of this,
0:39:39 to me, we want children to be able to grow up
0:39:42 and grow up with humans as friends.
0:39:46 And I’m sure with the power of AI arrival,
0:39:48 that you’re gonna see a lot of regulation
0:39:51 about child content.
0:39:54 What can a child below 16 see?
0:39:56 This does not answer the question of what do you do
0:39:58 with the 20 year old, right?
0:40:00 Who is also still being shaped.
0:40:03 And as we know, men develop a little bit later than women.
0:40:05 And so let’s focus on the underdeveloped man
0:40:08 who’s having trouble in college or what have you.
0:40:09 What do we do with them?
0:40:11 And that question remains open.
0:40:16 – In terms of the idea that the genie is out of the bottle
0:40:19 here and we face a very real issue or fulcrum retention.
0:40:22 And that is we wanna regulate it.
0:40:23 We wanna put in guardrails.
0:40:28 At the same time, we wanna let our sprinters and our IP
0:40:29 and our minds and our universities
0:40:31 and our incredible for profit machine,
0:40:34 we wanna let it run, right?
0:40:37 And the fear is that if you regulate it too much,
0:40:41 the Chinese or the Islamic Republic
0:40:43 isn’t quite as concerned
0:40:46 and gets ahead of us on this technology.
0:40:48 How do you balance that tension?
0:40:51 – So there are quite a few people in the industry,
0:40:53 along with myself who are working on this.
0:40:58 And the general idea is relatively light regulation
0:41:01 looking for the extreme cases.
0:41:03 So the worst of the extreme events
0:41:06 would be a biological attack, a cyber attack,
0:41:08 something that harmed a lot of people
0:41:10 as opposed to a single individual,
0:41:11 which is always a tragedy.
0:41:14 Any misuse of these in war,
0:41:17 any of those kinds of things we worry a lot about.
0:41:19 And there’s a lot of questions here.
0:41:24 One of them is, do you think that if we had a GI system
0:41:30 that developed a way to kill all of the soldiers
0:41:33 from the opposition in one day that it would be used?
0:41:36 And I think the answer from a military general perspective
0:41:37 would be yes.
0:41:40 The next question is, do you think that the North Koreans,
0:41:44 for example, or the Chinese would obey the same rules
0:41:45 about when to apply that?
0:41:47 And the answer is no one believes
0:41:50 that they would do it safely and carefully
0:41:52 under the way the US law would require.
0:41:56 US law has a law called person in the loop
0:41:58 or meaningful human control
0:42:01 that tries to keep these things from going out of hand.
0:42:06 So what I actually think is that we don’t have a theory
0:42:09 of deterrence with these new tools.
0:42:13 We don’t know how to deal with the spread of them.
0:42:16 And the simple example,
0:42:17 and sorry for the diversion for a sec,
0:42:19 but there’s closed source and open source.
0:42:22 Closed is like you can use it,
0:42:25 but the software and the numbers are not available.
0:42:27 There are other systems called open source
0:42:29 where everything is published.
0:42:32 China now has two of what appear to be
0:42:35 the most powerful models ever made
0:42:37 and they’re completely open.
0:42:39 And we’re obviously, you and I are not in China
0:42:42 and I don’t know why China made a decision to release them,
0:42:45 but surely evil groups and so forth
0:42:46 will start to use those.
0:42:49 Now maybe they don’t speak Chinese or what have you,
0:42:52 or maybe the Chinese just discount the risk,
0:42:55 but there’s a real risk of proliferation of systems
0:42:57 in the hands of terrorism.
0:43:00 And proliferation is not gonna occur
0:43:03 by misusing Microsoft or Google or what have you.
0:43:05 It’s going to be by making their own servers
0:43:06 in the dark web.
0:43:08 And an example, a worry that we all have
0:43:10 is exfiltration of the models.
0:43:14 I’ll give an example, Google or Microsoft or OpenAI
0:43:16 spends $200 million or something
0:43:18 to build one of these models, they’re very powerful.
0:43:22 And then some evil actor manages to exfiltrate it
0:43:25 out of those companies and put it on the dark web.
0:43:29 We have no theory of what to do when that occurs.
0:43:31 Because we don’t control the dark web,
0:43:34 we don’t know how to detect it and so forth.
0:43:38 In the book we talk about this and say that eventually
0:43:40 the network systems globally will have
0:43:43 fairly sophisticated supervision systems
0:43:45 that will watch for this.
0:43:47 Because it’s another example of proliferation.
0:43:51 It’s analogous to the spread of enriched uranium.
0:43:53 If anyone tried to do that, there’s an awful lot
0:43:55 of monitoring systems that would say
0:43:58 you have to stop right now or we’re gonna shoot you.
0:43:59 – So you make a really cogent argument
0:44:02 for the kind of existential threat here,
0:44:05 the weaponization of AI by bad actors.
0:44:07 And we have faced similar issues before.
0:44:09 My understanding is there are multilateral treaties
0:44:13 around bioweapons or we have nuclear arms treaties.
0:44:16 So is this the point in time where people such as yourself
0:44:21 and our defense infrastructure should be thinking about
0:44:25 or trying to figure out multilateral agreements?
0:44:27 And again, the hard part there is my understanding
0:44:30 is it’s very hard to monitor things like this.
0:44:33 And should we have something along the lines of Interpol
0:44:36 that’s basically policing this and then fighting fire
0:44:41 with AI to go out and find scenarios
0:44:43 where things look very ugly and move in
0:44:44 with some sort of international force.
0:44:46 It feels like a time for some sort
0:44:51 of multinational cooperation is upon us, your thoughts.
0:44:51 – We agree with you.
0:44:54 And in the book, we specifically talk about this
0:44:58 in a historical context of the nuclear weapons regime,
0:45:01 which Dr. Kissinger, as you know, invented largely.
0:45:04 What’s interesting is working with him,
0:45:08 you realize how long it took for the full solution to occur.
0:45:11 America used the bomb in 1945.
0:45:15 Russia or Soviet Union demonstrated in 1949.
0:45:17 So that’s roughly, that was a four year gap.
0:45:20 And then there was sort of a real arms race.
0:45:23 And once that it took roughly 15 years
0:45:27 for an agreement to come for limitations on these things,
0:45:30 during which time we were busy making an enormous number
0:45:33 of weapons, which ultimately were a mistake,
0:45:35 including, you know, these enormous bombs
0:45:37 that were unnecessary.
0:45:40 And so things got out of hand.
0:45:44 In our case, I think what you’re saying is very important
0:45:47 that we start now, and here’s where I would start.
0:45:50 I would start with a treaty that says,
0:45:53 we’re not going to allow anyone who’s the signature
0:45:56 of the treaty to have automatic weapon systems.
0:46:00 And by automatic weapons, I don’t mean automated.
0:46:03 I mean, ones that make the decision on their own.
0:46:07 So an agreement that any use of AI, of any kind
0:46:11 in a conflict sense, has to be owned and authorized
0:46:15 by a human being who is authorized to make that decision.
0:46:17 That would be a simple example.
0:46:20 Another thing that you could do as part of that
0:46:23 is say that you have a duty to inform
0:46:26 when you’re testing one of these systems
0:46:28 in case it gets out of hand.
0:46:31 Now, whether these treaties can be agreed to,
0:46:34 I don’t know, remember that it was the horror
0:46:37 of nuclear war that got people to the table
0:46:39 and it still took 15 years.
0:46:43 I don’t want us to go through an analogous bad incident
0:46:45 involving an evil actor in North Korea.
0:46:48 Again, I’m just using them as bad examples
0:46:50 or even Russia today.
0:46:52 We obviously don’t trust.
0:46:54 I don’t want to run that experiment and have all that harm
0:46:58 and then say, hey, we should have foreseen this.
0:47:01 – Well, my sense is when we are better to technology,
0:47:03 we’re not in a hurry for a multilateral treaty, right?
0:47:06 When we’re under the impression that our nuclear scientists
0:47:07 are better than your, remember,
0:47:09 our Nazis are smarter than your Nazis kind of thing,
0:47:11 that we like, we don’t want a multilateral treaty
0:47:13 ’cause we see advantage.
0:47:15 And curious if you agree with this,
0:47:20 we have better AIs than anyone else.
0:47:21 Does that get in the way of a treaty
0:47:23 or should we be doing this from a position of strength?
0:47:25 And also, if there’s a number two,
0:47:27 and maybe you think we’re not the number one,
0:47:30 but assuming you think that the US is number one in this,
0:47:31 who is the number two?
0:47:33 Who do you think poses the biggest threat?
0:47:36 Is it their technology or their intentions or both?
0:47:39 If you were to hear that one of these
0:47:41 really awful things took place,
0:47:43 who would you think most likely
0:47:45 are the most likely actors behind it?
0:47:46 Is it a rogue state?
0:47:47 Is it a terrorist group?
0:47:49 Is it a nation state?
0:47:51 – First place, I think that the short-term threats
0:47:54 are from rogue states and from terrorism.
0:47:56 And because as we know, there’s plenty of groups
0:48:01 that seek harm against the elites in any country.
0:48:04 Today, the competitive environment is very clear
0:48:08 that the US with a partner UK, I’ll give you an example.
0:48:12 This week, there were two libraries from China
0:48:14 that were released, open source.
0:48:17 One is a problem solver that’s very powerful
0:48:21 and another one is a large language model that’s equal.
0:48:24 And in some cases, it exceeds the one from META
0:48:28 which they use every day, it’s called Lama III, 400 billion.
0:48:32 I was shocked when I read this ’cause I had assumed
0:48:35 that are in my conversation with the Chinese
0:48:39 that they were two to three years late.
0:48:41 It looks to me like it’s within a year now.
0:48:43 So it’d be fair to say it’s the US
0:48:46 and then China within a year’s time.
0:48:49 Everyone else is well behind.
0:48:51 Now, I’m not suggesting that China
0:48:54 will launch a rogue attack against us in American city.
0:48:57 I am alleging that it’s possible
0:49:01 that a third party could steal from China
0:49:03 ’cause it’s open source or from the US
0:49:05 if they’re malevolent and do that.
0:49:09 So the threat escalation matrix goes up
0:49:11 with every improvement.
0:49:14 At today, the primary use of these tools
0:49:18 is to sow misinformation, which is what you talked about.
0:49:21 But remember that there’s a transition to agents
0:49:23 and the agents do things.
0:49:26 So it’s a travel agent or it’s whatever.
0:49:29 And the agents speak English, you give them English
0:49:33 and they respond in English so you can concatenate them.
0:49:36 You can literally put agent one talks to agent two,
0:49:39 talks to agent three, talks to agent four.
0:49:43 And there’s a scheduler that makes them all work together.
0:49:46 And so for example, you could say to these agents,
0:49:49 design me the most beautiful building in the world,
0:49:53 go ahead and file all the permits,
0:49:55 negotiate the fees of the builders
0:49:57 and tell me how much it’s gonna cost
0:50:00 and tell my accountant that I need that amount of money.
0:50:01 That’s the command.
0:50:03 So think about that.
0:50:06 Think about the agency, the ability to put
0:50:09 an integrated solution that today takes 100 people
0:50:13 who are very talented and you can do it by one command.
0:50:17 So that acceleration of power could also be misused.
0:50:19 I’ll give you another example.
0:50:21 You were talking earlier about the impact on social media.
0:50:26 I saw a demonstration in England, in fact.
0:50:31 The first command was build a profile of a woman who’s 25,
0:50:36 she has two kids and she has the following strange beliefs.
0:50:40 And the system wrote the code and created a fake persona
0:50:44 that existed on that particular social media case.
0:50:47 Then the next command was take that person
0:50:51 and modify that person to every possible stereotype,
0:50:55 every race, sex, so forth and so on, age, demographic thing
0:50:58 with similar views and populate that
0:51:02 and 10,000 people popped up just like that.
0:51:06 So if you wanted, for example, today, this is true today,
0:51:07 if you wanted to create a community
0:51:10 of 10,000 fake influencers to say, for example,
0:51:12 that smoking doesn’t cause cancer,
0:51:15 which as we know is not true, you could do it.
0:51:17 And one person with a PC can do this.
0:51:21 Imagine when the AI’s are far more powerful
0:51:22 than they are today.
0:51:26 – So one of the things that Dr. Kissinger was known for
0:51:27 and quite frankly I appreciate
0:51:29 was this notion of real politic.
0:51:31 Obviously we have aspirations around
0:51:33 the way the world should be,
0:51:34 but as it relates to decision-making,
0:51:37 we’re also gonna be very cognizant of the way the world is
0:51:41 and make some, I mean, he’s credited with a lot of very
0:51:43 controversial/difficult decisions
0:51:46 depending on how you look at it.
0:51:49 What I’m hearing you say leads,
0:51:51 all these roads lead to one place
0:51:54 in my kind of quote unquote critical thinking
0:51:56 or lack there of a brain and that is,
0:52:00 there’s a lot of incentive to kiss and make up with China
0:52:02 and partner around this stuff.
0:52:06 That if China and the US came to an agreement
0:52:08 around what they were gonna do or not do
0:52:11 and bilaterally created a security force
0:52:15 and agreed not to sponsor proxy agents against the West
0:52:18 or each other that we’d have a lot,
0:52:20 that would be a lot of progress.
0:52:22 That might be 50, 60, 80% of the whole shooting match
0:52:24 as if the two of us could say,
0:52:27 we’re gonna figure out a way to trust each other
0:52:29 on this issue and we’re gonna fight the bad guys
0:52:31 together on this stuff.
0:52:32 Your thoughts?
0:52:34 – So Dr. Kissinger of course was the world’s expert
0:52:36 in China, he opened up China,
0:52:38 which is one of his greatest achievements.
0:52:42 And but he was also a proud American
0:52:46 and he understood that China could go one way or the other.
0:52:49 His view on China was that China,
0:52:50 and he wrote a whole book on this,
0:52:52 was that China wanted to be the middle kingdom
0:52:54 as part of their history,
0:52:57 where they’d sort of dominated all the other countries,
0:52:58 but it’s not like America.
0:53:02 His view was they wanted to make sure the other countries
0:53:04 would show fealty to China,
0:53:07 in other words, do what they wanted.
0:53:10 And occasionally, if they didn’t do something,
0:53:12 China would then extract some payment,
0:53:14 such as invading the country.
0:53:16 That’s roughly what Henry would say.
0:53:20 So he was very much a realist about China as well.
0:53:24 His view would be at odds today
0:53:27 with Trump’s view and the US governments.
0:53:30 The US government is completely organized today
0:53:35 around decoupling, that is literally separating.
0:53:38 And his view, which I can report accurately
0:53:39 ’cause I went to China with him,
0:53:43 was that we’re never going to be great friends,
0:53:46 but we have to learn how to coexist.
0:53:51 And that means detailed discussions on every issue
0:53:54 at great length to make sure
0:53:57 that we don’t alarm each other or frighten each other.
0:54:01 His further concern was not that President Xi
0:54:04 would wake up tomorrow and invade Taiwan,
0:54:06 but that you would start with an accident
0:54:08 and then there would be an escalatory ladder.
0:54:11 And that because the emotions on both sides,
0:54:14 you would end up just like in World War I,
0:54:17 which started with a shooting in Sarajevo,
0:54:20 that ultimately people found in a few months
0:54:21 that they were in a world war
0:54:23 that they did not want and did not expect.
0:54:26 And once you’re in the war, you have to fight.
0:54:30 So the concern with China would be roughly that
0:54:35 we are codependent and we’re not best friends.
0:54:40 Being dependent is probably better
0:54:44 than being completely independent, that is non-dependent
0:54:46 because it forces some level of understanding
0:54:47 and communication.
0:54:50 – Eric Schmidt is a technologist,
0:54:51 entrepreneur and philanthropist.
0:54:55 In 2021, he founded the Special Competitive Studies Project,
0:54:57 a non-profit initiative to strengthen America’s
0:55:00 long-term competitiveness in AI
0:55:01 and technology more broadly.
0:55:03 Before that, Eric served as Google’s
0:55:05 chief executive officer and chairman,
0:55:08 and later as executive chairman and technical advisor.
0:55:10 He joins us from Boston, Eric.
0:55:12 In addition to your intelligence,
0:55:14 I get to sense your heart’s in the right place
0:55:17 and you’re using your human and financial capital
0:55:18 to try and make the world a better place.
0:55:20 Really appreciate you and your work.
0:55:26 (upbeat music)
0:55:29 (upbeat music)
0:55:32 – I was a bit of a happiness.
0:55:36 I’m at this gathering called Summit
0:55:39 and I’ve been struck by how many people are successful
0:55:41 or at least the appearance of being successful.
0:55:44 So as I know the rich kids, but they do seem to be,
0:55:48 I don’t know, economically secure or overeducated.
0:55:50 Interesting, some of them started sold businesses,
0:55:53 but what I see is a lot of people searching
0:55:54 and they’ll say shit like,
0:55:56 well, I’m just taking a year to really focus
0:55:57 on improving my sleep.
0:56:02 Okay, no, sleep is supposed to be part of your arsenal.
0:56:03 It’s not why you’re fighting this war.
0:56:05 You need good sleep, but I don’t think
0:56:07 you should take a year to focus on it.
0:56:09 Anyways, does that sound boomer of me?
0:56:11 But this notion of finding a purpose
0:56:12 and what I have found is,
0:56:14 and this is probably one of the accoutrements
0:56:18 of a prosperous society, is ask yourself,
0:56:20 do you have the wrong amount of money?
0:56:21 Do you have just the wrong amount of money?
0:56:22 What do I mean by that?
0:56:24 Obviously, the worst amount of money is not enough,
0:56:26 but a lot of my friends and a lot of people,
0:56:28 I think at this summit,
0:56:30 suffer from just having the wrong amount of money.
0:56:31 What do I mean by that?
0:56:32 They have enough money
0:56:34 so they don’t have to do something right away,
0:56:36 but they don’t have enough money to retire
0:56:39 or go into philanthropy or really pursue something creative
0:56:41 and not make money.
0:56:42 That’s exactly the wrong amount of money.
0:56:45 And I would say a good 50% of my friends
0:56:47 who kind of hit a wall, got stuck,
0:56:49 experienced their first failure,
0:56:53 sit around and wait for the perfect thing
0:56:55 and wake up one, two, three years later
0:56:57 and really don’t have a professional purpose
0:56:59 or a professional source of gravity.
0:57:03 And you know, the kind of basic stuff, right?
0:57:05 Do something in the agency of others,
0:57:08 be in service to others, but more than anything,
0:57:11 I think the call sign is just now.
0:57:15 And that is, don’t let perfect be the enemy of good
0:57:18 and give yourself a certain amount of time to find something.
0:57:22 And within that amount of time, when it elapses,
0:57:24 take the best thing that you have.
0:57:26 And it might not be the,
0:57:28 it might not foot to the expectations
0:57:29 that you have for yourself
0:57:32 or be really exciting or dramatic or really lucrative.
0:57:33 But the thing about working
0:57:35 is it leads to other opportunities.
0:57:37 And what I see is a lot of people
0:57:39 who kind of are cast into the wilderness
0:57:41 and then come out of the wilderness with no fucking skills.
0:57:43 And that is, you’ll be surprised
0:57:47 how much your Rolodex and your skills atrophy.
0:57:47 And so what is the key?
0:57:49 Do you want to write a book?
0:57:50 Do you want to start a podcast?
0:57:52 Do you want to try and raise a fund?
0:57:53 Do you want to start a company?
0:57:54 What is the key?
0:57:56 What is the critical success factor?
0:57:57 Is it finding the right people?
0:57:58 Is it finding capital?
0:57:59 Is it thinking through?
0:58:01 Is it positioning the concept?
0:58:02 Is it doing more research?
0:58:05 No, the key is now.
0:58:06 You want to write a book,
0:58:09 open your fucking laptop and start writing.
0:58:10 And it’s going to be shit.
0:58:11 But then when you go back and edit it,
0:58:12 it’ll be less shitty.
0:58:14 And then if you find someone to help you review it
0:58:15 and you find some people,
0:58:18 it’ll get dramatically even less shittier.
0:58:20 All right, you want to start a business?
0:58:20 Nobody knows.
0:58:22 The only way you have a successful business
0:58:24 is you start a bad one and you start iterating.
0:58:26 But here’s the key, starting.
0:58:27 You want to be in a nonprofit.
0:58:29 You want to start helping other people.
0:58:32 We’ll start with one person and see if in fact,
0:58:34 your infrastructure, your skills, your expertise,
0:58:37 tangibly change the community, the environment,
0:58:38 or their life.
0:58:39 What is key to all of this?
0:58:42 Three words, first N, second O, third W.
0:58:44 I have so many people I run across
0:58:47 who are searching, not because they’re not talented,
0:58:49 not because there’s not opportunity,
0:58:51 but they’re thinking they’re going to find the perfect thing.
0:58:56 No, find the best thing that is now and get started.
0:58:59 (upbeat music)
0:59:01 This episode was produced by Jennifer Sanchez
0:59:02 and Caroline Shagren.
0:59:04 Drew Burroughs is our technical director.
0:59:05 Thank you for listening to the Property Pod
0:59:07 from the Vox Media Podcast Network.
0:59:08 We will catch you on Saturday
0:59:11 for No Mercy, No Malice as read by George Hahn.
0:59:13 And please follow our Property Markets Pod
0:59:15 wherever you get your pods for new episodes
0:59:16 every Monday and Thursday.
0:59:21 – Do you feel like your leads never lead anywhere?
0:59:24 And you’re making content that no one sees
0:59:27 and it takes forever to build a campaign?
0:59:29 Well, that’s why we build HubSpot.
0:59:31 It’s an AI-powered customer platform
0:59:33 that builds campaigns for you,
0:59:35 tells you which leads are worth knowing,
0:59:38 and makes writing blogs, creating videos,
0:59:40 and posting on social a breeze.
0:59:43 So now, it’s easier than ever to be a marketer.
0:59:46 Get started at hubspot.com/marketers.
0:00:05 the AI platform for business transformation.
0:00:07 You’ve heard the big hype around AI.
0:00:09 The truth is AI is only as powerful
0:00:11 as the platform it’s built into.
0:00:13 ServiceNow is the platform that puts AI
0:00:15 to work for people across your business,
0:00:18 removing friction and frustration for your employees,
0:00:20 supercharging productivity for your developers,
0:00:22 providing intelligent tools for your service agents
0:00:24 to make customers happier.
0:00:27 All built into a single platform you can use right now.
0:00:29 That’s why the world works with ServiceNow.
0:00:33 Visit servicenow.com/aiforpeople to learn more.
0:00:39 Support for this show comes from Constant Contact.
0:00:42 If you struggle just to get your customers to notice you,
0:00:46 Constant Contact has what you need to grab their attention.
0:00:49 Constant Contact’s award-winning marketing platform
0:00:53 offers all the automation, integration and reporting tools
0:00:55 that get your marketing running seamlessly,
0:00:59 all backed by their expert live customer support.
0:01:01 It’s time to get going and growing
0:01:03 with Constant Contact today.
0:01:06 Ready, set, grow.
0:01:10 Go to ConstantContact.ca and start your free trial today.
0:01:14 Go to ConstantContact.ca for your free trial,
0:01:17 ConstantContact.ca.
0:01:23 Thumbtack presents the ins and outs of caring for your home.
0:01:26 Out, procrastination,
0:01:30 putting it off, kicking the can down the road.
0:01:33 In, plans and guides that make it easy
0:01:35 to get home projects done.
0:01:39 Out, carpet in the bathroom, like why.
0:01:45 In, knowing what to do, when to do it, and who to hire.
0:01:49 Start caring for your home with confidence.
0:01:50 Download Thumbtack today.
0:01:57 Episode 326, 326 is the area code serving southwest in Ohio.
0:02:00 In 1926, the first SATs took place.
0:02:03 Latest exam for me, a prostate exam.
0:02:05 My doctor told me it’s perfectly normal
0:02:08 to become aroused and even ejaculate.
0:02:12 That being said, I still wish he hadn’t.
0:02:18 Go, go, go.
0:02:21 (upbeat music)
0:02:26 – Welcome to the 326th episode of The Prop G Bot.
0:02:28 In today’s episode, we speak with Eric Schmidt,
0:02:30 a technologist, entrepreneur, and philanthropist.
0:02:32 He also previously served as Google’s
0:02:33 chief executive officer.
0:02:34 I don’t know if you’ve heard of him.
0:02:35 It’s a tech company.
0:02:37 You can actually go there and type in your own name
0:02:39 and you see what the world thinks of you.
0:02:42 Later, he was the executive chairman and technical advisor.
0:02:44 We discussed with Eric the dangers
0:02:47 and opportunities AI presents in his latest book,
0:02:49 “Genesis, Artificial Training.”
0:02:51 “Genesis, Artificial Intelligence,
0:02:53 Hope in the Human Spirit.”
0:02:55 Well, that sounds like a show
0:02:58 on the Hallmark channel in hell.
0:02:59 Okay, what’s happening?
0:03:02 Off to Vegas this week, I’ve been at Summit.
0:03:03 That’s beautiful here.
0:03:04 It’s lovely.
0:03:07 I love kind of the western Baja sky or light.
0:03:09 I think I may retire here.
0:03:11 When I retire in Mexico, I think the food’s amazing.
0:03:13 The people are just incredibly cool.
0:03:14 The service go, no joke,
0:03:18 think that Mexico’s the best vacation deal in the world.
0:03:20 Anyways, where am I headed to next?
0:03:21 I go to Vegas tonight.
0:03:24 Then know that you asked doing a talk there tomorrow.
0:03:26 Vegas turned the week, not so much fun.
0:03:27 Not so much fun.
0:03:30 That definitely kind of an unusual vibe there.
0:03:31 And then I go to LA for a couple of days.
0:03:33 Daddy will be at the Beverly Hills Hotel.
0:03:34 Swing by, stay high.
0:03:37 I’ll be the guy alone at the bar.
0:03:39 Oh, I love eating alone at the Polo Lounge.
0:03:40 How do you know if I like you?
0:03:42 I stare at your shoes and I’m mine.
0:03:45 Anyways, then I’m back to Vegas for Formula One,
0:03:47 which I am so excited about.
0:03:48 I love it.
0:03:49 The city comes alive.
0:03:51 And then, just ’cause I know you like to keep up
0:03:53 with my travels, I head to Sao Paulo,
0:03:54 where the nicest hotel in the world is right now.
0:03:56 I think the Rosewood and Sao Paulo.
0:03:59 I think Rosewood is actually the best brand
0:04:01 in high-end hospitality.
0:04:02 Isn’t that good to know?
0:04:04 A lot of insight here.
0:04:05 A lot of insight.
0:04:07 All right, let’s move on.
0:04:10 Some news in the media and entertainment space.
0:04:13 Netflix said that a record 60 million households worldwide
0:04:14 tuned in to watch the boxing match
0:04:15 between Jake Paul and Mike Todd.
0:04:17 I’m sorry.
0:04:18 I’m sorry.
0:04:19 Just a quick announcement.
0:04:21 This is very exciting.
0:04:24 I just struck a deal as I told you I’m going to LA.
0:04:26 And you’re the first to know that Hulu is announced
0:04:28 it’ll be live streaming a fight between me and Jimmy Carter.
0:04:29 (bell dings)
0:04:32 By the way, if you get paid $20 million,
0:04:33 I don’t know what Tyson was paid.
0:04:34 I think it was $20 million.
0:04:37 You have an obligation to either kick the shit out
0:04:39 of someone or have the shit kicked out of you.
0:04:42 This kind of jab snort through your nose
0:04:43 and just stay away from the guy.
0:04:44 I don’t buy it.
0:04:47 I want my $12 back Netflix.
0:04:50 Despite the disappointment in the fight,
0:04:53 Jake Paul did in fact defeat Mike Tyson in eight rounds.
0:04:54 Can you even call it a win?
0:04:56 Can you?
0:04:58 The fight was shown in over 6,000 bars and restaurants
0:04:59 across the U.S., breaking the record
0:05:03 for the biggest commercial distribution in the sport.
0:05:05 But the record numbers came with a few hiccups.
0:05:06 Viewers reported various tech issues,
0:05:10 including slow loading times, pixelated screens,
0:05:13 and a malfunctioning earpiece from one of the commentators.
0:05:14 That’s a weird one.
0:05:16 A malfunctioning earpiece from one of the commentators.
0:05:20 Data from Down Detector revealed that user reported out
0:05:21 it just peaked at more than 95,000
0:05:23 around 11 p.m. Eastern time.
0:05:25 Frustrated fans flooded social media,
0:05:28 criticizing Netflix for the poor streaming quality.
0:05:33 Netflix CTO Elizabeth Stone, soon to be probably former CTO,
0:05:34 wrote to employees, “I’m sure many of you
0:05:35 “have seen the chatter and the press
0:05:37 “and the social media about the quality issues.
0:05:39 “We don’t want to dismiss the poor experience
0:05:41 “of some members and know we have room for improvement,
0:05:43 “but still consider this event a huge success.”
0:05:47 No, there was a pretty big fuck up for you, Ms. Stone.
0:05:50 Specifically, Netflix tries to garner evaluation,
0:05:52 not of a media company, but of a tech company,
0:05:53 which means you’re actually supposed
0:05:54 to be pretty good at this shit.
0:05:56 And didn’t you know exactly how many people
0:05:57 were gonna show up for this?
0:06:00 Didn’t you kind of, weren’t you able to sort of estimate
0:06:03 pretty accurately just exactly how many people
0:06:05 would be dialing at exactly the same time
0:06:06 and then test the shit out of this?
0:06:09 You’re beginning to smell a little bit like Twitter
0:06:10 in a presidential announcement.
0:06:13 That just is unforgivable for a fucking tech company.
0:06:15 Come on, guys, this is what you do.
0:06:16 This isn’t the first time Netflix
0:06:17 has fumbled with a live event.
0:06:20 Last year, their Love Is Blind reunion show
0:06:21 faced a similar situation,
0:06:22 leaving viewers waiting over an hour
0:06:25 before a recorded version was made available.
0:06:27 And this brings up a bigger question.
0:06:29 With Netflix’s pushing to live sports,
0:06:31 including NFL game scheduled for Christmas
0:06:34 and a major deal with WWE starting next year,
0:06:36 can they deliver the kind of quality viewers expect
0:06:38 that they get from broadcast cable?
0:06:40 It looks like what’s old is new again
0:06:41 that we have taken for granted,
0:06:44 kind of the production quality of live TV
0:06:45 and how difficult it is.
0:06:49 That’s one thing I’ll say about Morning Joe or The View
0:06:51 or even I think Fox does a great,
0:06:53 they’re great at delivering TV live.
0:06:56 I think CNN also does a fantastic job.
0:06:57 Netflix isn’t alone.
0:07:00 Other streaming platforms including Comcast’s Peacock
0:07:02 have also been getting into live sports.
0:07:04 Earlier this year, Peacock’s January playoff game
0:07:06 between the Kansas City Chiefs and Miami Dolphins
0:07:08 drew 23 million viewers,
0:07:10 which broke records for internet usage in the US.
0:07:14 Get this, the game was responsible for 30%
0:07:15 of internet traffic that night.
0:07:16 That’s like squid games.
0:07:19 This is all proof that the market for live sports
0:07:21 on streaming platforms is a massive opportunity
0:07:23 and companies are willing to spend big.
0:07:24 According to the Wall Street Journal,
0:07:27 Netflix is paying around $75 million
0:07:28 for NFL game this season.
0:07:30 They also recently signed a 10 year,
0:07:31 $5 million deal with WWE.
0:07:35 It used to be that live in sports
0:07:38 were sort of the last walls to be breached
0:07:39 in broadcast cable.
0:07:40 Like we’ll always have sports.
0:07:42 And then the people with the cheapest capital
0:07:43 in the deepest pockets showed up and said,
0:07:45 “Hey, we’ll take Thursday night football.
0:07:48 Hey, we’ll take the Logan Paul or Jake Paul,
0:07:49 is it Jake or Logan?”
0:07:51 And I can’t remember, anyways.
0:07:54 I mean, literally broadcast cable television right now,
0:07:56 it’s like Mark Twain said about going bankrupt.
0:07:58 It was slowly then suddenly.
0:08:01 We’re in the suddenly stage of the decline
0:08:02 of linear ad supported TV.
0:08:06 It has gotten really bad in the last few months.
0:08:11 I had breakfast with the former CEO of CNN.
0:08:12 Who’s a lovely guy?
0:08:15 And he said that CNN’s viewership
0:08:17 versus the last election has been cut in half.
0:08:19 Can you imagine trying to explain to advertisers,
0:08:22 our viewership is off 50% since the last time
0:08:24 we were talking about election advertising.
0:08:29 My theory is that the unnatural unearned torrent of cash
0:08:31 at local news stations have been earning
0:08:33 for the last 20 years is about to go away.
0:08:34 And what are we talking about?
0:08:35 Scott, tell us more.
0:08:36 What are you saying?
0:08:38 Effectively, a lot of smart companies,
0:08:39 including I think Hearst and others
0:08:41 have gone around and bought up these local news stations.
0:08:42 And why?
0:08:43 ‘Cause they’re dying, aren’t they?
0:08:45 Well, yeah, they are.
0:08:46 But old people watch local news,
0:08:48 mostly to get the weather and local sports
0:08:51 because that Jerry Dumphy is just so likable
0:08:52 in that hot little number.
0:08:55 They always have some old guy with good hair
0:08:58 and broad shoulders who makes you feel comfortable and safe
0:09:00 and some hot woman in her 30s
0:09:04 who’s still waiting for the call up to do daytime TV.
0:09:08 And everybody, old people love this and old people vote.
0:09:09 Now what’s happening?
0:09:11 Okay, so the numbers are in.
0:09:13 A million people watch the best shows on MSNBC,
0:09:16 the average age is 70, it’s mostly white
0:09:17 and it’s mostly women.
0:09:18 So a 70 year old white woman.
0:09:20 Podcasts, 34 year old male.
0:09:21 Think about that.
0:09:23 Also the zeitgeist is different.
0:09:26 People go to cable news to sanctify their religion
0:09:27 or specifically their politics.
0:09:29 People come to podcasts to learn.
0:09:30 The zeitgeist is different.
0:09:34 We try to present our guests in a more aspirational light.
0:09:35 We’re not looking for a gotcha moment
0:09:36 to go live on TikTok.
0:09:38 It’s not, say a twist of phrase,
0:09:39 dead at done in six minutes
0:09:41 ’cause we got a break for an opioid induced constipation
0:09:43 commercial or life alert.
0:09:44 I’m following.
0:09:45 We don’t do that shit.
0:09:48 We sell zipper cruder and athletic greens
0:09:51 and thunder eyes and different kind of modern cool stuff
0:09:52 like that.
0:09:53 Also, Viori.
0:09:54 I’m wearing Viori shorts right now.
0:09:57 By the way, I fucking love this athleisure.
0:10:00 Oh my God, I look so good in this shit.
0:10:02 Actually, no one really looks good.
0:10:04 No man looks good in athleisure,
0:10:06 but I look less bad than I look in most athleisure.
0:10:08 I love the fabrics.
0:10:09 Not even getting paid to say this.
0:10:11 Wearing it right now.
0:10:13 So let’s talk a little bit about Netflix.
0:10:14 It’s up 81% year today.
0:10:16 True story, I bought Netflix at 10 bucks a share.
0:10:17 That’s a good news.
0:10:20 The bad news is I sold it at eight bucks a share
0:10:22 and now it’s at $840.
0:10:23 Daddy would be live broadcasting
0:10:25 from his own fucking Gulfstream right now.
0:10:26 Had I not been such a shit head,
0:10:29 I’m gonna find a time machine, get in it, go back,
0:10:31 find me, kill me and then kill myself.
0:10:34 Eight, Jesus, God.
0:10:37 Anyways, Amazon is up 34%.
0:10:38 I do own that stock.
0:10:39 Disney is up 22%.
0:10:40 My stock pick for 2024.
0:10:44 Warner Brothers Discovery down 22%.
0:10:47 Jesus Christ Malone, you fired the wrong guy, Paramount.
0:10:49 By the way, Zazloff, the guy who was overseeing
0:10:52 a destruction of about 60 or 70% of shareholder value
0:10:53 since he talked to much of stupid people
0:10:55 into why this merger made any fucking sense
0:10:57 and took on way too much debt.
0:11:00 He’s managed to pull out about a third of a billion dollars
0:11:01 despite destroying a massive amount
0:11:02 of shareholder value.
0:11:05 Paramount is down 28% year today.
0:11:07 Comcast is down 2.3%.
0:11:12 Comcast I think is arguably the best run of the cable folks.
0:11:13 Obviously not including Netflix,
0:11:16 which is just a gangster run company.
0:11:19 So Netflix has about 250 million users.
0:11:21 Amazon Prime Video has 200 million.
0:11:22 Is that fair though?
0:11:24 ‘Cause you just automatically get it with Prime.
0:11:27 Disney Plus 150 million.
0:11:29 Max 95.
0:11:33 I love Max, we sold our series into Netflix.
0:11:34 Our big tech drama.
0:11:38 I think most of us would have liked HBO
0:11:40 just ’cause HBO has a certain culture
0:11:42 that feeds kind of the water cooler.
0:11:45 You’re talking about something in streaming media.
0:11:47 You’re usually talking about something on Max
0:11:49 but Netflix has also got bigger reach.
0:11:51 These are good problems.
0:11:54 Hulu’s Paramount is at 63 million.
0:11:57 Hulu 49, Peacock 28.
0:12:02 ESPN Plus at 26, Apple TV at 25.
0:12:05 And then Stars, remember them at 16 million.
0:12:07 Effectively these guys have cheaper capital.
0:12:09 They’re absolutely killing linear TV.
0:12:11 Does that mean it’s a bad business now?
0:12:13 Someone’s gonna come in and roll up all of these assets
0:12:18 between the old Viacom assets, CNN, Turner,
0:12:22 all the Disney shit, ABC.
0:12:23 They’re gonna roll them all up.
0:12:26 Milk ’em for their cash flow, cut costs faster
0:12:28 than the revenue declines.
0:12:29 These businesses,
0:12:30 while they seem to be going out of business
0:12:33 pretty fast right now, it’ll probably level out.
0:12:35 AOL’s still a small but great business.
0:12:37 I think it does something like four or $500 million
0:12:38 in EBITDA ’cause there’s still a lot of people
0:12:40 that depend on AOL and rural areas
0:12:42 for their dial-up for their internet.
0:12:46 And some people will kind of hang in there, if you will.
0:12:48 But this is gonna be a distress play.
0:12:50 They’re gonna stop this consensual hallucination
0:12:52 that these things are gonna ever grow again.
0:12:54 They’ll consolidate them to start cutting costs.
0:12:57 One of the best investments I’ve ever made, yellow pages.
0:12:58 We bought a yellow pages company
0:13:01 for about two or two and a half times cash flow.
0:13:05 Yeah, it’s going down by eight to 12% a year.
0:13:06 But if you cut costs faster than that
0:13:09 by going and buying the other shitty yellow pages companies
0:13:10 and then consolidating the staff,
0:13:12 which is Latin for layoff people,
0:13:15 and you can cut costs faster than 8%,
0:13:17 you have an increase in EBITDA every year.
0:13:19 I still find across the entire asset class,
0:13:21 and this is where I’ll wrap up.
0:13:25 In general, a basic axiom that I have found holds water
0:13:26 through the test of time around investing,
0:13:29 is the sexier it is, the lower the ROI.
0:13:31 And if you look at asset classes
0:13:33 in terms of their sex appeal,
0:13:35 venture investing or angel investing is fun, right?
0:13:37 It’s for what I call FIPS,
0:13:39 formerly important people that wanna stay involved
0:13:40 and wanna help entrepreneurs.
0:13:42 But be clear, the only return you get is psychic.
0:13:45 It is a terrible asset class, even if something works.
0:13:48 And at that stage, it is very hard to predict.
0:13:51 You’re talking about one in seven, maybe, do well.
0:13:52 And even at one company,
0:13:54 likely you’ll get washed out along the way
0:13:56 at a little bump and the VCs have showed up
0:13:56 and they’ll wash you out.
0:13:59 It is a very tough asset class to make money.
0:14:02 Venture does better, but the majority of the returns
0:14:04 are not only crowded to a small number of brands
0:14:05 that get all the deal flow,
0:14:06 but a small number of partners
0:14:09 within that small number of firms.
0:14:11 And then you have growth, I think that’s better.
0:14:14 Then you have IPOs, unfortunately IPOs.
0:14:16 That winter is really ugly right now.
0:14:20 The IPO market’s basically been in a pretty big deep freeze
0:14:21 for several years now.
0:14:22 And people keep thinking it’s gonna come back.
0:14:25 We got excited about Reddit, but not a lot followed.
0:14:27 And then you go into public company stocks.
0:14:30 It’s impossible to pick stocks by an index fund.
0:14:31 Then you get into distressed
0:14:33 or mature companies, dividend plays.
0:14:35 And then what I love is distressed.
0:14:37 I find that distressed is the best asset class.
0:14:38 Why?
0:14:43 What business has the greatest likelihood of succeeding?
0:14:44 Anything in senior care.
0:14:45 Why?
0:14:46 Against the above, the less sexy it is.
0:14:48 People don’t wanna be around old people.
0:14:49 It reminds them of death.
0:14:50 They’re generally pretty boring.
0:14:51 I know I’m supposed to say
0:14:55 they just have so much experience and wisdom sometimes.
0:14:56 And people wanna avoid them.
0:14:58 People wanna hang out with hot young people, right?
0:15:00 And people wanna hang out with hot young companies,
0:15:03 specifically capital wants to hang out
0:15:05 with hot young growing companies.
0:15:06 And they don’t like the way
0:15:09 that old companies smell, so to speak.
0:15:10 So they avoid them.
0:15:12 And that’s why there’s a greater return
0:15:13 on investment in distressed.
0:15:15 What’s the learning here?
0:15:18 Sex appeal and ROI are inversely correlated.
0:15:20 So yeah, if you wanna invest
0:15:21 in a member’s club downtown
0:15:24 for the fashion industry and the music industry have at it,
0:15:28 but keep in mind ROI and sex appeal inversely correlated.
0:15:33 We’ll be right back for our conversation with Eric Schmidt.
0:15:38 Support for property comes from Mint Mobile.
0:15:40 You’re probably paying too much for your cell phone plan.
0:15:42 It’s one of those budgetary line items
0:15:43 that always looks pretty ugly.
0:15:44 And it might feel like
0:15:46 there’s nothing you can do about it.
0:15:48 That’s where Mint Mobile has something to say.
0:15:49 Mint Mobile’s latest deal might challenge your idea
0:15:51 of what a phone plan costs.
0:15:52 If you make the switch now,
0:15:54 you’ll pay just $15 a month
0:15:56 when you purchase a new three-month phone plan.
0:15:58 All Mint Mobile plans come with high-speed data
0:15:59 on a limit-talked index delivered
0:16:01 on the nation’s largest 5D network.
0:16:03 You can even keep your phone, your contacts,
0:16:04 and your number.
0:16:06 It doesn’t get much easier than that.
0:16:07 To get this new customer offer
0:16:09 and your new three-month premium wireless plan
0:16:10 for just 15 bucks a month,
0:16:13 you can go to mintmobile.com/profg.
0:16:15 That’s mintmobile.com/profg.
0:16:16 You can cut your wireless bill
0:16:18 to 15 bucks a month at mintmobile.com/profg.
0:16:20 $45 upfront payment required,
0:16:21 equivalent to $15 a month.
0:16:23 New customers on a first three-month plan
0:16:25 only speeds slower, about 40 gigabytes on a limited plan,
0:16:28 additional taxes, fees, and restrictions apply.
0:16:30 See Mint Mobile for details.
0:16:38 Your business is ready for launch.
0:16:40 But what’s the most important thing to do
0:16:42 before those doors open?
0:16:45 Is it getting more social media followers?
0:16:47 Or is it actually legitimizing
0:16:50 and protecting the business you’ve been busy building?
0:16:53 Make it official with LegalZoom.
0:16:55 LegalZoom has everything you need to launch,
0:16:58 run, and protect your business all in one place.
0:17:00 Setting up your business properly
0:17:01 and remaining compliant
0:17:04 are the things you want to get right from the get-go.
0:17:06 And LegalZoom saves you from wasting hours
0:17:09 making sense of the legal stuff.
0:17:11 And if you need some hands-on help,
0:17:12 their network of experienced attorneys
0:17:15 from around the country has your back.
0:17:18 Launch, run, and protect your business
0:17:21 to make it official today at LegalZoom.com.
0:17:24 And use promo code VOXBiz to get 10% off
0:17:26 any LegalZoom business formation product
0:17:29 excluding subscriptions and renewals.
0:17:32 Expires December 31st, 2024.
0:17:34 Get everything you need from setup to success
0:17:36 at LegalZoom.com.
0:17:38 And use promo code VOXBiz.
0:17:40 LegalZoom.com.
0:17:43 And use promo code VOXBiz.
0:17:46 LegalZoom provides access to independent attorneys
0:17:47 and self-service tools.
0:17:48 LegalZoom is not a law firm
0:17:50 and does not provide legal advice
0:17:52 except for authorized through its subsidiary law firm
0:17:54 LZ Legal Services LLC.
0:17:59 Support for the show comes from one password.
0:18:01 How do you make a password that’s strong enough
0:18:04 so no one will guess it and impossible to forget?
0:18:07 And now how can you do it for over 100 different sites
0:18:08 and make it so everyone in your company
0:18:10 can do the exact same thing
0:18:11 without ever needing to reset them?
0:18:13 It’s not impossible.
0:18:15 One password makes it simple.
0:18:17 One password combines industry-leading security
0:18:19 with award-winning design to bring private secure
0:18:22 and user-friendly password management to everyone.
0:18:25 One password makes strong security easy for your people
0:18:27 and gives you the visibility you need to take action
0:18:28 when you need to.
0:18:30 A single data breach can cost millions of dollars
0:18:32 while one password secures every sign-in
0:18:34 to save you time and money.
0:18:36 And it lets you securely switch
0:18:38 between iPhone, Android, Mac, and PC.
0:18:41 All you have to remember is the one strong account password
0:18:42 that protects everything else.
0:18:44 Your logins, your credit cards, secure notes,
0:18:46 or the office Wi-Fi password.
0:18:49 Right now, our listeners get a free two-week trial
0:18:53 at onepassword.com/prof for your growing business.
0:18:57 That’s two weeks free at onepassword.com/prof.
0:18:59 Don’t let security slow your business down.
0:19:02 Go to onepassword.com/prof.
0:19:05 (upbeat music)
0:19:14 – Welcome back.
0:19:15 Here’s our conversation with Eric Schmidt,
0:19:17 a technologist, entrepreneur, philanthropist,
0:19:19 and Google former CEO.
0:19:24 Eric, where does this podcast find you?
0:19:25 – I’m in Boston.
0:19:28 I’m at Harvard and giving a speech to students later today.
0:19:30 – Oh, nice.
0:19:31 So let’s bust right into it.
0:19:33 You have a new book out that you co-authored
0:19:35 with the late Henry Kissinger titled
0:19:37 “Genesis, Artificial Intelligence,
0:19:40 “Hope, and the Human Spirit.”
0:19:41 What is it about this book?
0:19:44 Or give us what you would call the Pillars of Insight here
0:19:48 around that’ll help people understand the evolution of AI.
0:19:51 – Well, the world is full of stories about what AI can do.
0:19:54 And we generally agree with those.
0:19:58 What we believe, however, is the world is not ready for this.
0:20:00 And there are so many examples,
0:20:04 whether it’s trust, military power, deception,
0:20:07 economic power, the effect on humans,
0:20:12 the effect on children that are relatively poorly explored.
0:20:16 So the reader of this book doesn’t need to understand AI,
0:20:18 but they need to be worried
0:20:20 that this stuff is going to be unmanaged.
0:20:23 Dr. Kissinger was very concerned
0:20:26 that the future should not be left to people like myself.
0:20:30 He believed very strongly that these tools are so powerful
0:20:33 in terms of their effect on human society.
0:20:35 It was important that the decisions be made
0:20:37 by more than just the tech people.
0:20:40 And the book is really a discussion
0:20:43 about what happens to the structure of organizations,
0:20:46 the structure of jobs, the structure of power,
0:20:49 and all the things that people worry about.
0:20:54 I personally believe that this will happen much, much more
0:20:57 quickly than societies are ready for,
0:20:59 including in the United States and China.
0:21:02 It’s happening very fast.
0:21:04 – And what do you see as the real existential threats here?
0:21:07 Is it that it becomes sentient?
0:21:11 Is it misinformation, income inequality, loneliness?
0:21:14 What do you think are the kind of first and foremost
0:21:17 biggest concerns you have about this rapid evolution of AI?
0:21:20 – There are many things to worry about.
0:21:22 Before we say the bad things,
0:21:25 let me remind you enormous improvements
0:21:28 in drug capability for healthcare,
0:21:31 solutions to climate change, better vehicles,
0:21:33 huge discoveries in science,
0:21:36 greater productivity for kind of everyone,
0:21:38 a universal doctor, a universal educator,
0:21:40 all of these things are coming.
0:21:42 And those are fantastic.
0:21:47 A long way you come with, because these are very powerful,
0:21:49 especially in the hands of an evil person
0:21:51 and we know evil exists,
0:21:55 these systems can be used to harm large numbers of people.
0:21:58 The most obvious one is their use in biology.
0:22:00 Can these systems at some point in the future
0:22:02 generate biological pathogens
0:22:06 that could harm many, many, many, many humans?
0:22:08 Today, we’re quite sure they can’t,
0:22:09 but there’s a lot of people who think
0:22:12 that they will be able to unless we take some action.
0:22:15 Those actions are being worked on now.
0:22:16 What about cyber attacks?
0:22:18 You have a lone actor, a terrorist group,
0:22:22 North Korea, whomever, whatever your evil person or group is,
0:22:25 and they decide to take down the financial system
0:22:28 using a previously unknown attack vector,
0:22:30 so-called zero-day exploits.
0:22:33 So the systems are so powerful
0:22:36 that we are quite concerned
0:22:39 that in addition to democracies using them for gains,
0:22:42 dictators will use them to aggregate power
0:22:45 and they’ll be used in a harmful and military context.
0:22:49 So I’m freaked out about these AI girlfriends.
0:22:52 I feel as if the biggest threat in the U.S. right now
0:22:55 is loneliness that leads to extremism,
0:22:59 and I see these AI girlfriends and AI searches popping up,
0:23:02 and I see a lot of young men who have a lack of romantic
0:23:06 or economic opportunities turning to AI girlfriends
0:23:09 and begin to sequester from real relationships
0:23:11 and they become less likely to believe in climate change,
0:23:14 more likely to engage in misogynistic content,
0:23:17 sequester from school, their parents’ work,
0:23:20 and some they become really shitty citizens.
0:23:23 And I think men, young men are having so much trouble
0:23:28 that this low risk entry into these faux relationships
0:23:30 is just gonna speedball loneliness
0:23:32 and the externalities of loneliness.
0:23:33 Your thoughts?
0:23:35 – I completely agree.
0:23:38 There’s lots of evidence that there’s now a problem
0:23:39 with young men.
0:23:42 In many cases, the path to success for young men
0:23:46 has been, shall we say, been made more difficult
0:23:48 because they’re not as educated as the women are now.
0:23:50 Remember, there are more women in college than men,
0:23:55 and many of the traditional paths are no longer as available.
0:23:58 And so they turn to the online world
0:24:01 for enjoyment and sustenance,
0:24:03 but also because of the social media algorithms,
0:24:07 they find like-minded people who ultimately radicalize them
0:24:10 either in a horrific way like terrorism
0:24:12 or in the kind of way that you’re describing,
0:24:13 where they’re just maladjusted.
0:24:18 This is a good example of an unexpected problem
0:24:20 of existing technology.
0:24:24 So now imagine that the AI girlfriend or boyfriend,
0:24:26 although she was AI girlfriend as an example,
0:24:31 is perfect, perfect visually, perfect emotionally.
0:24:35 And the AI girlfriend in this case captures your mind
0:24:39 as a man to the point where she or whatever it is
0:24:41 takes over the way you thinking.
0:24:44 You’re obsessed with her.
0:24:46 That kind of obsession is possible,
0:24:49 especially for people who are not fully formed.
0:24:51 Parents are going to have to be more involved
0:24:52 for all the obvious reasons,
0:24:53 but at the end of the day,
0:24:56 parents can only control what their sons and daughters
0:24:58 are doing within reason.
0:25:01 We’ve ended up, again, using teenagers as an example.
0:25:05 We have all sorts of rules about age of maturity, 16, 18,
0:25:07 what have you, 21 in some cases,
0:25:10 and yet you put a 12 or 13 year old
0:25:11 in front of one of these things
0:25:13 and they have access to every evil
0:25:14 as well as every good in the world
0:25:16 and they’re not ready to take it.
0:25:17 So I think the general question of,
0:25:21 are you mature enough to handle it?
0:25:24 Sort of the general version of your AI girlfriend example
0:25:26 is unresolved.
0:25:28 – So I think people, most people would agree
0:25:30 that the pace of AI is scary
0:25:35 and that our institutions and our ability to regulate
0:25:37 are not keeping up with the pace of evolution here.
0:25:40 And we see what perfectly what happened
0:25:41 with social around this.
0:25:43 What can be done?
0:25:47 How, what’s an example or a construct or framework
0:25:50 that you can point to where we get the good stuff,
0:25:54 the drug discovery, the help with climate change,
0:25:57 but attempt to screen out or at least put in check
0:26:00 or put in some guardrails around the bad stuff.
0:26:03 What’s the, what are you advocating for?
0:26:06 – I think it starts with having an honest conversation
0:26:08 of where the problems come from.
0:26:11 So you have people who are absolutist on free speech,
0:26:14 which I happen to agree with,
0:26:17 but they confuse free speech of an individual
0:26:19 versus free speech for a computer.
0:26:23 I am strongly in favor of free speech for every human.
0:26:26 I am not in favor of free speech for computers.
0:26:30 And the algorithms are not necessarily optimizing
0:26:32 the best thing for humanity.
0:26:35 So as a general point, specifically,
0:26:37 we’re going to have to have some conversations
0:26:40 about what is, at what age are things appropriate?
0:26:42 And we’re also going to have to change some of the laws,
0:26:44 for example, section 230,
0:26:48 to allow for liability in the worst possible cases.
0:26:52 So when someone is harmed from this technology,
0:26:54 we need to have a solution to prevent further harm.
0:26:57 Every new invention has created harm.
0:26:58 Think about cars, right?
0:27:02 So cars used to hit everything and they were very unsafe.
0:27:04 Now cars are really quite safe.
0:27:08 Certainly by comparison to anything in history.
0:27:10 So the history of these inventions
0:27:13 is that you allow for the greatness
0:27:16 and you police the guard, technically the guardrails.
0:27:19 You put limits on what they can do.
0:27:21 And it’s an appropriate debate,
0:27:23 but it’s one that we have to have now for this technology.
0:27:26 I’m particularly concerned about the issue
0:27:28 that you mentioned earlier
0:27:31 about the effect of on human psyche.
0:27:34 Dr. Kissinger, who studied Kant,
0:27:37 was very concerned, and we write in the book at some length,
0:27:40 about what happens when your worldview
0:27:45 is taken over by a computer as opposed to your friends, right?
0:27:49 Isolated, the computer is feeding you stuff.
0:27:53 It’s not optimized around human values, good or bad.
0:27:55 God knows what it’s trying to do.
0:27:57 It’s trying to make money or something.
0:27:59 That’s not a good answer.
0:28:01 – So I think most reasonable people would say,
0:28:04 “Okay, some sort of fossil fuels are a net good.”
0:28:06 I would argue pesticides are a net good,
0:28:10 but we have emission standards and an FDA.
0:28:12 Most people would, I think, loosely agree
0:28:15 or mostly agree that some sort of regulation
0:28:18 that keeps these things in check makes sense.
0:28:20 Now, let’s talk about big tech,
0:28:22 which you were an instrumental player in.
0:28:24 You guys figured out a way, quite frankly,
0:28:27 to overrun Washington with lobbyists
0:28:30 and avoid all reasonable regulation.
0:28:31 Why are things gonna be different now
0:28:33 than what they were in your industry
0:28:35 when you were involved in it?
0:28:37 – Well, President Trump has indicated
0:28:41 that he is likely to repeal the executive order
0:28:43 that came out of President Biden,
0:28:45 which was an attempt at this.
0:28:49 So I think a fair prediction is that for the next four years,
0:28:51 there’ll be very little regulation in this area
0:28:54 as the president will be focused on the things.
0:28:57 So what will happen in those companies
0:29:00 is if there is real harm, there’s liability,
0:29:02 there’s lawsuits and things.
0:29:04 So the companies are not completely scot-free.
0:29:07 Our companies, remember, are economic agents
0:29:09 and they have lawyers whose jobs are to protect
0:29:12 their intellectual property and their goals.
0:29:14 So it’s gonna take, I’m sorry to say,
0:29:18 it’s likely to take some kind of a calamity
0:29:21 to cause a change in regulation.
0:29:23 And I remember when I was in California,
0:29:27 when I was younger, California driver’s licenses,
0:29:30 the address on your driver’s license was public
0:29:31 and there was a horrific crime
0:29:33 where a woman was followed to her home
0:29:36 and then she was murdered based on that information.
0:29:38 And then they changed the law.
0:29:42 And my reaction was, didn’t you foresee this, right?
0:29:46 You put millions and millions of license information
0:29:49 to the public and you don’t think that some idiot
0:29:51 who’s horrific is gonna harm somebody.
0:29:54 So my frustration is not that it will occur
0:29:55 because I’m sure it will,
0:29:58 but why did we not anticipate that as an example?
0:30:03 We should anticipate, make a list of the biggest harms.
0:30:05 I’ll give you another example.
0:30:08 These systems should not be allowed access to weapons.
0:30:10 Very simple.
0:30:14 You don’t want the AI deciding when to launch a missile.
0:30:17 You want the human to be responsible.
0:30:20 And these kinds of sensible regulations
0:30:22 are not complicated to state.
0:30:25 – Are you familiar with character AI?
0:30:26 – I am.
0:30:30 – Really, just a horrific incident
0:30:33 where a 14-year-old thinks he establishes a relationship
0:30:37 with an AI agent that he thinks is a character
0:30:38 from Game of Thrones.
0:30:40 He’s obviously unwell,
0:30:42 although he, my understanding is from his mother
0:30:45 who’s taken this on as an issue, understandably.
0:30:49 He did not qualify as someone who was mentally ill.
0:30:52 Establishes this very deep relationship
0:30:55 with obviously a very nuanced character.
0:30:59 And the net effect is he contemplates suicide
0:31:02 and she invites him to do that.
0:31:05 And the story does not end well.
0:31:07 And my view, Eric, is that if we’re waiting
0:31:09 for people’s critical thinking to show up
0:31:11 or for the better angels of CEOs of companies
0:31:13 that are there to make a profit,
0:31:14 that’s what they’re supposed to do.
0:31:15 They’re doing their job.
0:31:19 We’re just gonna have tragedy after tragedy after tragedy.
0:31:22 My sense is someone needs to go to jail.
0:31:23 And in order to do that,
0:31:26 we need to pass laws showing that if you’re reckless
0:31:29 with technology and we can reverse engineer it
0:31:31 to the death of a 14-year-old,
0:31:33 that you are criminally liable.
0:31:35 But I don’t see that happening.
0:31:37 So I would push back on the notion
0:31:39 that people need to think more critically.
0:31:40 That would be lovely.
0:31:42 I don’t see that happening.
0:31:45 I have no evidence that any CEO of a tech company
0:31:47 is gonna do anything but increase the value
0:31:49 of their shares, which I understand
0:31:52 and is a key component of capitalism.
0:31:54 It feels like we need laws
0:31:57 that either remove this liability shield.
0:31:58 I mean, does any of this change
0:32:01 until someone shows up in an orange jumpsuit?
0:32:04 – I can tell you how we dealt with this at Google.
0:32:07 We had a rule that in the morning we would look at things.
0:32:10 And if there was something that looked like real harm,
0:32:12 we would resolve it by noon.
0:32:15 And we would make the necessary adjustments.
0:32:19 The example that you gave is horrific,
0:32:21 but it’s all too common.
0:32:24 And it’s gonna get worse for the following reason.
0:32:26 So now imagine you have a two-year-old
0:32:28 and you have the equivalent of a bear
0:32:30 that is the two-year-old’s best friend.
0:32:32 And every year the bear gets smarter
0:32:34 and the two-year-old gets smarter too,
0:32:37 becomes three, four, five, and so forth.
0:32:40 That now 15-year-old’s best friend
0:32:43 will not be a boy or a girl of the same age.
0:32:45 It’ll be a digital device.
0:32:50 And such people highlighted in your terrible example
0:32:52 are highly suggestible.
0:32:55 So either the people who are building
0:32:58 the equivalent of that bear 10 years from now
0:33:01 are gonna be smart enough to never suggest harm,
0:33:05 or they’re gonna get regulated and criminalized.
0:33:06 Those are the choices.
0:33:09 The technology, I used to say that the internet
0:33:12 is really wonderful, but it’s full of misinformation
0:33:15 and there’s an off button for a reason, turn it off.
0:33:17 I can’t do that anymore.
0:33:20 The internet is so intertwined in our daily lives.
0:33:23 All of us, every one of us, for the good and bad,
0:33:25 that we can’t get out of the cesspool
0:33:27 if we think it’s a cesspool and we can’t make it better
0:33:29 ’cause it keeps coming at us.
0:33:32 The industry, to answer your question,
0:33:36 the industry is optimized to maximize your attention
0:33:37 and monetize it.
0:33:40 So that behavior is gonna continue.
0:33:43 The question is how do you manage the extreme cases?
0:33:46 Anything involving personal harm of the nature
0:33:49 that you’re describing will be regulated
0:33:50 one way or the other.
0:33:53 – Yeah, at some point, it’s just a damage
0:33:54 we incur until then, right?
0:33:58 We’ve had 40 congressional hearings on child safety
0:34:00 and social media and we’ve had zero laws.
0:34:05 – In fairness to that, there is a very, very extensive set
0:34:08 of laws around child sexual abuse,
0:34:10 which is obviously horrific as well.
0:34:14 And those laws are universally implemented
0:34:16 and well adhered to.
0:34:19 So we do have examples where everyone agrees
0:34:20 what the harm is.
0:34:22 I think all of us would agree that a suicide
0:34:25 of a teenager is not okay.
0:34:27 And so regulating the industry,
0:34:29 so it doesn’t generate that message,
0:34:31 strikes me as a brainer.
0:34:34 The ones which will be much harder are where
0:34:38 the system has essentially captured the emotions
0:34:41 of the person and is feeding them back to the person
0:34:43 as opposed to making suggestions.
0:34:46 And that’s, and we talk about this in the book,
0:34:48 when the system is shaping your thinking,
0:34:50 you are being shaped by a computer,
0:34:52 you’re not shaping it.
0:34:54 And because these systems are so powerful,
0:34:57 we worry and again, we talk about this in the book,
0:35:01 of the impact on the perception of truth and of society.
0:35:02 Who am I?
0:35:03 What do I do?
0:35:06 And ultimately, one of the risks here,
0:35:08 if we don’t get this under control,
0:35:11 is that we will be the dogs to the powerful AI
0:35:14 as opposed to us telling the AI what to do.
0:35:17 A simple answer to the question of when
0:35:20 is the industry believes that within five to 10 years,
0:35:22 these systems will be so powerful
0:35:25 that they might be able to do self-learning.
0:35:27 And this is a point where the system begins
0:35:29 to have its own actions, its own religion,
0:35:32 it’s called evolution, it’s called general intelligence,
0:35:34 AGI as it’s called.
0:35:37 And the arrival of AGI will need to be regulated.
0:35:39 We’ll be right back.
0:35:42 (upbeat music)
0:35:44 Support for PropG comes from Miro.
0:35:46 While a lot of CEOs believe that innovation
0:35:48 is the lifeblood of business,
0:35:50 very few of them actually see their team unlock
0:35:52 the creativity needed to innovate.
0:35:54 A lot of times that’s because once you’ve moved
0:35:56 from discovery and ideation of product development,
0:35:58 outdated process management tools,
0:35:59 context switching, team alignment
0:36:03 and constant updates massively slow, the process.
0:36:06 But now you can take a big step to solving these problems
0:36:09 with the innovation workspace from Miro.
0:36:11 Miro is a workspace where teams can work together
0:36:13 from initial stages of project or product design
0:36:16 all the way to designing and delivering the finished product.
0:36:18 Powered by AI, Miro can help teams increase the speed
0:36:20 of their work by generating AI-powered summaries,
0:36:22 product briefs and research insights
0:36:24 in the early stages of development.
0:36:28 Then move to prototypes, process flows and diagrams.
0:36:30 And once there, execute those tasks with timelines
0:36:33 and project trackers all in a single shared space.
0:36:36 Whether you work in product design, engineering, UX, agile
0:36:39 or marketing, bring your team together on Miro.
0:36:40 Your first three Miro boards are free
0:36:43 when you sign up today at Miro.com.
0:36:46 That’s three free boards at M-I-R-O.com.
0:36:54 (gentle music)
0:36:57 – Autograph collection hotels offer over 300
0:36:59 independent hotels around the world,
0:37:02 each exactly like nothing else.
0:37:04 Hand selected for their inherent craft,
0:37:07 each hotel tells its own unique story
0:37:10 through distinctive design and immersive experiences
0:37:13 from medieval falconry to volcanic wine tasting.
0:37:16 Autograph collection is part of the Marriott Bonvoy portfolio
0:37:20 of over 30 hotel brands around the world.
0:37:23 Find the unforgettable at autographcollection.com.
0:37:28 Support for PropG comes from Fundrise.
0:37:29 Artificial Intelligence is poised to be
0:37:31 one of the biggest wealth creation events in history.
0:37:34 Some experts expect AI to add more than $15 trillion
0:37:37 to the global economy by 2030.
0:37:38 Unfortunately, your portfolio
0:37:40 probably doesn’t own the biggest names in AI.
0:37:42 That’s because most of the AI revolution
0:37:45 is largely being built and funded in private markets.
0:37:47 That means the vast majority of AI startups
0:37:49 are going to be backed and owned by venture capitalists,
0:37:50 not public investors.
0:37:53 But with the launch of the Fundrise Innovation Fund last year,
0:37:55 you can get in on it now.
0:37:56 The Innovation Fund pairs a $100 million
0:37:59 plus venture portfolio of some of the biggest names in AI
0:38:01 with one of the lowest investment minimums
0:38:03 the venture industry has ever seen.
0:38:06 Get in early at fundrise.com/propg.
0:38:09 Carefully consider the investment material before investing,
0:38:11 including objectives, risks, charges, and expenses.
0:38:13 This and other information can be found
0:38:15 at the Innovation Fund’s prospectus
0:38:17 at fundrise.com/innovation.
0:38:20 This is a paid advertisement.
0:38:29 – We know that social media
0:38:31 and a lot of these platforms and apps
0:38:34 and time on phone, is this not a good idea?
0:38:36 I’m curious what you think of my colleague’s work,
0:38:37 Jonathan Hyde, and that is,
0:38:40 is there any reason for anyone under the age of 14
0:38:41 to have a smartphone?
0:38:44 And is there any reason for anyone under the age of 16
0:38:45 to be on social media?
0:38:48 Shouldn’t we agegate pornography, alcohol, the military?
0:38:52 Shouldn’t we, specifically the device makers
0:38:55 and the operating systems, including your old firm,
0:38:58 shouldn’t they get in the business of agegating?
0:38:59 – They should.
0:39:02 Indeed, Jonathan’s work is incredible.
0:39:05 He and I wrote an article together two years ago,
0:39:07 which called for a number of things
0:39:09 in the area of regulating social media.
0:39:12 And we start with changing a law called COPPA
0:39:15 from 13 to 16.
0:39:17 And we are quite convinced
0:39:19 that using various techniques,
0:39:21 we can determine the age of the person
0:39:23 with a little bit of work.
0:39:25 And so people say, well, you can’t implement it.
0:39:27 Well, that doesn’t mean you shouldn’t try.
0:39:30 And so we believe that at least the pernicious effects
0:39:34 of this technology on below 16 can be addressed.
0:39:36 When I think about all of this,
0:39:39 to me, we want children to be able to grow up
0:39:42 and grow up with humans as friends.
0:39:46 And I’m sure with the power of AI arrival,
0:39:48 that you’re gonna see a lot of regulation
0:39:51 about child content.
0:39:54 What can a child below 16 see?
0:39:56 This does not answer the question of what do you do
0:39:58 with the 20 year old, right?
0:40:00 Who is also still being shaped.
0:40:03 And as we know, men develop a little bit later than women.
0:40:05 And so let’s focus on the underdeveloped man
0:40:08 who’s having trouble in college or what have you.
0:40:09 What do we do with them?
0:40:11 And that question remains open.
0:40:16 – In terms of the idea that the genie is out of the bottle
0:40:19 here and we face a very real issue or fulcrum retention.
0:40:22 And that is we wanna regulate it.
0:40:23 We wanna put in guardrails.
0:40:28 At the same time, we wanna let our sprinters and our IP
0:40:29 and our minds and our universities
0:40:31 and our incredible for profit machine,
0:40:34 we wanna let it run, right?
0:40:37 And the fear is that if you regulate it too much,
0:40:41 the Chinese or the Islamic Republic
0:40:43 isn’t quite as concerned
0:40:46 and gets ahead of us on this technology.
0:40:48 How do you balance that tension?
0:40:51 – So there are quite a few people in the industry,
0:40:53 along with myself who are working on this.
0:40:58 And the general idea is relatively light regulation
0:41:01 looking for the extreme cases.
0:41:03 So the worst of the extreme events
0:41:06 would be a biological attack, a cyber attack,
0:41:08 something that harmed a lot of people
0:41:10 as opposed to a single individual,
0:41:11 which is always a tragedy.
0:41:14 Any misuse of these in war,
0:41:17 any of those kinds of things we worry a lot about.
0:41:19 And there’s a lot of questions here.
0:41:24 One of them is, do you think that if we had a GI system
0:41:30 that developed a way to kill all of the soldiers
0:41:33 from the opposition in one day that it would be used?
0:41:36 And I think the answer from a military general perspective
0:41:37 would be yes.
0:41:40 The next question is, do you think that the North Koreans,
0:41:44 for example, or the Chinese would obey the same rules
0:41:45 about when to apply that?
0:41:47 And the answer is no one believes
0:41:50 that they would do it safely and carefully
0:41:52 under the way the US law would require.
0:41:56 US law has a law called person in the loop
0:41:58 or meaningful human control
0:42:01 that tries to keep these things from going out of hand.
0:42:06 So what I actually think is that we don’t have a theory
0:42:09 of deterrence with these new tools.
0:42:13 We don’t know how to deal with the spread of them.
0:42:16 And the simple example,
0:42:17 and sorry for the diversion for a sec,
0:42:19 but there’s closed source and open source.
0:42:22 Closed is like you can use it,
0:42:25 but the software and the numbers are not available.
0:42:27 There are other systems called open source
0:42:29 where everything is published.
0:42:32 China now has two of what appear to be
0:42:35 the most powerful models ever made
0:42:37 and they’re completely open.
0:42:39 And we’re obviously, you and I are not in China
0:42:42 and I don’t know why China made a decision to release them,
0:42:45 but surely evil groups and so forth
0:42:46 will start to use those.
0:42:49 Now maybe they don’t speak Chinese or what have you,
0:42:52 or maybe the Chinese just discount the risk,
0:42:55 but there’s a real risk of proliferation of systems
0:42:57 in the hands of terrorism.
0:43:00 And proliferation is not gonna occur
0:43:03 by misusing Microsoft or Google or what have you.
0:43:05 It’s going to be by making their own servers
0:43:06 in the dark web.
0:43:08 And an example, a worry that we all have
0:43:10 is exfiltration of the models.
0:43:14 I’ll give an example, Google or Microsoft or OpenAI
0:43:16 spends $200 million or something
0:43:18 to build one of these models, they’re very powerful.
0:43:22 And then some evil actor manages to exfiltrate it
0:43:25 out of those companies and put it on the dark web.
0:43:29 We have no theory of what to do when that occurs.
0:43:31 Because we don’t control the dark web,
0:43:34 we don’t know how to detect it and so forth.
0:43:38 In the book we talk about this and say that eventually
0:43:40 the network systems globally will have
0:43:43 fairly sophisticated supervision systems
0:43:45 that will watch for this.
0:43:47 Because it’s another example of proliferation.
0:43:51 It’s analogous to the spread of enriched uranium.
0:43:53 If anyone tried to do that, there’s an awful lot
0:43:55 of monitoring systems that would say
0:43:58 you have to stop right now or we’re gonna shoot you.
0:43:59 – So you make a really cogent argument
0:44:02 for the kind of existential threat here,
0:44:05 the weaponization of AI by bad actors.
0:44:07 And we have faced similar issues before.
0:44:09 My understanding is there are multilateral treaties
0:44:13 around bioweapons or we have nuclear arms treaties.
0:44:16 So is this the point in time where people such as yourself
0:44:21 and our defense infrastructure should be thinking about
0:44:25 or trying to figure out multilateral agreements?
0:44:27 And again, the hard part there is my understanding
0:44:30 is it’s very hard to monitor things like this.
0:44:33 And should we have something along the lines of Interpol
0:44:36 that’s basically policing this and then fighting fire
0:44:41 with AI to go out and find scenarios
0:44:43 where things look very ugly and move in
0:44:44 with some sort of international force.
0:44:46 It feels like a time for some sort
0:44:51 of multinational cooperation is upon us, your thoughts.
0:44:51 – We agree with you.
0:44:54 And in the book, we specifically talk about this
0:44:58 in a historical context of the nuclear weapons regime,
0:45:01 which Dr. Kissinger, as you know, invented largely.
0:45:04 What’s interesting is working with him,
0:45:08 you realize how long it took for the full solution to occur.
0:45:11 America used the bomb in 1945.
0:45:15 Russia or Soviet Union demonstrated in 1949.
0:45:17 So that’s roughly, that was a four year gap.
0:45:20 And then there was sort of a real arms race.
0:45:23 And once that it took roughly 15 years
0:45:27 for an agreement to come for limitations on these things,
0:45:30 during which time we were busy making an enormous number
0:45:33 of weapons, which ultimately were a mistake,
0:45:35 including, you know, these enormous bombs
0:45:37 that were unnecessary.
0:45:40 And so things got out of hand.
0:45:44 In our case, I think what you’re saying is very important
0:45:47 that we start now, and here’s where I would start.
0:45:50 I would start with a treaty that says,
0:45:53 we’re not going to allow anyone who’s the signature
0:45:56 of the treaty to have automatic weapon systems.
0:46:00 And by automatic weapons, I don’t mean automated.
0:46:03 I mean, ones that make the decision on their own.
0:46:07 So an agreement that any use of AI, of any kind
0:46:11 in a conflict sense, has to be owned and authorized
0:46:15 by a human being who is authorized to make that decision.
0:46:17 That would be a simple example.
0:46:20 Another thing that you could do as part of that
0:46:23 is say that you have a duty to inform
0:46:26 when you’re testing one of these systems
0:46:28 in case it gets out of hand.
0:46:31 Now, whether these treaties can be agreed to,
0:46:34 I don’t know, remember that it was the horror
0:46:37 of nuclear war that got people to the table
0:46:39 and it still took 15 years.
0:46:43 I don’t want us to go through an analogous bad incident
0:46:45 involving an evil actor in North Korea.
0:46:48 Again, I’m just using them as bad examples
0:46:50 or even Russia today.
0:46:52 We obviously don’t trust.
0:46:54 I don’t want to run that experiment and have all that harm
0:46:58 and then say, hey, we should have foreseen this.
0:47:01 – Well, my sense is when we are better to technology,
0:47:03 we’re not in a hurry for a multilateral treaty, right?
0:47:06 When we’re under the impression that our nuclear scientists
0:47:07 are better than your, remember,
0:47:09 our Nazis are smarter than your Nazis kind of thing,
0:47:11 that we like, we don’t want a multilateral treaty
0:47:13 ’cause we see advantage.
0:47:15 And curious if you agree with this,
0:47:20 we have better AIs than anyone else.
0:47:21 Does that get in the way of a treaty
0:47:23 or should we be doing this from a position of strength?
0:47:25 And also, if there’s a number two,
0:47:27 and maybe you think we’re not the number one,
0:47:30 but assuming you think that the US is number one in this,
0:47:31 who is the number two?
0:47:33 Who do you think poses the biggest threat?
0:47:36 Is it their technology or their intentions or both?
0:47:39 If you were to hear that one of these
0:47:41 really awful things took place,
0:47:43 who would you think most likely
0:47:45 are the most likely actors behind it?
0:47:46 Is it a rogue state?
0:47:47 Is it a terrorist group?
0:47:49 Is it a nation state?
0:47:51 – First place, I think that the short-term threats
0:47:54 are from rogue states and from terrorism.
0:47:56 And because as we know, there’s plenty of groups
0:48:01 that seek harm against the elites in any country.
0:48:04 Today, the competitive environment is very clear
0:48:08 that the US with a partner UK, I’ll give you an example.
0:48:12 This week, there were two libraries from China
0:48:14 that were released, open source.
0:48:17 One is a problem solver that’s very powerful
0:48:21 and another one is a large language model that’s equal.
0:48:24 And in some cases, it exceeds the one from META
0:48:28 which they use every day, it’s called Lama III, 400 billion.
0:48:32 I was shocked when I read this ’cause I had assumed
0:48:35 that are in my conversation with the Chinese
0:48:39 that they were two to three years late.
0:48:41 It looks to me like it’s within a year now.
0:48:43 So it’d be fair to say it’s the US
0:48:46 and then China within a year’s time.
0:48:49 Everyone else is well behind.
0:48:51 Now, I’m not suggesting that China
0:48:54 will launch a rogue attack against us in American city.
0:48:57 I am alleging that it’s possible
0:49:01 that a third party could steal from China
0:49:03 ’cause it’s open source or from the US
0:49:05 if they’re malevolent and do that.
0:49:09 So the threat escalation matrix goes up
0:49:11 with every improvement.
0:49:14 At today, the primary use of these tools
0:49:18 is to sow misinformation, which is what you talked about.
0:49:21 But remember that there’s a transition to agents
0:49:23 and the agents do things.
0:49:26 So it’s a travel agent or it’s whatever.
0:49:29 And the agents speak English, you give them English
0:49:33 and they respond in English so you can concatenate them.
0:49:36 You can literally put agent one talks to agent two,
0:49:39 talks to agent three, talks to agent four.
0:49:43 And there’s a scheduler that makes them all work together.
0:49:46 And so for example, you could say to these agents,
0:49:49 design me the most beautiful building in the world,
0:49:53 go ahead and file all the permits,
0:49:55 negotiate the fees of the builders
0:49:57 and tell me how much it’s gonna cost
0:50:00 and tell my accountant that I need that amount of money.
0:50:01 That’s the command.
0:50:03 So think about that.
0:50:06 Think about the agency, the ability to put
0:50:09 an integrated solution that today takes 100 people
0:50:13 who are very talented and you can do it by one command.
0:50:17 So that acceleration of power could also be misused.
0:50:19 I’ll give you another example.
0:50:21 You were talking earlier about the impact on social media.
0:50:26 I saw a demonstration in England, in fact.
0:50:31 The first command was build a profile of a woman who’s 25,
0:50:36 she has two kids and she has the following strange beliefs.
0:50:40 And the system wrote the code and created a fake persona
0:50:44 that existed on that particular social media case.
0:50:47 Then the next command was take that person
0:50:51 and modify that person to every possible stereotype,
0:50:55 every race, sex, so forth and so on, age, demographic thing
0:50:58 with similar views and populate that
0:51:02 and 10,000 people popped up just like that.
0:51:06 So if you wanted, for example, today, this is true today,
0:51:07 if you wanted to create a community
0:51:10 of 10,000 fake influencers to say, for example,
0:51:12 that smoking doesn’t cause cancer,
0:51:15 which as we know is not true, you could do it.
0:51:17 And one person with a PC can do this.
0:51:21 Imagine when the AI’s are far more powerful
0:51:22 than they are today.
0:51:26 – So one of the things that Dr. Kissinger was known for
0:51:27 and quite frankly I appreciate
0:51:29 was this notion of real politic.
0:51:31 Obviously we have aspirations around
0:51:33 the way the world should be,
0:51:34 but as it relates to decision-making,
0:51:37 we’re also gonna be very cognizant of the way the world is
0:51:41 and make some, I mean, he’s credited with a lot of very
0:51:43 controversial/difficult decisions
0:51:46 depending on how you look at it.
0:51:49 What I’m hearing you say leads,
0:51:51 all these roads lead to one place
0:51:54 in my kind of quote unquote critical thinking
0:51:56 or lack there of a brain and that is,
0:52:00 there’s a lot of incentive to kiss and make up with China
0:52:02 and partner around this stuff.
0:52:06 That if China and the US came to an agreement
0:52:08 around what they were gonna do or not do
0:52:11 and bilaterally created a security force
0:52:15 and agreed not to sponsor proxy agents against the West
0:52:18 or each other that we’d have a lot,
0:52:20 that would be a lot of progress.
0:52:22 That might be 50, 60, 80% of the whole shooting match
0:52:24 as if the two of us could say,
0:52:27 we’re gonna figure out a way to trust each other
0:52:29 on this issue and we’re gonna fight the bad guys
0:52:31 together on this stuff.
0:52:32 Your thoughts?
0:52:34 – So Dr. Kissinger of course was the world’s expert
0:52:36 in China, he opened up China,
0:52:38 which is one of his greatest achievements.
0:52:42 And but he was also a proud American
0:52:46 and he understood that China could go one way or the other.
0:52:49 His view on China was that China,
0:52:50 and he wrote a whole book on this,
0:52:52 was that China wanted to be the middle kingdom
0:52:54 as part of their history,
0:52:57 where they’d sort of dominated all the other countries,
0:52:58 but it’s not like America.
0:53:02 His view was they wanted to make sure the other countries
0:53:04 would show fealty to China,
0:53:07 in other words, do what they wanted.
0:53:10 And occasionally, if they didn’t do something,
0:53:12 China would then extract some payment,
0:53:14 such as invading the country.
0:53:16 That’s roughly what Henry would say.
0:53:20 So he was very much a realist about China as well.
0:53:24 His view would be at odds today
0:53:27 with Trump’s view and the US governments.
0:53:30 The US government is completely organized today
0:53:35 around decoupling, that is literally separating.
0:53:38 And his view, which I can report accurately
0:53:39 ’cause I went to China with him,
0:53:43 was that we’re never going to be great friends,
0:53:46 but we have to learn how to coexist.
0:53:51 And that means detailed discussions on every issue
0:53:54 at great length to make sure
0:53:57 that we don’t alarm each other or frighten each other.
0:54:01 His further concern was not that President Xi
0:54:04 would wake up tomorrow and invade Taiwan,
0:54:06 but that you would start with an accident
0:54:08 and then there would be an escalatory ladder.
0:54:11 And that because the emotions on both sides,
0:54:14 you would end up just like in World War I,
0:54:17 which started with a shooting in Sarajevo,
0:54:20 that ultimately people found in a few months
0:54:21 that they were in a world war
0:54:23 that they did not want and did not expect.
0:54:26 And once you’re in the war, you have to fight.
0:54:30 So the concern with China would be roughly that
0:54:35 we are codependent and we’re not best friends.
0:54:40 Being dependent is probably better
0:54:44 than being completely independent, that is non-dependent
0:54:46 because it forces some level of understanding
0:54:47 and communication.
0:54:50 – Eric Schmidt is a technologist,
0:54:51 entrepreneur and philanthropist.
0:54:55 In 2021, he founded the Special Competitive Studies Project,
0:54:57 a non-profit initiative to strengthen America’s
0:55:00 long-term competitiveness in AI
0:55:01 and technology more broadly.
0:55:03 Before that, Eric served as Google’s
0:55:05 chief executive officer and chairman,
0:55:08 and later as executive chairman and technical advisor.
0:55:10 He joins us from Boston, Eric.
0:55:12 In addition to your intelligence,
0:55:14 I get to sense your heart’s in the right place
0:55:17 and you’re using your human and financial capital
0:55:18 to try and make the world a better place.
0:55:20 Really appreciate you and your work.
0:55:26 (upbeat music)
0:55:29 (upbeat music)
0:55:32 – I was a bit of a happiness.
0:55:36 I’m at this gathering called Summit
0:55:39 and I’ve been struck by how many people are successful
0:55:41 or at least the appearance of being successful.
0:55:44 So as I know the rich kids, but they do seem to be,
0:55:48 I don’t know, economically secure or overeducated.
0:55:50 Interesting, some of them started sold businesses,
0:55:53 but what I see is a lot of people searching
0:55:54 and they’ll say shit like,
0:55:56 well, I’m just taking a year to really focus
0:55:57 on improving my sleep.
0:56:02 Okay, no, sleep is supposed to be part of your arsenal.
0:56:03 It’s not why you’re fighting this war.
0:56:05 You need good sleep, but I don’t think
0:56:07 you should take a year to focus on it.
0:56:09 Anyways, does that sound boomer of me?
0:56:11 But this notion of finding a purpose
0:56:12 and what I have found is,
0:56:14 and this is probably one of the accoutrements
0:56:18 of a prosperous society, is ask yourself,
0:56:20 do you have the wrong amount of money?
0:56:21 Do you have just the wrong amount of money?
0:56:22 What do I mean by that?
0:56:24 Obviously, the worst amount of money is not enough,
0:56:26 but a lot of my friends and a lot of people,
0:56:28 I think at this summit,
0:56:30 suffer from just having the wrong amount of money.
0:56:31 What do I mean by that?
0:56:32 They have enough money
0:56:34 so they don’t have to do something right away,
0:56:36 but they don’t have enough money to retire
0:56:39 or go into philanthropy or really pursue something creative
0:56:41 and not make money.
0:56:42 That’s exactly the wrong amount of money.
0:56:45 And I would say a good 50% of my friends
0:56:47 who kind of hit a wall, got stuck,
0:56:49 experienced their first failure,
0:56:53 sit around and wait for the perfect thing
0:56:55 and wake up one, two, three years later
0:56:57 and really don’t have a professional purpose
0:56:59 or a professional source of gravity.
0:57:03 And you know, the kind of basic stuff, right?
0:57:05 Do something in the agency of others,
0:57:08 be in service to others, but more than anything,
0:57:11 I think the call sign is just now.
0:57:15 And that is, don’t let perfect be the enemy of good
0:57:18 and give yourself a certain amount of time to find something.
0:57:22 And within that amount of time, when it elapses,
0:57:24 take the best thing that you have.
0:57:26 And it might not be the,
0:57:28 it might not foot to the expectations
0:57:29 that you have for yourself
0:57:32 or be really exciting or dramatic or really lucrative.
0:57:33 But the thing about working
0:57:35 is it leads to other opportunities.
0:57:37 And what I see is a lot of people
0:57:39 who kind of are cast into the wilderness
0:57:41 and then come out of the wilderness with no fucking skills.
0:57:43 And that is, you’ll be surprised
0:57:47 how much your Rolodex and your skills atrophy.
0:57:47 And so what is the key?
0:57:49 Do you want to write a book?
0:57:50 Do you want to start a podcast?
0:57:52 Do you want to try and raise a fund?
0:57:53 Do you want to start a company?
0:57:54 What is the key?
0:57:56 What is the critical success factor?
0:57:57 Is it finding the right people?
0:57:58 Is it finding capital?
0:57:59 Is it thinking through?
0:58:01 Is it positioning the concept?
0:58:02 Is it doing more research?
0:58:05 No, the key is now.
0:58:06 You want to write a book,
0:58:09 open your fucking laptop and start writing.
0:58:10 And it’s going to be shit.
0:58:11 But then when you go back and edit it,
0:58:12 it’ll be less shitty.
0:58:14 And then if you find someone to help you review it
0:58:15 and you find some people,
0:58:18 it’ll get dramatically even less shittier.
0:58:20 All right, you want to start a business?
0:58:20 Nobody knows.
0:58:22 The only way you have a successful business
0:58:24 is you start a bad one and you start iterating.
0:58:26 But here’s the key, starting.
0:58:27 You want to be in a nonprofit.
0:58:29 You want to start helping other people.
0:58:32 We’ll start with one person and see if in fact,
0:58:34 your infrastructure, your skills, your expertise,
0:58:37 tangibly change the community, the environment,
0:58:38 or their life.
0:58:39 What is key to all of this?
0:58:42 Three words, first N, second O, third W.
0:58:44 I have so many people I run across
0:58:47 who are searching, not because they’re not talented,
0:58:49 not because there’s not opportunity,
0:58:51 but they’re thinking they’re going to find the perfect thing.
0:58:56 No, find the best thing that is now and get started.
0:58:59 (upbeat music)
0:59:01 This episode was produced by Jennifer Sanchez
0:59:02 and Caroline Shagren.
0:59:04 Drew Burroughs is our technical director.
0:59:05 Thank you for listening to the Property Pod
0:59:07 from the Vox Media Podcast Network.
0:59:08 We will catch you on Saturday
0:59:11 for No Mercy, No Malice as read by George Hahn.
0:59:13 And please follow our Property Markets Pod
0:59:15 wherever you get your pods for new episodes
0:59:16 every Monday and Thursday.
0:59:21 – Do you feel like your leads never lead anywhere?
0:59:24 And you’re making content that no one sees
0:59:27 and it takes forever to build a campaign?
0:59:29 Well, that’s why we build HubSpot.
0:59:31 It’s an AI-powered customer platform
0:59:33 that builds campaigns for you,
0:59:35 tells you which leads are worth knowing,
0:59:38 and makes writing blogs, creating videos,
0:59:40 and posting on social a breeze.
0:59:43 So now, it’s easier than ever to be a marketer.
0:59:46 Get started at hubspot.com/marketers.
Eric Schmidt, a technologist, entrepreneur, philanthropist, and Google’s former CEO, joins Scott to discuss the dangers and opportunities AI presents and his latest book, Genesis: Artificial Intelligence, Hope, and the Human Spirit.
Follow Eric, @ericschmidt.
Scott opens with his thoughts on Netflix’s bet on live sports.
Algebra of happiness: don’t let perfect be the enemy of good.
Subscribe to No Mercy / No Malice
Buy “The Algebra of Wealth,” out now.
Follow the podcast across socials @profgpod:
Learn more about your ad choices. Visit podcastchoices.com/adchoices