AI transcript
0:00:11 Does AI so ruthlessly optimize for that former, what you will pay attention to?
0:00:15 It totally alienates you from the latter, what you want to pay attention to.
0:00:18 Acquisition and retention are different things, right?
0:00:24 It’s amazing how sort of bad, schlocky, and ineffective most internet advertising is.
0:00:28 I would literally bet large portions of my net worth in time, that’s the future.
0:00:29 That’s going to be the gig.
0:00:31 AI is going to upstream everything about the consumer experience.
0:00:36 We’re in the midst of an apocle transformation in how we get energy.
0:00:39 And it’s just not that sexy.
0:00:43 We’re not all going to be famous for 15 minutes, we’re all going to be famous to 15 people.
0:00:44 15 people, yeah.
0:00:51 In a world where everyone is famous to 15 people, what happens when AI starts generating content for all of them?
0:00:56 In this episode, I’m joined by author and MSNBC host, Chris Hayes,
0:01:02 and longtime ad tech operator and writer, Antonio Garcia Martinez, to talk about the shifting economics of attention.
0:01:06 How we got here, what’s breaking, and what AI might make worse.
0:01:12 Chris’s new book, The Siren’s Call, is about the way our attention has been bought, sold, and overwhelmed.
0:01:17 Together, we explore the rise of AI slop, whether platforms can contain it,
0:01:22 and what comes next as digital content, fame, and identity get increasingly fragmented.
0:01:23 Let’s get into it.
0:01:30 As a reminder, the content here is for informational purposes only.
0:01:33 Should not be taken as legal, business, tax, or investment advice,
0:01:35 or be used to evaluate any investment or security,
0:01:40 and is not directed at any investors or potential investors in any A16Z fund.
0:01:45 Please note that A16Z and its affiliates may also maintain investments in the companies discussed in this podcast.
0:01:48 For more details, including a link to our investments,
0:01:53 please see A16Z.com forward slash disclosures.
0:01:58 So excited to have you on, Chris.
0:02:02 Antonio is a longtime friend and collaborator of mine.
0:02:02 We have a show.
0:02:04 Antonio, why don’t you briefly introduce yourself?
0:02:08 Yeah, I started in ad tech, I guess, in 2008, in attention capitalism, as Chris would call it,
0:02:10 and worked in a number of startups.
0:02:15 I was an early member of the Facebook ads team back before the IPO when the ad system was horrible.
0:02:18 For those old enough to remember, I suspect Chris is probably in the same bucket.
0:02:21 Facebook ads used to be these little postage stamps on the right-hand side,
0:02:24 and then they became this huge money machine.
0:02:27 Read a book about that time called Chaos Monkeys that came out in 2016.
0:02:28 I’ve read it.
0:02:28 I’ve read it well.
0:02:29 Yeah, yeah.
0:02:30 Quote it in the book.
0:02:32 Was a media figure writer guy for a while.
0:02:33 I don’t know how you do it, Chris.
0:02:34 It drove me basically crazy.
0:02:35 And so I went back to tech.
0:02:37 And I’m ahead of ads.
0:02:44 Antonio famously says the prize for beating Matt Iglesias in the Substack rankings is you
0:02:46 then become Matt Iglesias, and that wasn’t inspiring enough.
0:02:54 I’m not anti-Matt, but we were briefly among the top 10 or 15 Substackers on a revenue basis
0:02:58 circa like 2021, and he was definitely a comp, I guess.
0:03:00 Him and Andrew Sullivan were definitely the winners of that race.
0:03:03 I feel like I would probably have more to learn from you guys than you have from me,
0:03:06 so I’m happy to talk and maybe I’ll ask you some questions.
0:03:10 I sort of think like the whole thing is about to collapse.
0:03:12 I think there’s just a huge pollution problem.
0:03:18 And I think that the treatment of spam in the book is the sort of signature pollution of attention
0:03:20 capitalism of the attention age.
0:03:28 If it’s lucrative to aggregate lots of attention, where attention pools and collects, there will
0:03:28 be money.
0:03:36 And to the extent that you can cheaply automate the extraction of that attention, even if it’s
0:03:41 done in a sort of like brute force wasteful way, there’s going to be spam, right?
0:03:46 So it’s happened on junk mail, it’s happened on phone calls, it’s happened on text, it’s
0:03:47 happened on email.
0:03:54 But I think the idea of AI content creation now puts that at scale for all the social media
0:03:55 platforms.
0:04:02 And I guess the question becomes like, is the AI social media content generated actually
0:04:07 compelling enough that it doesn’t feel like spam and it just dislocates all human creators?
0:04:09 Do people not like it?
0:04:11 So the algorithmic stops serving it up?
0:04:16 Or does it overwhelm the experience in a way that starts to feel like the way that our inboxes
0:04:17 feel?
0:04:18 Is AI slop going to win?
0:04:21 Yeah, that’s one way of saying like, is AI slop going to win?
0:04:22 I mean, yes and no.
0:04:24 I mean, you’re seeing it in the 4U tab Twitter already, right?
0:04:28 And maybe I would think this because I was creator paid for my creations, I guess, to some
0:04:28 degree.
0:04:30 But I think a lot of it is empty and sterile.
0:04:35 And I think humans who are able to be creative and use AI tools correctly will have superpowers.
0:04:39 Yeah, I mean, I’m pretty obsessed right now with AI slop and particularly like these sort
0:04:44 of sub genres that feel like what’s funny about a lot of AI is that it doesn’t actually
0:04:45 feel that sophisticated.
0:04:48 It just feels like reverse engineered in a way that you can trace.
0:04:50 I mean, not all of it, right?
0:04:51 But some of it does.
0:04:53 There are moments we all have with this technology where you’re like, oh, my God.
0:04:59 But like, for instance, it’s clear that like religious themes and babies like do well, right?
0:05:05 And so there’s this whole like universe of AI babies singing, bless the Lord, oh, my soul.
0:05:09 And there was one I saw yesterday that’s like Jesus with a baby.
0:05:11 They’re both singing, bless the Lord, oh, my soul.
0:05:16 And Jesus is holding a disembodied foot, which is just like signature AI slop.
0:05:22 To me, the sort of question about all this stuff is one of the theses of the book is that there’s
0:05:27 this kind of disconnect between what we will pay attention to and in some volitional sense what
0:05:28 we want to pay attention to.
0:05:35 And the question is, does AI from the generative perspective and AI from the machine learning
0:05:41 algorithmic perspective so ruthlessly optimize for that former what you will pay attention
0:05:41 to?
0:05:46 It totally alienates you from the latter, what you want to pay attention to.
0:05:51 And if it does do that, do people stick with it or do they reach some point where they feel
0:05:53 too alienated from what they want to be spending their time doing?
0:05:57 Well, I think the challenge there is what you put, Chris, which, by the way, I think it’s
0:05:58 time to just plug Chris’s book.
0:06:00 I think you wrote a phenomenal book, actually.
0:06:00 Well, thanks.
0:06:03 And I would recommend it to all my tech people.
0:06:07 And one of the things you cite is the to use the marketing speak acquisition and retention
0:06:08 are different things, right?
0:06:10 It’s one thing to acquire attention.
0:06:13 I think you cite the example of someone going into a room and firing a gun in the air.
0:06:15 Well, you’ve acquired attention, but then how do you maintain it?
0:06:18 And even with the guy with a gun in his hand, it might be a little hard to maintain it,
0:06:18 right?
0:06:19 And that’s the challenge.
0:06:22 And if you’re a sophisticated marketer, you realize it’s not just about acquisition.
0:06:23 It’s about the retention side of it.
0:06:24 I don’t know.
0:06:28 A positive side of me thinks humans aren’t going to sit there and stare at the slop all
0:06:28 day.
0:06:32 That said, I’ve used TikTok and like a meth head, like getting into a Fenty fold on the
0:06:33 street.
0:06:36 Like I emerged two hours later, I come to you and what the hell just happened?
0:06:39 And then I delete the app from my phone because I never want to do that again.
0:06:39 Yeah.
0:06:41 So that’s maybe the counter argument.
0:06:43 Well, first of all, thanks for the kind words.
0:06:47 I appreciate it, especially because I’m not a practitioner in ad tech and so much of the
0:06:48 story is sort of an ad tech story.
0:06:53 I mean, I think deeper than ad tech, ad tech is sort of epiphenomenal to the story.
0:06:56 So it’s really gratifying to hear that from someone who’s worked in it.
0:06:58 And yeah, like I’ve had the exact same experience.
0:07:04 Again, the sort of the defining metaphor of the book, the sirens call Odysseus on the
0:07:09 mast, sort of trying to avoid the sirens with your sort of volitional conscious self against
0:07:11 the kind of lulling that happens.
0:07:11 Right.
0:07:16 And all of us in that position, but I’m just really, yeah, I think I have the same
0:07:25 faith as you that fundamentally there’s something irreducibly human and that people are going
0:07:25 to just rebel.
0:07:29 Basically, we’ll rebel against slop, which I think brings us back to this first question,
0:07:32 which I do think is really interesting, which is if the slop overruns it, what does that mean
0:07:33 for the platforms?
0:07:34 Right.
0:07:39 And is it something that they have to start actively managing in the same way that at scale
0:07:44 somewhere like Twitter, right, they started to have to deal with a bunch of really difficult
0:07:51 content moderation questions that were tough calls where they’re trying to create a user
0:07:54 experience that is not going to repel people and is going to attract advertisers.
0:07:59 And I do wonder if AI generated content becomes a set of questions like that for TikTok and
0:08:01 for reels and stuff like that at a certain point.
0:08:03 We’ve had a slop problem before AI, right?
0:08:08 I mentioned LinkedIn earlier in terms of how fake that network feels, but also an Instagram,
0:08:08 right?
0:08:11 Like people have Finstas, fake Instagrams for a reason.
0:08:15 So this idea of who you present yourself as and who you actually are, what you actually think
0:08:17 has been a problem with social networks for a while.
0:08:25 And it’s not obvious to me that AI will make the problem worse in terms of accelerating or
0:08:32 emphasizing either fake stuff or stuff that people don’t want to be engaging with.
0:08:36 Why are you so convinced that it will expand that gap between what we want to be engaging
0:08:37 with and what we actually engage with?
0:08:40 I just think to me, it’s sort of a quantity and automation question, right?
0:08:45 Like one of the things about online success is that it’s not a batting average.
0:08:46 It’s like your total number of hits.
0:08:49 I mean, it’s a batting average if you’re like trying to make a living, right?
0:08:50 And you only have so much time.
0:08:51 But Mr. Beast talks about this.
0:08:52 Other people talk about it too.
0:08:56 Like always be posting and see what works.
0:09:00 And sometimes you could post the same things 15 times and one time it takes off and other
0:09:01 times it doesn’t.
0:09:03 If you’re thinking about it as this quantity game, right?
0:09:06 And this is where the sort of brute force spam analogy comes in.
0:09:11 If you’re automating a thousand videos every day and just throwing them at the algorithm
0:09:16 for sort of content farming purposes, does that start to overwhelm everything?
0:09:23 Because you can just do, I can’t make a thousand videos a day, but you know, you can now, right?
0:09:27 And Antonio, to your point, like you could even imagine this not as like a slop spam thing.
0:09:30 You could imagine it as someone who’s actually pretty good, right?
0:09:32 Using that automation.
0:09:34 It’s just to me, the kind of brute force scale issue.
0:09:38 Maybe it never comes up, but it just seems to me like I’m starting to see it crop up
0:09:43 and just wonder what it’s going to do to the existing models for these platforms.
0:09:47 One thing I find interesting of the impact of social media, I mean, you cite a lot of the
0:09:50 negative impacts of social media and you cite one book, The Frenzy of Renown, that I actually
0:09:52 started reading based on your recommendation, actually.
0:09:53 It’s a great book, isn’t it?
0:09:55 I’m going to be that guy so quick.
0:09:58 But I think I have a line in Chaos Monkeys that says, Andy Warhol was wrong.
0:10:00 In the future, we’re not all going to be famous for 15 minutes.
0:10:02 We’re all going to be famous to 15 people.
0:10:03 15 people, yeah.
0:10:04 It’s a great line.
0:10:04 Yeah, yeah.
0:10:08 And the fact that everyone carries themselves like they are a Kardashian everywhere they
0:10:09 go.
0:10:13 I mean, just as like a tourist in Europe, the fact that literally you have a line of people
0:10:18 taking selfies in front of the thing, right, is already like people are imagining themselves
0:10:19 already under the spotlight.
0:10:22 And as a side thing, like yesterday, I work at a tech company.
0:10:23 It’s very young.
0:10:26 And I think the one thing the Zoomers themselves cited is that like, well, you feel like you’re
0:10:27 being observed all the time, right?
0:10:29 Everyone is like a celebrity who goes out in the wild.
0:10:33 And if you yell at a waiter or you do something stupid, which we’ve all done, by the way,
0:10:36 right, suddenly you can become into famous for it and nobody wants that.
0:10:40 And I just, and again, Chris, you and I are probably old enough to remember that four times
0:10:44 when we were young, dumb kids, if you didn’t go to jail and didn’t go to the hospital, it
0:10:45 didn’t happen, right?
0:10:46 That’s it.
0:10:46 It was gone.
0:10:50 I mean, maybe it remains as like local lore, but it never got bigger than that.
0:10:52 You weren’t really worried about anything, right?
0:10:57 And somehow that’s totally changed the way that people think about themselves in the world.
0:10:57 Yeah.
0:11:00 And I think one of the things I write about in the book a lot is sort of the effects of social
0:11:02 attention and social attention on you.
0:11:08 And what it does and that democratization of fame as a genuinely new phenomenon that’s
0:11:10 sort of different from past eras.
0:11:11 And I think you’re right.
0:11:12 Like it is behavior modifying.
0:11:17 And for me, I have the experience of actually, again, fame is always totally relative.
0:11:19 Different people are famous in different places.
0:11:21 But like, you know, people recognize me on the street.
0:11:26 I do have that awareness that, for instance, if I like screamed at a car that cut me off,
0:11:30 you know, or like almost hit me as it was running the corner, like people might know that was
0:11:31 me and associate with me.
0:11:34 So there is that kind of behavioral modification part of it, too.
0:11:38 But there’s also something profoundly warping about it psychologically.
0:11:42 And I think it is true that that point that you just made, though, is a really interesting
0:11:42 one.
0:11:45 There is some kind of pro-social effect, right?
0:11:49 Of like not wanting to be like a complete maniac in public because that might go viral
0:11:50 or something.
0:11:51 It’s an interesting pro-social aspect of it.
0:11:54 But I also think there’s something so warping about it.
0:11:58 And one of the things that, and I quote this in the book, the move from public posting
0:12:03 to group chat is, I think, 100% driven by that, right?
0:12:07 So more and more of all the traffic, and Instagram talks about this all the time, is happening in
0:12:07 private messaging.
0:12:12 Things are happening in private Facebook groups, signal chats, WhatsApp chats, like sort of try
0:12:19 to basically to kind of reclaim a private space that’s not public space is one of the sort
0:12:20 of bigger trends happening.
0:12:26 I think precisely because of how warping and disquieting it is to be in this kind of digital
0:12:31 panopticon and the subject of near constant total surveillance and social attention.
0:12:33 You’re totally correct.
0:12:35 And I want to get back to the group chat thing because I think towards the end of your book,
0:12:37 you become a little bit of a doomer, Chris.
0:12:40 But you start trying to cite solutions to the problem.
0:12:41 And in the group chat one is an interesting one.
0:12:44 But I want to focus on one thing you said, which is, is it pro-social, right?
0:12:48 And one of the things I noticed right after I wrote Chaos Monkeys, I lived, I kid you
0:12:52 not on a small island in the Northwest in what’s called the San Juans, which is these
0:12:52 beautiful islands.
0:12:52 Yeah, I’ve been there.
0:12:53 It’s beautiful.
0:12:56 Yeah, it’s a cool place, very small town.
0:13:00 And so the reality is everyone’s famous in that neighborhood and that everyone knows who
0:13:01 everyone is and their family.
0:13:06 And if you did something totally crazy and like off the bend, you’d still catch hell for it
0:13:06 to be called.
0:13:07 Like you wouldn’t get away with it.
0:13:09 And so a lot of our focus on privacy.
0:13:11 In fact, this is historically, I know this is a bit of a rabbit hole.
0:13:13 I don’t know if you want to go down here, but you look at the history of privacy and
0:13:15 again, Chris, I’m sure you’ve studied these topics.
0:13:20 Louis Brindus basically created in 1890 with this sort of disquisition called the right
0:13:21 to privacy that he more or less invented, by the way.
0:13:23 The word privacy doesn’t appear once in the Constitution.
0:13:27 And the reason why is because at the time it wasn’t used in the sense that we mean it, which
0:13:30 is the right to live as a stranger among strangers, which again, was a right that was more or less
0:13:35 invented to cope with like urban living where you were losing this ability to actually know
0:13:35 everybody.
0:13:40 And anyhow, long wind up to getting to, I think the group chat phenomenon, not everybody
0:13:40 can go in the group chat.
0:13:42 There’s an admin, right?
0:13:44 If someone gets like completely out of control, they get booted.
0:13:45 And then everyone kind of knows everyone.
0:13:49 So like, you don’t really want to piss people off because there’s typically some theme, at
0:13:51 least in my group chats, often professional, right?
0:13:53 Like I’m in a group chat with Eric and we’re a lot of people that have worked together and
0:13:54 invested through those companies.
0:13:57 And you can’t be that much of a dick to be blunt because you’ll take a hit for it, right?
0:13:58 So it’s self-regulating.
0:14:03 And I think to me, if I was to propose a solution to our current bind, it’s not less technology
0:14:06 like a Butlerian jihad against this thing.
0:14:09 I think it’s using technology in some way to recreate.
0:14:12 I totally agree with that.
0:14:14 I mean, I do think the scale ends up being really the problem.
0:14:17 I mean, the question to me is like, where is the revenue in that model?
0:14:19 And this is sort of the interesting question, right?
0:14:20 This is something I’m really obsessed with.
0:14:27 And it’s the germ of something that I explored in the book and I’m writing about now in relation
0:14:28 to AI.
0:14:34 But like, we tend to conflate, particularly in this era, like useful tech and lucrative
0:14:34 tech.
0:14:41 And those are not like, there’s technologies that are incredibly useful that are not particularly
0:14:44 lucrative, like penicillin right now, like antibiotics.
0:14:46 No one’s making enormous fortunes off them.
0:14:52 Even solar power, which is arguably the most useful technology of our age, is a perfectly
0:14:57 profitable enterprise, but no one’s making like Rockefeller fortunes off of solar right
0:14:57 now.
0:15:02 And then there’s technologies that are incredibly lucrative, but not particularly socially useful.
0:15:09 Like the FanDuel DraftKings tech is like really good tech, like it works really well, but it’s
0:15:12 not like particularly useful, particularly compared to penicillin.
0:15:18 And one of the things I think that ends up happening in a digital space that is so dominated only by
0:15:26 commercial options is you don’t get as much useful tech that might not have a great business
0:15:28 revenue or a business model.
0:15:30 And so this question of, well, what’s right?
0:15:34 So what’s the sort of sustaining business model of the group chat is an interesting one.
0:15:39 Now, obviously, WhatsApp, which Meta owns, is incredibly valuable company.
0:15:41 I mean, incredibly valuable division of Meta.
0:15:45 There’s Signal, which is fascinating because it is a nonprofit and I think a really interesting
0:15:51 model in that respect, but to your point, I totally agree about this sort of reclaiming
0:15:56 kind of community or reclaiming like Dunbar numbers or IRL reactions.
0:16:02 Or what I say in the book is that normal relationships are born of bilateral exchanges of social
0:16:03 attention, right?
0:16:04 As opposed to unilateral, right?
0:16:07 That’s the way we’re all conditioned to work.
0:16:11 And the question is, it’s not a particularly lucrative tech to reclaim.
0:16:16 And that to me is the question of, is there some fundamental tension here between what
0:16:21 the sort of revenue model incentives are and what the tech is best for us as people or whatever?
0:16:24 I mean, one take on that, Chris.
0:16:26 I mean, just to give you the view from the crypto trenches, because I work at a crypto
0:16:29 company, group chats are actually huge in the crypto space.
0:16:33 It’s usually on Telegram, which is an app that most normal people don’t use or most Americans
0:16:34 don’t use, certainly.
0:16:39 And there you actually either pay to enter the group chat or some apps have actually literally
0:16:41 tokenized being able to go into the group chat.
0:16:43 So you have to buy so many of the token.
0:16:47 And then the owner of the group chat was the guy who’s leaking the alpha, because this is
0:16:48 all very speculative, right?
0:16:50 The token represents his value.
0:16:51 There’s a company called Frontech that was short-lived.
0:16:56 I was part of it for a while and I had a channel and you could buy the DAGM token and you could
0:16:57 join the group chat as part of it.
0:17:01 But I think the lesson there is that if you over-financialize things, it loses the social
0:17:03 side of it, which crypto does a lot, right?
0:17:06 AI is everything is computer and crypto, everything is money, like totally everything.
0:17:08 But yeah, I agree with you.
0:17:11 I mean, one place I’d maybe quibble with you a little bit, or I guess, I don’t know,
0:17:11 I’m confused myself.
0:17:16 I think the world we’re heading to, the digital version of things is like the mainstream premium
0:17:17 economy version of the world.
0:17:22 And then I think our friend, Mark Andreessen, that you might know, Eric, his is the A and
0:17:26 the sign behind you, famously said that some people are reality privileged, right?
0:17:29 Some people are wealthy enough or live in a certain context in which they don’t need the
0:17:30 VR headset, right?
0:17:32 Because like their life is good enough, they don’t need it.
0:17:33 But everyone else gets the VR headset.
0:17:37 And so how do you strike that balance between the digital versus the real versions of it?
0:17:41 But I think what happens is, and again, I think it’s the subtext to your entire book, Chris,
0:17:45 who we interact with, who we agree with, is completely uncoupled from our actual physical
0:17:47 place in the world.
0:17:51 And I think that tension between the physical and I wouldn’t say spiritual necessarily, but
0:17:53 the intellectual, is at the heart of all.
0:17:56 I think the nation state is cracking up in a way just to get do more for a hot second.
0:17:59 And I think the internet’s definitely behind part of it, because you don’t have the sense
0:18:03 of collective consensus that the media and TV, for example, used to create during the
0:18:08 Cronkite era, used to have some level of synchrony or synergy that would happen with Cronkite saying,
0:18:09 and that’s the way it is.
0:18:14 And somehow now I scroll my feed versus Blue Sky, and it’s like, no, I don’t know what the
0:18:14 way it is, actually.
0:18:16 It’s not quite clear the way it is.
0:18:19 What’s really interesting, too, is to think about it from it being the opposite problem,
0:18:24 because so much of what happens in sort of the 20s and 30s, particularly when you have
0:18:31 these sort of huge mass broadcast mediums that can sort of capture attention at scale,
0:18:35 right, is that you have sort of this opposite problem, right, which is like the problem of
0:18:36 the mass media.
0:18:42 And this is something that Lippmann encounters when he has a job basically propagandizing for
0:18:47 entrance into World War I, which is that, whoa, this is an insanely powerful tool, and
0:18:53 you can blast out messages at scale to all these people and get them all thinking the same
0:18:55 way or moving the needle in some way.
0:19:01 And that was a problem that Lippmann wrestles with and all sorts of different folks wrestle
0:19:01 with.
0:19:06 And now we have the opposite problem, right, which is there is no getting anything like
0:19:07 that.
0:19:12 I mean, I’ve even come to feel one way to think about a public, right, is paying attention
0:19:13 to the same thing together.
0:19:19 And I’ve even come to feel like weirdly old man nostalgic for the Super Bowl or when
0:19:24 everyone’s talking about Love Island because it feels like, oh, here’s a thing that everyone
0:19:25 is paying.
0:19:27 I’m not watching Love Island, but I know that everyone is.
0:19:29 And I find some weird comfort in that.
0:19:34 I mean, I think it’s so funny because I think 20 or 30 years ago, this was so particularly if
0:19:40 you’re Gen X and you thought of yourself a sort of alternative in whatever way that like
0:19:43 that was stultifying mass culture and middle brow.
0:19:47 But I’ve come to miss that because I think it’s the thing we’re sort of missing, which
0:19:51 is everyone football is really the only thing left.
0:19:53 This sort of everyone pays attention to together.
0:19:54 The elections.
0:19:55 Yeah.
0:19:57 And elections, although we pay attention in such different ways.
0:19:59 But I do think there’s something.
0:20:01 Yeah, there’s something lost in that.
0:20:03 And I don’t know how to get it back.
0:20:08 And I’m also fully aware it might just be like I’m 46 and I’m just succumbing to the
0:20:09 nostalgia of middle age.
0:20:14 What was the last sort of like literary event that like every American paid attention to?
0:20:17 And my vote is for like Jonathan Franzen’s The Corrections that came out.
0:20:21 I was literally going to say Franzen The Corrections and then like the whole thing.
0:20:22 Yeah, right.
0:20:24 Everyone had it’s a great book, by the way.
0:20:25 I like Franzen as a writer.
0:20:27 Everyone had to read that damn book.
0:20:30 And since then, it’s never been the case that everyone’s read the same book at all.
0:20:30 Yeah.
0:20:33 I mean, with the possible exception of David Foster Wallace and Infinite Jazz.
0:20:36 But it’s one of those books everyone claims to read, but no one’s actually read.
0:20:37 I think people actually did read The Corrections.
0:20:42 And yeah, it’s weird because if you talk to like Gen Xers, and particularly I think people
0:20:47 like, again, Mark, who came from maybe not like poor mainstream America, what they always
0:20:49 say about the internet is like, oh, I finally found my people.
0:20:52 Like I was interested in whatever, some weird little niche obsession that in your little
0:20:54 Midwestern town, there’s nobody doing it.
0:20:57 Although what’s also interesting to me too, and I’m always curious about the tech.
0:21:01 I feel like I don’t have a great sense actually, and maybe you guys have a better sense of this.
0:21:03 I think A, it’s very funny.
0:21:08 We refer to the algorithm as like a definite article, like it’s sort of totemic or almost
0:21:12 like a deity, like the algorithm served me this.
0:21:16 And it’s always just unclear to me like how sophisticated the algorithm actually is.
0:21:20 Sometimes it just seems very dumb to me where I look at one thing and then it serves me
0:21:22 15 of that.
0:21:24 And it’s like, I probably could have coded that.
0:21:25 It’s not that impressive.
0:21:26 It’s not like that sophisticated.
0:21:31 Sometimes it does find things where I’m like, oh, wow, there’s something interesting happening
0:21:31 here.
0:21:35 I write about the book, like taking a gummy and like realizing that it’s just showing me
0:21:37 pictures of sandwiches.
0:21:40 And after like 30 minutes of just like, it’s like, oh, it knows that I’m high.
0:21:41 Like that’s a kind of magic.
0:21:48 One of the things I think is interesting is there will be these kinds of things that go viral
0:21:54 or have this kind of like allergic resonance that everyone is then talking about or everyone
0:22:01 does, particularly in Gen Z, like reference or even the way that people talk like this
0:22:06 sort of brain rot Gen Z way of talking is really interesting to me because that does seem
0:22:08 recognizably mass, right?
0:22:13 In the same way, the 1960s, it was like far out and groovy and all these things that like
0:22:20 there is this really distinct argot of Gen Z, but they’re getting it all not from mass
0:22:23 culture, but from basically algorithmic social media.
0:22:28 That to me is super fascinating because there’s something that’s enduring beneath the kind of
0:22:33 technological or institutional layers of how culture is being mediated and how attention
0:22:35 is being captured and bought and sold.
0:22:36 Yeah, I forget who it was.
0:22:44 I was mad at Ireland recently and visit family and the kids need that same argument as in
0:22:48 the U.S. versus if you had visited Ireland 20 years ago, you wouldn’t be able to understand
0:22:51 them because they spoke some dialect that would be unintelligible to American ears.
0:22:53 And you see that actually in the U.S.
0:22:55 I mean, this is the thing that I’m really obsessed with, too, in the U.S.
0:22:59 Like one of the things that’s really fascinating is the lack of sort of geographical distinction
0:23:00 in geographical cultures.
0:23:03 You’re starting to see this like this shows up in politics, right?
0:23:07 Which is that like it used to be the case that rural Mississippi and rural Minnesota just
0:23:08 voted very, very differently.
0:23:15 And that’s because the people that occupied those places had different ethnic backgrounds
0:23:20 in Minnesota, like Scandinavian socialists, but also different religious traditions, Southern
0:23:21 Baptist versus Lutheran.
0:23:27 And there were all these like defining sort of geographic features of different places that
0:23:28 would show up in politics.
0:23:37 And increasingly, there is this kind of homogenization happening where all sort of rural folks of
0:23:41 certain demographics more and more vote like each other and also listen to the same music
0:23:44 and have the same kind of omniculture.
0:23:51 And that’s also true of sort of big city, college educated, upper middle class folks where like
0:23:58 you could go to the cool neighborhood in Columbus and there’s going to be some restaurant called
0:24:04 like Stern and Stem that serves whatever like farm to table, like gastropub food.
0:24:11 Like it’s interesting that these sort of omnicultures supplant localism in this moment when we don’t
0:24:14 have massness as the sort of central cultural feature.
0:24:17 The place is called Pine and Spruce in Seattle, but it’s exactly what you’re saying.
0:24:20 And it’s exactly right.
0:24:22 It’s all called stem to stern or pig to snout.
0:24:27 It’s always this kind of folksy and you can imagine the artificially distressed wood and the
0:24:33 artificially distressed people and the exposed iron and the bare filament bulbs and the hoppy.
0:24:34 You know, it’s all the same.
0:24:35 It’s all the same.
0:24:38 But I think the person who first called this, by the way, I think Dan Savage, who was like the
0:24:43 sex columnist or whatever, for like the Seattle local reg was after the Bush election to be a
0:24:45 total Gen X or millennial.
0:24:47 He said it’s not blue states and red states.
0:24:50 It’s like a blue urban archipelago in a sea of red and that the culture was actually going
0:24:52 two directions and you see it everywhere.
0:24:56 But to quote our friend Balaji, the analogy he cites, if you remember back when we had
0:25:00 Windows machines, you would do something called defragmenting the disk.
0:25:04 And what that means is Windows is such an inefficient operating system that a single application
0:25:07 serves its memory throughout the hard drive and that it slows it down because you’re
0:25:07 reading from many places.
0:25:09 So you have to do this thing called defragmenting.
0:25:13 You know, it basically write the memory back to the same place that the app would start running
0:25:13 fast again.
0:25:17 And so what he feels is happening, and of course, he believes in this thing called the
0:25:20 network state, is that we’re basically defragmenting reality, right?
0:25:23 And what we need to solve this is take all the people that think like each other and get them
0:25:26 back in the same state and in some sense defrag the disk.
0:25:29 I think the realities of that are a little challenging, but it’s an interesting thought experiment to imagine
0:25:33 that if you were to do that, right, in some magical way, then you would end up with a
0:25:35 holy red and a holy blue country or something.
0:25:41 Yeah, I mean, this question of like how the sort of individuation of the current attention
0:25:47 marketplace and how it relates to what we share and what we don’t share is like there’s just
0:25:49 a bunch of really interesting questions in there.
0:25:52 Some of which, again, like the point you were making before about like I found my people on
0:25:57 the Internet, like for a trans kid, right, who’s in a place where there’s no one else like
0:26:02 them to find on the Internet people that are going through this.
0:26:03 It’s incredible, right?
0:26:09 And that’s true for a million different ways of being that you might be as a teenager or as
0:26:10 a young adult or as an old adult.
0:26:10 Right.
0:26:18 So there is a level of individuation built into the technology and the market technology
0:26:21 of it now, which drives towards ever more individuation.
0:26:23 But that individuation fails in really interesting ways.
0:26:25 Can I ask you a question, Antonio?
0:26:26 Because I’m really curious about this.
0:26:32 So to me, one of the big unsolved questions that I never quite wrestle to the ground, I talk
0:26:36 a little bit about this notion of like subprime attention is like there’s sort of two different
0:26:39 ways of looking at ad tech to oversimplify.
0:26:45 One is that it has all this incredibly personalized data that no advertiser has ever had access
0:26:45 to.
0:26:50 It’s not just talking to a generic man head of household in the 50s.
0:26:52 It’s talking to like you, Chris Hayes.
0:26:56 I know where you live and what you do and how would you make and like what your medications
0:26:57 are, right?
0:27:03 That allows it a level of sophistication and optimization that’s never been possible before.
0:27:08 And then the other side of this is like, for all of that being true, it’s amazing how
0:27:12 sort of bad, schlocky and ineffective most internet advertising is.
0:27:15 And I think there’s some truth to both of them.
0:27:19 And I kind of wrestle with them a bit in the book, but never feel like resolved on the question.
0:27:21 I’m curious how you think about that.
0:27:23 Yeah, no, it’s a good point.
0:27:23 I mean, it’s funny.
0:27:26 This is going to take us way back last time when I was on your show, probably because
0:27:27 it’s probably what I was on for.
0:27:29 The whole Cambridge Analytica scandal.
0:27:29 Exactly.
0:27:29 That’s right.
0:27:30 You don’t need to go out in the rabbit hole.
0:27:31 It’s a whole story.
0:27:33 Everyone’s probably heard as much as they want to hear about it.
0:27:34 There’s no way to do that, right?
0:27:37 And any media buyer will tell you that a lot of ads are still crap.
0:27:37 And it’s true.
0:27:39 It’s still kind of a statistical phenomenon.
0:27:43 Just to put numbers on it, the CTR, the click-through rate, like how often a user engages
0:27:48 on a thing like Instagram, at least 97% of people who looked at it were just completely
0:27:50 indifferent to the thing, right?
0:27:51 And just didn’t engage.
0:27:52 Yeah.
0:27:54 I mean, look, you can really do a technical deep dive there.
0:27:57 I mean, yes, that data is out there, but a lot of it’s super fragmented.
0:27:59 And it often exists in varying silos.
0:28:02 And then there’s grift and bullshit to be blunt in advertising.
0:28:05 People will sell you data that it’s not nearly as accurate as you think it is.
0:28:08 I mean, companies like Facebook that actually do have a first-party relationship with the
0:28:09 user often have better data for that reason.
0:28:12 And if nothing else, they have a stable notion of identity.
0:28:15 It’s the same Chris Hayes or the same Antonio that shows up on the app every time.
0:28:16 So they kind of know.
0:28:18 When if you just show up on a website, it’s based on a cookie.
0:28:21 And that’s so transient these days that it’s basically meaningless.
0:28:23 So there’s just a lot of practical problems.
0:28:25 And again, it’s funny.
0:28:27 I actually did a story for Wired about it because it pissed me off so much.
0:28:30 You know, the whole conspiracy theory that Zuck is listening to you through your phone
0:28:31 or whatever.
0:28:32 Well, so here’s my question.
0:28:33 This strikes me a lot.
0:28:35 So when you’re on TikTok, right?
0:28:37 TikTok is pretty ad heavy these days.
0:28:42 And I’m always struck that like every ad feels like something from like the Home Shopping
0:28:46 Network or like infomercial I would see at like 5.30 a.m.
0:28:49 before cartoons came on on a Saturday morning in my youth.
0:28:54 And I just I keep thinking to myself, I’m like, why isn’t Progressive Auto Assurance or
0:28:57 GM serving me ads here if this ad tech works?
0:29:03 Like, why are the people that spend like the most amount of money on advertising and for whom
0:29:07 this is particularly true for brands like car brands and insurance brands where, you know,
0:29:09 there’s unbelievable continuity, right?
0:29:11 People just keep the same thing.
0:29:16 And that’s why like traditionally, right, the younger demographics in advertising were so
0:29:16 valuable.
0:29:21 You want to sell someone their first Toyota because they’re just going to buy Toyotas after
0:29:21 that.
0:29:23 So you’re really purchasing a lifetime of purchases.
0:29:27 So if that’s the case and young people are in TikTok, it’s like if this is such effective
0:29:31 advertising and Detroit spends like why do they never get an ad there?
0:29:31 I don’t know.
0:29:36 Maybe they’re just dinosaurs and they’re wrong or maybe it’s not as good, but I just don’t
0:29:37 know the answer to it.
0:29:38 It’s totally true.
0:29:39 I mean, there’s TikTok.
0:29:40 There’s an app called Whatnot.
0:29:43 I mean, it’s basically people stomping for a product and then buying from it.
0:29:45 I think it works for D2C brands, right?
0:29:49 It’s very niche brands that people associate with and buy very easily.
0:29:50 I mean, you’re absolutely correct.
0:29:52 That’s the whole point of brand advertising, right?
0:29:55 Again, to use more ad tech ease, high LTV, high lifetime value, right?
0:30:00 If you become a Ford buyer for the continuity of your life, you Mustang and you buy an SUV when
0:30:03 you have a family, that’s literally kind of thousands of dollars a lifetime value.
0:30:04 I’m not sure to be honest.
0:30:05 I don’t know what the answer is.
0:30:07 Yeah, no, I don’t know if it’s like a thing to answer.
0:30:12 Just like, it’s so striking to me that that’s the case and particularly how much that still
0:30:19 dominates these sort of older media that’s still when you watch a big sports game on a
0:30:21 broadcast if you’re watching Sunday football, right?
0:30:27 And there are also that scale that is very hard to replicate with TikTok.
0:30:31 To your point about how scalable the TikTok stuff is, like, if you make some, like, nifty
0:30:36 little gizmo that you can drop ship from China and you have low overhead, you can sell it to
0:30:38 a small number of people on TikTok.
0:30:40 That’s not the way you’re going to sell a Ford F-150.
0:30:43 So there’s a sort of interesting scale question there.
0:30:48 But I also just wonder, like, what the next…
0:30:53 Basically, we’ve seen these sort of movements from text-based to video.
0:30:59 And if there’s something after that, like, basically, what the next big thing is, what
0:31:04 is the next bite dance that does to TikTok what TikTok sort of did to the incumbents and
0:31:05 Facebook and stuff like that?
0:31:07 I mean, my vote would be for AI.
0:31:08 To frame it as a bigger thing, right?
0:31:10 We’re going from sort of a textual culture.
0:31:12 I think you said somewhere in the book that, like, the thought that we’re going to be reading
0:31:17 a 6,000-word thought piece in The New Yorker, I doubt most young people can even do that.
0:31:20 I can barely do it, actually, even though I used to do it regularly.
0:31:22 I’ve forced myself to do it, which is weird.
0:31:26 But it doesn’t mean we’re going back to an oral phase in which we’re going to be citing
0:31:28 Homer in iambic pentameter or something, right?
0:31:32 The oral phase, which Neil Postman writes about a lot, that was very mnemonic, right?
0:31:33 It was very memory intensive.
0:31:38 People had these sort of recognizable tropes and forms, but they also remembered a lot.
0:31:40 They knew they could recite poetry.
0:31:41 They could recite biblical verses.
0:31:44 There was an enormous amount stored in the brain.
0:31:46 Yeah, I mean, I think we were at the tail end of that.
0:31:50 I still had to memorize Shakespeare’s sets in high school, which I don’t even really have
0:31:51 to do anymore.
0:31:54 But I think the oral-textual divide wasn’t even about the format.
0:31:55 It’s the form of thought, right?
0:31:58 And when I talk to the AI, I don’t think they’re human, by the way.
0:32:00 I think everyone who thinks they’re human is crazy.
0:32:03 But I talk to it as if it were a human, because that’s actually the best way to interface with
0:32:03 it.
0:32:07 And you engage in this sort of Socratic dialogue with it, trying to get an answer.
0:32:09 Sometimes the AI bot gets crazy and hallucinates.
0:32:11 I mean, so do humans, and you kind of have to put it back on the right course.
0:32:13 But I think that’s the thing, right?
0:32:17 And talking to the human, and when you manage to map that to e-commerce or travel, when I
0:32:22 can say, I want to go, which I do, go to France for a week in August to see my daughter
0:32:23 and spend a week in Brittany.
0:32:28 So book me an Airbnb, book me a flight, and make it not cost more than $3,000, if that’s
0:32:28 impossible.
0:32:31 And just give me a button that says go and do it.
0:32:33 I mean, I would literally bet large portions of my net worth in time.
0:32:34 That’s the future.
0:32:35 That’s going to be the gateway.
0:32:38 AI is going to upstream everything about the consumer experience.
0:32:40 That I totally agree with.
0:32:44 My bigger question is what it does to, like, dicking around on your phone.
0:32:52 Like, I agree that there are these very obvious, to me, use cases, incredibly useful cases of
0:32:57 if it gets good enough, particularly the sort of agentic idea that you’re describing there.
0:32:58 There’s lots of stuff.
0:33:03 And the idea of it essentially becoming, in the way the browser was, like, the sort of
0:33:05 interface being the thing that you’re talking to, right?
0:33:08 But that’s to do all the useful stuff, right?
0:33:11 Like, my question is, like, what does it do?
0:33:15 Like, when you get that screen time notification on your phone, right, what you’re doing is you’re
0:33:19 looking, a lot of it is not booking flights and doing stuff.
0:33:20 Some of it is.
0:33:26 But what it does to that part of it is sort of the big, I don’t have the answer.
0:33:27 I don’t think anyone has the answer.
0:33:30 I think if you had the answer, you could probably make a fortune with the right bet.
0:33:33 But I just don’t know what it’s going to do there.
0:33:39 The place we started, which is my wonder if it produces a kind of brute force level of
0:33:42 pollution that then actually becomes hard to manage for the platforms.
0:33:53 Two, it gets so good that it creates this kind of surreal new genre of, like, pure attentional
0:34:00 drug, like the slot machine that, like, people just watch and doesn’t have any, like, real
0:34:06 meaning because it’s just this complete, like, neural network produced hive mind creation.
0:34:10 Or people just opt for humans and it doesn’t really change it that much anyway.
0:34:12 Those seem like the three big options.
0:34:16 There’s a paper from Yahoo Research in 2001 or 2002 or something in which they use the
0:34:18 multi-iron bandit problem, which precedes advertising, obviously.
0:34:20 But then they adapt it.
0:34:23 Because if you look at a page, there’s several bandits there, right?
0:34:24 Which are different ads.
0:34:28 And if you click on one, there’s a chance of it leading to a conversion or not.
0:34:30 And a lot of the math behind this one I forgot is that.
0:34:31 Oh, that’s cool.
0:34:34 As a side footnote, I think look at the Apple Vision as an example, right?
0:34:35 It’s, like, way too bulky.
0:34:36 It’s way too expensive.
0:34:37 It’s not quite good enough yet.
0:34:41 No, and I think the thing about the Apple Vision and VR to me is, and I write some, like,
0:34:45 in Siren’s Call about this, like, at a certain point, if you’re thinking about this commodity
0:34:47 and you come up against these hard limits, right?
0:34:50 So at a certain point, it’s like, okay, we have people looking at their phones as much
0:34:50 as they possibly can.
0:34:53 If we keep them awake a little longer, we can get more out of it.
0:34:54 We start going to children.
0:34:57 We can go, we sold the phone to everyone in the world.
0:35:01 Like, how do you keep expanding the frontier of the commodity to mine?
0:35:03 And that’s why VR is so important from that.
0:35:10 Because if you have it literally every second, you’ve just unlocked this enormous new landscape.
0:35:16 You’ve discovered these, like, new deep sea reserves, right, that no one had before.
0:35:19 The now, every, literally every second, right?
0:35:24 If it’s on every waking second, you can now mine that attention and extract and commodify it
0:35:25 in a way that wasn’t before.
0:35:30 And because Apple did this once before with the smartphone as the crucial piece of hardware
0:35:34 to massively expand that supply, like, that’s, I think, the kind of commercial
0:35:37 logic of the device, right?
0:35:39 Whatever it does after that.
0:35:41 But it’s only going to be waking hours, Chris, 16 hours.
0:35:44 We’re going to need full Neuralink to actually run ads inside your dreams, actually.
0:35:46 No, literally, but…
0:35:47 We’re going to need the full 24.
0:35:48 That’s the point.
0:35:52 It’s the frontiers, like, there’s a certain rapaciousness here.
0:35:53 No, well, of course there is.
0:35:55 I can see that the ad tech industry is completely rapacious.
0:35:59 So at the end of Chaos Monkey, I think in the, like, afterward that I wrote in the second
0:36:03 edition or whatever it was, I mentioned that Facebook is running out of people, right?
0:36:04 Exactly, right, yeah.
0:36:09 Like, I made the joke that they’d start breeding new humans just to become Facebook users to
0:36:10 sell them ads or something, which obviously is ridiculous.
0:36:13 But I think it’s a similar concept that there’s actually only so many attention hours in the
0:36:14 day, right?
0:36:18 Yeah, and you do come up, you know, I’m definitely not like a degrowth leftist at all.
0:36:21 I would call myself like a pro-growth left liberal.
0:36:23 Abundance, I think is what we call them these days.
0:36:24 Abundance, Chris, abundance.
0:36:26 Yes, distinct from that in some ways.
0:36:34 But I guess my point would be, there are these fundamental tensions between the need for
0:36:38 endless growth and what boundaries you hit up against and what it does to people.
0:36:45 And you could create a company that was just provided a useful service and printed money
0:36:45 for a long time.
0:36:48 And in some ways, Google had that for a long time.
0:36:51 Like, they had a genuinely useful service.
0:36:53 It genuinely transformed how we got information.
0:36:55 It was insanely profitable.
0:36:57 It was a money printing machine.
0:37:00 And it wasn’t enough, right?
0:37:02 Like, you have to grow it past that.
0:37:09 And in some ways, with the use of AI, like, the entire open web that was the basis for the
0:37:11 first iteration of Google is collapsing underneath it.
0:37:16 And what comes after when you talk about this sort of interface we have with the AI chatbot,
0:37:23 I find that for my work, the only way that I can use AI is with web search on and telling
0:37:28 it to cite everything because I can’t risk a hallucination.
0:37:33 And so what I need to do is I need to check everything against, like, did a person report
0:37:33 this?
0:37:35 Did a human being go and call this person?
0:37:36 Is that name right?
0:37:44 And I do wonder, like, the surreality that we’re all entering into once the kind of growth
0:37:48 revenue model collapses the open web, like, what is left?
0:37:53 The stuff that’s actually being used to create all that useful information, who’s going to
0:37:55 maintain that if the whole thing collapses?
0:37:59 Yeah, I think I cite Edward Abbey and Chaos Monkey’s ampergraph in there somewhere that
0:38:01 growth for the sake of growth is the ideology of the cancer cell.
0:38:02 Right, exactly.
0:38:03 Yes, exactly.
0:38:04 Kind of true.
0:38:05 I mean, yeah, it’s funny.
0:38:09 I cite The Abundance Agenda, which, you know, is Ezra’s book that basically is telling the
0:38:12 left, OK, don’t be dumb about capitalism and growth.
0:38:13 Like, it could actually be a good thing, right?
0:38:18 And I think there’s another book by Alex Karp of Palantir called The Technological Republic
0:38:21 that’s basically telling the right, look, I mean, the market isn’t actually the solution
0:38:24 to everything and doesn’t actually necessarily tell us what is good.
0:38:27 Like, what are the ultimate good ends of social life?
0:38:29 The market isn’t necessarily there to answer that question.
0:38:33 I mean, I think Peter Thiel famously asked the question, we’re promised flying cars and we
0:38:34 got 140 characters.
0:38:36 I’ll say this on that last point.
0:38:44 It’s interesting to me to compare the amount of capex and media attention to AI versus the
0:38:50 kind of insane solar power revolution happening right now, because it’s a little like the flying
0:38:51 car 140 character thing.
0:38:58 we’ve been burning carbon for energy before the Industrial Revolution, right?
0:39:03 Like we’ve been burning trees for wood stoves and then we got the Industrial Revolution and
0:39:05 we’ve been burning fossil fuels specifically.
0:39:11 But a world where we get to, which is totally possible now, I think from an engineering standpoint
0:39:18 of like the marginal cost of energy basically dropping to essentially zero, where we just get the solar
0:39:25 solar tech good enough that we kind of reclaim the abundance, to use a word, of the sun.
0:39:26 They write about that in the book a bit.
0:39:31 Like that is such a wildly revolutionary thing to happen.
0:39:35 And partly because it’s not that lucrative, I think in the end.
0:39:39 But also it’s not that intentionally salient in the way that AI is.
0:39:44 And so you do have this weird thing happening, which is like all the capex, all the attention
0:39:47 is like the future technology, the epic defining technology is going to be AI.
0:39:49 And I don’t know if I agree with that or not.
0:39:50 I understand why people, some people think that’s true.
0:39:52 I understand some of the skepticism.
0:39:58 But it’s interesting to contrast that with the fact that like right now we’re in the midst
0:40:01 of an apocal transformation in how we get energy.
0:40:04 And it’s just not that sexy.
0:40:06 You didn’t mention nuclear, Chris.
0:40:07 Nuclear might have to be part of the portfolio.
0:40:13 But I agree with you that having basically at the margins the intelligence and at the margin
0:40:14 free energy is going to be nuts.
0:40:15 Yeah, that’s right.
0:40:22 Yes, those two things combined like might be like within pretty close.
0:40:23 And obviously they’re related, right?
0:40:25 Because the energy consumption on the compute is enormous.
0:40:26 Yeah.
0:40:27 And you’re going to need the build out.
0:40:30 But if you got to that point, Jesus, what does that even look like?
0:40:30 I don’t know.
0:40:32 Chris, I know you got to run.
0:40:34 So thank you so much for coming on the podcast.
0:40:38 The book is The Siren’s Call, How Attention Became the World’s Most Endangered Resource.
0:40:39 Thanks so much for coming on the show.
0:40:40 I really enjoyed that.
0:40:41 Thanks so much.
0:40:46 Thanks for listening to the A16Z podcast.
0:40:51 If you enjoyed the episode, let us know by leaving a review at ratethispodcast.com slash A16Z.
0:40:54 We’ve got more great conversations coming your way.
0:40:55 See you next time.
What happens when AI starts generating content for everyone—and no one wants to watch it?
In this episode, MSNBC’s Chris Hayes and ad tech veteran Antonio García Martínez join a16z General Partner, Erik Torenberg to unpack the shifting economics of attention: from the rise of “AI slop” and spammy feeds to the difference between what we want to pay attention to and what platforms push on us.
They explore:
- How AI changes what gets created and what gets seen
- Why internet ads still mostly suck
- The return of group chats—and the slow death of mass culture
Based on Chris’s new book The Sirens Call, this is a candid look at what AI might amplify or break in our online lives.
Timecodes:
0:00 Introduction
1:47 Meet the Guests: Chris Hayes & Antonio Garcia Martinez
3:01 The Economics of Attention & AI Slop
6:38 Acquisition vs. Retention: The Attention Challenge
10:01 Fame, Identity, and Social Media Fragmentation
13:21 The Group Chat Solution & Privacy
16:01 Business Models, Community, and Technology
19:01 Mass Culture, Fragmentation, and the Algorithm
23:01 Ad Tech, Personalization, and Advertising Effectiveness
29:01 The Future: AI, Growth, and Abundance
Resources:
Find Chris on X: https://x.com/chrislhayes
Find Antonio on X: https://x.com/antoniogm
Learn more about Chris’ book ‘The Sirens’ Call’: https://sirenscallbook.com/
Learn more about Antonio’s book ‘Chaos Monkeys’: https://www.harpercollins.com/products/chaos-monkeys-antonio-garcia-martinez?variant=32207601532962
Stay Updated:
Let us know what you think: https://ratethispodcast.com/a16z
Find a16z on Twitter: https://twitter.com/a16z
Find a16z on LinkedIn: https://www.linkedin.com/company/a16z
Subscribe on your favorite podcast app: https://a16z.simplecast.com/
Follow our host: https://x.com/eriktorenberg
Please note that the content here is for informational purposes only; should NOT be taken as legal, business, tax, or investment advice or be used to evaluate any investment or security; and is not directed at any investors or potential investors in any a16z fund. a16z and its affiliates may maintain investments in the companies discussed. For more details please see a16z.com/disclosures.
Leave a Reply