Why TikTok matters

AI transcript
0:00:01 Megan Rapinoe here.
0:00:05 This week on A Touch More, it is sports with a capital S.
0:00:07 We break down the WNBA semifinals,
0:00:10 Asia’s historic fourth MVP,
0:00:12 and all the other post-season awards.
0:00:16 Plus, we put our biggest bicker of the week to bed.
0:00:18 Check out the latest episode of A Touch More
0:00:19 wherever you get your podcasts and on YouTube.
0:00:24 Hey, it’s Sean here.
0:00:26 If you’re a fan of the show,
0:00:29 do me a favor and consider becoming a Vox member.
0:00:32 If you do, you can get this podcast without ads.
0:00:34 That’s right, no ads.
0:00:37 There’s also a member-exclusive newsletter.
0:00:39 And you’ll help Vox end the show,
0:00:42 which is good news for all of you philanthropists out there.
0:00:45 If you sign up now, you’ll save 20 bucks
0:00:46 on an annual membership.
0:00:48 That’s more than 30% off.
0:00:51 Go to vox.com slash members to join.
0:00:52 Now here’s the show.
0:00:58 I’ve always thought of TikTok as digital.
0:01:02 And maybe at one point, that’s all it was.
0:01:06 But it has grown into something stranger
0:01:09 and more consequential than that.
0:01:15 TikTok is a hugely influential global speech platform.
0:01:19 It’s a billion-user machine for shaping culture.
0:01:26 And it’s a corporate monster caught in the crossfire of two superpowers.
0:01:32 It’s also, importantly, a black box
0:01:38 that runs on an algorithm built to know you better than you know yourself.
0:01:45 Which is why it’s quietly rewired how billions of people experience the internet.
0:01:52 And behind that addictive feed lies a geopolitical fight
0:01:54 that could reshape the landscape of social media
0:01:58 and have a massive impact on American politics.
0:02:05 I’m Sean Elling, and this is The Gray Area.
0:02:21 My guest today is Emily Baker White,
0:02:25 author of Every Screen on the Planet, The War Over TikTok.
0:02:28 Emily has spent years covering TikTok,
0:02:31 interviewing whistleblowers,
0:02:33 combing through leaked documents,
0:02:36 tracing the battles within the company,
0:02:40 and being targeted by the company for her reporting.
0:02:43 Her book is a definitive account of how TikTok became both
0:02:47 a cultural superpower and a geopolitical flashpoint.
0:02:50 We originally spoke a few weeks back.
0:02:54 At that time, the U.S. and China had not reached an agreement
0:02:57 to transfer TikTok ownership to an American company.
0:02:59 That is still the case,
0:03:02 but it appears a framework is now in place.
0:03:04 And so a couple of days ago,
0:03:08 I called Emily back to ask her about everything that’s happening.
0:03:11 We’ve added that to the end of this episode.
0:03:21 Emily Baker White, welcome to the show.
0:03:22 So happy to be here.
0:03:26 You’ve been reporting on TikTok for a while now.
0:03:31 It is clearly not just another social media platform.
0:03:33 For many reasons, we will get into,
0:03:35 but I just want to start with something very simple.
0:03:39 Why is this app so freaking addictive?
0:03:45 So the way that TikTok’s parent company’s founder,
0:03:48 Zhang Yuming, described it,
0:03:53 he thought that information could do a better job of finding people
0:03:55 than people could find information.
0:03:59 And so what he meant by that was,
0:04:01 especially back in the day,
0:04:02 when you went to a Facebook,
0:04:03 when you went to a Twitter,
0:04:05 you would go,
0:04:06 you would follow people,
0:04:08 you would opt into things,
0:04:09 you would search for things,
0:04:10 you would try to find the thing you’re looking for.
0:04:12 On TikTok,
0:04:13 when you open TikTok,
0:04:14 it just goes.
0:04:15 You don’t do anything.
0:04:16 It just goes.
0:04:18 You can swipe.
0:04:19 You cannot swipe.
0:04:22 It’s going to figure out how long you spend on each video,
0:04:26 how much you’re engaging with the various topic.
0:04:30 And the experience is so frictionless,
0:04:32 so seamless that you’re not doing anything
0:04:34 and it’s figuring you out,
0:04:35 right?
0:04:40 You essentially seed your agency over figuring out what you want online
0:04:44 to someone or something else that’s going to do that for you.
0:04:47 And it’s going to do it so beautifully and so seamlessly
0:04:50 that you are better entertained than you would be
0:04:51 if you were trying to entertain yourself.
0:04:53 Yeah,
0:04:56 I just want to stress that because it’s that,
0:04:59 like what TikTok seems to do the best
0:05:01 is exactly what you just said.
0:05:03 Take away your agency.
0:05:05 It feeds you what it thinks you want
0:05:07 without you having to ask for it.
0:05:08 And to me,
0:05:12 that really does exemplify where all this digital tech has brought us.
0:05:13 You know,
0:05:13 we just,
0:05:17 we do less and less thinking and hand over more and more
0:05:20 of our attentional bandwidth to these devices
0:05:22 and the companies that own them.
0:05:25 And TikTok just seems to be the best at that.
0:05:26 And it’s sneaky,
0:05:27 right?
0:05:27 Because we like it.
0:05:28 If we didn’t like it,
0:05:29 we wouldn’t use it.
0:05:32 And so we,
0:05:34 we’re sort of seeding our agency,
0:05:35 right?
0:05:36 We’re thinking less and less.
0:05:38 We’re turning over more and more agency
0:05:42 without even really knowing that that’s what we’re doing.
0:05:47 And I think that is the genius of a product like TikTok.
0:05:54 It makes us want to give up that agency because something about it is so pleasant.
0:05:56 Why?
0:05:57 Why do you think it’s so pleasant?
0:05:59 It’s just not thinking,
0:06:00 the joy of not thinking?
0:06:01 Yeah.
0:06:03 Decision fatigue is a real thing.
0:06:05 I think the,
0:06:09 also the moments that we now turn to our phones,
0:06:12 that we didn’t turn to our phones before they existed in the way that they do today.
0:06:13 Like,
0:06:15 you didn’t used to have to do anything in the checkout line.
0:06:19 You could just stand there in the checkout line and like be a person waiting for your turn.
0:06:20 Now,
0:06:21 you can’t just like,
0:06:23 you can’t,
0:06:24 the kids would say,
0:06:26 you can’t raw dog the checkout line,
0:06:26 right?
0:06:28 And like that.
0:06:29 It’s intolerable now.
0:06:29 Right.
0:06:30 But like,
0:06:30 why,
0:06:32 when did that become intolerable?
0:06:36 When did we have to be doing something in like a tiny little moment like that?
0:06:37 The bus stop,
0:06:37 right?
0:06:40 These are moments when we used to just exist in the world.
0:06:45 And now we exist in the world and immediately turn to our phones for some sort of comfort.
0:06:51 We’ve been conditioned to expect stimuli all the time.
0:06:52 And the absence of it is like,
0:06:54 completely intolerable.
0:06:56 Probably not great.
0:06:57 Not great.
0:06:58 So,
0:06:58 okay.
0:07:00 TikTok has this for you feed,
0:07:01 right?
0:07:04 And it is fundamentally a prediction machine.
0:07:07 It is based on your revealed preferences,
0:07:10 not what you tell it you like or what you think you like.
0:07:12 So when you go on TikTok,
0:07:15 you don’t really choose what you see,
0:07:15 right?
0:07:18 You open the thing up and it just happens to you.
0:07:22 How do you think this changes the user psychology?
0:07:28 And really the kind of content that thrives there compared to something like Facebook or
0:07:31 Instagram or X or whatever,
0:07:32 which are similar in lots of ways,
0:07:34 but not quite TikTok.
0:07:42 So this is another place that I think TikTok started a trend that went beyond just TikTok itself,
0:07:50 which is that compared to five or seven years ago on the sort of internet platforms that we exist on and spend time on,
0:07:56 we spend a lot less time looking at posts created by people we know.
0:07:59 So when Facebook became a big thing,
0:08:01 even when Instagram became a big thing,
0:08:04 we’re mostly seeing our friends’ vacation photos.
0:08:05 We’re seeing our cousins’ babies,
0:08:06 right?
0:08:08 That’s mostly what the thing was for.
0:08:12 And then also maybe you followed some news and whatnot.
0:08:16 But that has been largely replaced not only on TikTok,
0:08:19 but also on Facebook and Instagram with a sort of professionalization.
0:08:24 Most of what you see now on any of those platforms is influencers,
0:08:24 right?
0:08:28 It’s people who are producing content to entertain you,
0:08:30 to hold your interest,
0:08:32 and to make money for the people who are producing the content.
0:08:36 So it’s not a social network in any meaningful sense,
0:08:36 right?
0:08:39 It is just pure diversion entertainment app.
0:08:43 It’s as much like Netflix as it is like OG Facebook.
0:08:46 It’s just that’s not what TikTok is.
0:08:49 You don’t go to TikTok to see your friends.
0:08:49 Right.
0:08:54 I have proudly held the line and I’ve stayed off TikTok.
0:09:01 But at the urging of my dear editor, Jorge,
0:09:05 I signed up and started experimenting with it
0:09:07 in preparation for this conversation.
0:09:10 I’ve used it probably seven, eight times.
0:09:21 And my God, Emily, I mean, it really is pure uncut social media heroin blasted right into
0:09:22 your eyeballs.
0:09:26 I log on there and I feel like I’m thrown into a trance.
0:09:28 I cannot look away.
0:09:31 And then it just builds out your profile the more you use it.
0:09:34 And again, I’ve only been on there a handful of times,
0:09:43 but I can see it learning my mind and crafting in real time the perfect digital drug to just
0:09:48 keep me coming back, which is why I’m getting the hell off immediately after this taping.
0:09:49 But I get the appeal.
0:09:50 I really do.
0:09:57 And my God, is it exquisitely good at keeping your attention from the minute you log on.
0:10:01 To the envy of its would-be competitors.
0:10:05 The folks at Instagram Reels are really threatened by the fact that you said that,
0:10:06 but they’ve heard it before.
0:10:14 So as far as the content goes, you do write up a good bit about the moderation policies at TikTok.
0:10:16 We know about the algo.
0:10:18 It’s kind of what we’re talking about now.
0:10:25 But what role do actual humans at TikTok play in moderating and shaping the content?
0:10:30 And I mean, how much has this evolved as the company has gotten bigger and bigger?
0:10:31 Yeah.
0:10:34 So there are like three or four questions in there.
0:10:35 I’m going to take them in turn.
0:10:35 Sorry.
0:10:36 I do that sometimes.
0:10:37 No, that’s okay.
0:10:48 Today at TikTok, I would say that moderation is done largely like it is at other big platforms,
0:10:51 other big user-generated content platforms.
0:10:59 There are algorithms that detect things that violate the content rules that TikTok has set.
0:11:07 Those algorithms are tuned on based on the decisions of many, many, many humans enforcing those rules.
0:11:08 The algorithms are also imperfect.
0:11:14 And so humans are constantly checking the algorithms to try to get it right, but they’re not always going to get it right.
0:11:27 Today, TikTok’s content policies, their decisions about what they’re going to allow on the platform are pretty similar, at least in the United States, to the content policies of their main competitors.
0:11:46 And so one thing that the book talks a lot about is sort of how TikTok began as a social media platform that came out of China was, you know, being sort of directed from China and had much more Chinese content policies.
0:11:55 And then sort of how they tried to Westernize those policies and make them sort of appropriate for other markets in the U.S., in Europe, and in other places around the world.
0:12:02 And one thing that does sort of distinguish TikTok from other platforms is this thing called the heating button.
0:12:09 There was, is a button that some people inside TikTok could press.
0:12:13 It used to be a much larger number of people back in the day.
0:12:15 They’ve sort of clamped down on it called the heating button.
0:12:22 And when you press the heating button, you get to decide how many impressions you’re about to give a video.
0:12:25 And you can give it 5,000 impressions.
0:12:27 You can give it 50,000 impressions.
0:12:29 You can give it 5 million impressions, right?
0:12:36 And if that heating button is pressed internally, that’s sort of an override of the basic recommendations algorithm.
0:12:41 And that many people are going to see that video almost instantaneously.
0:12:44 And then probably some of those people are going to like it.
0:12:45 They’re going to share it.
0:12:47 And then the algorithm actually is going to say, wow, this thing’s getting a lot of attention.
0:12:48 We should show it to more people.
0:12:52 And it’ll get even more than whatever you sort of designate within the heating system.
0:12:56 You say some people can pull that lever.
0:12:57 What people?
0:12:59 How many people do you know?
0:13:01 It used to be a whole lot of people.
0:13:09 And so especially when TikTok was first getting off the ground, the algorithm was not always as beautifully sticky as it is now.
0:13:11 It wasn’t always that persuasive.
0:13:15 And it used to recommend not great stuff.
0:13:20 A short two-second blurry video that doesn’t say anything.
0:13:22 Or a video with really low resolution.
0:13:27 Or a video with like glitching music.
0:13:28 Right?
0:13:34 And TikTok, the company, wanted to teach the algorithm that that stuff wasn’t going to fly.
0:13:40 And so especially in the early years, there was a lot of human curation on TikTok.
0:13:48 There was a lot of human intervention trying to teach the machine what people were actually going to like.
0:13:53 And so the humans would use this heating feature to do that.
0:13:54 They would say, this is good content.
0:13:55 This is good content.
0:13:57 This we want to show a lot of people.
0:13:57 Right?
0:14:00 And that’s just editorial discretion.
0:14:00 Right?
0:14:03 That’s just some people saying, I think people are going to like X.
0:14:07 And that was sort of originally what taught the algorithm what to do.
0:14:12 And you ask how many people have control over the heating button today.
0:14:15 Eventually, marketing teams got their hands on this.
0:14:18 And they were trying to woo a creator to come over.
0:14:21 And they were trying to woo a potential partner, a potential advertiser.
0:14:25 And show those people, hey, TikTok’s really worth your time.
0:14:31 And one great way to convince someone that TikTok’s worth their time is to have their first couple of videos do really, really well.
0:14:36 And so they would artificially heat those people’s videos to make them think, wow, I could do really well on TikTok.
0:14:38 I could make money on TikTok.
0:14:40 And then they would, you know, sign a partnership agreement or something.
0:14:42 And their videos might not do so well after that.
0:14:48 And so the heating tool not only became a sort of editorial boost, it also became a marketing tool.
0:14:50 I reported the existence of this tool.
0:15:01 Partially before that, but certainly after that, there were editorial teams at TikTok that did say, okay, we’re going to way rein in how many people have access to this tool.
0:15:07 We’re going to write a lot of policies on exactly how it’s allowed to be used and how it isn’t allowed to be used.
0:15:10 Like, you cannot just anyone push this button willy-nilly.
0:15:17 I mean, it seems like Twitter has always kind of done something like this, too, in terms of, like, boosting certain things and not boosting others.
0:15:20 Is there something fundamentally different about what TikTok does?
0:15:25 Do they have more power over content and trends than, say, Twitter?
0:15:26 Yeah.
0:15:28 So a couple things on this.
0:15:32 I would say Facebook, too, obviously promotes some things over other things.
0:15:33 So does Instagram.
0:15:34 So does YouTube.
0:15:41 First of all, if you work at another social media company and you have access to the heating button, please go ahead and reach out to me on Signal.
0:15:42 I would love to speak to you.
0:15:51 That aside, what made the TikTok heating button so striking is that it’s literally a big red button that says heating.
0:15:54 Like, I don’t know if it actually says heating, but it’s called the heating button.
0:16:02 And it’s sort of more explicit than what I have at least heard or become aware of at other companies.
0:16:09 Now, Facebook and Twitter, they still have systems of promotions and boosts.
0:16:14 And, oh, we’re trying to, you know, boost content about X topic and not Y.
0:16:22 Like, you know, Facebook famously sort of started demoting hard news content after the platform became very divisive and everyone was saying, I’m seeing too much politics.
0:16:27 And so all platforms, when they rate content, they make choices about how much to boost things.
0:16:36 The heating button does seem a little different just in that it’s, like, very explicit and particular in a way that I haven’t heard of at other social media companies.
0:16:38 But, again, if you know about this, please tell me.
0:16:39 I would love to write about it.
0:16:41 Do it.
0:16:49 I really wonder what you think about TikTok as a cultural and political force.
0:16:56 We have seen how influential Facebook and Twitter have been, not just here, but all over the world.
0:17:00 Do you think TikTok is different in some fundamental way?
0:17:06 Like, not just as a platform, but as a mechanism for social change?
0:17:13 Do I think it’s different than Facebook and Twitter is kind of an interesting question.
0:17:17 Because I think Facebook and Twitter are really big engines of social and political conversation.
0:17:18 Agreed.
0:17:25 And so I guess I think Twitter is a lot smaller than we sometimes give it credit for.
0:17:28 TikTok is huge.
0:17:34 And on the question of how is it different when it comes to its sort of cultural and political force.
0:17:37 First of all, you talk about just how sticky it is, right?
0:17:42 You don’t get that feeling when you’re on its competitor apps necessarily.
0:17:44 And I think that’s worth noting.
0:17:47 People spend a huge amount of time there.
0:17:55 And it is really, really, really big on the order of a 2019, 2020 Facebook, for sure, if not bigger.
0:18:01 And the other thing to remember is that at least when you were on Facebook, you were making a bunch of choices about what to see.
0:18:02 Right.
0:18:05 On TikTok, you’re making very few choices about what to see.
0:18:17 And so you’re seeding more agency over the political, cultural, sociological discourse that you’re seeing on that platform to faceless machine over here.
0:18:18 Right.
0:18:26 And I do think, especially when we’re talking about our civic discourse, our cultural discourse, right?
0:18:36 Seeding that much control, it just ultimately gives the thing more power over you and over culture and society.
0:18:52 Support for the show comes from Shopify.
0:18:56 When you’re starting a new business, it can feel like you’re expected to do it all.
0:18:59 Marketing, design, and everything in between.
0:19:01 Even if you’ve never done half of it before.
0:19:07 What you really need is a tool that helps you reach your goals without having to master every skill yourself.
0:19:11 For millions of businesses, that tool is Shopify.
0:19:19 Their design studio lets you build a beautiful online store that matches your brand style, letting you choose from hundreds of ready-to-use templates.
0:19:24 You can also set up your content creation by using their helpful host of AI tools.
0:19:32 And you can even create email and social media campaigns with ease and meet your customers wherever they’re scrolling or strolling.
0:19:37 See why Shopify is the commerce platform behind millions of businesses around the world.
0:19:40 If you’re ready to sell, you can be ready with Shopify.
0:19:45 You can turn your big business idea into a reality with Shopify on your side.
0:19:52 You can sign up for your $1 per month trial period and start selling today at Shopify.com slash Vox.
0:19:55 Go to Shopify.com slash Vox.
0:19:57 Shopify.com slash Vox.
0:20:05 Support for the show comes from Bombas.
0:20:07 You know those songs that make you just want to run?
0:20:11 Maybe you’re in the middle of a workout and you’re totally spent.
0:20:17 But then that one track hits, maybe it’s tool, maybe it’s a little Wu-Tang, and boom, you’re back.
0:20:19 That’s kind of like what Bombas socks do.
0:20:21 You put them on and you’re ready to go.
0:20:28 This summer, you can get moving with Bombas socks, specially designed to wick sweat, keep you cool, and prevent blisters.
0:20:31 You probably know I’ve tried out Bombas for myself.
0:20:33 They are my favorite socks.
0:20:40 In the winter and fall, I rock their wool socks because I wear boots a lot, and they’re super thick but still breathable.
0:20:48 Now that it’s 500 degrees down here on the Gulf Coast, my go-to socks are their lightweight athletic socks.
0:20:52 They are comfortable, and they really do keep you cool and dry.
0:20:58 You can head over to Bombas.com and use code GRAYARIA for 20% off your first purchase.
0:21:03 That’s B-O-M-B-A-S dot com code GRAYARIA at checkout.
0:21:06 That’s Bombas.com code GRAYARIA at checkout.
0:21:13 Support for the Gray Area comes from Found.
0:21:18 For small business owners, bookkeeping and taxes don’t just drain your wallet.
0:21:20 They steal your time.
0:21:25 Think of all the hours you’ve already spent buried in spreadsheets instead of building your business.
0:21:31 Hours you could have spent playing catch with little Tina or playing house with tiny Frank.
0:21:36 If it’s hit a little too close to home, it’s probably time to give Found a look.
0:21:40 Found is business banking designed for small business owners just like you.
0:21:46 Found lets you manage your financial tasks effortlessly, all in one, easy-to-use app.
0:21:52 They say the platform can help you manage your money, track your spending, invoice your clients, and even handle your taxes.
0:21:55 So you can focus on what matters most, your customers.
0:22:02 You can open a Found account for free at found.com, F-O-U-N-D dot com.
0:22:05 Found is a financial technology company, not a bank.
0:22:08 Banking services are provided by Piermont Bank, member FDIC.
0:22:10 You don’t have to put this one off.
0:22:16 You can join thousands of small business owners who have streamlined their finances with Found.
0:22:33 Let’s get into the political fight, which is what interests me maybe the most.
0:22:36 People have a lot of assumptions about this.
0:22:39 I’m just going to ask the question so that it’s clear.
0:22:45 How much control does Beijing actually have over TikTok?
0:22:47 Do we even know the answer to that?
0:22:50 You use the word control.
0:22:54 I’m going to use the word leverage, which is a little bit different.
0:23:08 The way that the Chinese state operates, members of the government, the military, the police can come knock on your door and say, you’re going to do what I ask or else.
0:23:15 And or else is like, they can just cart you off to some, you know, undisclosed location.
0:23:22 They can cart your grandmother off to some undisclosed location until you comply with their request.
0:23:35 And so the concern about TikTok is that if the Chinese government were so inclined, it could go knock on the doors by dance employees who might not be fans of the CCP.
0:23:36 Doesn’t matter.
0:23:37 Doesn’t matter what their politics are.
0:23:41 Basically hold a gun to their heads and say, you’re going to do what I say or else.
0:23:59 And so if those people have access to data about Americans that the CCP wants to surveil, or if that person has access to the heating button, or access to make a tweak to the algorithm, the CCP could make them do it.
0:24:06 And so you ask, how much control does Beijing, does the Chinese government have over TikTok?
0:24:18 It has a lot of leverage over TikTok, so long as there are Chinese people working at TikTok who are living in China who can be sort of shaken down by their own government.
0:24:26 There is an extensive evidence, though, that the Chinese government has actually tried to do that very much.
0:24:36 Yeah, I think it’s fair to at least note that having the ability to do something isn’t the same thing as actually doing it, but the ability alone is still noteworthy.
0:24:37 I’ll put it that way.
0:24:42 But there’s a scene in the book about this new, I don’t remember his name, maybe Rob, I don’t know.
0:24:44 He’s a new TikTok hire, right?
0:24:45 He’s an American guy.
0:24:54 And his job was to map all of this internal data and figure out who could access what.
0:25:01 And he’s in a meeting with a tech consultant who was brought in to audit all the data, all the data flows.
0:25:09 And the consultant tells him in a meeting that he’s looking at all these tools TikTok uses internally.
0:25:13 And he says that, you know, it’s kind of weird.
0:25:18 They all seem to have a backdoor to access user data, all of them.
0:25:20 Tell me if I’m misunderstanding what that means.
0:25:29 But it seems to me that none of the user data is secure and it can be accessed either by people within TikTok or, by extension, the Chinese government at any time.
0:25:30 Do I have that right?
0:25:32 I think mostly yes.
0:25:37 Certainly at the time that that meeting took place, that that seemed to be the case.
0:25:38 How long ago?
0:25:39 Do you remember?
0:25:40 That meeting was in 2021.
0:25:41 Okay.
0:25:54 The idea of there being backdoors, when that person said that in the meeting, the way I interpreted it, the way I understood it, was not that those backdoors were, like, maliciously placed.
0:25:58 It’s that the way, like, this is an app that was built by people in China.
0:26:01 They were building it to be sticky.
0:26:04 They were building it to be, you know, addictive.
0:26:06 They were building it to serve people content that they wanted to see.
0:26:08 They weren’t necessarily building it for geopolitical purposes.
0:26:11 There’s not evidence that that was happening.
0:26:13 But they were Chinese people building an app.
0:26:21 And so it was efficient for the purposes of the thing that they were building to have user data flow many, many, many, many places.
0:26:32 And if you’ve ever worked inside a big tech company, you’ll know just how many internal tools there are and how much they talk to each other, which is just, like, a huge amount.
0:26:38 The app that you use, TikTok, in your house, there are, like, 500 internal apps propping that thing up.
0:26:43 And it’s not that they were built maliciously.
0:26:51 It’s that they were built without concern for the idea that this information should not be accessed from China.
0:26:56 I did think it was kind of wild to learn.
0:27:08 And it does seem relevant to what we’re talking about right now, that ByteDance employees used TikTok data to track journalists, including you.
0:27:10 Why did they do that?
0:27:12 And how did you find out about it?
0:27:14 Yeah, that was definitely wild.
0:27:27 So the meeting that I was just describing, that was one of many meetings that recordings of those meetings were leaked to me by a whistleblower who had worked at TikTok.
0:27:30 And there were a lot of them.
0:27:34 There were, like, 80 recorded meetings, 80-plus recorded meetings.
0:27:38 I listened to all these meetings many, many times.
0:27:39 I took notes on the meetings.
0:27:40 I tried to understand the meetings.
0:27:43 I then went to a bunch of people inside TikTok and said,
0:27:44 Huh, tell me about this thing.
0:27:45 Tell me about that thing.
0:27:49 Sort of confirmed that they were what the person said they were.
0:27:51 And then I wrote a big story about it.
0:27:57 Because what those meetings demonstrated was, and this was, the story came out in 2022.
0:28:00 The meetings were in 2021 and 2022.
0:28:10 At least at that time, functionally, all U.S. user data was accessible to people in China.
0:28:15 And they were using that data, I don’t know, to target ads, to make the app stickier.
0:28:19 Like, not necessarily for geopolitical purposes, but it was accessible to them there.
0:28:25 And if it’s accessible to them, and if the CCP can come to your door with the gun at your head and say, you’re going to give me this data,
0:28:29 then there is a risk that the CCP could seize that data.
0:28:31 Could have already, and we just don’t know about it.
0:28:31 Right?
0:28:36 And so I wrote this big story in 2022 based on all these tapes.
0:28:48 And TikTok and ByteDance had an internal audit and risk control department that was in charge, among other things, of employee misconduct and investigating employee misconduct, including leaks.
0:28:55 They’re pretty freaked out to learn that a whole bunch of the company’s internal business was recorded and leaked to a journalist.
0:28:58 And so they start investigating that leak.
0:28:59 Not terribly surprising.
0:29:00 That’s kind of their job.
0:29:05 And in the course of investigating that leak, they come up with a strange scheme.
0:29:15 They say, okay, well, let’s pull the data from the journalist’s TikTok account to get her IP addresses.
0:29:23 And then let’s compare that to the IP addresses of all of our employees and see if there’s a match.
0:29:37 And so if I had been at a cafe or a library or a public park, and my IP address matched the IP address of any TikTok or ByteDance employee’s device, that could be a sign that that person had met me and was talking to me.
0:29:38 Wow.
0:29:45 Now, there’s been a lot of conversation about how TikTok and ByteDance use TikTok to surveil journalists, and that was bad.
0:29:46 And it was.
0:29:59 But also, we forget that this also means that they’re monitoring the movements of all of their staffers to try to see where they’re going, which is also, you know, you sign up to work somewhere, but still, that’s a lot.
0:30:01 I guess, man.
0:30:03 I wouldn’t want that.
0:30:09 I am made aware of this scheme by whistleblowers inside the company.
0:30:14 I write about it, and I hold certain details back.
0:30:20 So I write this story in October 2022 about how TikTok had planned to surveil U.S. persons.
0:30:30 And I knew at the time that I was one of the U.S. persons that they were planning to surveil, but I didn’t reveal that because I was really being careful to keep my sources on this protected.
0:30:34 And TikTok and ByteDance come out and say, that’s ludicrous.
0:30:35 That’s not true.
0:30:36 We don’t know what you’re talking about.
0:30:37 That’s ridiculous.
0:30:39 And I’m like, I don’t know.
0:30:40 I heard what I heard.
0:30:41 I saw what I saw.
0:30:42 I think you’re doing it.
0:30:52 And so after they come out and call me a liar, which they did, they hire a white shoe law firm to investigate why I reported that this was the case.
0:30:59 And that white shoe law firm, Covington, very reputable law firm, finds essentially what I found.
0:31:09 They also found more than I knew, which is that ByteDance had also done this to another reporter who was reporting out of London on problems in their London office.
0:31:10 That person’s name is Christina Criddle.
0:31:17 She was also surveilled by the same sort of ByteDance team under the same theory that they could figure out who was leaking to her.
0:31:29 Do you buy the idea that TikTok and ByteDance, for reasons you explain in the book, aren’t really separable?
0:31:35 Do you buy that they pose a special kind of national security threat?
0:31:41 Because, as you know, that is sort of the question around which a lot of the discourse orbits.
0:31:48 Um, yes, I think they do pose a national security threat.
0:31:58 I think I also think that X and Meta and YouTube might, too, just because they are so large and so powerful.
0:32:08 I am somewhat skeptical of the idea that most people’s TikTok videos are sort of of national security importance.
0:32:13 I don’t think the CCP cares very much about what most of us are posting online.
0:32:24 But I definitely think the CCP cares about what Chinese dissidents who have left China and are sort of criticizing Chinese government policy from afar.
0:32:26 I think it definitely wants to know what those people are doing.
0:32:29 It could definitely use TikTok to surveil those people.
0:32:33 It could use TikTok to surveil, you know, active members of the U.S. military.
0:32:36 I do think those concerns are legitimate.
0:32:49 And I also think that we should all be worried about all big social media platforms being weaponized by foreign governments that are trying to fuck with our discourse.
0:32:53 Because we know they’ve done that before and they’ll try to do it again in any way they can.
0:32:54 Am I allowed to swear?
0:32:55 Oh, fuck yeah.
0:32:56 Okay.
0:32:56 I do.
0:32:58 Because I just did.
0:33:04 I don’t, probably to the chagrin of some of my editors, I have a bit of a potty mouth.
0:33:05 I’m working on it.
0:33:18 A question I had rolling around in my head when I started your book was, even if China does have access to all this user data, what’s the real tangible danger?
0:33:21 I mean, so what if they know who likes what or who watches what?
0:33:30 What can they really do with that apart from, you know, boosting their algo and making more money because their app becomes more addictive?
0:33:40 But there is a hypothetical in your book from that same American TikTok employee we were just talking about a second ago.
0:33:46 And if you’ll, you’ll bear with me, there was a passage that I wanted to just read.
0:33:55 You write, imagine a person commits a mass shooting and we learn that he expressed hateful views on TikTok before committing his crime.
0:34:05 Many people within ByteDance could easily pull up a list of all the people who had liked and engaged with the shooter’s video or exchanged messages with him.
0:34:13 They could pull up a list of people who were interested in content from other accounts similar to the shooter’s or those that had blocked him or reported him.
0:34:21 They might use these lists for good, perhaps targeting those people with anti-extremism messaging or anti-bullying and anti-harassment resources.
0:34:29 But they could also do the opposite, targeting them with further radicalizing or demoralizing messages.
0:34:32 That seems really plausible to me.
0:34:33 Does it seem plausible to you?
0:34:46 So before I became a reporter, I worked in what we call trust and safety, the sort of content policy area that tries to prevent people from seeing terrible things on the Internet that are going to hurt them for whatever reason.
0:34:56 And trust and safety employees spend much of their lives thinking about how these tools could be misused to hurt people.
0:35:02 And, you know, they think about how to get suicide-inducing content off the platform.
0:35:20 They try to, you know, they’re thinking about all of the ways that either bad actors can use these platforms to manipulate and harm people, or that even if a bad actor isn’t posting something, something could really harm someone, you know, induce them into an eating disorder, that kind of thing.
0:35:27 The person who was engaging in this hypothetical, like, that’s his job.
0:35:29 That’s what he’s supposed to be thinking about, right?
0:35:35 And I think it is wise for us all to spend some time thinking about that type of thing, too.
0:35:51 Now, the idea that someone within ByteDance or someone with a gun to the head of someone within ByteDance would actually try to radicalize mass shooters, that’s a pretty extreme, malevolent thing for someone to do.
0:35:56 And so when you talk about plausibility, like, I don’t know, that’s a pretty dark one, man.
0:36:03 I can think of more plausible, also problematic things that someone might do.
0:36:06 But that one is really poignant, right?
0:36:11 And it shows the sort of harm that could be done with a tool like this.
0:36:16 And it’s worth noting that, like, these systems exist on other platforms, too.
0:36:20 They don’t have the same employee shakedown problem, right?
0:36:22 But they could have malevolent employees, right?
0:36:28 Twitter at one point was infiltrated by several, you know, agents of the Saudi government, which then used it to exfiltrate data.
0:36:37 And so any platform that puts a person who is ultimately untrustworthy in one of those positions could be dealing with a similar sort of threat.
0:36:48 Yeah, I mean, I should say by plausible, I really just mean not necessarily that this has happened or will happen or is happening, but a very plausible capability.
0:36:52 It just sort of makes the point that how this can be weaponized.
0:36:53 Absolutely.
0:37:00 And, you know, to point out some other ways that are easier, right?
0:37:07 Maybe, I mean, we know that there are whole sub-communities on TikTok of people who believe conspiracy theories.
0:37:08 It’s huge.
0:37:11 It’s a huge, you know, part of TikTok.
0:37:27 If the CCP put a gun to someone’s head and said, I want a list of all of the accounts and the personal data behind the accounts of people who are susceptible to conspiracy theories, they could seed another QAnon if they wanted to.
0:37:37 I’m not saying that they’re doing this, but like when you’re trying to think about the possibilities of what a bad actor could do with this type of data, they go on and on and on.
0:37:55 As a BMO Eclipse Visa Infinite cardholder, you don’t just earn points.
0:37:57 You earn five times the points.
0:38:02 On the must-haves like groceries and gas and little extras like takeout and rideshare.
0:38:03 So you build your points faster.
0:38:07 And then you can redeem your points on things like travel and more.
0:38:09 And we could all use a vacation.
0:38:12 Apply now and get up to 60,000 points.
0:38:13 So many points.
0:38:16 For more info, visit BMO.com slash Eclipse.
0:38:18 Visit us today.
0:38:20 Terms and conditions apply.
0:38:24 Support for this show comes from Wealthfront.
0:38:27 It’s always hard to figure out where to put your money.
0:38:31 One option is a Wealthfront cash account with no minimum balance or account fees.
0:38:34 Right now, they say you can earn a 4% APY.
0:38:39 Plus, you get free instant withdrawals to eligible accounts every day.
0:38:42 So your money is always accessible when you need it.
0:38:46 No matter your goals, Wealthfront can give you flexibility and security.
0:38:54 Right now, you can open your first cash account with a $500 deposit and get a $50 bonus at Wealthfront.com slash gray area.
0:38:57 That’s Wealthfront.com slash gray area.
0:38:59 Bonus terms and conditions apply.
0:39:04 Cash account offered by Wealthfront Brokerage, LLC, member FinRecipic, not a bank.
0:39:12 Annual percentage yield on deposits as of September 26, 2025 is representative, subject to change, and requires no minimum.
0:39:16 Funds are swept to program banks where they earn variable APY.
0:39:25 Lock the front door.
0:39:25 Check.
0:39:27 Close the garage door.
0:39:27 Yep.
0:39:31 Installed window sensors, smoke sensors, and HD cameras with night vision.
0:39:31 No.
0:39:37 And you set up credit card transaction alerts, a secure VPN for a private connection, and continuous monitoring for our personal info on the dark web.
0:39:40 I’m looking into it.
0:39:42 Stress less about security.
0:39:46 Choose security solutions from Telus for peace of mind at home and online.
0:39:50 Visit telus.com slash total security to learn more.
0:39:51 Conditions apply.
0:40:11 Is there any evidence that China is using TikTok, has used TikTok, as an ideological weapon?
0:40:22 Any evidence, for example, that they have turned up the dial on certain kinds of content in order to destabilize other countries, other populations?
0:40:24 Hmm.
0:40:26 You made it complicated at the very end there.
0:40:29 Sorry, not sorry.
0:40:30 Yeah.
0:40:40 So, in the United States, I know of no evidence that the Chinese government has used TikTok to try to interfere with discourse in any way.
0:40:50 They did have content policies that restricted discourse about China years ago, but those content policies were changed years ago.
0:41:12 There is classified information that led the U.S. government to believe that the Chinese government has ordered ByteDance to manipulate discourse in some way in at least one other country around the world, and that is as much as I know about that.
0:41:30 But in the litigation, in the court filings, where ByteDance contested the law that would force it to sell TikTok or see it banned in the United States, there was this oblique mention of some sort of manipulation overseas somewhere.
0:41:33 And so, I don’t have that information.
0:41:43 I’m not going to vouch for it, but there is this little classified Easter egg that suggests that there’s something we don’t know that is pertinent to your question.
0:41:48 Well, I mean, I should say, there’s nothing terribly unusual.
0:41:56 Like, every country is interested in influencing the populations in their geopolitical rivals, right?
0:41:58 There’s nothing uniquely mendacious.
0:41:59 Including the United States, by the way.
0:42:00 Of course, right?
0:42:02 So, there’s nothing exceptional about that.
0:42:13 It’s just, we’re just talking about maybe the most powerful, persuasive tool in the world in the hands of what happens to be our greatest geopolitical rival at the moment, which makes it worth talking about.
0:42:21 Well, look, ByteDance did try to ease U.S. fears.
0:42:22 And the way they did that.
0:42:35 And the way they did that was they launched this thing called Project Texas, which was a plan to wall off American data, like we’ve been talking about, and put it strictly under American oversight.
0:42:40 For people who don’t know anything about this, how did that experiment go?
0:42:50 So, Project Texas was ByteDance and TikTok’s big bet to solve the employee shakedown problem, right?
0:42:54 This wasn’t a thing that they had thought about when they were building TikTok.
0:42:56 It didn’t cross their mind.
0:42:57 It wasn’t an issue.
0:43:04 And they realized after people in the United States started yelling about it being a national security risk, they were like, oh, okay.
0:43:07 We see what the problem is.
0:43:08 Like, we see the risk here.
0:43:20 And so, the best way to try to make sure that your employees don’t get shaken down is to put a driver carries no cash sticker on their car, right?
0:43:27 If there’s no cash to get, the likelihood of you being held up goes way down.
0:43:29 It’s not zero, but it goes way down.
0:43:46 And so, the thought was that ByteDance and TikTok would cut off access to sensitive user information and later on largely to ways to tweak the algorithm that that’s more complicated.
0:43:48 This was mostly a data project.
0:43:57 That they were going to cut off access to, like, private U.S. users’ data to people who were in China.
0:44:02 And if people in China can’t get it, then the CCP can’t go to their house and make them turn it over.
0:44:07 In theory, that’s a reasonably, you know, that’s a reasonable plan.
0:44:08 That makes some sense.
0:44:18 And by all accounts, they sunk a huge amount of money, multiple billions of dollars, into trying to complete this bifurcation.
0:44:26 Now, I’m going to go back to those 500 internal apps, propping up the app that you have as a, like, normal consumer.
0:44:29 There are so many pipes.
0:44:44 And all the pipes have so much data running through them that actually cutting off access to user data became this sort of maze-like, Sisyphean, horrible task.
0:44:50 And that’s why it was so incredibly expensive.
0:44:58 And ultimately, the U.S. government thought, there’s no way that they can do this all the way.
0:45:15 When we think about security, when we think about, like, hacks, right, we think about leaks of data, it’s common in the cybersecurity industry to hear people say, it’s not if, but when, right?
0:45:25 And that if you think that your security is 100%, you are wrong and you are foolish and you have to be prepared for when there are breaches.
0:45:39 And that, I think, is what tripped up TikTok and ByteDance because they did a pretty good job of closing off most of the data pathways.
0:45:46 But, like, that last mile problem in cybersecurity is basically insurmountable.
0:45:53 And so, inevitably, there were still people who still had access to things they shouldn’t have access to.
0:45:56 Lots of them in lots of different scenarios.
0:45:59 The wall wasn’t complete.
0:46:09 And there was always a fear that it wouldn’t be complete unless TikTok was fully spun off.
0:46:14 And so, TikTok and the government negotiated about Project Texas for years, right?
0:46:20 And eventually, the government was just like, we know you’re trying.
0:46:26 We have doubts about the idea that this will ever be actually foolproof.
0:46:38 Well, the way they were trying to deal with that, I guess, but the Trump White House was trying to force TikTok to sell itself to an American company.
0:46:44 Yeah, so the first attempt to regulate TikTok was in the first Trump administration.
0:46:48 And Trump decided that TikTok was a national security risk.
0:46:52 He was surrounded by somewhat hawkish folks at the time.
0:47:02 Plus, a bunch of teenagers had just made him look silly on the Internet by punking ticket registration for his comeback rally in Tulsa.
0:47:03 So, he didn’t like TikTok very much.
0:47:05 He said, yeah, okay, we’re going to ban TikTok.
0:47:06 That’s what we’re going to do.
0:47:16 People around him eventually persuaded him that what was better than banning TikTok is just forcing ByteDance to sell it to a U.S. company so that U.S. companies can make some money there, too.
0:47:17 So, he said, okay, fine.
0:47:20 He was clumsy in his attempts to do that.
0:47:23 He used a law that absolutely doesn’t allow him to do that.
0:47:28 And so, TikTok and ByteDance took him to court, said, you don’t have the authority to do that under this law.
0:47:29 What are you talking about?
0:47:32 Then he lost the election, left office.
0:47:34 TikTok and ByteDance lived on.
0:47:41 The Biden administration comes into office, says, yeah, yeah, yeah, we’re not going to ban TikTok.
0:47:50 We’re going to negotiate with TikTok and we’re going to try to find a way that we can mitigate these legitimate issues, but keep TikTok around because we don’t want to ban it.
0:47:56 They negotiate about Project Texas for the better part of two years.
0:47:59 Eventually, they’re just like, yeah, I don’t know, man.
0:48:02 I don’t think this is going to be good enough.
0:48:07 I don’t think there’s a way to actually meaningfully separate ownership and control this much.
0:48:12 And so, eventually, the Biden administration says, yeah, no, you’re going to have to sell it or we’re going to ban it.
0:48:23 And the Biden administration knows that the best way of actually achieving a ban is not through executive action, but through Congress.
0:48:27 And so, the Biden administration works with Congress and they say, let’s pass a law.
0:48:34 We think we need to pass a law here to actually require ByteDance to sell TikTok or see it banned.
0:48:39 And Congress passes the law with huge bipartisan margins.
0:48:42 ByteDance challenges that law.
0:48:44 Case goes all the way to the Supreme Court.
0:48:48 Supreme Court says, yeah, this law is constitutional.
0:49:01 And then, right as Trump is coming back to power, under sheer coincidence, it was a sheer coincidence that the law was actually set to go into effect one day before the inauguration.
0:49:06 And Trump says, oh, no, I’m going to save TikTok.
0:49:09 I’m going to make sure that TikTok doesn’t go dark.
0:49:14 TikTok sort of blickers itself off for dramatic impact.
0:49:15 Very dramatic.
0:49:25 Right as Trump comes into power and is inaugurated and then Trump suspends enforcement of this law, there is a binding U.S. law saying TikTok should be banned right now.
0:49:30 And Trump says, DOJ, do not enforce this law.
0:49:36 And TikTok has lived in that strange legal purgatory ever since.
0:49:45 I just want to ask, the very last line of the book stopped me in my tracks a little bit.
0:49:53 Because I am someone who has very publicly been sounding the alarm on AI and the dangers of AI.
0:49:56 I’m very concerned about where we’re going.
0:50:14 And you just sort of say that while all these battles over TikTok are going to keep raging on, the creator of TikTok, Yeming, he’s already moved on or he’s already moving on to the next thing, which is artificial general intelligence.
0:50:18 I just wanted to ask what you meant by that.
0:50:24 What role do you think AI might play in whatever he does next or whatever TikTok does next?
0:50:30 So the way I understand Yeming, he’s a guy who likes to think about hard tech problems.
0:50:36 And TikTok’s hard tech problems are basically solved.
0:50:38 TikTok is a pretty smooth machine right now.
0:50:40 It does what it is supposed to do.
0:50:45 There’s lots of, you know, technological innovations still to be happening on TikTok.
0:50:49 But as a sort of basic product, it’s cooked and it tastes great.
0:51:04 And so I think what Yeming wants to think about and what really gets him excited is figuring out how to create that kind of product and figuring out what that looks like.
0:51:06 I think he likes the earlier stages of it.
0:51:21 By the time your main job is, like, either on the sort of marketing and sales side or on the sort of political and lobbying side, I think he’s a real tech guy’s tech guy.
0:51:23 And he wants to think about building cool stuff.
0:51:29 And so that’s why I think, you know, he, like many other Silicon Valley founders right now, is obsessed with AI.
0:51:47 Well, I just bring it up because, you know, look, if we’re able to create a digital product this addictive without the help of AI or AGI, I shudder to think how addictive or how potent a product supercharged by that kind of tech might be.
0:51:48 Yeah, absolutely.
0:52:07 And I think, you know, the story of TikTok is not a story about AI or AGI, but thinking about why it shows us what it shows us and the amount of control we give it in shaping our realities is really important.
0:52:18 So the questions of the TikTok discourse, in my mind, and I’m biased because I just wrote a book about TikTok, but I really think they are the same questions that we need to be asking about AI, too.
0:52:28 A former FTC commissioner famously said, when you think about an algorithm, you should just instead replace algorithm in your brain with a guy named Bob.
0:52:36 If a guy named Bob shouldn’t be, you know, price fixing across these industries, an algorithm shouldn’t be doing it either.
0:52:43 And if a guy named Bob shouldn’t have access to all these social security numbers, an algorithm shouldn’t either.
0:53:07 And I think when we forget that algorithms are made by people to serve those people and their interests, we sometimes get ourselves into trouble and allow those algorithms too much power over our lives, our information diet.
0:53:21 And just remember that when you give something to an algorithm, you’re giving it to a group of people and you have to think about whether you actually want to give that to that group of people.
0:53:23 That’s a perfect place to end.
0:53:29 Once again, the book is called Every Screen on the Planet, The War Over TikTok.
0:53:35 It’s a great book and you are a fabulous reporter and I enjoy this conversation.
0:53:36 So thanks for having it.
0:53:37 Thank you so much.
0:53:52 After I spoke to Emily, the White House announced a plan to transfer TikTok’s U.S. operations to a joint venture that will be controlled by a majority of American investors and owners.
0:54:02 Although details of the plan have yet to be finalized, I reached out to Emily to find out what this might mean for TikTok and its millions of American users.
0:54:07 Emily Baker White, welcome back.
0:54:08 Thank you.
0:54:09 Happy to be here.
0:54:19 When we originally spoke a few weeks back, the legal future of TikTok in America was still very unclear.
0:54:24 We’re talking now because there have been some developments since then.
0:54:25 What do we know right now?
0:54:27 What do we not know?
0:54:28 What do you make of it?
0:54:36 We have both sides, the Chinese government negotiators and the U.S. government negotiators, saying that they have made progress toward a deal.
0:54:40 The Trump administration is more willing to call it a deal now.
0:54:49 We have Trump saying that there basically is a deal and he has provided an extension to non-enforcement of the law that would ban TikTok if it is not sold.
0:54:51 So that much has actually happened.
0:55:01 We have some reporting saying that Trump will soon sign an executive order saying that the deal meets the requirements of last year’s law, that banner sale law.
0:55:02 That’s interesting.
0:55:07 Not everyone will necessarily agree with the president that the requirements of that law have been met.
0:55:15 But the law does give the president sort of pretty wide latitude in deciding whether those criteria have been met.
0:55:19 So whether anyone can do anything about it if they disagree with the president is uncertain.
0:55:40 We know that the people who are set to buy and control a new U.S. TikTok include Oracle, the corporation controlled by Larry Ellison, a famous Trump ally, Andreessen Horowitz, which of course is largely controlled by Mark Andreessen, who is another important Trump ally.
0:55:45 Ben Horowitz has had his own sort of flirtations with Trump, though those are less clear.
0:55:50 And possibly now Trump is saying the Murdochs of Fox News may also get in on the action.
0:55:57 So we don’t know what level of involvement any of those parties would have in the deal.
0:55:59 It sounds like Oracle’s involvement is large.
0:56:12 Oracle, of course, already has a huge contract with TikTok and has been talking to the U.S. government for years about how their participation could help ameliorate the national security concerns about the app.
0:56:20 All that said, we still don’t know the terms of the deal and we don’t know what rights would be conferred to whom if this thing actually closes.
0:56:23 And this deal has not closed.
0:56:31 There are a lot more details to be hashed out and certainly a lot more details for the public to find out about for us to really know what we’re grappling with here.
0:56:34 Do we know much of anything about the contours of the deal?
0:56:48 So one of the biggest things we know is that we’ve got now both the Chinese negotiators and the Trump administration saying that ByteDance will continue to own the recommendations algorithm that has made TikTok, TikTok.
0:56:51 This is obviously a huge deal.
0:56:54 It’s been a sticking point in negotiations for years about the app.
0:56:59 And the idea is that the new U.S. TikTok will license the algorithm for ByteDance.
0:57:03 Some people will immediately say, that sounds like a violation of the law.
0:57:06 The law requires a clean break between TikTok and ByteDance.
0:57:07 They’re not supposed to be working together.
0:57:10 ByteDance isn’t supposed to be involved at all anymore.
0:57:15 The problem is, especially in software, license can mean a lot of things.
0:57:24 And so it’s possible if the license is pretty open and ByteDance says, okay, we’re licensing this thing.
0:57:25 We’re giving you this thing.
0:57:26 You can do what you want with it.
0:57:27 We’re not involved anymore.
0:57:33 That that would honor like what the law wanted to have happen and there would be meaningful separation.
0:57:42 It’s also possible, if it’s a different kind of license, that ByteDance still essentially has the ability to control or warp or change what U.S. TikTok users are seeing.
0:57:48 And we just need more clarity about what kind of license we’re talking about to be able to answer that question.
0:57:56 The language I read this morning was that Oracle is going to, quote, retrain the algorithm from the ground up, end quote.
0:57:58 I don’t know what that means.
0:57:59 Do you know what that means?
0:58:04 This is an algorithm that’s been trained over many years, over many, like, different sources of data.
0:58:13 And if Oracle is going to retrain the algorithm from the ground up, I have a couple questions about deal terms that will determine how they’re able to do that.
0:58:24 Is ByteDance going to send over, as part of this deal, every piece of content that they fed the algorithm in the first place in order to make it do what it does?
0:58:27 Do they even have all that stuff still?
0:58:28 This is a huge corpus of material.
0:58:33 And algorithms are only as good as their inputs, right?
0:58:40 When you have a model that’s predicting behavior, the more you know about behavior, the better it’s going to be at predicting it.
0:58:46 And so the For You algorithm has a huge number of inputs.
0:58:50 And a big question is whether they’re going to turn over all those inputs as part of this license or lease.
0:58:59 So the actors on the U.S. side that make up this joint venture group or consortium, what are they going to get out of this, apart from, I guess, a lot of money?
0:59:02 A lot of money is the reason a lot of people do deals.
0:59:05 So I don’t want to poo-poo the money.
0:59:10 Clearly, they think that this is good for their business or presumably they wouldn’t be in it.
0:59:30 A lot of people are looking at this, especially a lot of people on the left, are looking at this and saying Trump is delivering a huge organ of speech to a bunch of his political allies.
0:59:31 That’s right.
0:59:33 That seems to be true.
0:59:42 Now, it’s not clear that his allies will immediately say, we’re going to magify this because that’s not really good for business.
0:59:44 And so it’s not clear that they would want to do that.
0:59:51 Like, X hasn’t been terribly successful financially since it sort of went anti-woke.
0:59:56 And so I don’t think we know that they will immediately change the sort of political valence of TikTok.
1:00:08 But certainly, ultimately, the owners of the platform would be able to make decisions and guide decisions about content policy, what speech is allowed, what the platform considers bullying or hate speech and whatnot.
1:00:20 But all these guys, Andreessen, Allison, the Murdochs, it’s not a coincidence, right, that they’re all political allies of Trump.
1:00:31 I think if the Soros group wanted to get in on this, if Warren Buffett wanted to get in on this, I’m not at all sure that Trump would be interested in helping make that happen, right?
1:00:41 And I think you’re clearly looking at a president who has involved himself way more in the private sector and private sector deals than any president in recent history.
1:00:51 And when I think about the law that Congress passed, right, in a way, Congress was trying to curtail presidential authority in the way they passed this law.
1:00:57 But the way the law works, it still gave an immense amount of power to the president at the end of the day.
1:01:06 And I think a number of people who passed that law weren’t envisioning a president who was so willing to engage in naked self-dealing.
1:01:21 I think a lot of the people, like, a lot of the people who were passing laws over the past number of years might have done some things differently had they imagined as unusual and unprecedented and executive as we now find ourselves having.
1:01:23 And then that, I think that’s just true.
1:01:27 I don’t think a lot of people would have done this if they had thought it was going to go this way.
1:01:39 I want people to know, I mean, at least as of this morning, Alison, I believe, is now the second richest person in the world.
1:01:49 And if this goes through, he will unquestionably be the most powerful media mogul in the country and one of the most powerful on the planet, right?
1:01:54 Yeah, I mean, he’s right up there with, you know, I would say Murdoch.
1:02:08 We’re watching a massive consolidation of power in the media and in the news and also an unprecedented moment of speech suppression by the executive.
1:02:27 I mean, the idea that media owners will do things to placate the president, change the mix of information that is getting distributed to people in order to placate the president who threatens them if they don’t do those things is now not a theoretical problem.
1:02:37 This is a thing that is happening, and the idea of Trump delivering a TikTok to his allies cannot be understood without that context.
1:02:40 It’s a really scary, dangerous moment.
1:02:42 Yeah, it is.
1:02:58 What do you think it will mean to have these people, these new owners, should this go through, in control of what content gets elevated on the most popular social media platform in the country, TikTok and maybe the world?
1:03:11 So I wrote a whole book about this sort of idea of an authoritarian state that could just lean on companies and make them change the mix of what people are seeing.
1:03:24 And the foil to that throughout the book, throughout most of TikTok’s history until, like, the last few weeks or months, has been a state where the government doesn’t and can’t do that.
1:03:33 And we are now entering a moment where the U.S. government is going to try to do that or has started to try to do that.
1:03:51 We’re going to find out if it’s better or not, but we’re certainly seeing what we are seeing from the government, the U.S. government, in efforts to try to change what information people consume, change what commentary people consume, are, they are certainly CCP-like.
1:04:03 All right, people, I hope you enjoyed this episode.
1:04:07 It was an education for me, as I said in the conversation.
1:04:18 I never really understood the appeal or the power of TikTok because I am not a user, but I certainly do now.
1:04:22 And I learned a lot in this conversation and from Emily’s book.
1:04:23 I hope you did, too.
1:04:33 And quite frankly, I still don’t know what I think about this sale or this potential sale to this American group of investors.
1:04:34 And business people.
1:04:36 And I don’t know.
1:04:38 I’d love to know what you think.
1:04:42 So send me your thoughts about that or any other part of the episode.
1:04:53 You can drop us a line at thegrayareaatvox.com or you can leave us a message on our new voicemail line at 1-800-214-5749.
1:04:58 And once you’re done with that, please go ahead and rate and review and subscribe to the pod.
1:05:12 Also, the gray area is a finalist for a Signal Listener’s Choice Award in the category of thought leadership.
1:05:19 If you’re a fan of the gray area, we would love for you to go and vote for us so we can take home that top prize.
1:05:21 We’ll drop a link in the show notes.
1:05:32 This episode was produced by Beth Morrissey, edited by Jorge Just, engineered by Erica Wong, fact-checked by Melissa Hirsch, and Alex Overington wrote our theme music.
1:05:36 New episodes of The Gray Area drop on Mondays.
1:05:37 Listen and subscribe.
1:05:39 The show is part of Vox.
1:05:43 Support Vox’s journalism by joining our membership program today.
1:05:46 Go to vox.com slash members to sign up.
1:05:50 And if you decide to sign up because of this show, let us know.

This week, Sean talks with Emily Baker-White, author of Every Screen on the Planet, about why TikTok feels uniquely addictive, how it turned social media into a push-not-pull entertainment feed, and what happens when human editors inside the company can override the algorithm.

A few days after they spoke, TikTok was in the headlines again. So they jumped on a follow-up call to unpack the latest twists in the saga of who will ultimately control the app’s US-operations.

Host: Sean Illing (@SeanIlling)

Guest: Emily Baker-White, reporter and author of Every Screen on the Planet: The War Over TikTok

The Gray Area has been nominated for a Signal Listener’s Choice Award. Vote for The Gray Area here: https://vote.signalaward.com/PublicVoting#/2025/shows/genre/thought-leadership

We’d love to hear from you. Email us at tga@voxmail.com or leave a voicemail at 1-800-214-5749. Your questions and feedback help us make a better show.

Watch full episodes of The Gray Area on YouTube.

Listen ad-free by becoming a Vox Member: vox.com/members

Learn more about your ad choices. Visit podcastchoices.com/adchoices

Leave a Comment