Summary & Insights
0:00:07 You’ve got your core holdings, some high conviction picks, and maybe even a few strategic options
0:00:07 at play.
0:00:11 So why not switch to the investing platform built for those who take it seriously?
0:00:17 Go to Public.com slash ProfG and earn an uncapped 1% bonus when you transfer your portfolio.
0:00:20 Go to Public.com slash ProfG.
0:00:24 Paid for by public investing, all investing involves the risk of loss, including loss
0:00:29 of principal, brokered services for U.S.-listed registered securities, options, and bonds,
0:00:31 and a self-directed account are offered by Public Investing, Inc.
0:00:34 Member FINRA and SIPC.
0:00:38 Complete disclosures available at public.com slash disclosures.
0:00:43 Support for the show comes from Blue Air Purifier.
0:00:48 In markets and in life, the fundamentals matter, and taking care of your health is a big one.
0:00:53 The Blue Signature Air Purifier by Blue Air is the most powerful yet compact air purifier
0:00:54 you can get.
0:00:57 It quietly removes pollutants that affect focus, sleep, and longevity.
0:01:02 Blue Air is one of the most awarded air care brands in the U.S. and U.K.
0:01:07 Use promo code PROFG25 to save 25% at BlueAir.com.
0:01:14 Did you lock the front door?
0:01:14 Check.
0:01:16 Closed the garage door?
0:01:16 Yep.
0:01:19 Installed window sensors, smoke sensors, and HD cameras with night vision?
0:01:20 No.
0:01:24 And you set up credit card transaction alerts, a secure VPN for a private connection, and
0:01:26 continuous monitoring for our personal info on the dark web?
0:01:29 Uh, I’m looking into it.
0:01:31 Stress less about security.
0:01:35 Choose security solutions from Telus for peace of mind at home and online.
0:01:39 Visit telus.com slash total security to learn more.
0:01:40 Conditions apply.
0:01:42 Today’s number 68.
0:01:46 That’s the percentage of the world’s population that is lactose intolerant.
0:01:47 True story, Ed.
0:01:50 I spent most of my college years as a waiter, and I remember one time someone came in and
0:01:50 said,
0:01:54 I am allergic to shellfish, lactose, and peanuts.
0:01:55 What should I get?
0:01:56 You know what I said, Ed?
0:01:57 What?
0:01:58 The fuck out!
0:02:02 Listen to me.
0:02:03 Markets are bigger than us.
0:02:06 What you have here is a structural change in the wealth distribution.
0:02:07 Cash is trash.
0:02:08 Stocks look pretty attractive.
0:02:10 Something’s going to break.
0:02:10 Forget about it.
0:02:13 Yeah, I think the dirty jokes are better.
0:02:15 I think I’ve got to go dirtier again.
0:02:17 You know, I’m not exaggerating, Ed.
0:02:23 I have been fired from some of the most interesting food establishments in Los Angeles.
0:02:28 I interview really well, which is no surprise, which also is no surprise, is I’m a terrible
0:02:28 employee.
0:02:35 I got fired from Shakey’s Pizza, Island’s Burgers, The Chart House, Monty’s Steakhouse.
0:02:38 I’ve been fired from some of the best restaurants.
0:02:41 LA Sports Club had a restaurant.
0:02:43 I got fired from there, and I became a trainer.
0:02:44 I was much better at that.
0:02:45 You became a trainer?
0:02:47 Yeah, I used to train all these old white dudes.
0:02:48 When were you a personal trainer?
0:02:51 My senior year in college.
0:02:52 Oh, that’s right.
0:02:56 And you kind of did that as well when you were starting out in banking.
0:02:56 Is that right?
0:03:00 You were offering training sessions to the MDs?
0:03:02 Am I remembering that right?
0:03:06 Yeah, I used to, every once in a while, and I was in New York, they had a great gym at
0:03:08 Morgan Stanley, and I used to take guys down.
0:03:12 Because if you’re an investment banker, by the time you’re 45 or 55, you’re just a fucking
0:03:13 hot mess.
0:03:14 I mean, you just look like shit.
0:03:17 And then they start freaking out, and they get diagnosed.
0:03:19 Anyways, yeah, I used to, it’s fun.
0:03:20 I kind of enjoy it.
0:03:26 I do it, one of the nicest things, the nicest parts of my day, I FaceTime my son, and I take
0:03:26 him through a workout.
0:03:29 It’s a ton of fun.
0:03:30 I really enjoy it.
0:03:30 Where are you?
0:03:31 Are you in London?
0:03:32 Back in London?
0:03:33 I’m in New York.
0:03:35 I’m here for two more days.
0:03:39 I go to this fancy conference in Aspen, which I’m excited about.
0:03:42 And they’ll be there two days, and I go home to London.
0:03:43 What have you been doing in New York?
0:03:44 Drinking.
0:03:47 What have I been doing in New York?
0:03:52 I got like, oh, I’m going on, I’m going on the Today Show this afternoon.
0:03:53 Oh, wow.
0:03:56 They called me and said, do you want to come on this morning with Meredith Vieira?
0:03:56 You know?
0:03:58 And I’m like, yeah, oh yeah, great.
0:03:59 And they’re like, how long is that going?
0:03:59 I’m like, six minutes.
0:04:00 I’m like, no.
0:04:02 I’m not coming on for six minutes.
0:04:06 You can ask me one question, and then sell pharmaceutical ads, like convince people they
0:04:07 have restless legs or whatever.
0:04:09 And I’m hauling my ass up to Rockefeller.
0:04:11 Anyway, so they call back.
0:04:13 This comes out after I’m on.
0:04:14 Okay, I can say this.
0:04:17 And they said, we’ll do a 30-minute piece or whatever.
0:04:22 So I’m going up for a sit-down with, gosh, I don’t know her name.
0:04:29 I’m sure some woman who was the most talented, attractive woman and then went to McGill College
0:04:34 of Journalism or the one at Northwestern and is now hoping to be the next Barbara Walters
0:04:37 and is finding out that all of her friends are making more money on Substack and podcasts.
0:04:42 So I’m about to go find someone who’s about to make a career change.
0:04:45 So when you say no to these things, how do you word that response?
0:04:46 It’s nuanced.
0:04:46 It’s sophisticated.
0:04:47 No.
0:04:50 I’m very honest with them.
0:04:54 I was supposed to be on with Gayle King on, what’s that show?
0:04:55 Is that the Today Show?
0:04:56 No, CBS Sunday Morning.
0:04:57 I don’t know.
0:04:59 What do they play at the rest homes?
0:05:00 I couldn’t tell you.
0:05:02 I haven’t watched these shows.
0:05:03 Well, that’s because you’re not 90 yet.
0:05:06 Although you little MSNBC man whore.
0:05:08 Oh my God.
0:05:09 By the way, you are out.
0:05:10 I got to give it to you.
0:05:12 You were outstanding.
0:05:14 Your last appearance.
0:05:16 You were so good.
0:05:19 I think you got more play on social.
0:05:21 You were, you were really strong.
0:05:23 Well, this is the interesting thing about this TV stuff.
0:05:28 You go on TV, you think that’s the performance, but it actually turns out it’s, it’s the clip
0:05:31 that goes on social media after that’s, that’s where all the action’s really happening.
0:05:35 A hundred percent, which they don’t monetize, or at least I don’t think that, do they monetize?
0:05:36 I don’t know.
0:05:40 They post on YouTube, but you know, I don’t know how they’re monetizing that.
0:05:45 I guess they’re just doing the AdSense, but that really is the future for these TV companies.
0:05:53 The format is amazing for social media, actually, because there are these quick, very passionate
0:05:53 hits.
0:05:57 When you clip them into like one or two minute segments, that plays really well.
0:06:02 So we are seeing sort of a resurgence of the cable model, except it’s just not happening
0:06:02 on TVs.
0:06:04 It’s happening on people’s phones.
0:06:05 I see it as organ donation.
0:06:10 I see it as they’re this giant corpus that has lived a long life, taking a ton of energy
0:06:12 and love and resources to keep this person alive.
0:06:19 And then they’re next to death and they donate their hearts and lungs to social, where it’s
0:06:24 like the social basically clips the only 90 seconds it was any good in the last hour on,
0:06:25 you know, name your broadcast network.
0:06:26 Why do you care?
0:06:31 Can I just, in terms of the six minute versus 30 minute thing, is it the idea that they’re
0:06:34 not giving you, is it more of a respect thing?
0:06:37 Like I’m Scott Galloway, you should be giving me more of your time.
0:06:44 Or is it that six minutes, you don’t, you just don’t want to have to put in the work for such
0:06:46 a small amount, such a small performance.
0:06:48 It’s total fucking game.
0:06:51 I mean, Noam Chomsky and Sam Harris aren’t doing fucking six minute hits.
0:06:57 I mean, if Nicole Wallace or Anderson Cooper invite me on for like 20 or 30 minutes or 10
0:06:59 minutes or a longer hit, I’ll do it.
0:07:02 But I’m not going to go up there, have them ask one questions and have them thoughtfully
0:07:04 look into my eyes and go, this is an issue.
0:07:05 We definitely want to keep on.
0:07:06 Thank you so much for coming in.
0:07:10 I’m like, oh God, get me to fucking Dunkin’ Donuts.
0:07:15 But see, you’re in the stage that I like to call, that I was in for 30 years, called being
0:07:17 a total fucking media whore.
0:07:19 You’re saying yes to everything.
0:07:22 And then you go on and you’re awesome.
0:07:28 And this is the thing, you’re under the illusion that you’re making progress.
0:07:33 And to a certain extent, people saying the brand, like, I got to believe a ton of your friends
0:07:37 have emailed me like, oh my God, Ed, you’re on MSNBC?
0:07:44 The guy that we did Rails Academy with in our junior year and the guy who dressed up as
0:07:48 a Tiger for the Princeton Tigers game and who cheated off of me on the classics.
0:07:50 The dudes on it.
0:07:55 I can’t imagine the group text ripping around PrincetonTigerDouchebag.com right now.
0:08:03 The first time I went on TV, it was, I got a text from my college group chat and it was
0:08:06 a picture of me on the screen at the gym.
0:08:09 And he just said, my friend just texted me, he’s like, I’m just trying to work out, man.
0:08:11 He’s like, dude, stop it.
0:08:12 Stop it.
0:08:14 No, you have been, I mean this sincerely.
0:08:20 And I call out, go to TikTok or wherever you go and type in Ed Elson, MSNBC.
0:08:23 You are really good.
0:08:29 I mean, you’re just okay on the pod, but you are really outstanding on MSNBC.
0:08:30 You are really good.
0:08:32 I think it’s because, I’m not sure.
0:08:35 I think you got the hots for Katie tour, Ari Melber.
0:08:38 I think you really bring it when Ari or Katie are on.
0:08:41 You’re like, oh, yeah.
0:08:41 I get it.
0:08:42 I get it, Ed.
0:08:44 Yeah, you understand.
0:08:44 I get it.
0:08:45 All right, enough of that.
0:08:47 Enough of you dominating this conversation with banter.
0:08:50 Ed, let’s get to the headlines.
0:08:51 Let’s get into it.
0:08:53 We have a conversation, a great conversation coming up.
0:08:54 We’re speaking with-
0:08:56 Don’t correct me, bitch, just because you’re on MSNBC.
0:08:58 I’m sorry, go ahead.
0:09:00 We’re speaking with Mark Cuban.
0:09:06 Mark Cuban, serial entrepreneur, investor, a total legend.
0:09:09 Mark, so good to have you on the show.
0:09:10 Thank you for joining us.
0:09:11 Thanks for having me on, Ed.
0:09:12 Thanks, Scott.
0:09:19 So, you’ve started many companies, serial entrepreneur.
0:09:22 You’ve also started a lot of media companies.
0:09:24 You have a lot of experience in media.
0:09:28 This is a very interesting time for the media space.
0:09:30 A ton of action in the past few weeks.
0:09:36 Let’s just start with the thing that everyone’s talking about, and that is Jimmy Kimmel.
0:09:42 Disney or next star ABC cancels Jimmy Kimmel, then they uncancel him.
0:09:45 Just a very open-ended question.
0:09:48 Your reactions to what’s happened with Jimmy Kimmel thus far?
0:09:55 I mean, it’s not the first time a major media company has put a star on hiatus for any number
0:09:56 of reasons.
0:10:03 You know, go back to Disney kicking out Gina Carano from a film, you know, and I know when
0:10:09 I was on Shark Tank, the guys in charge of standards and practices were very clear about
0:10:11 what we could say or not say.
0:10:17 And when my name kept on coming up as a vice presidential candidate or a presidential candidate,
0:10:22 they told me with no uncertain terms that I would have to leave the show if I took that
0:10:22 path.
0:10:27 So, there’s a lot of precedent for a lot of this stuff.
0:10:29 So, I wasn’t surprised at all to answer your question.
0:10:31 Just to interject here, this is different, Mark.
0:10:36 This looks like there’s a direct connection between the government, specifically the FCC chairman,
0:10:38 threatening to revoke their license.
0:10:40 This is more than just standards.
0:10:42 It is and it isn’t, right?
0:10:46 On one hand, we haven’t heard an FCC commissioner talk like this ever.
0:10:49 And so, it’s natural to connect it.
0:10:55 On the flip side, there was just, you know, a different set of pressures with identity politics
0:11:00 that, you know, put people under a microscope if they did or didn’t make certain decisions.
0:11:08 And so, where before it was coming, you know, from social media and, you know, people with
0:11:10 influence, now it’s coming from the top down.
0:11:12 So, it’s different, but it’s the same.
0:11:14 The players are different, but the impact’s the same.
0:11:20 One is cultural pressure from progressives who, quite frankly, seem more concerned with virtue
0:11:25 signaling than the material or psychological well-being of Americans.
0:11:31 And what happened with Gina at The Mandalorian was unacceptable.
0:11:32 They’re allowed to do that.
0:11:33 They’re allowed to be stupid.
0:11:36 This, to me, seems much more frightening.
0:11:37 Two different things, right?
0:11:39 You’re alluding to Jimmy Kimmel, right?
0:11:42 And what ABC did, putting his show on hiatus.
0:11:49 Now, if we want to talk about an FCC commissioner or anybody in a position of power in the administration
0:11:51 making comments like that, yeah, it’s terrifying.
0:11:56 But I don’t think there’s, you know, there may or may not be a direct connection between the
0:11:56 two.
0:12:01 It seems like there are, but I think they’re two different issues because it’s just a question
0:12:08 of what kind of pressures would impact Disney or ABC to make a change in a brand name host
0:12:13 versus what would lead an FCC commissioner or president or anybody, for that matter, to
0:12:15 make the type of comments that they’re making.
0:12:21 It sounds like you’re less concerned about what happened with Jimmy Kimmel than, I would
0:12:26 say, some other people, perhaps Scott and I are about this.
0:12:27 Is that right?
0:12:27 Yeah.
0:12:29 I mean, again, two different topics.
0:12:33 Does it terrify me when the head of the FCC makes the comments he does?
0:12:34 Yes.
0:12:35 Right?
0:12:41 Does it surprise me that a network would virtue signal one way or the other, depending on who
0:12:43 they think has the leverage or not?
0:12:43 No.
0:12:47 Yeah, there’s some other interesting things that we’re seeing going on in media right now,
0:12:53 and that is the TikTok deal, that it appears that Trump is really quarterbacking.
0:12:59 It will likely go into the hands of Oracle and perhaps the Ellison family.
0:13:04 And more recently, he’s been talking about how the Murdoch family could get involved in this
0:13:04 deal.
0:13:09 I also want to just get your reactions to what’s happening with TikTok, putting that algorithm
0:13:11 in the hands of these families.
0:13:13 Well, there’s two parts to it.
0:13:16 One, it’s scary because there was no open auction.
0:13:23 There was only a discussion about someone should be interested in buying TikTok and let’s see
0:13:24 what happens.
0:13:30 But then the second part is when you bring in new cooks, the soup might just get destroyed,
0:13:31 you know?
0:13:33 And so there’s no assumption.
0:13:38 You can’t make an assumption that its success will continue because if they do take over the
0:13:40 algorithm, that’s part of it.
0:13:47 Remember, when we were looking at TikTok closing, they came out with a new product called Red and
0:13:51 they came out with other products that were out there and, you know, a material number of people
0:13:52 just switched.
0:13:59 What would be insane to see is if TikTok came out with their own Red again and competed in
0:14:01 the United States with the U.S.
0:14:02 version of TikTok.
0:14:08 But I guess my point is there’s no assurances that this new version of TikTok with Larry Ellison,
0:14:15 his son David, the Murdochs, whoever it may be, are, you know, that they will be successful.
0:14:21 I mean, kids are very persnickety and those algorithms, if they take control, I mean, if
0:14:27 what is it about the TikTok algorithm that differentiated it from the meta algorithms for
0:14:28 Facebook?
0:14:35 You know, they both have their own goals and we don’t yet know how those goals will influence
0:14:41 the uptake when, if we have new TikTok management, you know, or how they’ll even influence what
0:14:46 meta and others will do, you know, meta and doing with Instagram and Facebook.
0:14:49 So there’s a lot of interconnected pieces there.
0:14:54 Are you concerned at all about, I mean, you mentioned there that it wasn’t really an open
0:14:54 auction.
0:15:02 Are you concerned about the extent to which the president is really setting up these deals?
0:15:03 Is this something that we’ve seen before?
0:15:05 Is there precedent for this?
0:15:06 Or is this new?
0:15:13 The idea that the president would figure out a way to get the seller to sell and then
0:15:14 choose the buyers?
0:15:16 Well, it’s not new under Trump, for sure, right?
0:15:18 We’ve seen it with MP.
0:15:19 We’ve seen it with Intel.
0:15:21 We’ve seen it in his efforts with others.
0:15:22 You know, he’s a dealmaker.
0:15:26 He wants to be the figurehead for every and any deal.
0:15:29 Now, what are the consequences of that happening?
0:15:34 That’s to be determined because interjecting himself doesn’t mean it’s going to work.
0:15:39 You know, then we saw Intel when they took the 10%.
0:15:42 Then you saw NVIDIA take another 5%.
0:15:45 You know, it doesn’t change Intel’s business.
0:15:47 Intel still has the same challenges.
0:15:50 The fact that he’s involved, do I like it?
0:15:50 No.
0:15:51 Do I think it’s smart?
0:15:52 No.
0:15:56 Do I think it optimizes the opportunities for the companies involved?
0:15:56 No.
0:16:00 But at the same time, am I surprised?
0:16:00 No.
0:16:07 Just speaking more broadly about the media ecosystem, you know, it feels like it’s concentrating.
0:16:12 It feels like, obviously, certain channels and platforms are losing relevance.
0:16:13 Some are gaining relevance.
0:16:16 Do you have any general thoughts about the media ecosystem?
0:16:24 And if and where, if you were going to invest in media, where you would invest, where you would go long and where you would go short, so to speak?
0:16:28 There’s no chance I’d invest in the media ecosystem at all, anywhere.
0:16:29 Yeah.
0:16:30 Just stay away from it.
0:16:30 Yeah.
0:16:31 It’s brutal.
0:16:33 Because it’s hits-driven.
0:16:38 You know, creating a hit is hard, no matter what the platform is.
0:16:41 And going viral is hard, no matter what the platform is.
0:16:45 If you look at what Mr. Beast does, I think he’s figured out the best.
0:16:49 He spends a lot of time reverse engineering the algorithms.
0:16:51 Because that’s what it all comes down to.
0:16:59 What makes the media ecosystem so difficult is we all spend so much time on social media, and we all have our own unique feed.
0:17:03 Ed’s feed on whichever platform is different than Scott’s, different than Mark’s.
0:17:12 And so that customization that really allows people to really go down all kinds of rabbit holes is unique.
0:17:17 You can’t do that on traditional television, whether it’s broadcast or cable.
0:17:25 You can’t do that necessarily with YouTube, even though they do use algorithms and they feed you a lot of stuff with shorts.
0:17:30 And they’re truly the largest distributor, but there’s just so much, right?
0:17:31 You can’t get to all.
0:17:36 Whereas short-form content, they can make you think anything they want you to think.
0:17:44 And that is the underlying challenge that this country has, that whoever controls the algorithm controls your thoughts.
0:17:51 Well, do you think along those lines, I promised myself during this entire segment, I was not going to ask you if you’re running for president.
0:17:54 So I’ll do something different.
0:17:57 I’ll say, let’s cosplay president.
0:18:07 If it was a president Cuban, and to your point, the algorithms decide what we see.
0:18:14 And sometimes those algorithms don’t have, most of the time, all of the time, those algorithms don’t have our best interests at heart.
0:18:16 They have shareholders’ best interests at heart.
0:18:24 And sometimes incendiary, rage-filled content gets elevated beyond its organic reach.
0:18:33 So as a president Cuban, what would be your approach, if any, to regulating big tech and algorithmically elevated content?
0:18:38 So one, I’d say you have to be 16 or more to use social media.
0:18:43 If you go on X, you can be 13 years old and go on X, and there’s tons of porn.
0:18:45 There’s no shortage of porn at all.
0:18:52 And it’s just amazing to me that, you know, people outside the Republican Party don’t bring that up as a cause of action.
0:19:07 Because you can’t be so strict on, I want to protect schools, I want to protect kids, et cetera, and not recognize that this is, you know, worse than the Playboys I grew up with underneath the bed, underneath my dad’s bed that I used to grab, right?
0:19:09 It just goes to no end.
0:19:10 So that’s part one.
0:19:16 Part two is, and I actually said this to TikTok a couple years ago when they brought me in to just have a discussion.
0:19:31 I think if you are going to let kids under 18 on, you have to have an HTML file that shows a link to all of the videos that they’ve watched so that parents can get a feel for what’s going on with their children.
0:19:42 I mean, literally, as a parent, my son, who’s just turned 16, gets so mad at me because I’ll look at what he’s watching on Instagram and TikTok and whatever it may be because it tells me who he is.
0:19:45 That algorithm knows more about him than I do.
0:19:50 And so that would be part two, having, you know, a way to communicate with parents.
0:20:04 But beyond that, I wouldn’t do anything because that would make me a hypocrite, you know, saying, you know, when Trump or the FCC chairman tries to insert themselves into business, that that would be the way to work.
0:20:09 But I would also encourage businesses at that point in time to flood the zone, right?
0:20:16 Whatever the message, let’s just say as a president, I was trying to get a particular message, unity among the American people.
0:20:18 I want people to come together.
0:20:33 I would use AI to create nonstop billions of 30-second videos that communicate that message to get to the point where people are so sick and tired of seeing it all, they stop using those platforms.
0:20:38 So just along those lines, not intervening.
0:20:43 I mean, I like the idea of capitalism just being competition that’s refereed.
0:20:55 But are you comfortable with the concept of Larry Ellison having a lot of influence over CBS, TikTok, potentially Warner Brothers, CNN?
0:21:01 Do you think there’s a place for intervention around antitrust and ensuring these media ecosystems don’t get too concentrated?
0:21:02 No.
0:21:14 And just like when George Soros bought all the Odyssey stations and there were 200 of them and there was a big to-do about how he’s going to change them over, you know, the American people will speak with their attention.
0:21:25 And I don’t think somebody putting in cash to effectively dying industries is something the government should get involved with.
0:21:28 You know, most of those are end-of-life type situations.
0:21:30 And there’s still a lot of uncertainty.
0:21:34 And I think, you know, with AI, we still haven’t seen the next generation of media developed yet.
0:21:37 Do you think Hollywood is dead?
0:21:39 You talked about Mr. Beast there.
0:21:40 And I totally agree.
0:21:41 He’s absolutely nailed it.
0:21:44 He’s now getting more viewing time than the top series on Netflix.
0:21:47 Is this the end of Hollywood?
0:21:50 No, because AI, again, I keep on going back to it.
0:21:54 AI makes creative people more creative and more efficient.
0:22:03 So, whereas before, you know, with Shark Tank, we would shoot in June and nothing would appear to go on TV until September at the earliest.
0:22:08 And it just took time for the editors, the storytellers to put together stories.
0:22:17 With AI, you can iterate so quickly so that somebody who’s very skilled in storytelling and graphics, it doesn’t replace them.
0:22:21 It amplifies their skill set so you can do more quicker.
0:22:29 And so I think you’ll see more great content created by great content creators and more junk, a lot more junk.
0:22:33 But the great content creators, like they always have, will stand out.
0:22:37 And those will, you know, I’m not saying movies will come back.
0:22:39 They won’t because people aren’t leaving the house.
0:22:48 But long-form content will, I think, be better because AI allows people to, you know, turn it around far more quickly and less expensively.
0:22:51 We’ll be right back after the break.
0:22:56 If you’re enjoying the show so far, be sure to give Profit Markets a follow wherever you get your podcasts.
0:23:14 The Twisted Tale of Amanda Knox is an eight-episode Hulu original limited series that blends gripping pacing with emotional complexity,
0:23:24 offering a dramatized look as it revisits the wrongful conviction of Amanda Knox for the tragic murder of Meredith Kircher and the relentless media storm that followed.
0:23:30 The Twisted Tale of Amanda Knox is now streaming only on Disney+.
0:23:33 Hey, so what did you want to talk about?
0:23:35 Well, I want to tell you about Wagovi.
0:23:36 Wagovi?
0:23:37 Yeah, Wagovi.
0:23:38 What about it?
0:23:41 On second thought, I might not be the right person to tell you.
0:23:42 Oh, you’re not?
0:23:44 No, just ask your doctor.
0:23:45 About Wagovi?
0:23:47 Yeah, ask for it by name.
0:23:50 Okay, so why did you bring me to this circus?
0:23:53 Oh, I’m really into lion tamers.
0:23:55 You know, with the chair and everything.
0:23:57 Ask your doctor for Wagovi by name.
0:23:59 Visit Wagovi.ca for savings.
0:24:00 Exclusions may apply.
0:24:05 Hey, Alex Heath here, founder of Sources.news and a contributor at The Verge.
0:24:12 And I’m Ellis Hamburger, tech reporter turned industry insider, working closely with today’s hottest AI startups.
0:24:17 We’re excited to announce the launch of our new show, Access, with the Vox Media Podcast Network.
0:24:23 Access is the tech industry’s inside conversation with Silicon Valley’s most influential leaders.
0:24:28 From the tech titans of today to tomorrow’s most visionary builders.
0:24:32 It’s a show made by insiders for everyone who wants a glimpse into the future.
0:24:41 In our first episode, Alex interviewed Mark Zuckerberg about Meta’s latest smart glasses, the AI race, and what’s next for the social media giant.
0:24:43 I mean, didn’t you just tell Trump you were going to spend like $600 billion?
0:24:44 I mean, that’s…
0:24:44 I did.
0:24:46 Yeah, through 2028, which is…
0:24:47 That’s a lot of money.
0:24:48 It is.
0:24:56 And if we end up misspending a couple of hundred billion dollars, I think that that is going to be very unfortunate, obviously.
0:24:59 But what I’d say is I actually think the risk is higher on the other side.
0:25:05 You can find the Access pod now on YouTube, Spotify, or wherever you listen to podcasts.
0:25:14 We’re back with Prof G Markets.
0:25:21 Just on AI, I mean, this is the massive investing trend of the year, of the past two years.
0:25:25 What are you thinking about when it comes to your investment strategy?
0:25:30 How are you incorporating AI into your investment strategy?
0:25:33 What do you think the future of AI is really going to look like?
0:25:38 Well, there’s two types of companies, those who are great at AI and those who used to be in business, you know?
0:25:41 And so you’ve got to incorporate it into everything you do.
0:25:45 Like with Cost Plus Drugs, we have a manufacturing plant in Dallas, Texas.
0:25:53 That’s cheaper than what they can manufacture in India and China because it’s all robotically and AI-driven.
0:25:58 It doesn’t take many people, but more importantly, we can turn from one drug to the next in hours.
0:26:03 And so the same type of concept applies to all businesses.
0:26:08 Now, what’s interesting, I know, Scott, you’ve talked a lot about this in terms of jobs and kids coming out of school.
0:26:17 I think the big adjustment is that, like my daughter goes to Vanderbilt and she’s getting ready to graduate and they all want to go work for a big company.
0:26:27 And I keep on telling her that, no, what’s changing is big companies have the money and resources to train and implement AI and become more and more efficient.
0:26:29 Small to medium-sized companies don’t.
0:26:45 They need AI natives, not necessarily just to be prompters, that’s not going to get you there, but to understand how agentic AI works, to understand how to integrate, to understand how to look at processes from the consumer’s perspective and reinvent how you do things.
0:26:53 And then the great unknown beyond that, which will open up more doors for kids coming out over the next few years, is robotics.
0:27:05 Because right now, everything we know about AI is pretty much, it’s multimodal where you have pictures and text, but it doesn’t really incorporate video at all.
0:27:09 It’s not training on video to understand what’s happening in the world.
0:27:17 You have to tell it very discreetly, though, if somebody gets hit by this, it rolls, et cetera, et cetera.
0:27:20 Whereas with robotics, they’ve got to capture video.
0:27:30 I think this is where Elon is smart and ahead of the curve, and where robotics companies might supersede what we’re seeing with the big AI companies.
0:27:47 But think of it this way, put aside humanoid robots, I don’t think that’s the future, but being able to tell a robot, clean the house, without giving discreet instructions, is what’s going to happen, right?
0:27:55 It’s going to know what socks go together, it’s going to know how long to wash, it’s going to know to look under the bed for dust for the kids.
0:28:00 It’s going to have to understand these things inherently because it’s been trained on those things.
0:28:12 Being trained on video and understanding the rules of physics without having to be told and having some common sense and context because you’re seeing it all reduces latency, makes it smarter.
0:28:19 And imagine that approach relative to a chat GPT, which are you going to use, right?
0:28:25 It’s great to get text-based information back and pictures-based, right, and allow it to create fun videos and all that.
0:28:39 But in terms of helping your business or changing your life, I mean, I think we’ll take it to the point where homes will be redesigned, you know, to become more efficient because now it’s designed for people to do all the tasks.
0:28:55 But if your washing machine is designed to be run by, you know, a robot that looks more like a spider or whatever they come up with because it’s been optimized to fit, the game has changed and we can talk differently about building homes and connecting homes and all kinds of stuff.
0:29:02 Now, parallel to that, we’ve got the ultimate war between all the AI companies.
0:29:13 You’ve got Meta, you’ve got Google, you’ve got ChatGPT, you’ve got XAI with Grok, you’ve got Perplexity, you’ve got Clod and Anthropoc.
0:29:15 All of them want to be the winner.
0:29:21 Not all of them will be the primary destination when people download their first AI app.
0:29:29 And so, how deep will the ability to be financially successful and have that network effect, will it go?
0:29:30 Will it just be one?
0:29:33 Will it be two, three, four, five?
0:29:35 We don’t know.
0:29:43 But as we start to find out, you know, that could be the delineator in the stock market with the MAG-7, right?
0:29:45 You know, will they acquire each other?
0:29:47 Will they go down swinging?
0:29:53 You know, will they be the next IBMs from the 60s to the, you know, the 2010s?
0:29:56 It’s not preordained the way we see it today.
0:29:57 That’s the way I believe.
0:30:09 And you just look at these companies spending tens of billions of dollars a year borrowing money beyond their cash flow because they believe it’s a zero-sum game.
0:30:13 Kissing the ass of the administration because they have to.
0:30:15 It’s their fiduciary responsibility.
0:30:17 So, that’s generally how I see everything.
0:30:20 I think we’re still, it reminds me of the early days of streaming.
0:30:24 When we first started streaming, you had to download a TCP IP client.
0:30:27 You had to have an ISP subscription.
0:30:30 You had to get, you know, a piece of software.
0:30:32 And then it all changed.
0:30:34 And people don’t even think twice about streaming.
0:30:35 It’s just there.
0:30:37 We’ll get into it and it becomes comprehensive.
0:30:47 If it went from audio to video to 3D to AR, whatever, we’ll see that type of generational change with AI as well.
0:30:51 Yeah, I mean, you bring up the comparison to streaming.
0:30:57 I think a lot of people are thinking of the comparison really just to the dot-com boom at large.
0:31:03 And a lot of the dynamics you described there do feel at least similar.
0:31:06 The idea that we all know that something’s going to happen.
0:31:13 You’ve got Zuckerberg saying, you know, we might piss away hundreds of billions of dollars, but ultimately, super intelligence.
0:31:16 It’s too big of a fish to just not really go for it.
0:31:28 So, yeah, but it also portends if we saw what happened when the dot-com bubble imploded and a lot of people did get massively burned.
0:31:30 Yeah, we don’t know who AOL is.
0:31:33 We don’t know who Yahoo is, you know, to create analogies.
0:31:38 But we know a couple of these big AI companies are either or.
0:31:41 And so how do you as an investor deal with that?
0:31:48 I mean, we’re seeing these massive valuations, OpenAI, Anthropic, everyone’s trying to get in on these companies.
0:31:55 And yet there’s this little thing in the back of everyone’s mind, which is maybe it’s only going to be one or two winners in this whole game.
0:32:02 And yet we’re still willing to invest tens, in some cases, hundreds of billions of dollars into these companies.
0:32:04 I still think you have to, right?
0:32:06 You don’t know who the winner is going to be.
0:32:10 I mean, ChatGPT is in the lead, but Gemini’s catching up with NanoBanano.
0:32:12 I mean, it just came out of left field, right?
0:32:19 And now all of a sudden it’s just changed the dynamics of users using AI products.
0:32:22 And so I still think you’ve got to go all in.
0:32:24 I don’t think there’s any issue there.
0:32:28 And part of the issue is that there’s just fewer places to put your money.
0:32:37 You know, particularly in the public markets, when we took Broadcast.com public in 1998, there were 8,500 public company plus all the pink sheets and everything else.
0:32:46 Now there’s, you know, 5,000 maybe, you know, and it’s easier to deal with private equity than it is to go public.
0:32:51 But to me, that’s unfortunate because it doesn’t create the liquidity for your employees or early investors.
0:32:57 I mean, you can with early investors, but it doesn’t, you know, I always looked at going public as a way to reward our employees.
0:33:19 And I think going back to, if I was the president again, I would create tax incentives for companies that rewarded their, all of their employees with a peri-pusu type ratio of stock rewards, whether it’s warrants, restricted stock, you know, absolute stock, whatever it may be, to income.
0:33:32 So if the CEO gets $100,000 worth of stock and makes a million dollars, that’s 10%, the person working the front desk at the office who makes $40,000 gets 10%, $4,000.
0:33:42 Because I think that’s a path to income equality because you have to increase people’s asset base in order for them to have assets that appreciate and keep up.
0:33:46 Because if you don’t have assets that appreciate, you can’t ever keep up.
0:33:51 So, you know, it’s kind of jumping around again, but I think AI is still where you invest.
0:33:56 I think part of the reason is there’s so much money coming in and there’s no place else to put it.
0:34:06 You know, you don’t see the mid-caps, you don’t see the Russell, you don’t see them breaking out at the same, because between tariffs and the investment in AI.
0:34:10 But at some point, we’ll figure out who the America Online is.
0:34:12 We’ll figure out who the Yahoo is.
0:34:14 And then, bam, it all changes.
0:34:16 I just don’t know when that point is.
0:34:24 I’m really glad you bring up this idea of the fact that we are seeing so many fewer public companies today than we were 20, 30 years ago.
0:34:31 This has been a big theme on our show, this idea that AI is happening, and yet all of the action is happening in the private markets.
0:34:41 The only place you can really invest, as you say, is you’ve got NVIDIA, you can invest in Google, and we believe that Google’s pretty undervalued right now, Oracle.
0:34:51 But aside from a handful of big tech companies, which everyone already knew about, all of the consumer AI investments are still sort of siphoned off from the public.
0:34:52 It’s been a big concern of ours.
0:34:57 I’m wondering if it’s a concern of yours, the idea that regular people can’t get in on this stuff.
0:35:01 I’ve been saying it for two decades, that there aren’t enough public companies.
0:35:09 Whatever the latest technology is, people, you know, employees and investors should be able to participate.
0:35:14 I used to jump all over Gary Gensler because he made it so hard to go public.
0:35:17 It should be easy to go public.
0:35:21 Now, there should be strict reporting and auditing standards to make sure the numbers are real.
0:35:25 But beyond that, it needs to be easier than it is now.
0:35:31 And look, you give President Trump credit to do a six-month instead of three-month reporting cycle.
0:35:32 That’s fine.
0:35:45 It doesn’t change the economics of the business, but it makes it a little bit easier because, you know, that week or two before you report earnings is always just a stressful time when you have to write and rewrite the same nonsense over and over again.
0:35:59 Let me ask you this, though, because you said, and I think I agree with you, but I just want the strong man here, or the steel man, the – I went and saw the Fantastic Four, 3,400 people working on the film.
0:36:02 And I think sooner rather than later, it’s going to take 300.
0:36:05 And I don’t think we need 10 superhero films.
0:36:08 I don’t think people are begging for more content right now.
0:36:19 And when you reference Kimmel, the problem with Kimmel, the bigger issue other than censorship, is that it’s 160 people to produce that show.
0:36:20 It’s just too damn expensive.
0:36:22 It’s not that he’s not talented.
0:36:23 It’s not that he doesn’t have an audience.
0:36:25 It’s just the juice isn’t worth the squeeze.
0:36:26 It’s too goddamn expensive.
0:36:37 And it strikes me that there’s just certain industries that are going to see a fairly severe reduction in employment in the short and the medium term.
0:36:38 Oh, for sure.
0:36:40 That always – creative destruction.
0:36:44 I mean, they don’t make records in Terre Haute, Indiana anymore.
0:36:47 RCA is gone, right?
0:36:49 Or there may be some semblance of it left.
0:36:52 So there will be destruction at various points.
0:37:04 But I think as an entrepreneur, oh, my God, there’s going to be people creating businesses and having access to, you know, a large language model, perplexed, whatever.
0:37:18 As long as you understand when it starts hallucinating, when you start getting 8, 10, you know, layers deep, right, you have every business professor in the history of business professors at your disposal.
0:37:31 So to put together your business plan, to review, to use Gemini Research, that – like, when I started my companies when I was 16 and 18, and I had to go sit in the library and read books to try to figure out all this stuff.
0:37:34 You know, now it’s like that.
0:37:37 And just – it’ll democratize business education.
0:37:40 It’ll democratize education in general.
0:37:46 It’s like when Andrew Carnegie built libraries around the country to try to democratize education.
0:37:49 This is that to the millionth level.
0:37:52 There’s nothing a kid can’t learn right now.
0:38:08 And when you tell me that there’s a tool that allows any child to learn anything they want within the constraints of self-harm and all that kind of stuff, right, but allowing a child to learn anything they want, nothing makes me more excited.
0:38:17 Because if you believe that, you know, the curiosity of a child can lead to, you know, something we haven’t even imagined yet.
0:38:27 Like, when I go talk to kids at school, one of the things I always do inevitably, you know, when I say kids, I mean anywhere from 10 to 18.
0:38:31 I point – I point at the screen on the wall.
0:38:35 I said, one day that didn’t exist and someone came up with the idea.
0:38:41 That chair you’re sitting on, one day a chair looking like that didn’t exist and someone came up with the idea.
0:38:46 Everything you’re wearing, everything you’ve seen that’s not natural, someone came up with that idea.
0:38:59 Now, with AI and the access to everybody, literally, then that kid with that idea can get direction in exactly what to do.
0:39:06 One thing that’s been a concern for young people, though, is, I mean, this creative destruction, which is absolutely true.
0:39:11 And what we always see with technology is it comes in, it tears up a bunch of jobs, and then some new jobs show up.
0:39:14 But I think one of the big questions is, what do you do in the interim?
0:39:28 And what we have found, and we’ve been discussing this study that came out recently, which found that there’s been a reduction in job openings that are related to AI of 13%, specifically for young people.
0:39:34 So AI is kind of taking your job, but more specifically, it’s taking young people’s jobs.
0:39:41 And I think one of the big questions that we’ve been trying to get to the bottom to is, as a young person, you mentioned your daughter there at Vanderbilt.
0:39:46 As a young person, how do you not get replaced by AI?
0:39:50 If AI is so capable at all of these things, what are the kinds of skills that you double down on?
0:39:56 Well, in order for AI to replace you, somebody has to know how to implement AI in order to replace you, right?
0:39:59 And that happens in big businesses, like I alluded to earlier.
0:40:06 But there are only 20,000 businesses in this country with more than 500 employees.
0:40:07 That’s it.
0:40:13 And there are millions and millions of businesses that create 62% of new jobs every year.
0:40:23 The challenge is you’ve got to just look in different places because the big company already is going to have somebody because they’re under the pressure to get all their numbers down to do it.
0:40:31 The little, the small to medium-sized companies are under far more pressure to compete with the bigger companies.
0:40:37 And we saw that to a certain extent, you know, with HTML and the internet in the mid and late 90s.
0:40:40 And kids were considered geniuses if they could do web pages.
0:40:43 But I’m not talking about prompting, right?
0:40:57 Small companies don’t have the resources or the knowledge or the ability to just say, okay, this process, we’re going to just turn it into an agent.
0:41:02 At Cost Plus Drugs, we have one person that’s young in her 20s.
0:41:11 All she does is look at processes that are manual right now, just go through the list and try to automate them and then track them and manage them.
0:41:21 That is the job that every small to medium-sized company needs but may not know that they need it because the CEO may not be literate enough.
0:41:31 And a kid coming in there saying, I just spent the last two years living and breathing and cheating with chat GPT and perplexity to get through my classes.
0:41:34 And I did an agent that went out and done it.
0:41:41 As long as you can do agents or learn agents and if you have a tech, you know, a programming background at all, you’re good.
0:41:43 You’re going to be able to get a job.
0:41:54 It’s just, you know, the tipping point really is when those small companies realize they’re falling behind because they’re not able to implement AI to get the cost savings that their bigger competitors are.
0:41:56 Then they’ll have no choice but to hire those kids.
0:42:04 If I were to sort of summarize your approach to this, I mean, it’s really, you got to lean into AI.
0:42:08 You got to use as many agents that are at your disposal and you got to learn how to use it.
0:42:24 But if I were to sort of summarize your philosophy, it seems as though one of your biggest priorities in business, and I think this has been true across your career, is you need to be curious and willing to adopt technology.
0:42:28 It sounds like that is the big, that is the real alpha that you see in your career.
0:42:32 Whenever a technology comes along, you cannot write it off.
0:42:35 You cannot think that it’s too confusing.
0:42:43 You have to be willing to go through the hard work of understanding how do I actually use this thing and how do I figure out a way to be more productive?
0:42:44 It’s actually a little bit backwards.
0:42:53 You look at the industry and you look at the processes and you look at the interrelations and you say, what’s right, what’s optimal, and what’s a mess?
0:43:05 Like in healthcare, how can we start costplusdrugs.com and literally change the pharmacy industry so that the $100 billion plus competitors are telling their customers they’re going to emulate what we do?
0:43:20 And they haven’t been able to do it because of their cost structure because I went in there and I looked at the processes and realized just how opaque they are and was able to simplify it to something as simple as all we have to do is be transparent because nobody trusts anything.
0:43:28 If you look at what’s happening now with healthcare on the healthcare side, right, it is so back-ass halfwards.
0:43:30 I’ll give you a simple example.
0:43:33 You’ve got, and we’ll go back to politics, if Mark Cuban was president, right?
0:43:39 You’ve got the Democrats talking about the continuing resolution to keep the government open.
0:43:46 And they’re saying, what are the things that we want is to have the premium subsidies for the ACA continued?
0:43:49 Now, I’ll take a little step to the side here.
0:43:53 What type of company is the most hated company in all of America?
0:43:54 Health insurance.
0:43:55 Yep.
0:44:10 Now, when you create those, when you create, when you have those premium subsidies that you’re trying to keep alive, right, who do 100% of those premium subsidies, tens of billions of dollars or more for the 12.5 million people receiving subsidies on the ACA.
0:44:13 Who are those taxpayer checks written directly to?
0:44:14 Health insurance.
0:44:14 Yes.
0:44:20 Now, why do health insurance companies love it and everybody hates them?
0:44:30 Because you have premiums, but you can’t get access and benefit of your insurance until you’ve paid your deductible and you’re out of pocket.
0:44:36 And so, you know, and we can take a little side trip again.
0:44:39 You want to talk about the economy, right?
0:44:42 Everybody’s paying more and more premiums, but even worse.
0:44:49 And what everybody’s missing is the deductibles and out of pockets are going up even faster.
0:44:55 And if you can’t afford to pay your deductible, you can’t afford to get health care.
0:45:02 So, President Cuban would say, look, we already have, we guarantee mortgages.
0:45:07 We, in some states, they’ll even put the down payment down for you for $25,000 worth.
0:45:09 We’ll guarantee student loans.
0:45:10 We’ll guarantee SBA loans.
0:45:12 Maybe not 100%, but a big chunk of them.
0:45:26 But if you’re about to die, right, you can go into an ER to try to get stabilized, but anything else, if you can’t afford your deductible, you’re shit out of luck.
0:45:44 Why don’t we guarantee the deductibles and instead of sending that premium, those premium subsidies to the insurance companies, use them to subsidize, not subsidize, but guarantee the amount of money that it would take to go and get the care you need so that your health insurance kicks in.
0:45:56 Because the other thing that happens as part of this, think of it, you’re a hospital and God forbid somebody you know gets hurt and they go to the hospital and you, and they look up your, your insurance from your insurance card.
0:45:58 And you have an ACA silver plan.
0:46:11 You have a family of five and your, your deductible is $5,000 and you don’t have $400, 40% of people in this country now chunk around Medicaid, but 40% of these people don’t have $400, right?
0:46:17 So there’s a really good probability that people going in to need care can’t afford their deductible.
0:46:18 So you know what the hospitals do?
0:46:22 The hospitals work with companies or they do it themselves.
0:46:35 They loan money to these people just so they can get access to the insurance company’s premiums, which in turn turns those providers, the hospitals, et cetera, into subprime lenders.
0:46:46 And they collect maybe 50% and they’re the ones that are creating all the bills that everybody’s talking about, the healthcare bills that caused bankruptcy or did cause bankruptcy, right?
0:46:54 Not because they want to, not because, but they were put in this position because of the way we allow health insurance to work.
0:47:02 And those people, if you can’t afford your deductible, do you think you’re going to be willing to take a chance on a house or moving, right?
0:47:03 Or changing jobs?
0:47:03 Hell no.
0:47:13 And so all these things, we have to start understanding the real pain points that every single American that’s not wealthy has.
0:47:24 And that starts with your healthcare deductible because there is nothing more important to any individual or their family than if something horrific happens or even little things.
0:47:36 You know, if it costs, if your deductible is higher than what you have access in credit and cash, and you can’t make up the gap to be able to use your insurance, you already know what we call people like that.
0:47:37 F.
0:47:40 Stay with us.
0:47:40 Stay with us.
0:48:08 A day of sunshine?
0:48:09 No.
0:48:10 A box of fine wines?
0:48:11 Yes.
0:48:12 Uber Eats can definitely get you that.
0:48:15 Get almost, almost anything delivered with Uber Eats.
0:48:16 Order now.
0:48:17 Alcohol in select markets.
0:48:20 Product availability may vary by Regency app for details.
0:48:21 Hi, everyone.
0:48:22 This is Kara Swisher.
0:48:31 This week on my podcast, On with Kara Swisher, I’ve got a conversation with former Transportation Secretary and 2020 Presidential Candidate Pete Buttigieg.
0:48:32 This was a good one.
0:48:42 We recorded it live and talked about everything from political violence and the Democratic Party’s attempts to fix their credibility gap to the Trump administration’s authoritarian tendencies.
0:48:45 We also talked about train daddies.
0:48:46 Have a listen.
0:48:51 Nobody’s going to just come down and tell us, hey, you just had an authoritarian breakthrough.
0:49:00 And the real question is, does it get consolidated or does it get redirected and disrupted?
0:49:05 And of course, I asked him if he was planning to run for president in 2028.
0:49:06 Of course he is.
0:49:09 You can listen wherever you get your podcasts and on YouTube.
0:49:13 Be sure to subscribe to On with Kara Swisher for more.
0:49:21 In an age of media consolidation, family dynasties are having a moment.
0:49:25 The Murdochs, the Sulzbergers, the Roys, the Hursts, and the new kids on the block.
0:49:42 The Ellison family, Potter, Larry, Iho, David, and their money just bought CBS Paramount, will
0:49:47 soon take a big stake in TikTok, and are reportedly going to bid for Warner Brothers Discovery, which
0:49:48 owns CNN.
0:49:50 It’s been said that nothing bad can happen.
0:49:52 It can only good happen.
0:49:53 Nothing bad can happen.
0:49:54 It can only good happen.
0:49:59 But is yet another Trump-aligned family having control of your grandpa’s TV shows and your
0:50:01 TikTok algo something to worry about?
0:50:05 Answers on Today Explained every weekday.
0:50:22 We’re back with Prof G Markets.
0:50:26 So, Mark, we think a lot about the struggles that young men face.
0:50:28 You have two daughters and a son.
0:50:33 And the other thing that I thought you could offer unique perspective on is that we see, as
0:50:37 the owner of the Mavericks, you get to know very personally and intimately these young men.
0:50:45 And we, I think just by their physical stature and just their excellence, we forget that a lot of
0:50:47 these men are really more boys.
0:50:49 They’re so young.
0:50:56 And one, I’d just be curious to get your thoughts on the struggles that young men face in America and what some potential solutions and
0:51:06 approaches to it and how your views as a father, having both daughters and sons, and being around so many young men has informed your view on this.
0:51:11 First, on the players, it’s actually a lot easier than it used to be 10 years ago because of NIL.
0:51:16 Back in the day, they’d never opened up a, you know, a credit card account or a checking account.
0:51:17 What’s NIL, Mark?
0:51:21 That’s the money that kids in college can be paid for playing sports.
0:51:22 Oh, I see.
0:51:22 Thank you.
0:51:26 So now they understand how economics work, right?
0:51:26 And they have agents.
0:51:28 And that’s been great for athletes.
0:51:38 For my 16-year-old son, his friends, it’s terrifying, you know, because it’s hard for 16-year-olds and parents to connect, male or female, for that matter, boys or girls.
0:51:42 Because when I want to talk to him about Andrew Tate, he’s like, shut up, Dad.
0:51:50 But, you know, so I’ll just go on a little soliloquy and say, you’re going to listen to this, and here’s what’s important to me, and here’s what’s not, and if you have questions.
0:51:52 But it is terrifying.
0:51:59 It really, really is because the influence is there, and there’s no independence whatsoever.
0:52:01 It’s either far left or far right.
0:52:07 And I think there’s a lot more far right winning right now than we’d like to see.
0:52:12 Like Nick Fuentes, you know, he has a following that goes younger and younger.
0:52:15 And that’s terrifying.
0:52:18 Any thoughts or ideas regarding potential solutions?
0:52:20 Yeah, it goes back to what I said.
0:52:22 This is all algorithmically driven, you know.
0:52:31 And so if you didn’t allow kids, you know, or maybe you just start with not allowing them to have their phones at school as a starting point, which I think is a big plus.
0:52:37 But you can’t let kids under 18 or under 16 at least get on social media.
0:52:51 You just cannot unless you make sure that the algorithms are fully published and available all day every day for every change and or there’s a link for parents to see everything that they’ve watched.
0:52:54 So you know what to have a conversation about.
0:52:58 Because as I said before, those algorithms know our kids better than we do.
0:53:05 And the easiest way for me to know what my son in particular is into is by watching him scroll through his phones.
0:53:12 You know, as long as it’s basketball, basketball, fantasy, fantasy, football, fantasy, football, hot girl, basketball, basketball, I’m okay.
0:53:21 But when I, you know, every now and then I’ll see, you know, not an Andrew Tate, but somebody, you know, that’s selling, you know, the macho approach, right?
0:53:24 And that scares me.
0:53:27 You know, you’ve built so many different businesses.
0:53:29 You’ve had successes in just so many different areas.
0:53:32 I mean, you’ve built businesses, internet businesses.
0:53:34 You’ve had success in entertainment.
0:53:35 You’ve been a TV personality.
0:53:40 You’ve owned an NBA team that went on to win a championship.
0:53:41 Now you’re in the healthcare business.
0:53:50 I feel like, you know, a lot of the big thing that we’re trying to do on this show is we’re talking about markets, but we’re also just trying to figure out how do we get rich and how do we become successful?
0:53:52 And what is that all about?
0:54:01 I would be interested to know, just given the diversity of success that you’ve had, what really drives you?
0:54:03 Like, you’re building this healthcare company right now.
0:54:06 What is driving Mark Cuban?
0:54:07 And what are you trying to do?
0:54:09 What kind of legacy are you trying to leave?
0:54:11 To me, business is the ultimate sport.
0:54:13 Period, end of story.
0:54:20 And I may not be able to play basketball as well as I used to, but I want to compete still.
0:54:28 And there’s no better way to compete than in business because it’s just a matter of my taking the time to learn what I need to learn.
0:54:30 And Steve Jobs said it best.
0:54:31 He said, everything’s a remix.
0:54:34 And so I start with a base of technological information.
0:54:38 So when AI comes along, you know, I read books about machine learning.
0:54:42 Then I get into, you know, training models and all that kind of stuff.
0:54:43 But I’m willing to put in the time.
0:54:47 And what I tell kids when they ask me, how do you get there?
0:54:47 Right?
0:54:52 You find something you love to do, and then you just commit to time to be great at it.
0:54:56 And I’m not, you know, if you happen to be a special athlete and playing tennis, basketball, whatever, great.
0:54:57 Do what you do.
0:54:59 But in business, most people don’t take the time.
0:55:05 And especially now with AI, you can ask AI any question on the planet.
0:55:07 There’s nothing you can’t ask.
0:55:08 And so I do it all the time.
0:55:10 Like I was giving a speech.
0:55:13 And that’s why I won’t even, I won’t blow my cover here.
0:55:21 But it would, they, they, right before the speech, they asked me to cover a topic that they, I was going to be asked, but I didn’t know shit about it.
0:55:21 Right?
0:55:24 So I just pulled out perplexity in chat GPT.
0:55:28 So I had two different answers to compare and said, this is what I’m going to be asked to talk about.
0:55:30 What are the 10 top things that I should mention?
0:55:32 And they all made perfect sense to me.
0:55:33 So I went in and they thought I was smart.
0:55:34 You know?
0:55:35 That’s what I should do.
0:55:45 Every, every kid, the hardest part about AI is not being embarrassed about asking the questions you normally would only ask a human.
0:55:48 You know, that’s the hardest part about AI.
0:55:52 And so as it pertains, again, there is no better time.
0:55:56 I’ve said this forever, but there’s no better time to be an entrepreneur than now.
0:56:02 I mean, I remember buying books, how to build a business plan or your business plan for this.
0:56:05 I opened up a bar in college, your business plan for your bar.
0:56:07 And it has all the fill in the blank stuff, right?
0:56:11 Now, there’s no industry that I can’t ask about.
0:56:16 Like there was, there was a product today that I had Gemini research, do deep research for me.
0:56:18 And it took them eight minutes.
0:56:23 And I knew everything that I would have learned if I would have gone and done all the searching online, you know?
0:56:25 And I checked the sources and confirmed and all that.
0:56:34 But anybody, if they’re curious enough and they’re excited enough and they want to learn, can dig in and learn.
0:56:41 If you have the capacity just to start building a base of knowledge, then for all businesses, right?
0:56:44 You start with that base and then you might fail at a business, but you learn.
0:56:46 Then you go to the next one and you learn and you go to the next one.
0:56:49 And I say it all the time, these kids, you only have to be right one time.
0:56:52 It doesn’t matter how many times you fail.
0:56:54 You don’t know about my powdered milk company.
0:56:55 You don’t know about my bar.
0:56:57 You don’t know about all these businesses that failed.
0:57:06 But I just kept on compounding my knowledge and kept on always learning, always reading, and just always being curious, always being competitive.
0:57:08 That’s what it takes.
0:57:15 You know, and how you define success may be different than me, but however you define it, you can get there.
0:57:22 That moment where you’re asked to speak on this topic and you have two options there.
0:57:27 Either you say, I’m actually not that knowledgeable about that topic, but I’m happy to talk about these topics here.
0:57:28 I can do that.
0:57:30 Or you say, you know what?
0:57:30 Fuck it.
0:57:38 I’m just going to figure out what this is and I’m going to talk about it anyway and I’m going to make sure that I say something that I really believe and that I think is correct.
0:57:40 Those are two choices.
0:57:44 I’m wondering, one of them has a sense of caution.
0:57:47 The other has a sense of risk, danger, and bravery.
0:57:56 Has that always existed in you, this desire to go above and beyond to push yourself no matter what the situation is?
0:57:59 Or is this something that you’ve trained over the years?
0:58:00 I mean, it’s all imposter syndrome.
0:58:01 Right?
0:58:03 And I just don’t want to get busted.
0:58:05 You know?
0:58:08 And I’m not afraid to say I don’t know.
0:58:15 You know, even like that speech I alluded to, they really zigged off into other areas I knew, so this is just a starting point.
0:58:16 But I don’t have a problem.
0:58:24 You know, and I’ll bring, going back to AI, the one thing that humans do that AI can’t do, they say, I don’t know.
0:58:37 You know, an AI model, the programmers behind it will set thresholds above which they think the probability that they’re right and they’ll give the answer.
0:58:39 But they have no context at all.
0:58:44 Humans, inherently, we understand what’s going on around us.
0:58:47 And if there’s something we don’t know, we have the capability.
0:58:48 We might not do it, right?
0:58:50 And maybe that gets you elected president.
0:58:54 But most of us are able to say, I don’t know.
0:58:58 And that’s always going to make us different than technology.
0:59:00 I mean, that’s the difference, right?
0:59:01 Right there.
0:59:09 You mentioned this idea of achieving success, and everyone has a different version of success, and yours might be different from others.
0:59:10 What is your version of success?
0:59:13 What does success mean to you at this point?
0:59:15 You’ve kind of conquered the money game.
0:59:20 Everyone wants you to run for some form of elected office.
0:59:24 People keep on doing this, and you keep on saying, no, I’m not going to do it.
0:59:29 But that’s an area of life that you could go on to conquer.
0:59:32 What does success look like for you at this point?
0:59:33 Just waking up every day excited.
0:59:36 I was happy when I was broke.
0:59:42 When I was sleeping on the floor with five roommates in a three-bedroom apartment, I was enjoying my life.
0:59:44 It was stressful in a lot of different ways.
0:59:48 You know, getting the credit cards cut up, the phone ringing all the time, and his bill collectors, et cetera, et cetera.
0:59:57 But I knew I was going to find a way, maybe not financially, but I was going to find a way to take care of myself and enjoy myself.
1:00:01 And it’s the same way now, but that’s one of the reasons I sold the Mavs.
1:00:06 That’s one of the reasons I got off of Shark Tank, because that joy for me comes from my kids and comes from my family.
1:00:18 And Shark Tank was always in June and September, which is my wife’s birthday, and effectively all but one of my kids is in August, and the other two are in September.
1:00:22 My anniversary is in September, and I was always missing it, you know?
1:00:24 And then the season started, and so I was gone.
1:00:37 I’m a lot, and I was like, no, you know, of all the things I want to say on my deathbed, being president, being rich, being called dad, yeah.
1:00:40 Mark Cuban is a serial entrepreneur and investor.
1:00:47 He’s founded several companies, including Microsolutions, Broadcast.com, and most recently, Cost Plus Drugs, a company focused on reforming U.S. drug pricing.
1:00:55 He bought the NBA’s Dallas Mavericks in 2000, led them to a championship in 2011, sold majority control in 2023.
1:01:01 His ventures in media include HDNet, Magnolia Pictures, 2929 Productions, and Shark Tank.
1:01:03 He lives in Dallas with his wife and three children.
1:01:06 Mark, this was excellent.
1:01:08 So good to have you on the show, and we really appreciate your time.
1:01:09 Thanks, Mark.
1:01:10 Yeah, thanks, Ed.
1:01:10 Thanks, Scott.
1:01:11 It was always a lot of fun.
1:01:12 Thank you, guys.
1:01:23 Scott, your thoughts?
1:01:25 Look, I like Mark.
1:01:27 He’s very likable.
1:01:32 I don’t know many people who don’t like him, and I like the fact that he tries to be centrist.
1:01:41 I like the fact that he’s very much a capitalist, that he tries to give both sides sort of equal airtime.
1:01:46 I think it’s refreshing to have someone who really is sort of, I’d call him sort of center-right.
1:01:49 I like what he’s doing with Cost Plus.
1:01:50 He’s always entertaining.
1:01:55 I think he’s definitely, personally, I think he’s leaving the door open to run for presidency,
1:02:07 because I’m just not entirely sure why he would come on shows like this unless, I mean, there’s just not a lot of people looking to buy, you know, Zolopram on Cost Plus who listen to the show.
1:02:08 Never know.
1:02:10 You never know.
1:02:18 I do think he’s probably thinking that he wants to leave the door open around president.
1:02:20 But in general, I think he’s a good role model for young men.
1:02:22 I think he’s smart, thoughtful.
1:02:24 I think he’d actually be a really interesting voice.
1:02:27 I’m not sure I want him to be president, but I’d like him to run.
1:02:37 I think he’d add a lot of really interesting dialogue and also always make sure that the race was keeping their eye on capitalism, if you will.
1:02:37 Your thoughts, Ed?
1:02:51 I was very interested to hear his thoughts on those last few questions on, especially around imposter syndrome, this idea that, you know, he really has a love of the game, the games being business.
1:03:02 He’s never down to not be playing in the game, whether that’s in sports or in media or in entertainment or in technology or in healthcare.
1:03:08 I mean, he just has this absolute love of business and this love of competing.
1:03:14 And I feel like that’s a big part of why he’s been so successful is that he basically doesn’t seem to say no.
1:03:20 And I just wanted to get your reactions to that philosophy.
1:03:21 What do you make of that?
1:03:25 My sense of Mark, and I don’t know him well, but I know him a little bit, is that he’s good at life.
1:03:28 He’s just really, he’s taken, he also got really fucking lucky.
1:03:31 I mean, broadcast.com got bought for $6 billion.
1:03:33 Do you use broadcast.com?
1:03:33 I don’t.
1:03:37 It was internet radio, I think, and he sold it for $6 billion.
1:03:41 So back in the kind of go-go days of 98 or 99.
1:03:44 So, but then again, he kept trying until he got lucky.
1:03:46 So I don’t begrudge his billions.
1:03:49 Well, let me ask you this then.
1:03:51 Do you have the same approach?
1:04:00 I mean, when he was describing his approach, I feel like you have a similar philosophy, which is you’re always going to be in the game.
1:04:07 You’re going to be competing, whether that’s in e-commerce or podcasting or investing.
1:04:11 I feel like that’s a big part of your philosophy too, no?
1:04:21 This for me is the salad days, because when I was your age, I was working just so hard and doing so many things I didn’t want to do to try and establish an economic base.
1:04:37 And now, you know, that I’ve gotten some economic security, I can do exactly what I want and exactly what I want.
1:04:44 And I think it’s similar to Marcus to do fun stuff where you’re in the game, where you’re trying to build shit, trying to make money.
1:04:46 And if you don’t like it, you can leave.
1:04:50 And, you know, if he was still trying to make money in awareness, he wouldn’t have left Shark Tank.
1:04:53 But now, the biggest luxury in the world is no.
1:04:59 And one of my role models, Barry Rosenstein, he said there are three buckets in life.
1:05:03 There’s things you have to do, there’s things you want to do, and there’s things you should do.
1:05:06 If you’re biggest investors in town, you have to do it.
1:05:08 There’s things you want to do.
1:05:12 Yeah, you want to take your son to see, you know, Arsenal play Wolves or whatever.
1:05:14 And then there’s things you should do.
1:05:18 And that is you think, well, Scott’s got this event.
1:05:21 I don’t really want to go, but I should go because Scott’s my boss.
1:05:26 Or my friend’s daughter, my friend’s sister is having her bat mitzvah.
1:05:29 Last fucking thing I want to do, but I should do it.
1:05:31 There’s a networking event.
1:05:35 I’m exhausted, but I should go in case I, you know, meet good people.
1:05:42 The key to getting to a point of economic security and a certain level of self-actualization is you can eliminate the should bucket.
1:05:45 And I have a Mark, I have, that’s the luxury.
1:05:52 I get the sense Mark is doing, doing a few things he has to do, but mostly doing things he wants to do.
1:05:53 But you want to stay in the game.
1:05:54 You want to stay mentally active.
1:05:56 It’s fun to make money.
1:05:57 You want to be relevant.
1:05:58 A lot of it’s ego-driven.
1:06:02 I think, quite frankly, what I probably share with Mark, I think we both have pretty healthy egos.
1:06:07 And you want to be relevant and in the game and get the affirmation of others.
1:06:15 But also, as you get older, if you have a certain amount of economic security, you do get some perspective.
1:06:18 And you’re like, you start recognizing the finite nature of life.
1:06:25 And that is, you’re like, okay, I got to start spending more time with people that I care about and they care about me, especially your kids, because your kids are leaving.
1:06:27 So, you do change.
1:06:38 You absolutely do change your perspective and start, you know, there’s nothing like impending death that creates perspective.
1:06:45 And it’s getting increasingly impending when you’re 60 and 67, which I am and which Mark Cuban is.
1:06:50 It just changes your view on things because when you’re your age, you can’t imagine it ending.
1:06:51 So, you don’t think that way.
1:06:54 And that’s healthy because you should be taking risks and doing things.
1:07:04 But in terms of imposter syndrome, what I would suggest, Ed, is that when you feel imposter syndrome, it’s common sense.
1:07:06 You are in over your head, Ed.
1:07:09 You are out over your skis.
1:07:11 Little humor.
1:07:12 Little humor.
1:07:22 This episode was produced by Claire Miller and Alison Weiss and engineered by Benjamin Spencer.
1:07:27 Our research team is Dan Shallan, Isabella Kinsel, Kristen O’Donoghue, and Mia Silverio.
1:07:32 Drew Burrows is our technical director, and Catherine Dillon is our executive producer.
1:07:35 Thank you for listening to Prof G Markets from the Vox Media Podcast Network.
1:07:41 If you liked what you heard, give us a follow and join us for a fresh take on markets on Monday.
1:08:06 We’ll see you next time.
1:08:18 Prof G Markets from the Vox Media Podcast Network.
Tập podcast này với sự dẫn dắt của Scott Galloway và khách mời Andrew Ross Sorkin đã kết nối nhiều chủ đề tưởng chừng khác biệt nhưng lại liên quan mật thiết, bắt đầu từ việc phân tích văn hóa bo tiền rồi mở rộng sang các thảo luận về quản trị doanh nghiệp, biến đổi khí hậu, và tương lai của thị trường lẫn truyền thông.
Cuộc trò chuyện mở đầu bằng phân tích hiện tượng “lạm phát bo tiền” hiện đại. Galloway và Sorkin mổ xẻ phản ứng xã hội trước những lời nhắc bo tiền kỹ thuật số tràn lan, lập luận rằng công nghệ vốn dĩ nhằm tạo sự thuận tiện nay lại gây ra bực bội, dẫn đến mức bo tiền trung bình giảm. Họ liên hệ điều này với nỗi lo âu kinh tế rộng hơn và sự thay đổi văn hóa khiến hành vi bo tiền cảm giác ít giống một khoản tự nguyện mà giống một thứ thuế bắt buộc. Điều này dẫn đến chỉ trích các nền tảng như GoFundMe, mà theo hai vị dẫn chương trình, đã áp dụng một cách sai lệch mô hình “bo tiền” vào cứu trợ thiên tai từ thiện.
Một phần quan trọng của tập podcast đề cập đến hệ quả chính trị và kinh tế của chủ nghĩa gia đình trị và ảnh hưởng. Lấy thông tin Donald Trump Jr. gia nhập hội đồng quản trị công ty thị trường dự đoán Kalshi làm bàn đạp, Galloway nhiệt thành phản đối việc bình thường hóa việc các gia đình quyền lực kiếm tiền từ mối quan hệ chính trị, so sánh trực tiếp với những chỉ trích nhắm vào Hunter Biden. Ông cảnh báo rằng hành vi này biến nền kinh tế thành một “chế độ tham nhũng” và thách thức các hội đồng quản trị về nghĩa vụ ủy thác của họ trong việc tránh những xung đột lợi ích như vậy.
Chủ đề lớn cuối cùng là cuộc thảo luận sâu về các chủ đề kinh tế hiện tại với Sorkin. Ông xác định thị trường trái phiếu là một “bộ điều tốc” then chốt đối với chính sách tài khóa của bất kỳ chính quyền mới nào, làm nổi bật rủi ro từ lãi suất tăng. Cuộc thảo luận sau đó chuyển hướng sang cơn sốt AI, nơi Sorkin tỏ ra hoài nghi về định giá trên trời của nhiều công ty phần mềm AI, gợi ý rằng các mô hình ngôn ngữ lớn cốt lõi có thể bị hàng hóa hóa. Ông lạc quan hơn về cơ sở hạ tầng “cuốc và xẻng” hỗ trợ như chip và năng lượng. Tập podcast kết thúc bằng một cuộc trò chuyện mang tính siêu hình về khủng hoảng niềm tin vào truyền thông, với Sorkin nhấn mạnh rằng thành công trong tương lai của các tổ chức tin tức nằm ở việc phát triển những cá tính và phóng viên đáng tin cậy, chân thực mà khán giả tin là đang hành động với thiện ý.
Những Góc Nhìn Bất Ngờ
- Công Nghệ Bo Tiền Phản Tác Dụng: Việc số hóa bo tiền, được thiết kế để làm nó dễ dàng và liền mạch hơn, trớ trêu thay lại dẫn đến sự bực bội phổ biến của người tiêu dùng và mức bo tiền trung bình giảm, vì những lời nhắc liên tục có cảm giác như ép buộc.
- Thị Trường Trái Phiếu Như Một Rào Cản Chính Trị: Sorkin cho rằng ràng buộc thực sự đối với các chính sách tài khóa đầy tham vọng (và có khả năng gây lạm phát) của một chính quyền mới có thể không phải là đảng đối lập, mà là thị trường trái phiếu toàn cầu, nơi có thể trừng phạt sự phung phí bằng cách đẩy chi phí vay lên cao.
- Khủng Hoảng Thanh Khoản Của Vốn Cổ Phần Tư Nhân: Hệ sinh thái đầu tư thay thế, đặc biệt là vốn cổ phần tư nhân, được mô tả là “đã hỏng”, với các công ty không thể thoái vốn khỏi các khoản đầu tư ở mức định giá kỳ vọng, dẫn đến một vòng luẩn quẩn: định giá tài sản theo kiểu “tưởng tượng” và gặp khó trong việc huy động quỹ mới.
- Hào Cản Của AI Có Thể Rất Mỏng: Trái ngược với sự cường điệu, có một lập luận nghiêm túc rằng công nghệ cốt lõi của các mô hình ngôn ngữ lớn (LLM) thiếu một hào cản sâu và đang nhanh chóng bị hàng hóa hóa, ngụ ý rằng định giá khổng lồ của nhiều công ty phần mềm AI được xây dựng trên nền móng không vững chắc.
- Trận Chiến Chênh Lệch Của Truyền Thông Chính Thống: Podcast lập luận rằng các kênh tin tức truyền thống, bị ràng buộc bởi việc kiểm tra sự thật và trách nhiệm pháp lý, đang tham gia một trận chiến thua trước các nền tảng mạng xã hội không phải chịu những ràng buộc như vậy, tạo ra một sự chênh lệch sâu sắc trong hệ sinh thái thông tin.
Những Điểm Rút Ra Thực Tiễn
- Để chống lại sự bực bội vì “lạm phát bo tiền”, hãy cân nhắc sử dụng tiền mặt khi có thể. Điều này cắt đứt lời nhắc kỹ thuật số khó xử, khiến khoản tiền bo cảm giác cá nhân và tự nguyện hơn, đồng thời đảm bảo tiền đến trực tiếp người lao động.
- Khi đánh giá tin tức doanh nghiệp, hãy xem xét kỹ các vị trí trong hội đồng quản trị và tuyển dụng điều hành để tìm dấu hiệu của chủ nghĩa gia đình trị hoặc việc kiếm tiền từ mối quan hệ chính trị. Hãy tự hỏi liệu năng lực chính để được tuyển là thành tích hay mối quan hệ.
- Đối với nhà đầu tư, hãy nhìn xa hơn những cơn sốt phần mềm AI và cân nhắc cơ sở hạ tầng thiết yếu, hỗ trợ — các công ty sản xuất chip, xây trung tâm dữ liệu, hoặc cung cấp năng lượng — những thứ có thể có mô hình kinh doanh bền vững hơn.
- Tại các khu vực dễ xảy ra cháy rừng, hãy tính toán chi phí thực của rủi ro vào quyết định bất động sản. Dựa vào các kế hoạch bảo hiểm do nhà nước hậu thuẫn hoặc kỳ vọng vào các gói cứu trợ của người đóng thuế là một chiến lược mạo hiểm; chi phí thực của việc sống ở các vùng nguy cơ cao đang trở nên rõ ràng hơn và đắt đỏ hơn đối với cá nhân.
- Hãy ủng hộ báo chí bằng cách theo dõi những phóng viên đáng tin cậy cá nhân, không chỉ thương hiệu. Trong thời đại niềm tin vào thể chế suy giảm, uy tín và nỗ lực thiện chí của những nhà báo cụ thể đang trở thành đồng tiền giá trị nhất.
本集播客節目由主持人史考特·加洛威與來賓安德魯·羅斯·索金對談,將多個看似獨立卻相互關聯的主題交織串連,從檢視小費文化現象出發,延伸探討公司治理、氣候變遷,以及市場與媒體的未來走向。
對話首先剖析現代「小費通膨」現象。加洛威與索金解析社會對無所不在的數位支付小費提示產生的反感情緒,指出原本旨在創造無摩擦體驗的科技,反而引發民眾不滿,導致平均小費金額下降。他們將此現象連結至更廣泛的經濟焦慮與文化轉變——給予小費不再像是自發性打賞,更像是強制性稅金。話題隨後轉向對GoFundMe等平台的批判,主持人認為這類平台荒謬地將「小費模式」植入慈善災害救援中。
節目重要篇幅聚焦於裙帶關係與政治影響力衍生的政經後果。以前總統唐納·川普之子加入預測市場公司Kalshi董事會的新聞為引,加洛威力陳權貴家族將政治人脈變現的行為不應被常態化,並直接對比對杭特·拜登的批評。他警告此舉將使經濟淪為「盜賊統治」,並質疑企業董事會是否善盡受託責任,避免此類利益衝突。
最後的討論主軸是索金對當前經濟趨勢的深度剖析。他點名債券市場將成為新政府財政政策的關鍵「制動器」,並強調利率上升風險。話題隨後轉向AI熱潮,索金對許多AI軟體公司的高估值持懷疑態度,認為底層的大型語言模型可能淪為大宗商品。他更看好晶片、能源等基礎「鎬鏟型」基建產業。節目尾聲以媒體信任危機的後設討論作結,索金強調新聞機構未來的成功關鍵,在於培養受眾認可信賴、真誠且秉持善意的記者與意見領袖。
令人意外的洞見
- 小費科技適得其反:旨在使小費支付更便捷的數位化設計,因不斷跳出的提示令人產生被迫感,反引發普遍消費者反感,導致小費平均金額下滑。
- 債市成政治制衡力量:索金指出,新政府雄心勃勃(可能引發通膨)的財政政策,真正制約力量或許不是反對黨,而是全球債券市場——可透過推高借貸成本懲戒揮霍行為。
- 私募股權流動性危機:另類投資生態(尤指私募股權)被形容為「失靈」,企業無法以預期估值退出投資,陷入資產估值「憑空捏造」與募資困難的惡性循環。
- AI護城河恐難堅固:與熱潮相反,有觀點認為大型語言模型核心技術缺乏深廣護城河,正快速商品化,意味許多AI軟體公司的巨額估值基礎脆弱。
- 傳統媒體的非對稱戰爭:播客指出,受制於事實查核與法律責任的傳統新聞機構,正節節敗退於毫無此類限制的社群媒體平台,形成資訊生態的嚴重不對稱。
實用建議
- 緩解小費反感:盡量使用現金支付小費,既可避開尷尬的數位提示,使打賞更具個人自願性,亦確保款項直接交予工作者。
- 審視企業新聞時:嚴察董事任命與高層聘雇是否有裙帶關係或政治資源變現跡象,思考聘任主要依據是能力抑或人脈。
- 投資者應跳脫熱門AI軟體概念:關注晶片製造、資料中心建設、能源供應等關鍵基礎設施企業,其商業模式可能更具永續性。
- 野火高風險區置產:須納入真實風險成本,依賴州營保險計畫或納稅人紓困是危險策略,居住高風險區域的真實成本正日益顯現且更個人化。
- 以追隨可信記者取代僅關注媒體品牌:在機構信任衰微時代,特定新聞工作者的公信力與真誠努力,正成為最具價值的媒介貨幣。
Este episodio del podcast, presentado por el anfitrión Scott Galloway y su invitado Andrew Ross Sorkin, entrelaza varios temas aparentemente dispares pero interconectados. Comienza con un examen cultural de las propinas y se expande hacia debates sobre gobierno corporativo, cambio climático y el futuro de los mercados y los medios de comunicación.
La conversación comienza con un análisis del fenómeno moderno de la “inflación de propinas”. Galloway y Sorkin diseccionan el rechazo social hacia los omnipresentes recordatorios digitales para dar propina, argumentando que la tecnología diseñada para ser fluida ha generado, en cambio, resentimiento, llevando a una disminución en los montos promedio de propina. Lo vinculan con una ansiedad económica más amplia y un cambio cultural donde el acto de dar propina se siente menos como una gratificación voluntaria y más como un impuesto obligatorio. Esto da paso a una crítica de plataformas como GoFundMe, que, según los presentadores, han insertado de manera perversa un modelo de “propina” en la ayuda caritativa para desastres.
Una parte significativa del episodio aborda las ramificaciones políticas y económicas del nepotismo y la influencia. Usando la noticia de que Donald Trump Jr. se unirá a la junta directiva de la empresa de mercados predictivos Kalshi como punto de partida, Galloway argumenta apasionadamente contra la normalización de que familias poderosas monetizen sus conexiones políticas, estableciendo una comparación directa con las críticas a Hunter Biden. Advierte que este comportamiento convierte la economía en una “cleptocracia” y desafía a las juntas corporativas sobre su deber fiduciario de evitar tales conflictos de interés percibidos.
El último eje principal implica un análisis profundo de los temas económicos actuales con Sorkin. Él identifica al mercado de bonos como un “regulador” crucial de la política fiscal de cualquier nueva administración, destacando el riesgo del aumento de las tasas de interés. La discusión luego gira hacia el auge de la IA, donde Sorkin expresa escepticismo sobre las valoraciones estratosféricas de muchas empresas de software de IA, sugiriendo que los modelos de lenguaje subyacentes podrían convertirse en una mercancía. Es más optimista sobre la infraestructura habilitadora, como los chips y la energía. El episodio concluye con una meta-conversación sobre la crisis de confianza en los medios, donde Sorkin enfatiza que el futuro éxito de las organizaciones de noticias reside en cultivar personalidades y reporteros confiables y auténticos, que la audiencia crea que actúan de buena fe.
Ideas Sorprendentes
- La tecnología de propinas fracasó: La digitalización de las propinas, diseñada para hacerla más fácil y fluida, irónicamente ha llevado a un resentimiento generalizado del consumidor y a una disminución general en los promedios de propina, ya que los recordatorios constantes se sienten coercitivos.
- El mercado de bonos como freno político: Sorkin postula que la verdadera restricción a las políticas fiscales ambiciosas (y potencialmente inflacionarias) de una nueva administración podría no ser el partido de oposición, sino el mercado global de bonos, que puede castigar la prodigalidad aumentando los costos de endeudamiento.
- La crisis de liquidez del capital privado: El ecosistema de inversión alternativa, particularmente el capital privado, se describe como “roto”, con empresas incapaces de liquidar inversiones a las valoraciones esperadas, lo que lleva a un ciclo de valorar activos de manera “ilusoria” y luchar por recaudar nuevos fondos.
- La ventaja competitiva de la IA puede ser superficial: Contrario al bombo publicitario, existe un argumento serio de que la tecnología central de los modelos de lenguaje (LLMs) carece de una ventaja competitiva profunda y se está mercantilizando rápidamente, lo que implica que las enormes valoraciones de muchas empresas de software de IA se construyen sobre cimientos poco sólidos.
- La batalla asimétrica de los medios tradicionales: El podcast argumenta que los medios de comunicación tradicionales, sujetos a verificación de hechos y responsabilidad legal, están librando una batalla perdida contra las plataformas de redes sociales que no enfrentan tales restricciones, creando una asimetría profunda en el ecosistema informativo.
Conclusiones Prácticas
- Para combatir el resentimiento por la inflación de propinas, considere usar efectivo para las propinas cuando sea posible. Elimina el incómodo recordatorio digital, hace que la gratificación se sienta más personal y voluntaria, y asegura que el dinero vaya directamente al trabajador.
- Al evaluar noticias corporativas, examine detenidamente los nombramientos de juntas directivas y las contrataciones ejecutivas en busca de señales de nepotismo o monetización del acceso político. Pregúntese si la calificación principal de un contratado es el mérito o la conexión.
- Para los inversores, mire más allá de las atractivas empresas de software de IA y considere la infraestructura esencial y habilitadora (empresas que fabrican chips, construyen centros de datos o proporcionan energía), que pueden tener modelos de negocio más duraderos.
- En áreas propensas a incendios forestales, tenga en cuenta el costo real del riesgo en las decisiones sobre propiedades. Confiar en planes de seguro respaldados por el estado o esperar rescates de los contribuyentes es una estrategia arriesgada; el costo real de vivir en zonas de alto riesgo se está volviendo más claro y más costoso a nivel personal.
- Apoye el periodismo siguiendo a reporteros confiables de manera individual, no solo a las marcas. En una era de declive de la confianza institucional, la credibilidad y el esfuerzo de buena fe de periodistas específicos se están convirtiendo en la moneda más valiosa.
A conversa começa com uma análise do fenômeno moderno da “inflação das gorjetas”. Galloway e Sorkin dissecam a rejeição social contra os pedidos digitais onipresentes de gorjetas, argumentando que a tecnologia projetada para ser sem atritos, em vez disso, gerou ressentimento, levando a uma queda nos valores médios das gorjetas. Eles relacionam isso a uma ansiedade econômica mais ampla e a uma mudança cultural em que o ato de dar gorjeta parece menos uma gratificação voluntária e mais um imposto obrigatório. Isso leva a uma crítica de plataformas como o GoFundMe, que os apresentadores argumentam ter perversamente inserido um modelo de “gorjeta” no auxílio caritativo em desastres.
Uma parte significativa do episódio aborda as ramificações políticas e econômicas do nepotismo e da influência. Usando a notícia de que Donald Trump Jr. se juntou ao conselho da empresa de mercado de previsão Kalshi como ponto de partida, Galloway argumenta apaixonadamente contra a normalização de famílias poderosas monetizando suas conexões políticas, traçando uma comparação direta com as críticas a Hunter Biden. Ele alerta que esse comportamento transforma a economia em uma “cleptocracia” e desafia os conselhos corporativos sobre seu dever fiduciário de evitar tais conflitos de interesse percebidos.
O último grande tópico envolve uma análise profunda dos temas econômicos atuais com Sorkin. Ele identifica o mercado de títulos como um “regulador” crucial sobre a política fiscal de qualquer nova administração, destacando o risco do aumento das taxas de juros. A discussão então muda para o boom da IA, onde Sorkin expressa ceticismo sobre as valorizações estratosféricas de muitas empresas de software de IA, sugerindo que os grandes modelos de linguagem subjacentes podem se tornar commodities. Ele é mais otimista em relação à infraestrutura capacitadora, como chips e energia. O episódio termina com uma metaconversa sobre a crise de confiança na mídia, com Sorkin enfatizando que o sucesso futuro das organizações de notícias reside em cultivar personalidades e repórteres confiáveis e autênticos, que o público acredita estar agindo de boa fé.
### Insights Surpreendentes
– **A Tecnologia das Gorjetas Sair pela Culatra:** A digitalização das gorjetas, projetada para torná-la mais fácil e fluida, ironicamente levou a um ressentimento generalizado do consumidor e a um declínio geral nas médias de gorjetas, já que os constantes lembretes parecem coercitivos.
– **O Mercado de Títulos como um Freio Político:** Sorkin postula que a verdadeira restrição às políticas fiscais ambiciosas (e potencialmente inflacionárias) de uma nova administração pode não ser o partido de oposição, mas o mercado global de títulos, que pode punir a prodigalidade aumentando os custos de empréstimo.
– **A Crise de Liquidez do Private Equity:** O ecossistema de investimentos alternativos, particularmente o private equity, é descrito como “quebrado”, com empresas incapazes de sair de investimentos nas valorizações esperadas, levando a um ciclo de marcar ativos para “faz de conta” e lutar para captar novos fundos.
– **O Fosso da IA Pode Ser Razo:** Contrário ao hype, há um argumento sério de que a tecnologia central dos grandes modelos de linguagem (LLMs) carece de um fosso profundo e está se tornando rapidamente uma commodity, implicando que as enormes valorizações de muitas empresas de software de IA são construídas sobre bases instáveis.
– **A Batalha Assimétrica da Mídia Tradicional:** O podcast argumenta que os veículos de notícias tradicionais, vinculados à verificação de fatos e responsabilidade legal, estão travando uma batalha perdida contra as plataformas de mídia social que não enfrentam tais restrições, criando uma assimetria profunda no ecossistema de informação.
### Lições Práticas
– **Para combater o ressentimento da inflação das gorjetas, considere usar dinheiro** para gorjetas quando possível. Isso elimina o desconfortável lembrete digital, faz a gratificação parecer mais pessoal e voluntária e garante que o dinheiro vá diretamente para o trabalhador.
– **Ao avaliar notícias corporativas, examine minuciosamente as nomeações para conselhos** e contratações executivas em busca de sinais de nepotismo ou monetização de acesso político. Pergunte-se se a principal qualificação de um contratado é mérito ou conexão.
– **Para investidores, olhem além dos negócios de software de IA da moda** e considerem a infraestrutura essencial e capacitadora – empresas que fabricam chips, constroem data centers ou fornecem energia – que podem ter modelos de negócios mais duráveis.
– **Em áreas propensas a incêndios florestais, considere o verdadeiro custo do risco nas decisões sobre propriedade.** Contar com planos de seguro apoiados pelo estado ou esperar por resgates dos contribuintes é uma estratégia arriscada; o verdadeiro custo de viver em zonas de alto risco está ficando mais claro e mais caro pessoalmente.
– **Apoie o jornalismo seguindo repórteres individuais confiáveis**, não apenas marcas. Em uma era de confiança institucional em declínio, a credibilidade e o esforço de boa fé de jornalistas específicos estão se tornando a moeda mais valiosa.
Ed and Scott are joined by Mark Cuban, serial entrepreneur and investor, to discuss the challenges facing today’s media ecosystem. Mark shares his take on what happened with Jimmy Kimmel, his thoughts on the TikTok deal, and how he believes AI will ultimately impact employment. He also explains why he thinks social media should be age-gated — and why, despite the concerns, AI could end up helping Hollywood rather than hurting it.
Subscribe to the Prof G Markets newsletter
Order “The Algebra of Wealth” out now
Subscribe to No Mercy / No Malice
Follow Prof G Markets on Instagram
Follow Scott on Instagram
Learn more about your ad choices. Visit podcastchoices.com/adchoices

Leave a Reply
You must be logged in to post a comment.