AI transcript
0:00:02 Support for the show comes from Public.com.
0:00:08 You’ve got your core holdings, some high conviction picks, maybe even a few strategic options at play.
0:00:11 So why not switch the investment platform built for those who take it seriously?
0:00:17 Go to Public.com slash PropG and earn an uncapped 1% bonus when you transfer your portfolio.
0:00:19 That’s Public.com slash PropG.
0:00:21 Paid for by public investing.
0:00:24 All investing involves the risk of loss, including loss of principal,
0:00:27 brokered services for U.S.-listed registered securities.
0:00:31 Options and bonds and a self-directed account are offered by Public Investing, Inc.,
0:00:33 member FINRA, and SIPC.
0:00:37 Complete disclosures available at Public.com slash Disclosures.
0:00:40 Listen closely.
0:00:44 That’s not just paint rolling on a wall.
0:00:47 It’s artistry.
0:00:55 A master painter carefully applying Benjamin Moore Regal Select eggshell with deftly executed strokes.
0:01:02 The roller, lightly cradled in his hands, applying just the right amount of paint.
0:01:03 Hmm.
0:01:05 It’s like hearing poetry in motion.
0:01:07 Benjamin Moore.
0:01:08 See the love.
0:01:13 Did you lock the front door?
0:01:13 Check.
0:01:14 Closed the garage door?
0:01:15 Yep.
0:01:18 Installed window sensors, smoke sensors, and HD cameras with night vision?
0:01:19 No.
0:01:25 And you set up credit card transaction alerts, a secure VPN for a private connection, and continuous monitoring for our personal info on the dark web?
0:01:28 Uh, I’m looking into it?
0:01:30 Stress less about security.
0:01:34 Choose security solutions from Telus for peace of mind at home and online.
0:01:37 Visit telus.com slash total security to learn more.
0:01:38 Conditions apply.
0:01:40 Today is number 10.
0:01:45 That’s the percentage of marriages worldwide that are between first or second cousins.
0:01:47 Ed, genuine question here.
0:01:54 Is finding out that your partner has sucked more than a hundred dicks upsetting, or is my wife overreacting?
0:02:10 You know, I was annoyed that you’re 20 minutes late, but you’ve suddenly just papered over that, and now I’m happy.
0:02:11 I’ve redeemed myself.
0:02:11 I was upstairs.
0:02:13 Now I’m happy.
0:02:17 I was upstairs with the dogs, and then finally Drew called, and I have no sense of the calendar.
0:02:19 I apologize for being late.
0:02:24 I was either going to go with that one, or I had a joke on Pippa, which I really like, and that is the—
0:02:25 Do you ever see the movie The Exorcist?
0:02:26 No.
0:02:27 Oh, my God.
0:02:30 It’s an incredible horror film.
0:02:39 I made the mistake of sneaking in to see it when I was 13 with my friend Adam Markman, and when I got up in the morning, I would have to go into the corner and put my socks on in the corner.
0:02:43 I was so frightened of a demon being possessing me.
0:02:57 It’s about a girl, Linda Blair, who gets possessed by the devil, and about all these attempts to exercise, and it’s called The Exorcist because this priest, played by Max von Sydow, and I forget the other guy I played,
0:03:02 coming in to try and, you know, the devil from this poor little girl.
0:03:04 Ellen Burstyn plays the mother way before your time.
0:03:14 Anyways, there’s a rumor that they’re planning a sequel to The Exorcist, but this time it’s about getting the priest out of the little boy.
0:03:24 Oh, how are you, Ed?
0:03:27 What’s going on with NVIDIA?
0:03:29 What’s going on with NVIDIA?
0:03:32 Look at Ed.
0:03:33 Ed loves that joke.
0:03:36 It’s that Princeton pedo humor.
0:03:49 The fact that every joke has to be punctuated with a groan is another piece of this podcast I’ll never quite wrap my head around.
0:03:50 That’s right.
0:03:51 That’s right.
0:03:54 Guess who I’m having dinner with tonight?
0:03:55 I’m totally name dropping.
0:03:59 I probably shouldn’t say that in the same sentence as a pedo joke.
0:04:01 I’m having dinner with the chancellor of UCLA.
0:04:03 To discuss your vocational programming?
0:04:05 Well, Ed, I’m glad you asked.
0:04:07 You know, I don’t know.
0:04:10 I think it’s to discuss how I can give more money.
0:04:11 Probably that, yeah.
0:04:15 The reward you’re giving money to a university is you get called and they want your insight into academia.
0:04:20 And they pretend to care for about 10 minutes.
0:04:21 And then comes this opportunity.
0:04:23 It was presented as an opportunity.
0:04:27 And to match some other gift or do something.
0:04:31 But, yeah, I have something called, let’s talk about the accelerator program.
0:04:32 So I’m a big fan.
0:04:35 Public school trade changed my life.
0:04:36 This isn’t philanthropy.
0:04:38 This is an overdue payback.
0:04:43 And I give money to public universities and I always say, find me the program no one else will give money to.
0:04:46 So I don’t want sexy shit.
0:04:52 I don’t want an engineering school or a school of international relations or a gym or none of that bullshit.
0:04:59 So at UW-Madison, I’m the second donor to a wonderful program where the professors go to local penitentiaries.
0:05:06 And if you’re near your release or like less than five years from your release, you can take courses to work towards a BA.
0:05:07 That’s cool.
0:05:09 And at UCLA, I’m doing adult education.
0:05:12 Adult education, it’s like vocational programming.
0:05:13 No one wants to fund it.
0:05:16 And I love it because it’s very low cost.
0:05:17 No admissions policy.
0:05:24 And I’m doing, I love that this is the, we should call this, this section of the program, Scott Vergey-Signaling.
0:05:25 I did-
0:05:26 Prof G. Philanthropy.
0:05:27 University of Georgia.
0:05:30 I’m doing tenant rights at Berkeley.
0:05:34 I did scholarships for children of immigrants as I am one of them.
0:05:36 The big question is, do you get your name on any of these things?
0:05:38 No, they’ve asked.
0:05:42 They wanted to name one of the programs and I don’t want to be, have my name on anything.
0:05:44 Because you don’t want anyone to know that you’re doing this.
0:05:46 Well, yeah, I’m so shy.
0:05:48 You know me.
0:05:48 I’m very subtle.
0:05:50 I don’t like to do that.
0:05:52 I don’t like to talk about this sort of thing.
0:05:54 No, I’m being very honest now.
0:05:57 They wanted to call this the, whatever, the Galway program.
0:06:02 And I’m like, no, because eventually they’re going to decide that my jokes or something I did was totally inappropriate.
0:06:08 And my kids are going to have to fight to keep my name on this thing as I’m 90 and reheating soup.
0:06:11 And everyone decides I’m a, you know, name your favorite ist.
0:06:17 If they can come up with reasons why they need to tear down Winston Churchill’s statue in London,
0:06:22 they can figure out a million things why they’re going to need to take my name off a program.
0:06:23 So I’m like, no, I don’t.
0:06:26 It’s going to be a giant compilation of every intro to Prof G. Markets.
0:06:27 That’ll be your downfall.
0:06:28 Yeah, that’s it.
0:06:30 They’ll be like, oh, my God, I can’t believe you said that.
0:06:31 You know, national.
0:06:34 Anyway, so, no, I don’t have my name on.
0:06:36 I’m not going to put my name on any of this shit.
0:06:36 Fair enough.
0:06:39 Well, I’m glad that we talked about it and people know anyway.
0:06:40 That’s the most important thing, right?
0:06:41 There you go.
0:06:42 There you go.
0:06:47 Now is the time to cry.
0:06:51 I hope you have plenty of the well at all.
0:06:56 Tech companies are racing to secure data centers to handle the surge in AI computing.
0:07:03 We’ve discussed this trend at length on the show, but more data centers means more demand
0:07:04 for energy.
0:07:10 In fact, S&P Global projects grid demand from data centers will rise 22% in 2025, and it will
0:07:12 nearly triple by 2030.
0:07:18 But the part that is talked about less is who is footing the bill.
0:07:20 Right now, it looks like consumers might be paying.
0:07:28 Bloomberg found electricity costs near major data center hubs are up as much as 267% compared
0:07:30 to five years ago.
0:07:33 So, Scott, everyone’s very excited about data centers.
0:07:35 Everyone’s very excited about chips.
0:07:38 We’ve seen a ton of announcements about these new data centers.
0:07:43 Open AI building trillions of dollars worth of data center capacity.
0:07:46 They want to build 250 gigawatts of data center capacity.
0:07:50 But the part that’s getting, again, less attention is the energy side.
0:07:52 Two big questions for me.
0:07:55 One, how are we actually going to power these things?
0:07:59 And two, if we can power them, what is it going to do to the cost of energy?
0:08:05 Because as that Bloomberg article points out, in areas where data centers have been built,
0:08:08 energy costs have more than tripled in just five years.
0:08:12 And meanwhile, the plan is to build more data centers.
0:08:13 So, a lot to get into here.
0:08:17 I will start or I will pause there and get your reactions.
0:08:24 The greatest arbitrage in history is the arbitrage of fossil fuels and, in general, energy.
0:08:30 The more energy you produce, I mean, there’s just, energy is like broadband.
0:08:33 No matter how much of you have of it, people will find a use for it.
0:08:39 Now, that’s not to say the prices don’t fluctuate, but the spike in this incremental demand and
0:08:42 the kind of price discovery happens at the frontier, right?
0:08:47 So, even though I think it’s increased demand consumption by 6% or 7%, that’s enough to
0:08:50 take prices up 25%, 50% or 100%.
0:08:52 So, what do we do about that?
0:08:58 Do we end up doing what we do with a lot of places and that is invest overseas?
0:09:03 Should we be building the data centers in the Gulf?
0:09:08 And the reason why that probably will happen or that might happen is that for the last 20
0:09:14 years, the dynamic in terms of capital flows have been that a bunch of institutional money
0:09:23 managers fly to Riyadh, Dubai or Doha or Abu Dhabi and say, I run a biotech fund in the U.S.
0:09:28 Please give me $100 million, a billion dollars to invest it and I will return back more capital.
0:09:35 The shift is that now the Gulf investors are, yeah, they’re still looking for alpha.
0:09:36 They’re still looking for good managers.
0:09:42 But even more than that, they’re looking for capital or to fund projects that create an infrastructure
0:09:45 around manufacturing, services, education.
0:09:51 The electronic arts take private was part of that because they realize that while they have
0:09:57 what feels like near infinite capital, the sea of oil beneath them at some point will run out.
0:10:02 Now, I don’t know if that’s 20 or 30 years or 60 or 80, but they know they do have a fuse
0:10:03 around trying to transition this economy.
0:10:08 So, when you go down there, they’re open for business, but they want to invest in companies
0:10:14 that will build some sort of local demand, local infrastructure, local business operations.
0:10:19 So, I can see a situation where these companies make massive investments in a company like OpenAI
0:10:26 in exchange for building those data centers with those processors in somewhere in the Gulf.
0:10:27 What are your thoughts?
0:10:32 I’m just really struck by how everyone is so excited about these data centers and we’re just
0:10:34 seeing this record investment into data centers.
0:10:38 And meanwhile, people are not talking enough about energy.
0:10:40 And the fact that we need to power these things.
0:10:44 And, you know, people were talking about it maybe a year ago.
0:10:50 I saw, you know, op-eds and newspapers and people talking about it publicly on podcasts saying,
0:10:54 oh, you know, this AI thing, it’s going to be a lot of power demand.
0:10:55 So, we’re going to need a lot more power.
0:10:58 I know Bill Gates was talking a lot about that.
0:11:04 But the reality is, right now, there is not nearly enough investment going into energy
0:11:08 compared to the amount of investment that’s going into data centers.
0:11:14 And by the way, that is why all of these construction plans for these data centers keeps on getting
0:11:15 delayed.
0:11:20 Because what happens is they announce these plans, these multi-billion dollar deals.
0:11:21 They say, we’re going to build this thing.
0:11:26 And then they realize that the waiting list to actually connect the data center to the source
0:11:32 of power, the thing that’s going to keep the lights on, is like five, six, seven years long.
0:11:38 And so, essentially, what we have here is a very simple and classic problem.
0:11:44 The demand for energy, because of AI, is way outstripping the supply.
0:11:48 And so, then your mind goes to Econ 101.
0:11:50 What does that mean?
0:11:52 It means energy costs are only going to go way up.
0:11:58 And I just want to sort of back up and highlight some of the numbers that we’re mentioning there.
0:12:05 So, as I mentioned, OpenAI, they want to build out 250 gigawatts worth of compute
0:12:07 over the next few years.
0:12:10 I think it would be by 2033, 2032.
0:12:14 So, let’s just talk about how realistic that is.
0:12:21 Last year, the entire U.S. added about 56 gigawatts worth of energy capacity.
0:12:23 And that was the most ever.
0:12:28 OpenAI wants to build out a chip network that would consume 250.
0:12:36 So, that is going to be equivalent to a quarter of the entire U.S. electric grid capacity.
0:12:43 It would also mean that OpenAI alone would consume more than half of all of the energy capacity
0:12:46 that we add over the next eight years.
0:12:52 If you wanted to supply 250 gigawatts of power, you would need to build 250 nuclear power plants.
0:12:54 It would cost you $12.5 trillion.
0:13:01 And I think what is striking is that we’re all just assuming that this data center build-out is a given.
0:13:09 And yet, there’s this very practical logistical problem, which is how the fuck are you going to power these things?
0:13:15 And, you know, I think then the conversation goes to, okay, well, what do we do about it?
0:13:19 You know, do we just say no to AI?
0:13:22 I don’t think anyone, or I’m sure some people want to do that.
0:13:24 In fact, actually, a lot of Americans want to say that.
0:13:26 But I don’t think the market’s going to let you do that.
0:13:31 So, you can just, like, pull back on AI or build out the grid.
0:13:35 And then it’s a question of how do you actually build out the grid?
0:13:41 And I think it’s a very—I mean, I just—this energy question becomes more and more interesting
0:13:47 because, you know, the administration, there’s been a lot of emphasis on how we need to drill baby drill.
0:13:49 We need to get our energy back.
0:13:54 All this focus on gas-fueled energy, fossil fuel energy.
0:14:01 But I would also like to point out that even if you use gas-fueled energy,
0:14:07 you’re not going to—you’re not even going to come close to the demand that these data centers are projecting.
0:14:11 It’s not even a question.
0:14:12 Maybe you get it with nuclear.
0:14:16 But again, as we’ve discussed, this takes decades and decades to build.
0:14:21 So, increasingly, I’m starting to believe that the route that you need to go is you need to go with solar energy.
0:14:28 But, of course, the administration is gutting all of the investments and the tax credits that fuel the solar industry.
0:14:30 So, I will pause again there.
0:14:36 But, I mean, I think the question is becoming,
0:14:41 clearly, we need more energy in America if we’re going to do this AI thing.
0:14:46 And if we don’t, the consequences are quite simple.
0:14:49 AI is going to absolutely skyrocket your energy bill.
0:14:55 This all feels to me like it’s desperate for some sort of technical breakthrough that reduces the energy demands.
0:15:01 Because the whole notion of—it’s sort of this circular philosophical question.
0:15:08 At what point do you not do a—conduct an AI query for the best place to eat Indian food in London
0:15:14 when the cost of that query is—inhibits you from being able to afford to go eat Indian food?
0:15:20 Or that AI becomes something that is sequestered to wealthy people and corporations,
0:15:27 and then we outsource the expense of AI to households that see their electric bills go up 20, 50.
0:15:29 I’ve heard in some areas it’s gone up 200%.
0:15:33 Or at some point, does the government come in or weigh in and call it an infrastructure investment?
0:15:40 Or do we come up with some sort of tiered pricing system where we charge—instead of having a bulk discount,
0:15:43 if you’re a B2B, if you’re a business that’s buying this capacity,
0:15:47 at some point and you’re purchasing over a certain amount,
0:15:50 which probably means you’re an AI or an AI-related business,
0:15:54 you pay a surcharge, which they then pass on to their consumers,
0:15:55 which would hit their stock price.
0:16:00 But at this point, you would argue if you need investment or deep pockets,
0:16:05 you’d probably go after the pools of shareholder capital of these organizations
0:16:11 are greater than the discretionary income of middle-class households in these areas.
0:16:17 And then it gets very political because every congressperson wants to—
0:16:23 I mean, so it’s no accident that Musk announced a huge infrastructure project
0:16:26 in Speaker Johnson’s district, right?
0:16:31 And so being able to cut the ribbon on a big data center or a colossus or whatever it is
0:16:32 makes that politician look very good
0:16:35 until the electric bills start coming for those consumers.
0:16:41 So I wonder if there’s going to be some sort of legislation that—I don’t know—
0:16:45 either the government weighs in and provides subsidies for people.
0:16:50 But if you think about, okay, we talk constantly about feeding and growing the middle class.
0:16:53 And one way to do that is through tax cuts.
0:16:55 But a better way to do it, I would argue,
0:17:00 because the reality is the middle class in America pay a lot of usage and consumption taxes,
0:17:03 but distinct of the class warfare rhetoric.
0:17:05 The middle class doesn’t pay as a percentage of their income.
0:17:07 They don’t pay a ton of federal income tax.
0:17:12 So I think about 70 or 80 percent of federal income tax is paid by the top 10 percent income-earning households.
0:17:17 But how do you—the best investment, I would argue,
0:17:20 or a great way to support the middle class is through infrastructure investments,
0:17:29 whether it’s rail, or power grid, or clean water, highways that help you get to and from your house faster,
0:17:37 so you can spend more time with your kids, or tax credits for housing such that people can have more affordable housing.
0:17:41 But it feels as if there needs to be a rethink around national energy policy.
0:17:51 The problem is, is that there’s such regulatory capture that it strikes me that if I had to predict what’s going to happen here,
0:17:57 is that there will be a huge investment in the grid, the majority of which will be absorbed by AI.
0:18:04 So effectively, all that is, is a transfer from all homes and all taxpayers to the shareholders of AI.
0:18:10 Yes. I think this is such an important—the amount that energy costs are going up is so important
0:18:13 because it’s highlighting how this stuff is actually expensive.
0:18:17 Like, it actually costs something, and someone has to pay for it.
0:18:23 And up to this point, it has seemed to most people that it’s just these, like, you know,
0:18:27 Qatari billionaires or these Silicon Valley billionaires who are just funding these things,
0:18:32 and there’s this abundance mindset, and it’s like, oh, a trillion dollars here, a trillion dollars there.
0:18:34 We’re just going to keep investing.
0:18:41 But this is sort of highlighting, actually, we’re not in a place of abundance, really,
0:18:45 or at least not in a place of abundance in the way that Sam Altman seems to think.
0:18:53 We actually have a scarcity of energy to the point that the costs are going to be felt by regular Americans.
0:19:02 And what has been so interesting recently is the local community pushback that has been growing against data center buildouts.
0:19:08 And this is very recent, but in the past few months, basically,
0:19:14 $63 billion worth of data centers have been blocked because local communities have gotten together,
0:19:18 and they’ve said, no, you’re not going to build this data center in our neighborhood.
0:19:22 It’s happened in Ohio, in Texas, in Indiana.
0:19:26 Google just withdrew one of their data center applications because they got all of this pushback.
0:19:33 And the argument from the people saying no is, basically, this actually doesn’t help us.
0:19:37 It doesn’t help our communities because it sends energy prices through the roof.
0:19:40 And then the other point, which I think is also really important that they’ve made,
0:19:44 is that it actually doesn’t even create jobs, really.
0:19:48 Because one thing about data centers, it’s not like a Walmart.
0:19:50 The data center is basically a giant robot.
0:19:55 And you build the thing, and there are construction jobs when you’re building it,
0:19:58 but then once it’s built, you basically just let it sit there,
0:20:00 and it’s only a few people who are servicing the thing.
0:20:04 You look at the data center that Apple built in North Carolina, as an example.
0:20:07 They spent a billion dollars on this data center.
0:20:10 It created less than 100 permanent jobs.
0:20:15 So that’s equivalent to, like, a Series B startup, like, showing up in North Carolina.
0:20:24 So huge costs on energy and temporary job creation through the construction.
0:20:29 But once it is built, actually, the local community doesn’t really benefit from it.
0:20:37 And as you say, it’s a robot that’s building shareholder value for the people who own AI stocks.
0:20:43 And so it’s becoming a really interesting, and I think this is going to become more prominent in the political sphere,
0:20:51 but it’s this NIMBY versus YIMBY debate, where people are saying, you know, not in my backyard.
0:20:54 God, and you’re YIMBY when it comes to housing.
0:20:56 I’m YIMBY when it comes to housing.
0:21:06 But I’ve got to say, I’m starting to find this data center argument against the data center bill that I’m starting to find it somewhat compelling,
0:21:09 especially if it’s going to triple people’s energy bills.
0:21:16 I mean, right now in Virginia, I guess about a quarter of the energy is already going to data centers.
0:21:21 And these things sort of typify people’s fears around where the world is going,
0:21:25 and that is these economic centers powering stocks don’t require anybody.
0:21:30 You don’t even need to turn on the lights during the day because there’s nobody working there.
0:21:38 So this is going to be – it is very difficult to spin up new energy sources, right?
0:21:40 It takes 10 years to build a nuclear power plant.
0:21:42 Cost overruns are just not easy.
0:21:49 So electricity companies or power generation companies have been some of the best-performing stocks.
0:21:52 Some of them have even outperformed AI stocks.
0:21:58 Constellation, the largest nuclear power operator in the U.S., is up 78% this year, more than any of the Magnificent Seven.
0:22:05 Brookfield Renewables, one of the largest public renewable energy companies, is up 47%, also outpacing the MAG-7.
0:22:08 American Electric Power, up 30%.
0:22:13 Entergy, another utility that focuses on the southern U.S., is up 20% year-to-date.
0:22:21 Because what you want as a business is you want moats, and then you want – but you want liquid – you want friction in supply.
0:22:24 You want a lack of friction in demand.
0:22:25 So what do we have?
0:22:29 We have demand that is just, like, accelerating with very little friction, right?
0:22:33 But at the same time, there’s a lot of friction around bringing on supplies.
0:22:36 So if you own the supply, it’s champagne and cocaine.
0:22:45 And the example I would use is that as someone who travels a shit ton and stays in nice places, you can’t spin up a five-star hotel.
0:22:50 The top 1% are garnering so much money, and there’s a, you know, YOLO attitude, live for today.
0:22:52 They’re traveling more.
0:22:53 There’s a lot of pent-up demand to travel.
0:23:01 So the demand at five-star resorts has vastly outstripped – five-star resorts are like nuclear power plants.
0:23:11 To find a place you can build a five-star hotel, get their permits, the local residents, plan the thing, raise the capital, and build a world-class hotel, it takes you 10 years.
0:23:21 So in between that lag, as consumption or demand goes up, you have seen hotel prices at nice hotels basically double since pre-COVID.
0:23:28 I think that’s kind of a relatively, you know, a reasonable analogy for what’s about to happen to power consumption.
0:23:36 And that is there is no kind of easy spin-it-up, large-scale, multi-gigawatt power supply available.
0:23:45 And it all feels to me, again, like wherever we head, based on the regulatory capture, based on who the president is dining with,
0:23:54 that Doug Burgum is going to figure out a way to come up with some sort of big, beautiful bill that taxes every American in the form of taxes
0:24:07 and transfers wealth with subsidized energy to these AI companies the same way we’re subsidizing all sorts of other things for, you know, different politically connected sectors.
0:24:12 I think that the energy build-out is the most important thing.
0:24:18 And just some – I mean, when we talk about like what do we do about it, just some data I would point you to.
0:24:23 I mean, again, there’s been this emphasis on we need gas to fuel this energy.
0:24:33 The most amount of gas-fired energy that has been deployed in a year, the most, has been 12.5 gigawatts.
0:24:35 OpenAI wants 250 gigawatts.
0:24:39 So, you’re not going to do it with gas.
0:24:44 As you point out, maybe you get it with nuclear, but it takes decades to build that out.
0:24:49 And as one of our research analysts on the team, Chris Nodonis, you said,
0:24:53 the problem isn’t that we need energy supply, it’s that we need energy supply now.
0:24:56 That’s exactly what AI companies are asking for.
0:24:58 We need the energy to come online right now.
0:25:01 So, I think you’ve got to do it with solar.
0:25:06 Solar made up more than half of all of the new energy capacity that was added last year.
0:25:14 But for whatever reason, probably for anti-woke reasons, the administration thinks that solar is hippy-dippy or woke.
0:25:16 Trump has this hatred of solar.
0:25:19 And so, there is this attack on the solar build-out.
0:25:23 And because you mentioned the Big Beautiful Bill, because of the Big Beautiful Bill,
0:25:27 we’re seeing less investment in solar.
0:25:29 We’re seeing elimination of tax credits.
0:25:34 And as a result, it’s going to reduce solar production capacity by about 20%.
0:25:35 Those are the projections right now.
0:25:42 So, all this comes down to, if we want to do AI, we need more focus on the energy side of things.
0:25:45 And we need more attention on how do we actually build out the energy.
0:25:48 And it’s not just a question of drill, baby, drill.
0:25:49 It’s a question of everything.
0:25:51 It’s a question of wind.
0:25:52 It’s a question of solar.
0:25:54 It’s a question of nuclear.
0:26:03 And I just, it seems as if the conversation has moved away from that because people are so excited about the AI thing
0:26:06 that they’ve gotten way out over their skis on this.
0:26:14 And there needs to be more of a rational conversation of like, okay, sounds good, but let’s keep the lights on.
0:26:16 How do we actually power these things?
0:26:22 Problems like this, when they’re this obvious and this big, oftentimes don’t end up happening.
0:26:26 And that is, the shit you’re worried about is the shit that, generally speaking, doesn’t happen.
0:26:30 And I can see a scenario along the lines of the following.
0:26:36 Sam Altman, in order to justify this valuation, is making these just kind of enormous outlandish projections,
0:26:42 including this is how much power I will need because I’m basically saying to the market,
0:26:48 this is how much AI is going to permeate all corners of society.
0:26:51 This is how big my company is going to be.
0:26:53 I need 250 gigawatts, right?
0:26:59 It’s sort of like announcing to the world, we’re going to hire 3 million employees based on our current growth.
0:27:00 Buy my stock.
0:27:01 This is how confident I am.
0:27:12 So one, there’s still a scenario where AI is an interesting technology, but it’s not powering everything in our lives.
0:27:13 We’re not glued to it every day.
0:27:20 It’s not operating every car that picks us up and doing all our finances and in charge of all of our workflow.
0:27:27 That it has some uses, but it’s more niche, and it never really gets the traction that the current valuations justify, too.
0:27:34 That there’s some sort of breakthrough where the energy consumption demands just dramatically diminish.
0:27:41 I mean, to a certain extent, the Civic, the Honda Civic came in, and it was a better car.
0:27:52 But the reason it was a better car was because it cost you $12 a week to power the thing instead of $22 because it just had a more efficient engine and a smaller car wrapped around the engine that just had better mileage.
0:27:57 So I think there’s a lot of things here that could change the calculus.
0:27:59 What’s clear is we’re going to need more energy.
0:28:08 I also think that, like, I’m a drill baby drill person, but I also think we should have kept some of those subsidies for renewables.
0:28:10 Economics ultimately went out.
0:28:19 The number one producer of wind is Texas, and renewables are still right now probably the most cost-efficient incremental capacity to bring on.
0:28:27 Specifically, I believe wind and solar are the least expensive and quickest power generation to deploy, even without government subsidies.
0:28:33 I would also argue there’s a really strong geopolitical reason to keep energy prices low.
0:28:41 I believe that if oil gets—I think the war in Ukraine, in how long it goes, is a function of oil prices.
0:28:49 And that is, if the price goes below $50 a barrel—I think it’s in the low $60s right now—I think Putin has to come to the table because 50% of his economy is based on the price of that.
0:28:56 And at some point, the cost of production is greater than that number, I think, if it gets to, like, $40 or $45.
0:29:06 So I think we have a geopolitical—a lot of reasons to massively invest in energy production across, you know, all the different dimensions.
0:29:09 So is it the old dirty stuff?
0:29:15 You know, I would argue that stuff is not going away as quickly as it should, but it’s also cost-ineffective.
0:29:16 Is it solar?
0:29:17 Is it wind?
0:29:20 Is it, you know, LNG?
0:29:25 It’s like—I think it’s, like, door number four, all of it, right?
0:29:28 And we’re the largest energy producer in the world.
0:29:30 We’re the largest oil producer.
0:29:32 I mean, people just don’t realize how strong we are economically.
0:29:33 Oil independent, food independent.
0:29:37 And we’re the number one producer of oil in the world.
0:29:47 And China is making—and you pointed this out last week—to their credit, they’re going to add, I believe, more solar power generation in this year.
0:29:54 Or at least that’s a stat you had a couple days ago, which blew my mind, than we have in total currently in the United States.
0:29:55 But solar is woke.
0:29:58 I mean, China understands the issues.
0:30:00 I don’t know how these things got so politicized.
0:30:03 I’m building—by the way, on a ground level, I’m building a home.
0:30:08 And one of the things we’re trying to figure out is how to buy the solar panels now to qualify for the tax credits.
0:30:13 And all of these solar panels, from my understanding, come in from China.
0:30:17 Like, China’s made just a huge bet on renewables.
0:30:24 But my gut tells me that it’s so obvious that there’s just going to be such a demand for energy consumption.
0:30:31 And then Sam Altman is out there saying, I’m going to need the energy of the EU to power my unbelievable business.
0:30:33 I don’t know.
0:30:38 It’s all making me very skeptical that we’re going to need as much energy as Sam Altman is saying, or as much incremental energy.
0:30:51 Whatever the solution is, whether it’s less AI, or more efficient AI, or more energy, or more efficient energy, or cost-effective energy, the point is, something has to give.
0:30:53 Because you look at the numbers.
0:30:56 As you often say, the math ain’t mathin’.
0:31:14 Just to wrap it up, at some point, middle-class households are going to say, look, I’m not willing to not eat and not be able to buy groceries because my electricity bill is so high, such that Scott and Ed can edit the notes for their newsletter using Anthropic.
0:31:19 Or as we’re about to discuss, so that my son can, like, jerk off to AI porn on ChatGPT.
0:31:20 That’s a good segue.
0:31:21 Well done.
0:31:22 Very elegant.
0:31:25 There is a reason you’re not on CNBC.
0:31:26 Okay.
0:31:28 Back to you, Joe.
0:31:33 We’ll be right back after the break.
0:31:39 And if you’re enjoying the show so far, be sure to give Prof.G Markets a follow wherever you get your podcasts.
0:31:49 Support for the show comes from Brax.
0:31:52 These days, every business leader is under pressure to save money.
0:31:55 But you can’t crush the competition just by cutting costs.
0:31:58 To win, you need to spend smarter and move faster.
0:31:59 You need Brax.
0:32:08 Brax is the intelligent finance platform that breaks the tradeoff between control and speed with smart corporate cards, high-yield banking, and AI-powered expense management.
0:32:12 Join the 30,000 companies that spend smarter and move faster with Brax.
0:32:15 Learn more at Brex.com slash grow.
0:32:20 Support for this show comes from Vanguard.
0:32:23 The lineup includes over 80 bond funds.
0:32:26 To all the financial advisors listening, let’s talk about bonds for a minute.
0:32:28 Capturing value and fixed income is not easy.
0:32:31 Bond markets can be massive, murky, and let’s be real.
0:32:35 A lot of firms throw a couple of flashy funds your way and call it a day.
0:32:36 But not Vanguard.
0:32:39 Vanguard bonds are institutional quality.
0:32:43 They’re actively managed by a 200-person global squad of sector specialists, analysts, and traders.
0:32:46 Lots of firms love to highlight their star portfolio managers.
0:32:50 Like, it’s all about that one brilliant mind making the magic happen.
0:32:52 Vanguard’s philosophy is a little different.
0:32:55 They believe the best active strategy should be shared across the team.
0:33:00 That way, every client benefits from the collective brainpower, not just one individual’s take.
0:33:04 So, if you’re looking to give your clients consistent results year in and year out,
0:33:09 go see the record for yourself at Vanguard.com slash markets.
0:33:12 That’s Vanguard.com slash markets.
0:33:14 All investing is subject to risk.
0:33:16 Vanguard Marketing Corporation Distributor.
0:33:22 Hi, this is Kara Swisher, host of On With Kara Swisher.
0:33:25 This week on my podcast, On With Kara Swisher,
0:33:28 I talk with Vice President Kamala Harris about her new book,
0:33:32 107 Days, as well as The Weaponization of the Department of Justice,
0:33:34 the tech industry’s rightward lurch,
0:33:38 and how she’s planning to fight against Trump’s march toward autocracy.
0:33:40 We take it live at the Warner Theater,
0:33:44 which is just a stone’s throw away from the White House,
0:33:45 where Donald Trump now lives.
0:33:46 Have a listen.
0:33:51 I don’t believe that the people who are flattering Donald Trump,
0:33:54 who are offering him favors,
0:33:59 necessarily believe in what he is doing or believe it is right,
0:34:01 but they want power.
0:34:05 And that’s part of what we are dealing with right now.
0:34:07 They want power,
0:34:11 and they’re not willing to compromise their power or access to power
0:34:14 for the sake of the Constitution and our democracy.
0:34:18 Listen to the whole conversation on On With Kara Swisher on YouTube
0:34:20 or wherever you get your podcasts.
0:34:31 We’re back with Profty Markets.
0:34:34 The conversation around technology and mental health
0:34:36 has taken center stage in recent months,
0:34:38 and some tech giants say they are stepping up.
0:34:42 Meta added new parental controls on Instagram last week
0:34:45 using the PG-13 rating system to limit what teens see.
0:34:46 And back in September,
0:34:49 Sam Altman announced a 120-day plan
0:34:52 to roll out parental controls for chat GPT users.
0:34:56 That move came after the parents of a teen who died by suicide
0:34:57 sued OpenAI,
0:35:00 accusing it of responsibility for their child’s death.
0:35:01 Last week, however,
0:35:04 Altman said that the problems with the chat bot
0:35:05 were, quote,
0:35:06 fully mitigated,
0:35:10 and then rolled many of those changes back.
0:35:12 So, Scott,
0:35:13 I think the question here is,
0:35:14 you know,
0:35:17 these companies are recognizing the problem,
0:35:18 and the question is,
0:35:20 are they really doing anything about it?
0:35:22 I will just first say,
0:35:24 let’s talk about Meta
0:35:26 and this PG-13 policy.
0:35:27 You know,
0:35:29 it sounds like progress
0:35:31 until you kind of remember
0:35:32 that they’ve had,
0:35:33 like,
0:35:35 many child safety announcements
0:35:37 that have happened over the past few years,
0:35:40 and the research has shown
0:35:43 that basically none of these features really work.
0:35:46 They’re really weak at protecting against self-harm content,
0:35:48 bullying content,
0:35:49 sexual content,
0:35:49 et cetera.
0:35:50 So, you know,
0:35:52 whatever efforts they have been making,
0:35:54 they’re not really good enough,
0:35:55 and it’s starting to appear to me
0:35:56 as if they’re more interested
0:35:59 in announcing these safety features
0:36:01 than they are in actually implementing them.
0:36:02 We’ll see,
0:36:04 with this PG-13 rating,
0:36:06 that track record isn’t great.
0:36:06 And then,
0:36:08 with Sam Altman,
0:36:11 he announced that Chad GPD
0:36:12 was, quote,
0:36:13 originally pretty restrictive
0:36:15 to be careful with mental health issues,
0:36:15 but he says
0:36:17 that they are now going to,
0:36:17 quote,
0:36:18 relax the restrictions
0:36:20 in most cases.
0:36:20 So,
0:36:23 some varying responses
0:36:24 to the mental health thing,
0:36:25 but what is clear
0:36:26 is that these companies
0:36:27 have to say something about it,
0:36:29 and they are saying things.
0:36:30 I think the question is,
0:36:32 are they going to really do anything about it?
0:36:33 If I didn’t know better,
0:36:35 I would think there’s a possibility here
0:36:36 that these firms are more concerned
0:36:37 with shareholder value
0:36:38 than the well-being of our children.
0:36:39 It’s just,
0:36:41 I’m starting to get that sense
0:36:42 or that feeling.
0:36:43 So, just as an example,
0:36:47 we downloaded an app called QStudio,
0:36:48 I think,
0:36:49 or QStudio.
0:36:51 And essentially,
0:36:51 it’s an app,
0:36:53 and I can control my 15-year-old’s phone,
0:36:55 and I can control
0:36:58 what apps he’s allowed to use.
0:36:59 It alerts me
0:37:01 to sensitive content
0:37:02 he might be searching for,
0:37:02 and I had sort of
0:37:03 a philosophical argument
0:37:05 with myself around,
0:37:05 should I really be
0:37:07 the East German Stasi
0:37:08 following my son?
0:37:09 But I am comfortable
0:37:10 being able to turn off his phone
0:37:12 and say,
0:37:12 okay,
0:37:13 it’s study time,
0:37:14 or you need to wind down,
0:37:15 or beyond a certain time of night,
0:37:17 all your apps are shut off.
0:37:21 And this is a nice little app.
0:37:21 I imagine it’s,
0:37:22 I don’t know if it’s
0:37:23 three engineers,
0:37:23 10 engineers,
0:37:24 20 engineers,
0:37:26 but you know every company
0:37:27 could do a much better job
0:37:28 of this app,
0:37:28 and they choose not to.
0:37:30 So, the fact that
0:37:31 Meta,
0:37:31 Alphabet,
0:37:33 and Apple
0:37:34 don’t have this app
0:37:35 readily available
0:37:36 or this feature
0:37:38 means they don’t want
0:37:39 to make it easy for you.
0:37:41 So, they do these parental controls,
0:37:42 which are mostly hand-waving.
0:37:43 I’ve tried to use
0:37:44 the parental controls on Meta,
0:37:45 and I just,
0:37:46 granted,
0:37:47 I’m in tech,
0:37:48 but I’m not good at this stuff,
0:37:49 but I just give up.
0:37:50 I find it confusing.
0:37:53 And they will create
0:37:54 the illusion of complexity here.
0:37:55 It’s very straightforward.
0:37:57 There should be no social media
0:37:59 for kids under the age of 16.
0:38:01 We just need to age-gate it.
0:38:02 There should be no smartphones
0:38:02 in schools.
0:38:04 And I’m increasingly believing
0:38:06 you cannot have
0:38:07 synthetic relationships
0:38:08 available to anyone
0:38:09 under the age of 18.
0:38:11 Because the collision,
0:38:12 I don’t know if you saw this,
0:38:14 but Sam Altman has,
0:38:15 you know,
0:38:16 announced that
0:38:17 we don’t want to be
0:38:17 in the business
0:38:18 of being in moral judgment
0:38:19 and I think
0:38:20 some or some such bullshit.
0:38:22 We’re going to allow erotica.
0:38:23 I love that name.
0:38:24 It’s porn.
0:38:25 All right?
0:38:25 It’s not,
0:38:26 this isn’t,
0:38:27 this isn’t,
0:38:28 you know,
0:38:30 nibbling on someone’s ears
0:38:31 and poetry,
0:38:32 and this is all right.
0:38:33 You know,
0:38:34 I like
0:38:36 bukkake and Asian casting porn.
0:38:37 I mean,
0:38:37 it’s porn.
0:38:38 All right?
0:38:40 By the way,
0:38:40 by the way,
0:38:41 this is not my preferences,
0:38:42 just so you know.
0:38:45 My shit gets much darker
0:38:46 than that.
0:38:47 Anyways,
0:38:50 25% of search queries
0:38:50 on Google
0:38:51 are related to porn.
0:38:53 This is an enormous business.
0:38:54 And one of the reasons
0:38:56 this is a difficult sector
0:38:57 to have peer-reviewed research on
0:38:58 is there is no,
0:39:00 very little academic research on it
0:39:01 because no one wants
0:39:01 to be known
0:39:02 as the porn professor.
0:39:04 So what do we know?
0:39:05 We know that teens
0:39:06 are spending about
0:39:07 five hours a day
0:39:07 on social media
0:39:08 or about a third
0:39:09 of their waking hours
0:39:10 outside of school.
0:39:12 According to the CDC,
0:39:13 adolescent depression
0:39:14 is up 60%
0:39:16 since the introduction
0:39:17 of mobile
0:39:19 or social going on mobile.
0:39:20 And then from internal
0:39:21 meta research,
0:39:22 this is research they did,
0:39:23 one in three teenage girls
0:39:24 who experienced
0:39:25 body image issues
0:39:26 reported that Instagram
0:39:27 made them feel worse.
0:39:28 And so
0:39:30 you have
0:39:31 this collision
0:39:32 of very unfortunate things
0:39:32 and that is
0:39:33 this is a really,
0:39:34 having kids
0:39:35 on social media
0:39:36 and on these platforms
0:39:38 is a really profitable business.
0:39:39 Platforms earned
0:39:40 $11 billion
0:39:41 from kids
0:39:42 under the age of 18
0:39:42 in 2022
0:39:43 on pace
0:39:44 for $13 billion
0:39:45 in 2025.
0:39:45 Instagram’s
0:39:46 $4 billion
0:39:46 from teens.
0:39:48 YouTube gets
0:39:48 about a billion,
0:39:49 get this,
0:39:50 a billion in revenues
0:39:51 from viewership
0:39:52 from kids
0:39:52 that are under
0:39:53 the age of 12.
0:39:54 TikTok gets
0:39:55 $2 billion
0:39:56 from teens.
0:39:57 And ad spending
0:39:58 targeting minors
0:39:59 rose 24%
0:40:00 year-on-year
0:40:00 in 2025
0:40:01 reaching $8 billion
0:40:02 in 2025
0:40:03 up from $6.6 billion.
0:40:05 So what do you have?
0:40:06 We know that
0:40:06 the more time
0:40:07 we get kids
0:40:09 on phones,
0:40:09 the more depressed
0:40:10 they are
0:40:11 and the more money
0:40:12 these guys make,
0:40:12 right?
0:40:13 So we have linked
0:40:14 shareholder value
0:40:16 to teen depression.
0:40:17 That is not good.
0:40:19 And then where it’s
0:40:19 going to get
0:40:20 really fucking scary
0:40:22 is you have
0:40:23 a cohort
0:40:25 of young men
0:40:26 and teen boys
0:40:27 who get mixed messages
0:40:28 around what it means
0:40:29 to approach a woman
0:40:32 and are told,
0:40:33 and I think
0:40:34 some of this information
0:40:34 is good,
0:40:35 I think some of it
0:40:35 is not good,
0:40:37 that there’s,
0:40:38 you know,
0:40:39 no young man
0:40:40 in high school
0:40:41 or anywhere else
0:40:41 or to say
0:40:42 in college
0:40:43 wants to be that guy.
0:40:44 that it makes
0:40:45 an unwelcome advance
0:40:46 on a woman
0:40:47 no matter how respectful
0:40:48 and then gets a reputation
0:40:49 as that guy,
0:40:49 as the creep,
0:40:50 right?
0:40:52 Or is not confident
0:40:53 or not in shape
0:40:54 or whatever it is
0:40:55 or hasn’t taken risks,
0:40:57 has not developed resilience,
0:40:59 is used to frictionless
0:40:59 online dating
0:41:01 where you just swipe right
0:41:02 and maybe get a date,
0:41:02 usually not.
0:41:04 And the result is,
0:41:04 okay,
0:41:05 if I’m 14,
0:41:06 if I’m a 14-year-old male
0:41:10 and I am not comfortable
0:41:11 around women
0:41:13 and I just haven’t learned
0:41:14 those skills yet,
0:41:15 which describes
0:41:16 most 14 and 15-year-old,
0:41:17 that would be
0:41:18 a fairly apt description
0:41:20 for most adolescent males.
0:41:21 And then I can go home
0:41:23 and I’m getting
0:41:24 these synthetic,
0:41:27 visually near-perfect
0:41:29 images of a girl
0:41:30 that learns everything
0:41:30 about me
0:41:32 and says the right things
0:41:34 and will perform erotic,
0:41:36 i.e. pornographic acts
0:41:36 for me.
0:41:38 And you’re going to have,
0:41:39 essentially,
0:41:41 why wouldn’t,
0:41:43 how can these kids,
0:41:44 these kids can’t compete.
0:41:45 There’s no way
0:41:46 they can compete
0:41:47 against these things.
0:41:48 And then the thing
0:41:48 that I find
0:41:50 so upsetting about this
0:41:51 is we’ve been talking
0:41:52 a lot about synthetic
0:41:53 or character AI relationships
0:41:55 is that,
0:41:56 and I’ve been writing
0:41:57 about this,
0:41:58 sexual desire
0:41:59 from young men
0:42:00 is not a bug,
0:42:01 it’s a feature.
0:42:02 And that is,
0:42:02 I think of it
0:42:03 as like fire.
0:42:04 And that is,
0:42:05 fire can be bad.
0:42:05 You can start
0:42:06 objectifying women,
0:42:08 you can believe that,
0:42:09 you can have unreasonable
0:42:10 expectations around
0:42:11 what a relationship is,
0:42:13 you can think of women
0:42:14 in a negative light
0:42:15 from this shit.
0:42:16 That’s when fire
0:42:17 burns a forest down.
0:42:18 For the most part,
0:42:19 sexual desire
0:42:20 fire from young men
0:42:22 is fire that if it’s
0:42:23 captured in an engine
0:42:23 can create,
0:42:24 can fuel cylinders
0:42:25 and move progress forward.
0:42:26 How does it move
0:42:26 progress forward?
0:42:28 You’re a young man,
0:42:29 you would really like
0:42:29 to have a girlfriend
0:42:30 at some point
0:42:31 to be physical with,
0:42:32 romantic with,
0:42:33 and have sex with.
0:42:33 So,
0:42:35 you start working out,
0:42:36 you have a plan,
0:42:38 you practice kindness,
0:42:39 you show resilience,
0:42:40 you show a willingness
0:42:41 to endure rejection,
0:42:43 you learn how to be funny,
0:42:44 you learn how to open,
0:42:45 you learn how to approach,
0:42:46 you learn how to maintain
0:42:46 a wrap,
0:42:48 you learn how to have a plan
0:42:49 and how to articulate
0:42:50 your plan to people.
0:42:52 And you basically learn
0:42:53 all the features
0:42:55 of what it is to be human
0:42:56 and successful
0:42:56 in that environment
0:42:57 and in other environments.
0:42:59 And I wonder
0:43:00 how many of these kids
0:43:01 are going to take
0:43:02 these risks
0:43:03 and feel the need
0:43:04 to take these risks
0:43:05 when they have
0:43:07 the most ridiculously
0:43:08 lifelike,
0:43:08 easy,
0:43:09 frictionless
0:43:11 sexual experiences
0:43:12 available on their phone
0:43:13 and their computer
0:43:13 at home.
0:43:15 And we connected
0:43:17 that attention,
0:43:17 we connected
0:43:19 to massive shareholder value.
0:43:21 And just to,
0:43:22 I mean,
0:43:23 just sort of bring it home
0:43:24 or personalize it,
0:43:26 I barely graduated
0:43:27 from UCLA.
0:43:28 Barely.
0:43:29 Graduated a 2.27 GPA.
0:43:30 And one of the reasons
0:43:31 I graduated
0:43:33 was I loved going to class.
0:43:34 Specifically,
0:43:36 I loved going to campus.
0:43:37 Why did I love
0:43:38 going to campus?
0:43:39 All my fraternity brothers
0:43:40 were there.
0:43:42 UCLA was a ton of fun.
0:43:42 There were people
0:43:43 playing football
0:43:44 and Frisbee
0:43:44 and the quad
0:43:46 and it was something
0:43:46 out of a fucking
0:43:47 Cinemax film.
0:43:48 There were so many
0:43:49 hot women everywhere
0:43:51 and there was a non-zero,
0:43:52 granted near zero,
0:43:54 but a non-zero probability
0:43:55 that I would be able
0:43:56 to meet
0:43:58 an attractive woman,
0:43:59 say,
0:43:59 hey,
0:43:59 we’re having a party
0:44:00 back at the fraternity
0:44:00 come by
0:44:03 and lightning might strike
0:44:03 and,
0:44:03 you know,
0:44:04 at some point later
0:44:05 I might have the opportunity
0:44:06 to be physical
0:44:07 with this person.
0:44:08 That was really
0:44:10 motivating for me.
0:44:11 My first girlfriend
0:44:12 was in college
0:44:12 for a serious
0:44:13 relationship.
0:44:15 If I’d had
0:44:16 near lifelike
0:44:18 tested a million
0:44:19 times a minute
0:44:20 to tickle my sensors,
0:44:21 my,
0:44:22 you know,
0:44:23 my mental triggers
0:44:24 around what sexual
0:44:26 predilections I had,
0:44:27 I just don’t think
0:44:28 I would have gone
0:44:29 on campus as much
0:44:30 and I don’t think
0:44:31 young men are going
0:44:32 to go on campus
0:44:32 as much.
0:44:32 I don’t think
0:44:33 they’re going to
0:44:33 leave their house
0:44:34 as much.
0:44:34 I don’t think
0:44:34 they’re going to
0:44:36 want to go have
0:44:36 drinks with friends.
0:44:37 I don’t think
0:44:37 they’re going to
0:44:38 want to go
0:44:39 to football practice.
0:44:39 I don’t think
0:44:40 they’re going to
0:44:41 volunteer at non-profits
0:44:42 or go to church.
0:44:45 So I see this
0:44:45 as,
0:44:46 again,
0:44:46 another attempt.
0:44:47 We have connected
0:44:49 to shareholder value
0:44:50 things that make us
0:44:51 less mammalia,
0:44:52 that make us less,
0:44:54 that quite frankly,
0:44:55 attack our manhood,
0:44:57 attack our risk
0:44:58 aggressiveness,
0:44:59 attack what it means
0:45:00 to experience
0:45:01 real victory
0:45:02 because the thing
0:45:03 I hate about
0:45:04 these fucking
0:45:05 synthetic relationships
0:45:06 is they give
0:45:07 people the sense
0:45:09 that relationships
0:45:09 are supposed to be
0:45:10 easy.
0:45:10 They’re not.
0:45:11 The most wonderful
0:45:11 things,
0:45:12 you’ll see this,
0:45:13 the most wonderful
0:45:14 thing in your life
0:45:16 will be forming
0:45:18 a romantic partnership
0:45:18 with someone,
0:45:19 figuring out how
0:45:20 to develop economic
0:45:21 security such that
0:45:22 you can have kids
0:45:23 and then raising
0:45:24 those children.
0:45:24 And the only thing
0:45:25 I can guarantee you
0:45:26 about all that shit
0:45:27 is it’s really
0:45:28 fucking hard.
0:45:30 And if you don’t
0:45:31 develop the skills
0:45:32 to navigate a
0:45:32 partnership,
0:45:33 navigate romantic
0:45:34 interests,
0:45:35 navigate raising
0:45:35 kids,
0:45:37 navigate the
0:45:37 corporate environment
0:45:39 because you become
0:45:40 used to synthetic
0:45:41 relationships that are
0:45:42 just so fucking
0:45:43 vanilla and easy
0:45:43 and always tell you
0:45:44 you’re great and
0:45:45 laugh at every joke
0:45:45 you make,
0:45:46 you’re never going
0:45:47 to develop those
0:45:48 skills and you’re
0:45:48 going to wake up.
0:45:49 I think we’re going
0:45:50 to just raise a
0:45:51 generation of young
0:45:51 people who wake up
0:45:52 and are like,
0:45:53 I have no ability
0:45:54 to deal with other
0:45:54 people.
0:45:55 I don’t know what
0:45:56 real victory feels
0:45:58 like and I’m anxious,
0:45:59 obese and depressed.
0:46:00 I don’t think many
0:46:01 people would disagree
0:46:02 with anything that
0:46:03 you just said.
0:46:04 And the trouble is
0:46:05 there is so much
0:46:08 money in this.
0:46:09 There is so much
0:46:10 money in children
0:46:12 and advertising to
0:46:12 children and
0:46:13 entertaining children
0:46:14 and keeping them
0:46:14 glued to their
0:46:15 devices.
0:46:16 And there’s also so
0:46:17 much money in
0:46:18 porn.
0:46:19 Porn makes up
0:46:22 25% of global
0:46:23 internet traffic.
0:46:24 People spend
0:46:27 $3,000 on porn
0:46:28 every second.
0:46:29 So this is an
0:46:31 extremely profitable
0:46:33 business, the
0:46:34 business of porn.
0:46:36 And it is
0:46:37 therefore kind of a
0:46:37 question.
0:46:38 I mean, the
0:46:39 markets are going to
0:46:40 roll on and they
0:46:41 are going to go full
0:46:43 steam ahead on
0:46:45 porn and more
0:46:46 specifically AI porn
0:46:47 because AI porn is a
0:46:48 more cost-effective
0:46:49 way of delivering porn
0:46:50 to people.
0:46:51 So it’s a choice
0:46:53 for these CEOs and
0:46:55 for these AI leaders.
0:46:57 And it’s a choice for
0:46:57 Sam Albin.
0:46:59 And I want to point
0:47:02 you to a quote that
0:47:03 he said on a
0:47:05 podcast, I believe this
0:47:06 was a few months ago,
0:47:07 maybe a year ago,
0:47:09 where he addressed
0:47:10 this choice.
0:47:11 There’s a lot of
0:47:11 short-term stuff we
0:47:12 could do that would
0:47:15 like really like juice
0:47:15 growth or revenue or
0:47:17 whatever and be very
0:47:17 misaligned with that
0:47:18 long-term goal.
0:47:20 And I’m proud of the
0:47:21 company and how little
0:47:22 we get distracted by
0:47:23 that, but sometimes we
0:47:23 do get tempted.
0:47:25 Are there specific
0:47:26 examples that come to
0:47:26 mind?
0:47:27 Any like decisions
0:47:27 that you’ve made?
0:47:32 Well, we haven’t put a
0:47:33 sex bot avatar in
0:47:34 Chessie Beauty yet.
0:47:36 That does seem like it
0:47:38 would get time spent.
0:47:39 Apparently it does.
0:47:40 And now they do.
0:47:41 Yeah, I mean, I hear
0:47:42 that and I kind of
0:47:43 have just one general
0:47:43 feeling.
0:47:45 Fuck you.
0:47:48 You and your hush
0:47:49 tones and your faux
0:47:49 concern.
0:47:51 If we’re waiting on
0:47:52 the better angels of
0:47:53 Sam Altman or Satya
0:47:55 Nadella or Tim Cook
0:47:57 or Mark Zuckerberg or
0:47:58 Elon Musk to show
0:47:59 up, don’t hold your
0:48:00 breath.
0:48:01 Stop being such
0:48:01 fucking idiots.
0:48:03 There have to be
0:48:04 laws.
0:48:05 We live in a
0:48:07 capitalist society where
0:48:08 your power, your
0:48:09 selection set of mates,
0:48:10 your influence, how
0:48:11 much people laugh at
0:48:12 your jokes, your
0:48:13 ability to take care of
0:48:14 your children is all
0:48:16 correlated to wealth.
0:48:17 So what we know with
0:48:20 100% certainty or 99.9%
0:48:22 certainty is the CEOs of
0:48:23 these companies will make
0:48:24 decisions.
0:48:25 It will rationalize
0:48:26 decisions incrementally
0:48:27 regardless of how many
0:48:28 teens start cutting
0:48:30 themselves or how it
0:48:31 separates them from their
0:48:32 parents and key
0:48:33 relationships.
0:48:34 So to a certain extent it’s
0:48:36 not Sam Altman’s fault.
0:48:36 It’s ours.
0:48:38 It’s pretty simple.
0:48:42 Age-gate social media, no
0:48:43 phones in schools, no
0:48:45 synthetic relationships under
0:48:46 the age of 18, no
0:48:47 pornographic material for
0:48:49 kids under the age of 18,
0:48:50 removal of Section 230
0:48:52 protections for
0:48:53 algorithmically elevated
0:48:54 content, and start
0:48:55 fucking finding these
0:48:56 companies a percentage of
0:48:57 their revenues, not a
0:48:59 parking ticket, and here’s
0:48:59 an idea.
0:49:01 someone does a perp
0:49:03 walk, but this notion
0:49:06 that we keep hoping that
0:49:07 these guys are going to
0:49:08 get it and their better
0:49:09 angels are going to show
0:49:10 up, come on.
0:49:13 Sam’s doing what he’s
0:49:14 supposed to be doing.
0:49:16 He is adding a crazy
0:49:16 amount of shareholder
0:49:18 value to justify the
0:49:19 $500 billion valuation he
0:49:20 just raised money at.
0:49:21 He’s got to figure out a
0:49:23 way to create the GDP of
0:49:24 Finland in the next two
0:49:27 years, and if it means
0:49:28 increasing attention by
0:49:30 10, 20, 30 percent, I’m
0:49:30 young people under the
0:49:32 age of 25, with
0:49:34 lifelike porn, he’s
0:49:36 going to rationalize why
0:49:37 they should do it, and
0:49:38 he’ll put in place some
0:49:39 faux controls that kids
0:49:40 can get around.
0:49:42 Folks, this is up to
0:49:44 us, and the notion of
0:49:45 the illusion of
0:49:46 complexity that gets
0:49:47 weaponized here in these
0:49:48 false or hollow
0:49:49 arguments are just
0:49:49 that.
0:49:50 They’re false
0:49:51 misdirects.
0:49:52 Do you realize how
0:49:53 much trouble a bar gets
0:49:54 in if they let in kids
0:49:55 under the age of 21?
0:49:56 They get in real
0:49:57 trouble.
0:49:58 You can lose your
0:49:59 liquor license.
0:50:02 So why would it be any
0:50:02 different here?
0:50:04 Because, and the reason
0:50:05 it’s more different is
0:50:06 because we have become so
0:50:08 co-opted by money that if
0:50:09 a company can add tens,
0:50:10 hundreds of billions of
0:50:11 dollars in shareholder
0:50:13 value, well, child safety
0:50:15 kind of takes a backseat.
0:50:17 We’re going to nod our
0:50:18 head, and Sam’s going to
0:50:20 talk in hushed tones
0:50:21 about, you know, we don’t
0:50:22 do these things because
0:50:23 we’re more concerned with
0:50:24 the commonwealth than
0:50:24 shareholders.
0:50:25 No.
0:50:26 Sam is going to continue
0:50:28 to do whatever will
0:50:29 increase shareholder value,
0:50:32 by a dollar every day.
0:50:33 That is his first, his
0:50:34 second, and his third
0:50:34 priority.
0:50:35 And quite frankly, that’s
0:50:36 his job.
0:50:38 It’s our job to elect
0:50:39 people who say, okay,
0:50:41 our kids should not be
0:50:41 engaging in these
0:50:42 relationships.
0:50:43 Our kids should not be
0:50:44 consuming content that
0:50:46 results in a 60% increase
0:50:47 in self-harm.
0:50:49 And we have to have elected
0:50:50 leaders that aren’t total
0:50:52 whores and sort of not.
0:50:53 And by the way, this is on
0:50:54 both sides of the aisle,
0:50:56 hands down, and have
0:50:57 thoughtful questions about
0:50:59 privacy in these issues and
0:51:00 want to hear more about it
0:51:02 because they just got money
0:51:05 from, you know, from
0:51:06 Meta or from Alphabet or
0:51:08 for whatever PAC is the, is
0:51:09 the false front for these
0:51:11 things or the veneer, and
0:51:12 that we don’t have a
0:51:13 president who basically
0:51:15 wants to hang out with
0:51:15 these guys.
0:51:17 So this is, you know,
0:51:20 this is really, I have, I
0:51:21 mean, I’m always freaked
0:51:22 out and I catastrophize
0:51:23 because I’m angry and
0:51:24 depressed, but I think
0:51:24 there is a legitimate
0:51:26 reason to worry about the
0:51:28 collision of adult
0:51:29 content and synthetic
0:51:30 relationships and
0:51:31 struggling young men.
0:51:32 I think that is a
0:51:33 fucking disaster waiting
0:51:34 to happen.
0:51:34 I think you’re going to
0:51:36 find, we talk about
0:51:37 these, I think they’re
0:51:38 called needs, neither
0:51:39 employed.
0:51:39 Yeah, employed or
0:51:40 enrolled or in training.
0:51:42 Yeah, nothing, right?
0:51:43 I think that could
0:51:44 quadruple in the next
0:51:45 five or eight years
0:51:47 because it’s like, well,
0:51:48 if my parents will let
0:51:48 me live in the
0:51:50 basement and I can find
0:51:51 enough chief calories to
0:51:53 survive, why wouldn’t I
0:51:54 stay at home and have
0:51:56 these interesting, exotic,
0:51:58 erotic, pornographic,
0:51:59 intellectually somewhat
0:52:00 stimulating relationships
0:52:02 with friends, mentors, and
0:52:03 girlfriends on an
0:52:04 algorithm and a
0:52:05 screen, and by the
0:52:06 time I emerge from my
0:52:07 fucking cave, I’m
0:52:08 going to have no
0:52:09 skills at all, none
0:52:11 whatsoever, right?
0:52:12 It’s almost like these,
0:52:14 they call them
0:52:15 sexpats, these guys
0:52:16 who’ve just given up on
0:52:17 their local society and
0:52:18 they moved to Thailand or
0:52:20 some other place where
0:52:21 they can basically have a
0:52:22 relationship for a much
0:52:23 lower cost, much lower
0:52:24 effort, much more
0:52:24 frictionless.
0:52:26 Take that times a
0:52:28 hundred, times a
0:52:29 hundred, and it’s going
0:52:31 to be not involve any
0:52:31 humans at all.
0:52:34 But this is, and the
0:52:35 thing that pisses me off so
0:52:36 much about this is that we
0:52:37 don’t want to admit and
0:52:38 acknowledge the
0:52:39 solutions are a lot
0:52:41 simpler than these guys
0:52:42 would have you believe.
0:52:43 Just to go over some of
0:52:44 the solutions that other
0:52:45 countries have had,
0:52:46 Norway, they just
0:52:48 implemented a complete
0:52:49 ban on social media for
0:52:50 people under 13, the
0:52:51 plan is to raise it to
0:52:52 15, and Australia just
0:52:52 passed a law which is
0:52:54 going to ban social media
0:52:56 for children under 16, and
0:52:57 that’s going to be going
0:52:58 into effect later this
0:52:58 year.
0:53:00 But I think the point
0:53:01 being, like, as you
0:53:04 always say, money wins,
0:53:05 and money will always
0:53:05 win.
0:53:07 And we can’t keep
0:53:09 expecting and hoping that
0:53:10 people and tech leaders
0:53:11 and business leaders are
0:53:11 going to regulate
0:53:12 themselves.
0:53:13 They might talk about it
0:53:15 for a time, but as we
0:53:17 see here, no, they’re
0:53:17 never going to regulate
0:53:18 themselves.
0:53:19 It’s the government’s job
0:53:21 to do the regulation.
0:53:22 It’s the government’s job
0:53:24 to figure out what the
0:53:25 rules of the road are.
0:53:26 They’re supposed to be
0:53:26 the referee.
0:53:28 They figure out what the
0:53:28 boundaries are.
0:53:29 And so you need the
0:53:30 government to set the
0:53:32 rules here and say what
0:53:33 is okay and what isn’t
0:53:34 okay, make it punishable
0:53:35 by law, and then let
0:53:36 capitalism do its
0:53:37 thing.
0:53:38 I mean, we talk about
0:53:38 this a lot.
0:53:39 we like the
0:53:40 competition, but this
0:53:41 expectation that these
0:53:43 tech leaders are going to
0:53:45 have sort of the moral
0:53:46 bandwidth to regulate
0:53:47 themselves and to do it
0:53:49 in a thoughtful way, it’s
0:53:50 just never going to
0:53:50 happen.
0:53:51 As you say, it’s not
0:53:52 their job.
0:53:56 We’ll be right back.
0:53:58 For even more markets
0:53:59 content, sign up for our
0:54:00 newsletter at
0:54:02 ProfGMarkets.com slash
0:54:03 subscribe.
0:54:14 Scott, we’re hitting the
0:54:16 road, bringing Pivot Live to
0:54:17 the people.
0:54:19 Seven cities, Toronto, Boston,
0:54:21 New York, D.C., Chicago, San
0:54:23 Francisco, and L.A., of course.
0:54:25 You went to Oasis, you
0:54:27 went to Beyoncé, you saw the
0:54:28 remake of Wizard of Oz and
0:54:29 the Spear.
0:54:32 All those suck compared to
0:54:32 the Pivot Tour.
0:54:35 This is the biggest tour.
0:54:37 Same people that are
0:54:38 organizing our tour that
0:54:40 organized Taylor Swift’s
0:54:40 tour.
0:54:41 They are much more excited
0:54:43 about our tour.
0:54:44 All right, that’s enough,
0:54:44 Grandpa.
0:54:46 It’s going to be so good, and
0:54:48 we’re bringing our brand of
0:54:50 whatever we do to the people,
0:54:51 and we’re excited to meet our
0:54:51 fans.
0:54:52 We love our fans.
0:54:53 For tickets, head to
0:54:55 PivotTour.com.
0:54:56 See you there.
0:55:00 Megan Rapinoe here.
0:55:02 This week on A Touch More,
0:55:04 we’ve got WNBA champion
0:55:07 Jackie Young, a.k.a.
0:55:09 IA Jack, on the show.
0:55:10 We’re so excited.
0:55:12 We’ll find out how she’s been
0:55:13 celebrating her third
0:55:15 championship, how the Aces
0:55:16 turned their season around,
0:55:18 and whether they’re the
0:55:19 greatest dynasty ever.
0:55:21 Plus, we’re handing out the
0:55:23 most prestigious awards of the
0:55:25 WNBA season, the Meggies.
0:55:27 From best dressed to gayest
0:55:28 moments of the season, you do
0:55:29 not want to miss it.
0:55:31 Check out the latest episode of
0:55:32 A Touch More wherever you get
0:55:33 your podcasts and on YouTube.
0:55:40 Ever gone on vacation outside the country
0:55:42 and felt like the food was just
0:55:43 better?
0:55:45 When I was in Japan recently,
0:55:47 the produce and meat were amazing,
0:55:49 and even a minor skin issue that I
0:55:51 usually have cleared out by the end
0:55:51 of the two weeks.
0:55:53 What’s up with what we eat here?
0:55:56 Tomatoes are renowned for looking
0:55:58 delicious, but not tasting delicious.
0:56:00 Is some food actually better in other
0:56:02 countries, and what are they doing
0:56:03 differently?
0:56:05 A law passed in 1993
0:56:08 that said, if you’re buying bread
0:56:11 in a boulangerie, it must be made
0:56:13 with four ingredients, essentially.
0:56:15 Find out more this week on
0:56:16 Explain It To Me.
0:56:18 New episodes every Sunday,
0:56:19 wherever you get your podcasts.
0:56:34 We’re back with Prof G Markets.
0:56:37 Back in July, we said SpaceX was the
0:56:38 most important monopoly that no one
0:56:39 is talking about.
0:56:41 We argued that it owns the space
0:56:42 economy and that that’s what makes it
0:56:43 such a powerful investment.
0:56:47 But that monopoly might be under
0:56:48 threat because there are two companies
0:56:51 that are looking to shake up this
0:56:51 space race.
0:56:55 AST Space Mobile is building a space-based
0:56:57 cellular broadband network, and
0:56:59 Rocket Lab is working to offer
0:57:01 reliable launches for hundreds of
0:57:03 small satellites, and those stocks
0:57:07 are up 230% and 500%
0:57:10 respectively in the past year.
0:57:12 That is really why we’re paying
0:57:14 attention to these companies.
0:57:15 The stocks are absolutely
0:57:17 ripping right now, and the idea
0:57:20 could be that they are going to
0:57:22 compete with SpaceX, perhaps one day
0:57:24 dethrone SpaceX.
0:57:26 They’re certainly a way off from that
0:57:26 at the moment.
0:57:28 But Scott, your reactions to the
0:57:30 absolute explosions in the stocks
0:57:33 of these two fledgling space
0:57:33 companies.
0:57:35 I think it’s really exciting because I
0:57:37 think it’ll get everyone’s greed
0:57:40 glands going and people will go into
0:57:40 the space.
0:57:42 And we said earlier in the year that the
0:57:44 most powerful and possibly dangerous
0:57:48 monopoly wasn’t even, you know, YouTube
0:57:59 Amazon at 50% of e-commerce, Meta at 75% or
0:58:02 two-thirds of all social media, and
0:58:04 Alphabet with 90-plus percent of search.
0:58:08 The fact that Elon Musk or SpaceX is
0:58:10 responsible, I think, for about two-thirds of
0:58:13 all launches, I mean, the way to think of
0:58:15 it is we’re on an, we’re one of nine
0:58:17 unremarkable rocks with a little bit of
0:58:19 moisture and gas that, and one of them
0:58:22 appears to sustain life as far as we know.
0:58:24 It’s one of 100,000 galaxies in a million
0:58:25 different universes.
0:58:28 I mean, the potential of space is pretty
0:58:29 striking, right?
0:58:32 And one company appears to be, or was
0:58:34 developing a monopoly on, well, there’s
0:58:36 Earth and there’s everything else.
0:58:38 And it’s like, well, okay, if you have a
0:58:41 monopoly on everything else, there might
0:58:43 be incredible upside there, but also do
0:58:45 you really want, you want a lot of
0:58:45 competition here.
0:58:49 So, I think it’s super exciting.
0:58:52 Right now, of the 10,200 active
0:58:54 satellites, 8,500 are Starlink.
0:58:56 By the end of 2025, 6 million subscribers
0:58:59 and 62% of global satellite broadband
0:59:02 revenue is going to go to Starlink.
0:59:05 And there are three pure-play public space
0:59:06 companies with market caps larger than
0:59:07 10 billion.
0:59:10 Rocket Lab was up, almost tripled last
0:59:14 year, up 178%, and it’s up six-fold, 500%
0:59:15 over the past year.
0:59:17 Echo Star, a similar satellite communications
0:59:19 company, up 231% this year.
0:59:23 AST Space Mobile, a company building a
0:59:25 Starlink-like satellite broadband network, is
0:59:30 up 356% a year today, up over 2,500% in
0:59:30 the last two years.
0:59:33 Stock’s up 26-fold in the last two years.
0:59:36 They’ve only had six satellites in orbit, but
0:59:37 they’re aiming to get to 45 to 60.
0:59:41 There’s Blue Origin and Kuiper.
0:59:43 What would be just super interesting, and
0:59:45 Mia did this research, is just as we were
0:59:48 excited about GLP-1, you always feel like
0:59:48 we’re late.
0:59:49 No.
0:59:50 You don’t feel like we’re late.
0:59:51 So, the question is, how do we go further
0:59:55 downstream and find the suppliers?
0:59:57 These guys are going to raise so much
0:59:58 capital.
0:59:58 They’re not dumb.
1:00:01 They realize that these stocks have gone
1:00:02 crazy, so they’re going to issue a ton of
1:00:03 stock, and they’re going to start buying.
1:00:07 What I want to know is, what are the O-rings
1:00:11 or the type of metal or the type of plastics,
1:00:15 milled products, specialty components that go
1:00:16 into these things?
1:00:19 And are any of them publicly traded, or could
1:00:20 we buy any of them?
1:00:23 Because the amount of CapEx that’s about to
1:00:27 go into space is only going to be, it’s not
1:00:30 going to rival what’s going into AI, but it
1:00:32 strikes me that this is going to be, just as
1:00:34 we’ve been tracking AI and talking about the
1:00:35 extraordinary amounts of money, I wonder if
1:00:38 we’ll be talking about space and launch
1:00:42 capability with the same type of valuations and
1:00:44 growth over the next couple of years that we’ve
1:00:46 been talking about AI in the last 24 months.
1:00:50 Let’s go over some of the numbers, like AST Space
1:00:53 Mobile, as you say, up 2,500% in the past two years.
1:00:54 It’s just unbelievable.
1:00:58 You look at what they’ve done, they’ve got six
1:00:59 satellites in orbit.
1:01:03 They’re planning to launch another 45 satellites into
1:01:05 orbit by the end of next year.
1:01:10 So they’re getting there, building out the
1:01:11 satellite network.
1:01:15 But remember, you’ve got to keep in mind, 8,500
1:01:17 satellites out there are Starlink.
1:01:22 So, you know, they’re beginning to compete, but not
1:01:22 really.
1:01:26 Then there’s Rocket Lab, which is, Morgan Stanley
1:01:27 calls them the alternative to SpaceX.
1:01:30 They’re not really building a network, but they’re
1:01:33 launching stuff into space.
1:01:37 And, you know, they are becoming sort of the
1:01:40 second alternative to SpaceX if you want to launch
1:01:41 payloads into space.
1:01:43 You might go with Rocket Lab.
1:01:45 They have this Rocket Lab Neutron rocket.
1:01:48 That’s their flagship rocket, which is sort of trying
1:01:50 to compete with the Falcon Heavy.
1:01:53 So that’s another option.
1:01:57 But something I have been thinking about, you know,
1:02:00 this stock performance is absolutely crazy.
1:02:04 I mean, 2,500% in the past two years, Rocket Lab up
1:02:05 600%.
1:02:08 Like, this is crazy town.
1:02:11 And you look at, like, the valuations.
1:02:15 I mean, you’ve got Space Mobile that is trading at 500
1:02:18 times expected sales for 2025.
1:02:22 So these aren’t fully rational valuations.
1:02:26 And it does remind me of this dynamic that we’ve discussed
1:02:28 about private versus public.
1:02:32 And that is, there’s only one space company that people are
1:02:34 really excited about, and it’s SpaceX.
1:02:37 And you can’t invest in it because it’s a private company,
1:02:39 unless you are an accredited investor, unless you know someone
1:02:40 who can get you some shares.
1:02:43 But when you look at the stock performance of these companies,
1:02:47 it does feel as though those are just the names that are
1:02:48 associated with space.
1:02:52 And people who are investing in the public markets just want to
1:02:54 get some sort of exposure to space.
1:02:57 So I do think that these are the kinds of stocks that you want
1:02:58 to be really wary of.
1:03:02 Like, this kind of explosion is not really tied to the fundamentals.
1:03:05 To me, it’s more of a momentum thing where there’s so much
1:03:06 demand for space.
1:03:08 And we all agree, space is a big deal.
1:03:09 It’s going to be important.
1:03:14 But the one company that you’d want to invest in, you can’t invest in.
1:03:19 So you go to these competitors, which, you know, are making some progress.
1:03:22 But let’s be real, they don’t hold a candle to SpaceX.
1:03:24 Mia pulled together some research.
1:03:26 There’s some of the component suppliers for satellites.
1:03:30 Honeywell, they make antennas, high-speed radios, data transmitters, receivers
1:03:33 that allow satellites to send and receive information.
1:03:37 Their clients include Boeing, Airbus, Lockheed, Martin, and SpaceX.
1:03:44 It also holds a majority stake in quantum computing firm, Quantinum, that recently raised about
1:03:48 $600 million at a $10 billion valuation from investors, including NVIDIA.
1:03:51 And their backlog grew 10% year-on-year.
1:03:53 It’s down 10% year-to-date.
1:03:56 Trades at three-time sales, roughly in line with its five-year historical average.
1:04:01 So there’s a company that appears to be in the space, but hasn’t had the same sort of updraft.
1:04:05 Universal Microwave Technology, a Taiwanese firm that manufactures super high-frequency
1:04:11 electronics that handle the radio waves satellites use, and their components are used in satellites
1:04:13 themselves and also in the ground receivers.
1:04:14 It’s up 39%.
1:04:20 But to your point, space is obviously a really risky business.
1:04:24 Fewer than one in four venture-backed companies even make it to orbit with a vehicle.
1:04:29 I had dinner with a friend of mine who works at a large PE firm, and he said the cheapest
1:04:33 way to invest in SpaceX right now is through EchoStar.
1:04:41 That I think EchoStar either owns, has a stake in SpaceX, but he described it.
1:04:48 He basically said that EchoStar is, it’s not cheap though, it’s tripled in the last year,
1:04:52 is the cheapest way to own SpaceX.
1:04:54 That sort of describes the dynamic, right?
1:04:59 It’s like everyone wants to get to SpaceX, so they take these interesting diversions to get there.
1:04:59 I love this.
1:05:01 I just love seeing competition.
1:05:07 And the fact that these stocks, what happens is when these stocks go up 25-fold, you’re just going
1:05:13 to see a ton of venture activity in the space, a ton of human capital go in, and people trying to
1:05:17 figure out how to get shit into space and attract capital.
1:05:26 If any of these guys becomes somewhat formidable, gets them striking as SpaceX, that’s what probably
1:05:30 would motivate SpaceX to go public so they can run away with it on a capital basis.
1:05:32 But I think it’s super exciting.
1:05:36 And I think, and we were talking about this in the editorial call, I think we should start
1:05:39 following space kind of the same way we have been following AI.
1:05:41 Okay, let’s take a look at the week ahead.
1:05:44 We will see the consumer price index for September.
1:05:48 Despite the government shutdown, we’ll also see earnings from Netflix.
1:05:53 We’ll see earnings from Tesla, from P&G, Coca-Cola, and Intel.
1:05:55 Any predictions, Scott?
1:05:58 Really interested in this Netflix deal.
1:05:59 And I know you talked about this.
1:06:00 We talked about it last night.
1:06:05 But basically, Netflix is partnering with Spotify and has essentially said, okay, let’s be honest.
1:06:09 Disney Plus and Hulu, that’s not our competition.
1:06:11 Our competition is YouTube.
1:06:16 And they’ve done a deal with Spotify for some of their original content.
1:06:20 I think it’s The Ringer, Bill Simmons, and some crime drama stuff.
1:06:23 And we’re going to run it on Netflix.
1:06:28 And I think that what you’re going to see in the next 12 months, I think you’re going to
1:06:35 see a lot of podcasts or what started as podcasts running, not only in streaming media, I think
1:06:40 the real home for them or the more opportune home is on cable networks.
1:06:40 Why?
1:06:46 If you think about cable networks, they’re actually still, if you didn’t know what amazing businesses
1:06:51 they were 10 years ago, and you just looked at them today, they still look like good businesses.
1:06:55 They’re declining, but they still spin off a lot of cash flow.
1:07:01 The problem with these businesses is not only on the revenue side, it’s that the expenses haven’t moved.
1:07:03 The expenses haven’t come down.
1:07:08 They dramatically need to decrease the costs of these shows.
1:07:10 And podcasts do that.
1:07:16 I mean, essentially, 25% of our listens are on a TV, streamed through YouTube.
1:07:21 But I can guarantee you, we cost a lot less than your traditional whatever.
1:07:25 If someone watches an hour of this on their TV right now off of YouTube, and then they flip
1:07:33 over and watch an hour of Wednesday or Breaking Bad on AMC or name your hour-long program, you
1:07:35 can bet this costs a lot less.
1:07:42 It won’t get nearly the viewership, but it’ll get more people in the core demo, and it’ll
1:07:46 be on an operating margin or profitability basis much more profitable.
1:07:52 So anyway, long-winded way of saying a dozen to two dozen of the top 50, maybe top 100 podcasts
1:07:57 are going to be running on cable channels and across streaming media.
1:08:03 This episode was produced by Claire Miller and engineered by Benjamin Spencer.
1:08:04 Our associate producer is Alison Weiss.
1:08:06 Mia Silveri is our research lead.
1:08:10 Our research associates are Isabella Kinsel, Dan Chalon, and Kristen O’Donoghue.
1:08:14 Drew Burrows is our technical director, and Catherine Dillon is our executive producer.
1:08:16 Thank you for listening to Prof G Markets from Prof G Media.
1:08:20 Tune in tomorrow for a fresh take on markets.
1:08:50 Prof G Markets from Prof G Markets from Prof G Markets from Prof G Markets.
1:08:55 Prof G Markets from Prof G Markets from Prof G Markets from Prof G Markets from Prof G Markets from Prof G Markets.
0:00:08 You’ve got your core holdings, some high conviction picks, maybe even a few strategic options at play.
0:00:11 So why not switch the investment platform built for those who take it seriously?
0:00:17 Go to Public.com slash PropG and earn an uncapped 1% bonus when you transfer your portfolio.
0:00:19 That’s Public.com slash PropG.
0:00:21 Paid for by public investing.
0:00:24 All investing involves the risk of loss, including loss of principal,
0:00:27 brokered services for U.S.-listed registered securities.
0:00:31 Options and bonds and a self-directed account are offered by Public Investing, Inc.,
0:00:33 member FINRA, and SIPC.
0:00:37 Complete disclosures available at Public.com slash Disclosures.
0:00:40 Listen closely.
0:00:44 That’s not just paint rolling on a wall.
0:00:47 It’s artistry.
0:00:55 A master painter carefully applying Benjamin Moore Regal Select eggshell with deftly executed strokes.
0:01:02 The roller, lightly cradled in his hands, applying just the right amount of paint.
0:01:03 Hmm.
0:01:05 It’s like hearing poetry in motion.
0:01:07 Benjamin Moore.
0:01:08 See the love.
0:01:13 Did you lock the front door?
0:01:13 Check.
0:01:14 Closed the garage door?
0:01:15 Yep.
0:01:18 Installed window sensors, smoke sensors, and HD cameras with night vision?
0:01:19 No.
0:01:25 And you set up credit card transaction alerts, a secure VPN for a private connection, and continuous monitoring for our personal info on the dark web?
0:01:28 Uh, I’m looking into it?
0:01:30 Stress less about security.
0:01:34 Choose security solutions from Telus for peace of mind at home and online.
0:01:37 Visit telus.com slash total security to learn more.
0:01:38 Conditions apply.
0:01:40 Today is number 10.
0:01:45 That’s the percentage of marriages worldwide that are between first or second cousins.
0:01:47 Ed, genuine question here.
0:01:54 Is finding out that your partner has sucked more than a hundred dicks upsetting, or is my wife overreacting?
0:02:10 You know, I was annoyed that you’re 20 minutes late, but you’ve suddenly just papered over that, and now I’m happy.
0:02:11 I’ve redeemed myself.
0:02:11 I was upstairs.
0:02:13 Now I’m happy.
0:02:17 I was upstairs with the dogs, and then finally Drew called, and I have no sense of the calendar.
0:02:19 I apologize for being late.
0:02:24 I was either going to go with that one, or I had a joke on Pippa, which I really like, and that is the—
0:02:25 Do you ever see the movie The Exorcist?
0:02:26 No.
0:02:27 Oh, my God.
0:02:30 It’s an incredible horror film.
0:02:39 I made the mistake of sneaking in to see it when I was 13 with my friend Adam Markman, and when I got up in the morning, I would have to go into the corner and put my socks on in the corner.
0:02:43 I was so frightened of a demon being possessing me.
0:02:57 It’s about a girl, Linda Blair, who gets possessed by the devil, and about all these attempts to exercise, and it’s called The Exorcist because this priest, played by Max von Sydow, and I forget the other guy I played,
0:03:02 coming in to try and, you know, the devil from this poor little girl.
0:03:04 Ellen Burstyn plays the mother way before your time.
0:03:14 Anyways, there’s a rumor that they’re planning a sequel to The Exorcist, but this time it’s about getting the priest out of the little boy.
0:03:24 Oh, how are you, Ed?
0:03:27 What’s going on with NVIDIA?
0:03:29 What’s going on with NVIDIA?
0:03:32 Look at Ed.
0:03:33 Ed loves that joke.
0:03:36 It’s that Princeton pedo humor.
0:03:49 The fact that every joke has to be punctuated with a groan is another piece of this podcast I’ll never quite wrap my head around.
0:03:50 That’s right.
0:03:51 That’s right.
0:03:54 Guess who I’m having dinner with tonight?
0:03:55 I’m totally name dropping.
0:03:59 I probably shouldn’t say that in the same sentence as a pedo joke.
0:04:01 I’m having dinner with the chancellor of UCLA.
0:04:03 To discuss your vocational programming?
0:04:05 Well, Ed, I’m glad you asked.
0:04:07 You know, I don’t know.
0:04:10 I think it’s to discuss how I can give more money.
0:04:11 Probably that, yeah.
0:04:15 The reward you’re giving money to a university is you get called and they want your insight into academia.
0:04:20 And they pretend to care for about 10 minutes.
0:04:21 And then comes this opportunity.
0:04:23 It was presented as an opportunity.
0:04:27 And to match some other gift or do something.
0:04:31 But, yeah, I have something called, let’s talk about the accelerator program.
0:04:32 So I’m a big fan.
0:04:35 Public school trade changed my life.
0:04:36 This isn’t philanthropy.
0:04:38 This is an overdue payback.
0:04:43 And I give money to public universities and I always say, find me the program no one else will give money to.
0:04:46 So I don’t want sexy shit.
0:04:52 I don’t want an engineering school or a school of international relations or a gym or none of that bullshit.
0:04:59 So at UW-Madison, I’m the second donor to a wonderful program where the professors go to local penitentiaries.
0:05:06 And if you’re near your release or like less than five years from your release, you can take courses to work towards a BA.
0:05:07 That’s cool.
0:05:09 And at UCLA, I’m doing adult education.
0:05:12 Adult education, it’s like vocational programming.
0:05:13 No one wants to fund it.
0:05:16 And I love it because it’s very low cost.
0:05:17 No admissions policy.
0:05:24 And I’m doing, I love that this is the, we should call this, this section of the program, Scott Vergey-Signaling.
0:05:25 I did-
0:05:26 Prof G. Philanthropy.
0:05:27 University of Georgia.
0:05:30 I’m doing tenant rights at Berkeley.
0:05:34 I did scholarships for children of immigrants as I am one of them.
0:05:36 The big question is, do you get your name on any of these things?
0:05:38 No, they’ve asked.
0:05:42 They wanted to name one of the programs and I don’t want to be, have my name on anything.
0:05:44 Because you don’t want anyone to know that you’re doing this.
0:05:46 Well, yeah, I’m so shy.
0:05:48 You know me.
0:05:48 I’m very subtle.
0:05:50 I don’t like to do that.
0:05:52 I don’t like to talk about this sort of thing.
0:05:54 No, I’m being very honest now.
0:05:57 They wanted to call this the, whatever, the Galway program.
0:06:02 And I’m like, no, because eventually they’re going to decide that my jokes or something I did was totally inappropriate.
0:06:08 And my kids are going to have to fight to keep my name on this thing as I’m 90 and reheating soup.
0:06:11 And everyone decides I’m a, you know, name your favorite ist.
0:06:17 If they can come up with reasons why they need to tear down Winston Churchill’s statue in London,
0:06:22 they can figure out a million things why they’re going to need to take my name off a program.
0:06:23 So I’m like, no, I don’t.
0:06:26 It’s going to be a giant compilation of every intro to Prof G. Markets.
0:06:27 That’ll be your downfall.
0:06:28 Yeah, that’s it.
0:06:30 They’ll be like, oh, my God, I can’t believe you said that.
0:06:31 You know, national.
0:06:34 Anyway, so, no, I don’t have my name on.
0:06:36 I’m not going to put my name on any of this shit.
0:06:36 Fair enough.
0:06:39 Well, I’m glad that we talked about it and people know anyway.
0:06:40 That’s the most important thing, right?
0:06:41 There you go.
0:06:42 There you go.
0:06:47 Now is the time to cry.
0:06:51 I hope you have plenty of the well at all.
0:06:56 Tech companies are racing to secure data centers to handle the surge in AI computing.
0:07:03 We’ve discussed this trend at length on the show, but more data centers means more demand
0:07:04 for energy.
0:07:10 In fact, S&P Global projects grid demand from data centers will rise 22% in 2025, and it will
0:07:12 nearly triple by 2030.
0:07:18 But the part that is talked about less is who is footing the bill.
0:07:20 Right now, it looks like consumers might be paying.
0:07:28 Bloomberg found electricity costs near major data center hubs are up as much as 267% compared
0:07:30 to five years ago.
0:07:33 So, Scott, everyone’s very excited about data centers.
0:07:35 Everyone’s very excited about chips.
0:07:38 We’ve seen a ton of announcements about these new data centers.
0:07:43 Open AI building trillions of dollars worth of data center capacity.
0:07:46 They want to build 250 gigawatts of data center capacity.
0:07:50 But the part that’s getting, again, less attention is the energy side.
0:07:52 Two big questions for me.
0:07:55 One, how are we actually going to power these things?
0:07:59 And two, if we can power them, what is it going to do to the cost of energy?
0:08:05 Because as that Bloomberg article points out, in areas where data centers have been built,
0:08:08 energy costs have more than tripled in just five years.
0:08:12 And meanwhile, the plan is to build more data centers.
0:08:13 So, a lot to get into here.
0:08:17 I will start or I will pause there and get your reactions.
0:08:24 The greatest arbitrage in history is the arbitrage of fossil fuels and, in general, energy.
0:08:30 The more energy you produce, I mean, there’s just, energy is like broadband.
0:08:33 No matter how much of you have of it, people will find a use for it.
0:08:39 Now, that’s not to say the prices don’t fluctuate, but the spike in this incremental demand and
0:08:42 the kind of price discovery happens at the frontier, right?
0:08:47 So, even though I think it’s increased demand consumption by 6% or 7%, that’s enough to
0:08:50 take prices up 25%, 50% or 100%.
0:08:52 So, what do we do about that?
0:08:58 Do we end up doing what we do with a lot of places and that is invest overseas?
0:09:03 Should we be building the data centers in the Gulf?
0:09:08 And the reason why that probably will happen or that might happen is that for the last 20
0:09:14 years, the dynamic in terms of capital flows have been that a bunch of institutional money
0:09:23 managers fly to Riyadh, Dubai or Doha or Abu Dhabi and say, I run a biotech fund in the U.S.
0:09:28 Please give me $100 million, a billion dollars to invest it and I will return back more capital.
0:09:35 The shift is that now the Gulf investors are, yeah, they’re still looking for alpha.
0:09:36 They’re still looking for good managers.
0:09:42 But even more than that, they’re looking for capital or to fund projects that create an infrastructure
0:09:45 around manufacturing, services, education.
0:09:51 The electronic arts take private was part of that because they realize that while they have
0:09:57 what feels like near infinite capital, the sea of oil beneath them at some point will run out.
0:10:02 Now, I don’t know if that’s 20 or 30 years or 60 or 80, but they know they do have a fuse
0:10:03 around trying to transition this economy.
0:10:08 So, when you go down there, they’re open for business, but they want to invest in companies
0:10:14 that will build some sort of local demand, local infrastructure, local business operations.
0:10:19 So, I can see a situation where these companies make massive investments in a company like OpenAI
0:10:26 in exchange for building those data centers with those processors in somewhere in the Gulf.
0:10:27 What are your thoughts?
0:10:32 I’m just really struck by how everyone is so excited about these data centers and we’re just
0:10:34 seeing this record investment into data centers.
0:10:38 And meanwhile, people are not talking enough about energy.
0:10:40 And the fact that we need to power these things.
0:10:44 And, you know, people were talking about it maybe a year ago.
0:10:50 I saw, you know, op-eds and newspapers and people talking about it publicly on podcasts saying,
0:10:54 oh, you know, this AI thing, it’s going to be a lot of power demand.
0:10:55 So, we’re going to need a lot more power.
0:10:58 I know Bill Gates was talking a lot about that.
0:11:04 But the reality is, right now, there is not nearly enough investment going into energy
0:11:08 compared to the amount of investment that’s going into data centers.
0:11:14 And by the way, that is why all of these construction plans for these data centers keeps on getting
0:11:15 delayed.
0:11:20 Because what happens is they announce these plans, these multi-billion dollar deals.
0:11:21 They say, we’re going to build this thing.
0:11:26 And then they realize that the waiting list to actually connect the data center to the source
0:11:32 of power, the thing that’s going to keep the lights on, is like five, six, seven years long.
0:11:38 And so, essentially, what we have here is a very simple and classic problem.
0:11:44 The demand for energy, because of AI, is way outstripping the supply.
0:11:48 And so, then your mind goes to Econ 101.
0:11:50 What does that mean?
0:11:52 It means energy costs are only going to go way up.
0:11:58 And I just want to sort of back up and highlight some of the numbers that we’re mentioning there.
0:12:05 So, as I mentioned, OpenAI, they want to build out 250 gigawatts worth of compute
0:12:07 over the next few years.
0:12:10 I think it would be by 2033, 2032.
0:12:14 So, let’s just talk about how realistic that is.
0:12:21 Last year, the entire U.S. added about 56 gigawatts worth of energy capacity.
0:12:23 And that was the most ever.
0:12:28 OpenAI wants to build out a chip network that would consume 250.
0:12:36 So, that is going to be equivalent to a quarter of the entire U.S. electric grid capacity.
0:12:43 It would also mean that OpenAI alone would consume more than half of all of the energy capacity
0:12:46 that we add over the next eight years.
0:12:52 If you wanted to supply 250 gigawatts of power, you would need to build 250 nuclear power plants.
0:12:54 It would cost you $12.5 trillion.
0:13:01 And I think what is striking is that we’re all just assuming that this data center build-out is a given.
0:13:09 And yet, there’s this very practical logistical problem, which is how the fuck are you going to power these things?
0:13:15 And, you know, I think then the conversation goes to, okay, well, what do we do about it?
0:13:19 You know, do we just say no to AI?
0:13:22 I don’t think anyone, or I’m sure some people want to do that.
0:13:24 In fact, actually, a lot of Americans want to say that.
0:13:26 But I don’t think the market’s going to let you do that.
0:13:31 So, you can just, like, pull back on AI or build out the grid.
0:13:35 And then it’s a question of how do you actually build out the grid?
0:13:41 And I think it’s a very—I mean, I just—this energy question becomes more and more interesting
0:13:47 because, you know, the administration, there’s been a lot of emphasis on how we need to drill baby drill.
0:13:49 We need to get our energy back.
0:13:54 All this focus on gas-fueled energy, fossil fuel energy.
0:14:01 But I would also like to point out that even if you use gas-fueled energy,
0:14:07 you’re not going to—you’re not even going to come close to the demand that these data centers are projecting.
0:14:11 It’s not even a question.
0:14:12 Maybe you get it with nuclear.
0:14:16 But again, as we’ve discussed, this takes decades and decades to build.
0:14:21 So, increasingly, I’m starting to believe that the route that you need to go is you need to go with solar energy.
0:14:28 But, of course, the administration is gutting all of the investments and the tax credits that fuel the solar industry.
0:14:30 So, I will pause again there.
0:14:36 But, I mean, I think the question is becoming,
0:14:41 clearly, we need more energy in America if we’re going to do this AI thing.
0:14:46 And if we don’t, the consequences are quite simple.
0:14:49 AI is going to absolutely skyrocket your energy bill.
0:14:55 This all feels to me like it’s desperate for some sort of technical breakthrough that reduces the energy demands.
0:15:01 Because the whole notion of—it’s sort of this circular philosophical question.
0:15:08 At what point do you not do a—conduct an AI query for the best place to eat Indian food in London
0:15:14 when the cost of that query is—inhibits you from being able to afford to go eat Indian food?
0:15:20 Or that AI becomes something that is sequestered to wealthy people and corporations,
0:15:27 and then we outsource the expense of AI to households that see their electric bills go up 20, 50.
0:15:29 I’ve heard in some areas it’s gone up 200%.
0:15:33 Or at some point, does the government come in or weigh in and call it an infrastructure investment?
0:15:40 Or do we come up with some sort of tiered pricing system where we charge—instead of having a bulk discount,
0:15:43 if you’re a B2B, if you’re a business that’s buying this capacity,
0:15:47 at some point and you’re purchasing over a certain amount,
0:15:50 which probably means you’re an AI or an AI-related business,
0:15:54 you pay a surcharge, which they then pass on to their consumers,
0:15:55 which would hit their stock price.
0:16:00 But at this point, you would argue if you need investment or deep pockets,
0:16:05 you’d probably go after the pools of shareholder capital of these organizations
0:16:11 are greater than the discretionary income of middle-class households in these areas.
0:16:17 And then it gets very political because every congressperson wants to—
0:16:23 I mean, so it’s no accident that Musk announced a huge infrastructure project
0:16:26 in Speaker Johnson’s district, right?
0:16:31 And so being able to cut the ribbon on a big data center or a colossus or whatever it is
0:16:32 makes that politician look very good
0:16:35 until the electric bills start coming for those consumers.
0:16:41 So I wonder if there’s going to be some sort of legislation that—I don’t know—
0:16:45 either the government weighs in and provides subsidies for people.
0:16:50 But if you think about, okay, we talk constantly about feeding and growing the middle class.
0:16:53 And one way to do that is through tax cuts.
0:16:55 But a better way to do it, I would argue,
0:17:00 because the reality is the middle class in America pay a lot of usage and consumption taxes,
0:17:03 but distinct of the class warfare rhetoric.
0:17:05 The middle class doesn’t pay as a percentage of their income.
0:17:07 They don’t pay a ton of federal income tax.
0:17:12 So I think about 70 or 80 percent of federal income tax is paid by the top 10 percent income-earning households.
0:17:17 But how do you—the best investment, I would argue,
0:17:20 or a great way to support the middle class is through infrastructure investments,
0:17:29 whether it’s rail, or power grid, or clean water, highways that help you get to and from your house faster,
0:17:37 so you can spend more time with your kids, or tax credits for housing such that people can have more affordable housing.
0:17:41 But it feels as if there needs to be a rethink around national energy policy.
0:17:51 The problem is, is that there’s such regulatory capture that it strikes me that if I had to predict what’s going to happen here,
0:17:57 is that there will be a huge investment in the grid, the majority of which will be absorbed by AI.
0:18:04 So effectively, all that is, is a transfer from all homes and all taxpayers to the shareholders of AI.
0:18:10 Yes. I think this is such an important—the amount that energy costs are going up is so important
0:18:13 because it’s highlighting how this stuff is actually expensive.
0:18:17 Like, it actually costs something, and someone has to pay for it.
0:18:23 And up to this point, it has seemed to most people that it’s just these, like, you know,
0:18:27 Qatari billionaires or these Silicon Valley billionaires who are just funding these things,
0:18:32 and there’s this abundance mindset, and it’s like, oh, a trillion dollars here, a trillion dollars there.
0:18:34 We’re just going to keep investing.
0:18:41 But this is sort of highlighting, actually, we’re not in a place of abundance, really,
0:18:45 or at least not in a place of abundance in the way that Sam Altman seems to think.
0:18:53 We actually have a scarcity of energy to the point that the costs are going to be felt by regular Americans.
0:19:02 And what has been so interesting recently is the local community pushback that has been growing against data center buildouts.
0:19:08 And this is very recent, but in the past few months, basically,
0:19:14 $63 billion worth of data centers have been blocked because local communities have gotten together,
0:19:18 and they’ve said, no, you’re not going to build this data center in our neighborhood.
0:19:22 It’s happened in Ohio, in Texas, in Indiana.
0:19:26 Google just withdrew one of their data center applications because they got all of this pushback.
0:19:33 And the argument from the people saying no is, basically, this actually doesn’t help us.
0:19:37 It doesn’t help our communities because it sends energy prices through the roof.
0:19:40 And then the other point, which I think is also really important that they’ve made,
0:19:44 is that it actually doesn’t even create jobs, really.
0:19:48 Because one thing about data centers, it’s not like a Walmart.
0:19:50 The data center is basically a giant robot.
0:19:55 And you build the thing, and there are construction jobs when you’re building it,
0:19:58 but then once it’s built, you basically just let it sit there,
0:20:00 and it’s only a few people who are servicing the thing.
0:20:04 You look at the data center that Apple built in North Carolina, as an example.
0:20:07 They spent a billion dollars on this data center.
0:20:10 It created less than 100 permanent jobs.
0:20:15 So that’s equivalent to, like, a Series B startup, like, showing up in North Carolina.
0:20:24 So huge costs on energy and temporary job creation through the construction.
0:20:29 But once it is built, actually, the local community doesn’t really benefit from it.
0:20:37 And as you say, it’s a robot that’s building shareholder value for the people who own AI stocks.
0:20:43 And so it’s becoming a really interesting, and I think this is going to become more prominent in the political sphere,
0:20:51 but it’s this NIMBY versus YIMBY debate, where people are saying, you know, not in my backyard.
0:20:54 God, and you’re YIMBY when it comes to housing.
0:20:56 I’m YIMBY when it comes to housing.
0:21:06 But I’ve got to say, I’m starting to find this data center argument against the data center bill that I’m starting to find it somewhat compelling,
0:21:09 especially if it’s going to triple people’s energy bills.
0:21:16 I mean, right now in Virginia, I guess about a quarter of the energy is already going to data centers.
0:21:21 And these things sort of typify people’s fears around where the world is going,
0:21:25 and that is these economic centers powering stocks don’t require anybody.
0:21:30 You don’t even need to turn on the lights during the day because there’s nobody working there.
0:21:38 So this is going to be – it is very difficult to spin up new energy sources, right?
0:21:40 It takes 10 years to build a nuclear power plant.
0:21:42 Cost overruns are just not easy.
0:21:49 So electricity companies or power generation companies have been some of the best-performing stocks.
0:21:52 Some of them have even outperformed AI stocks.
0:21:58 Constellation, the largest nuclear power operator in the U.S., is up 78% this year, more than any of the Magnificent Seven.
0:22:05 Brookfield Renewables, one of the largest public renewable energy companies, is up 47%, also outpacing the MAG-7.
0:22:08 American Electric Power, up 30%.
0:22:13 Entergy, another utility that focuses on the southern U.S., is up 20% year-to-date.
0:22:21 Because what you want as a business is you want moats, and then you want – but you want liquid – you want friction in supply.
0:22:24 You want a lack of friction in demand.
0:22:25 So what do we have?
0:22:29 We have demand that is just, like, accelerating with very little friction, right?
0:22:33 But at the same time, there’s a lot of friction around bringing on supplies.
0:22:36 So if you own the supply, it’s champagne and cocaine.
0:22:45 And the example I would use is that as someone who travels a shit ton and stays in nice places, you can’t spin up a five-star hotel.
0:22:50 The top 1% are garnering so much money, and there’s a, you know, YOLO attitude, live for today.
0:22:52 They’re traveling more.
0:22:53 There’s a lot of pent-up demand to travel.
0:23:01 So the demand at five-star resorts has vastly outstripped – five-star resorts are like nuclear power plants.
0:23:11 To find a place you can build a five-star hotel, get their permits, the local residents, plan the thing, raise the capital, and build a world-class hotel, it takes you 10 years.
0:23:21 So in between that lag, as consumption or demand goes up, you have seen hotel prices at nice hotels basically double since pre-COVID.
0:23:28 I think that’s kind of a relatively, you know, a reasonable analogy for what’s about to happen to power consumption.
0:23:36 And that is there is no kind of easy spin-it-up, large-scale, multi-gigawatt power supply available.
0:23:45 And it all feels to me, again, like wherever we head, based on the regulatory capture, based on who the president is dining with,
0:23:54 that Doug Burgum is going to figure out a way to come up with some sort of big, beautiful bill that taxes every American in the form of taxes
0:24:07 and transfers wealth with subsidized energy to these AI companies the same way we’re subsidizing all sorts of other things for, you know, different politically connected sectors.
0:24:12 I think that the energy build-out is the most important thing.
0:24:18 And just some – I mean, when we talk about like what do we do about it, just some data I would point you to.
0:24:23 I mean, again, there’s been this emphasis on we need gas to fuel this energy.
0:24:33 The most amount of gas-fired energy that has been deployed in a year, the most, has been 12.5 gigawatts.
0:24:35 OpenAI wants 250 gigawatts.
0:24:39 So, you’re not going to do it with gas.
0:24:44 As you point out, maybe you get it with nuclear, but it takes decades to build that out.
0:24:49 And as one of our research analysts on the team, Chris Nodonis, you said,
0:24:53 the problem isn’t that we need energy supply, it’s that we need energy supply now.
0:24:56 That’s exactly what AI companies are asking for.
0:24:58 We need the energy to come online right now.
0:25:01 So, I think you’ve got to do it with solar.
0:25:06 Solar made up more than half of all of the new energy capacity that was added last year.
0:25:14 But for whatever reason, probably for anti-woke reasons, the administration thinks that solar is hippy-dippy or woke.
0:25:16 Trump has this hatred of solar.
0:25:19 And so, there is this attack on the solar build-out.
0:25:23 And because you mentioned the Big Beautiful Bill, because of the Big Beautiful Bill,
0:25:27 we’re seeing less investment in solar.
0:25:29 We’re seeing elimination of tax credits.
0:25:34 And as a result, it’s going to reduce solar production capacity by about 20%.
0:25:35 Those are the projections right now.
0:25:42 So, all this comes down to, if we want to do AI, we need more focus on the energy side of things.
0:25:45 And we need more attention on how do we actually build out the energy.
0:25:48 And it’s not just a question of drill, baby, drill.
0:25:49 It’s a question of everything.
0:25:51 It’s a question of wind.
0:25:52 It’s a question of solar.
0:25:54 It’s a question of nuclear.
0:26:03 And I just, it seems as if the conversation has moved away from that because people are so excited about the AI thing
0:26:06 that they’ve gotten way out over their skis on this.
0:26:14 And there needs to be more of a rational conversation of like, okay, sounds good, but let’s keep the lights on.
0:26:16 How do we actually power these things?
0:26:22 Problems like this, when they’re this obvious and this big, oftentimes don’t end up happening.
0:26:26 And that is, the shit you’re worried about is the shit that, generally speaking, doesn’t happen.
0:26:30 And I can see a scenario along the lines of the following.
0:26:36 Sam Altman, in order to justify this valuation, is making these just kind of enormous outlandish projections,
0:26:42 including this is how much power I will need because I’m basically saying to the market,
0:26:48 this is how much AI is going to permeate all corners of society.
0:26:51 This is how big my company is going to be.
0:26:53 I need 250 gigawatts, right?
0:26:59 It’s sort of like announcing to the world, we’re going to hire 3 million employees based on our current growth.
0:27:00 Buy my stock.
0:27:01 This is how confident I am.
0:27:12 So one, there’s still a scenario where AI is an interesting technology, but it’s not powering everything in our lives.
0:27:13 We’re not glued to it every day.
0:27:20 It’s not operating every car that picks us up and doing all our finances and in charge of all of our workflow.
0:27:27 That it has some uses, but it’s more niche, and it never really gets the traction that the current valuations justify, too.
0:27:34 That there’s some sort of breakthrough where the energy consumption demands just dramatically diminish.
0:27:41 I mean, to a certain extent, the Civic, the Honda Civic came in, and it was a better car.
0:27:52 But the reason it was a better car was because it cost you $12 a week to power the thing instead of $22 because it just had a more efficient engine and a smaller car wrapped around the engine that just had better mileage.
0:27:57 So I think there’s a lot of things here that could change the calculus.
0:27:59 What’s clear is we’re going to need more energy.
0:28:08 I also think that, like, I’m a drill baby drill person, but I also think we should have kept some of those subsidies for renewables.
0:28:10 Economics ultimately went out.
0:28:19 The number one producer of wind is Texas, and renewables are still right now probably the most cost-efficient incremental capacity to bring on.
0:28:27 Specifically, I believe wind and solar are the least expensive and quickest power generation to deploy, even without government subsidies.
0:28:33 I would also argue there’s a really strong geopolitical reason to keep energy prices low.
0:28:41 I believe that if oil gets—I think the war in Ukraine, in how long it goes, is a function of oil prices.
0:28:49 And that is, if the price goes below $50 a barrel—I think it’s in the low $60s right now—I think Putin has to come to the table because 50% of his economy is based on the price of that.
0:28:56 And at some point, the cost of production is greater than that number, I think, if it gets to, like, $40 or $45.
0:29:06 So I think we have a geopolitical—a lot of reasons to massively invest in energy production across, you know, all the different dimensions.
0:29:09 So is it the old dirty stuff?
0:29:15 You know, I would argue that stuff is not going away as quickly as it should, but it’s also cost-ineffective.
0:29:16 Is it solar?
0:29:17 Is it wind?
0:29:20 Is it, you know, LNG?
0:29:25 It’s like—I think it’s, like, door number four, all of it, right?
0:29:28 And we’re the largest energy producer in the world.
0:29:30 We’re the largest oil producer.
0:29:32 I mean, people just don’t realize how strong we are economically.
0:29:33 Oil independent, food independent.
0:29:37 And we’re the number one producer of oil in the world.
0:29:47 And China is making—and you pointed this out last week—to their credit, they’re going to add, I believe, more solar power generation in this year.
0:29:54 Or at least that’s a stat you had a couple days ago, which blew my mind, than we have in total currently in the United States.
0:29:55 But solar is woke.
0:29:58 I mean, China understands the issues.
0:30:00 I don’t know how these things got so politicized.
0:30:03 I’m building—by the way, on a ground level, I’m building a home.
0:30:08 And one of the things we’re trying to figure out is how to buy the solar panels now to qualify for the tax credits.
0:30:13 And all of these solar panels, from my understanding, come in from China.
0:30:17 Like, China’s made just a huge bet on renewables.
0:30:24 But my gut tells me that it’s so obvious that there’s just going to be such a demand for energy consumption.
0:30:31 And then Sam Altman is out there saying, I’m going to need the energy of the EU to power my unbelievable business.
0:30:33 I don’t know.
0:30:38 It’s all making me very skeptical that we’re going to need as much energy as Sam Altman is saying, or as much incremental energy.
0:30:51 Whatever the solution is, whether it’s less AI, or more efficient AI, or more energy, or more efficient energy, or cost-effective energy, the point is, something has to give.
0:30:53 Because you look at the numbers.
0:30:56 As you often say, the math ain’t mathin’.
0:31:14 Just to wrap it up, at some point, middle-class households are going to say, look, I’m not willing to not eat and not be able to buy groceries because my electricity bill is so high, such that Scott and Ed can edit the notes for their newsletter using Anthropic.
0:31:19 Or as we’re about to discuss, so that my son can, like, jerk off to AI porn on ChatGPT.
0:31:20 That’s a good segue.
0:31:21 Well done.
0:31:22 Very elegant.
0:31:25 There is a reason you’re not on CNBC.
0:31:26 Okay.
0:31:28 Back to you, Joe.
0:31:33 We’ll be right back after the break.
0:31:39 And if you’re enjoying the show so far, be sure to give Prof.G Markets a follow wherever you get your podcasts.
0:31:49 Support for the show comes from Brax.
0:31:52 These days, every business leader is under pressure to save money.
0:31:55 But you can’t crush the competition just by cutting costs.
0:31:58 To win, you need to spend smarter and move faster.
0:31:59 You need Brax.
0:32:08 Brax is the intelligent finance platform that breaks the tradeoff between control and speed with smart corporate cards, high-yield banking, and AI-powered expense management.
0:32:12 Join the 30,000 companies that spend smarter and move faster with Brax.
0:32:15 Learn more at Brex.com slash grow.
0:32:20 Support for this show comes from Vanguard.
0:32:23 The lineup includes over 80 bond funds.
0:32:26 To all the financial advisors listening, let’s talk about bonds for a minute.
0:32:28 Capturing value and fixed income is not easy.
0:32:31 Bond markets can be massive, murky, and let’s be real.
0:32:35 A lot of firms throw a couple of flashy funds your way and call it a day.
0:32:36 But not Vanguard.
0:32:39 Vanguard bonds are institutional quality.
0:32:43 They’re actively managed by a 200-person global squad of sector specialists, analysts, and traders.
0:32:46 Lots of firms love to highlight their star portfolio managers.
0:32:50 Like, it’s all about that one brilliant mind making the magic happen.
0:32:52 Vanguard’s philosophy is a little different.
0:32:55 They believe the best active strategy should be shared across the team.
0:33:00 That way, every client benefits from the collective brainpower, not just one individual’s take.
0:33:04 So, if you’re looking to give your clients consistent results year in and year out,
0:33:09 go see the record for yourself at Vanguard.com slash markets.
0:33:12 That’s Vanguard.com slash markets.
0:33:14 All investing is subject to risk.
0:33:16 Vanguard Marketing Corporation Distributor.
0:33:22 Hi, this is Kara Swisher, host of On With Kara Swisher.
0:33:25 This week on my podcast, On With Kara Swisher,
0:33:28 I talk with Vice President Kamala Harris about her new book,
0:33:32 107 Days, as well as The Weaponization of the Department of Justice,
0:33:34 the tech industry’s rightward lurch,
0:33:38 and how she’s planning to fight against Trump’s march toward autocracy.
0:33:40 We take it live at the Warner Theater,
0:33:44 which is just a stone’s throw away from the White House,
0:33:45 where Donald Trump now lives.
0:33:46 Have a listen.
0:33:51 I don’t believe that the people who are flattering Donald Trump,
0:33:54 who are offering him favors,
0:33:59 necessarily believe in what he is doing or believe it is right,
0:34:01 but they want power.
0:34:05 And that’s part of what we are dealing with right now.
0:34:07 They want power,
0:34:11 and they’re not willing to compromise their power or access to power
0:34:14 for the sake of the Constitution and our democracy.
0:34:18 Listen to the whole conversation on On With Kara Swisher on YouTube
0:34:20 or wherever you get your podcasts.
0:34:31 We’re back with Profty Markets.
0:34:34 The conversation around technology and mental health
0:34:36 has taken center stage in recent months,
0:34:38 and some tech giants say they are stepping up.
0:34:42 Meta added new parental controls on Instagram last week
0:34:45 using the PG-13 rating system to limit what teens see.
0:34:46 And back in September,
0:34:49 Sam Altman announced a 120-day plan
0:34:52 to roll out parental controls for chat GPT users.
0:34:56 That move came after the parents of a teen who died by suicide
0:34:57 sued OpenAI,
0:35:00 accusing it of responsibility for their child’s death.
0:35:01 Last week, however,
0:35:04 Altman said that the problems with the chat bot
0:35:05 were, quote,
0:35:06 fully mitigated,
0:35:10 and then rolled many of those changes back.
0:35:12 So, Scott,
0:35:13 I think the question here is,
0:35:14 you know,
0:35:17 these companies are recognizing the problem,
0:35:18 and the question is,
0:35:20 are they really doing anything about it?
0:35:22 I will just first say,
0:35:24 let’s talk about Meta
0:35:26 and this PG-13 policy.
0:35:27 You know,
0:35:29 it sounds like progress
0:35:31 until you kind of remember
0:35:32 that they’ve had,
0:35:33 like,
0:35:35 many child safety announcements
0:35:37 that have happened over the past few years,
0:35:40 and the research has shown
0:35:43 that basically none of these features really work.
0:35:46 They’re really weak at protecting against self-harm content,
0:35:48 bullying content,
0:35:49 sexual content,
0:35:49 et cetera.
0:35:50 So, you know,
0:35:52 whatever efforts they have been making,
0:35:54 they’re not really good enough,
0:35:55 and it’s starting to appear to me
0:35:56 as if they’re more interested
0:35:59 in announcing these safety features
0:36:01 than they are in actually implementing them.
0:36:02 We’ll see,
0:36:04 with this PG-13 rating,
0:36:06 that track record isn’t great.
0:36:06 And then,
0:36:08 with Sam Altman,
0:36:11 he announced that Chad GPD
0:36:12 was, quote,
0:36:13 originally pretty restrictive
0:36:15 to be careful with mental health issues,
0:36:15 but he says
0:36:17 that they are now going to,
0:36:17 quote,
0:36:18 relax the restrictions
0:36:20 in most cases.
0:36:20 So,
0:36:23 some varying responses
0:36:24 to the mental health thing,
0:36:25 but what is clear
0:36:26 is that these companies
0:36:27 have to say something about it,
0:36:29 and they are saying things.
0:36:30 I think the question is,
0:36:32 are they going to really do anything about it?
0:36:33 If I didn’t know better,
0:36:35 I would think there’s a possibility here
0:36:36 that these firms are more concerned
0:36:37 with shareholder value
0:36:38 than the well-being of our children.
0:36:39 It’s just,
0:36:41 I’m starting to get that sense
0:36:42 or that feeling.
0:36:43 So, just as an example,
0:36:47 we downloaded an app called QStudio,
0:36:48 I think,
0:36:49 or QStudio.
0:36:51 And essentially,
0:36:51 it’s an app,
0:36:53 and I can control my 15-year-old’s phone,
0:36:55 and I can control
0:36:58 what apps he’s allowed to use.
0:36:59 It alerts me
0:37:01 to sensitive content
0:37:02 he might be searching for,
0:37:02 and I had sort of
0:37:03 a philosophical argument
0:37:05 with myself around,
0:37:05 should I really be
0:37:07 the East German Stasi
0:37:08 following my son?
0:37:09 But I am comfortable
0:37:10 being able to turn off his phone
0:37:12 and say,
0:37:12 okay,
0:37:13 it’s study time,
0:37:14 or you need to wind down,
0:37:15 or beyond a certain time of night,
0:37:17 all your apps are shut off.
0:37:21 And this is a nice little app.
0:37:21 I imagine it’s,
0:37:22 I don’t know if it’s
0:37:23 three engineers,
0:37:23 10 engineers,
0:37:24 20 engineers,
0:37:26 but you know every company
0:37:27 could do a much better job
0:37:28 of this app,
0:37:28 and they choose not to.
0:37:30 So, the fact that
0:37:31 Meta,
0:37:31 Alphabet,
0:37:33 and Apple
0:37:34 don’t have this app
0:37:35 readily available
0:37:36 or this feature
0:37:38 means they don’t want
0:37:39 to make it easy for you.
0:37:41 So, they do these parental controls,
0:37:42 which are mostly hand-waving.
0:37:43 I’ve tried to use
0:37:44 the parental controls on Meta,
0:37:45 and I just,
0:37:46 granted,
0:37:47 I’m in tech,
0:37:48 but I’m not good at this stuff,
0:37:49 but I just give up.
0:37:50 I find it confusing.
0:37:53 And they will create
0:37:54 the illusion of complexity here.
0:37:55 It’s very straightforward.
0:37:57 There should be no social media
0:37:59 for kids under the age of 16.
0:38:01 We just need to age-gate it.
0:38:02 There should be no smartphones
0:38:02 in schools.
0:38:04 And I’m increasingly believing
0:38:06 you cannot have
0:38:07 synthetic relationships
0:38:08 available to anyone
0:38:09 under the age of 18.
0:38:11 Because the collision,
0:38:12 I don’t know if you saw this,
0:38:14 but Sam Altman has,
0:38:15 you know,
0:38:16 announced that
0:38:17 we don’t want to be
0:38:17 in the business
0:38:18 of being in moral judgment
0:38:19 and I think
0:38:20 some or some such bullshit.
0:38:22 We’re going to allow erotica.
0:38:23 I love that name.
0:38:24 It’s porn.
0:38:25 All right?
0:38:25 It’s not,
0:38:26 this isn’t,
0:38:27 this isn’t,
0:38:28 you know,
0:38:30 nibbling on someone’s ears
0:38:31 and poetry,
0:38:32 and this is all right.
0:38:33 You know,
0:38:34 I like
0:38:36 bukkake and Asian casting porn.
0:38:37 I mean,
0:38:37 it’s porn.
0:38:38 All right?
0:38:40 By the way,
0:38:40 by the way,
0:38:41 this is not my preferences,
0:38:42 just so you know.
0:38:45 My shit gets much darker
0:38:46 than that.
0:38:47 Anyways,
0:38:50 25% of search queries
0:38:50 on Google
0:38:51 are related to porn.
0:38:53 This is an enormous business.
0:38:54 And one of the reasons
0:38:56 this is a difficult sector
0:38:57 to have peer-reviewed research on
0:38:58 is there is no,
0:39:00 very little academic research on it
0:39:01 because no one wants
0:39:01 to be known
0:39:02 as the porn professor.
0:39:04 So what do we know?
0:39:05 We know that teens
0:39:06 are spending about
0:39:07 five hours a day
0:39:07 on social media
0:39:08 or about a third
0:39:09 of their waking hours
0:39:10 outside of school.
0:39:12 According to the CDC,
0:39:13 adolescent depression
0:39:14 is up 60%
0:39:16 since the introduction
0:39:17 of mobile
0:39:19 or social going on mobile.
0:39:20 And then from internal
0:39:21 meta research,
0:39:22 this is research they did,
0:39:23 one in three teenage girls
0:39:24 who experienced
0:39:25 body image issues
0:39:26 reported that Instagram
0:39:27 made them feel worse.
0:39:28 And so
0:39:30 you have
0:39:31 this collision
0:39:32 of very unfortunate things
0:39:32 and that is
0:39:33 this is a really,
0:39:34 having kids
0:39:35 on social media
0:39:36 and on these platforms
0:39:38 is a really profitable business.
0:39:39 Platforms earned
0:39:40 $11 billion
0:39:41 from kids
0:39:42 under the age of 18
0:39:42 in 2022
0:39:43 on pace
0:39:44 for $13 billion
0:39:45 in 2025.
0:39:45 Instagram’s
0:39:46 $4 billion
0:39:46 from teens.
0:39:48 YouTube gets
0:39:48 about a billion,
0:39:49 get this,
0:39:50 a billion in revenues
0:39:51 from viewership
0:39:52 from kids
0:39:52 that are under
0:39:53 the age of 12.
0:39:54 TikTok gets
0:39:55 $2 billion
0:39:56 from teens.
0:39:57 And ad spending
0:39:58 targeting minors
0:39:59 rose 24%
0:40:00 year-on-year
0:40:00 in 2025
0:40:01 reaching $8 billion
0:40:02 in 2025
0:40:03 up from $6.6 billion.
0:40:05 So what do you have?
0:40:06 We know that
0:40:06 the more time
0:40:07 we get kids
0:40:09 on phones,
0:40:09 the more depressed
0:40:10 they are
0:40:11 and the more money
0:40:12 these guys make,
0:40:12 right?
0:40:13 So we have linked
0:40:14 shareholder value
0:40:16 to teen depression.
0:40:17 That is not good.
0:40:19 And then where it’s
0:40:19 going to get
0:40:20 really fucking scary
0:40:22 is you have
0:40:23 a cohort
0:40:25 of young men
0:40:26 and teen boys
0:40:27 who get mixed messages
0:40:28 around what it means
0:40:29 to approach a woman
0:40:32 and are told,
0:40:33 and I think
0:40:34 some of this information
0:40:34 is good,
0:40:35 I think some of it
0:40:35 is not good,
0:40:37 that there’s,
0:40:38 you know,
0:40:39 no young man
0:40:40 in high school
0:40:41 or anywhere else
0:40:41 or to say
0:40:42 in college
0:40:43 wants to be that guy.
0:40:44 that it makes
0:40:45 an unwelcome advance
0:40:46 on a woman
0:40:47 no matter how respectful
0:40:48 and then gets a reputation
0:40:49 as that guy,
0:40:49 as the creep,
0:40:50 right?
0:40:52 Or is not confident
0:40:53 or not in shape
0:40:54 or whatever it is
0:40:55 or hasn’t taken risks,
0:40:57 has not developed resilience,
0:40:59 is used to frictionless
0:40:59 online dating
0:41:01 where you just swipe right
0:41:02 and maybe get a date,
0:41:02 usually not.
0:41:04 And the result is,
0:41:04 okay,
0:41:05 if I’m 14,
0:41:06 if I’m a 14-year-old male
0:41:10 and I am not comfortable
0:41:11 around women
0:41:13 and I just haven’t learned
0:41:14 those skills yet,
0:41:15 which describes
0:41:16 most 14 and 15-year-old,
0:41:17 that would be
0:41:18 a fairly apt description
0:41:20 for most adolescent males.
0:41:21 And then I can go home
0:41:23 and I’m getting
0:41:24 these synthetic,
0:41:27 visually near-perfect
0:41:29 images of a girl
0:41:30 that learns everything
0:41:30 about me
0:41:32 and says the right things
0:41:34 and will perform erotic,
0:41:36 i.e. pornographic acts
0:41:36 for me.
0:41:38 And you’re going to have,
0:41:39 essentially,
0:41:41 why wouldn’t,
0:41:43 how can these kids,
0:41:44 these kids can’t compete.
0:41:45 There’s no way
0:41:46 they can compete
0:41:47 against these things.
0:41:48 And then the thing
0:41:48 that I find
0:41:50 so upsetting about this
0:41:51 is we’ve been talking
0:41:52 a lot about synthetic
0:41:53 or character AI relationships
0:41:55 is that,
0:41:56 and I’ve been writing
0:41:57 about this,
0:41:58 sexual desire
0:41:59 from young men
0:42:00 is not a bug,
0:42:01 it’s a feature.
0:42:02 And that is,
0:42:02 I think of it
0:42:03 as like fire.
0:42:04 And that is,
0:42:05 fire can be bad.
0:42:05 You can start
0:42:06 objectifying women,
0:42:08 you can believe that,
0:42:09 you can have unreasonable
0:42:10 expectations around
0:42:11 what a relationship is,
0:42:13 you can think of women
0:42:14 in a negative light
0:42:15 from this shit.
0:42:16 That’s when fire
0:42:17 burns a forest down.
0:42:18 For the most part,
0:42:19 sexual desire
0:42:20 fire from young men
0:42:22 is fire that if it’s
0:42:23 captured in an engine
0:42:23 can create,
0:42:24 can fuel cylinders
0:42:25 and move progress forward.
0:42:26 How does it move
0:42:26 progress forward?
0:42:28 You’re a young man,
0:42:29 you would really like
0:42:29 to have a girlfriend
0:42:30 at some point
0:42:31 to be physical with,
0:42:32 romantic with,
0:42:33 and have sex with.
0:42:33 So,
0:42:35 you start working out,
0:42:36 you have a plan,
0:42:38 you practice kindness,
0:42:39 you show resilience,
0:42:40 you show a willingness
0:42:41 to endure rejection,
0:42:43 you learn how to be funny,
0:42:44 you learn how to open,
0:42:45 you learn how to approach,
0:42:46 you learn how to maintain
0:42:46 a wrap,
0:42:48 you learn how to have a plan
0:42:49 and how to articulate
0:42:50 your plan to people.
0:42:52 And you basically learn
0:42:53 all the features
0:42:55 of what it is to be human
0:42:56 and successful
0:42:56 in that environment
0:42:57 and in other environments.
0:42:59 And I wonder
0:43:00 how many of these kids
0:43:01 are going to take
0:43:02 these risks
0:43:03 and feel the need
0:43:04 to take these risks
0:43:05 when they have
0:43:07 the most ridiculously
0:43:08 lifelike,
0:43:08 easy,
0:43:09 frictionless
0:43:11 sexual experiences
0:43:12 available on their phone
0:43:13 and their computer
0:43:13 at home.
0:43:15 And we connected
0:43:17 that attention,
0:43:17 we connected
0:43:19 to massive shareholder value.
0:43:21 And just to,
0:43:22 I mean,
0:43:23 just sort of bring it home
0:43:24 or personalize it,
0:43:26 I barely graduated
0:43:27 from UCLA.
0:43:28 Barely.
0:43:29 Graduated a 2.27 GPA.
0:43:30 And one of the reasons
0:43:31 I graduated
0:43:33 was I loved going to class.
0:43:34 Specifically,
0:43:36 I loved going to campus.
0:43:37 Why did I love
0:43:38 going to campus?
0:43:39 All my fraternity brothers
0:43:40 were there.
0:43:42 UCLA was a ton of fun.
0:43:42 There were people
0:43:43 playing football
0:43:44 and Frisbee
0:43:44 and the quad
0:43:46 and it was something
0:43:46 out of a fucking
0:43:47 Cinemax film.
0:43:48 There were so many
0:43:49 hot women everywhere
0:43:51 and there was a non-zero,
0:43:52 granted near zero,
0:43:54 but a non-zero probability
0:43:55 that I would be able
0:43:56 to meet
0:43:58 an attractive woman,
0:43:59 say,
0:43:59 hey,
0:43:59 we’re having a party
0:44:00 back at the fraternity
0:44:00 come by
0:44:03 and lightning might strike
0:44:03 and,
0:44:03 you know,
0:44:04 at some point later
0:44:05 I might have the opportunity
0:44:06 to be physical
0:44:07 with this person.
0:44:08 That was really
0:44:10 motivating for me.
0:44:11 My first girlfriend
0:44:12 was in college
0:44:12 for a serious
0:44:13 relationship.
0:44:15 If I’d had
0:44:16 near lifelike
0:44:18 tested a million
0:44:19 times a minute
0:44:20 to tickle my sensors,
0:44:21 my,
0:44:22 you know,
0:44:23 my mental triggers
0:44:24 around what sexual
0:44:26 predilections I had,
0:44:27 I just don’t think
0:44:28 I would have gone
0:44:29 on campus as much
0:44:30 and I don’t think
0:44:31 young men are going
0:44:32 to go on campus
0:44:32 as much.
0:44:32 I don’t think
0:44:33 they’re going to
0:44:33 leave their house
0:44:34 as much.
0:44:34 I don’t think
0:44:34 they’re going to
0:44:36 want to go have
0:44:36 drinks with friends.
0:44:37 I don’t think
0:44:37 they’re going to
0:44:38 want to go
0:44:39 to football practice.
0:44:39 I don’t think
0:44:40 they’re going to
0:44:41 volunteer at non-profits
0:44:42 or go to church.
0:44:45 So I see this
0:44:45 as,
0:44:46 again,
0:44:46 another attempt.
0:44:47 We have connected
0:44:49 to shareholder value
0:44:50 things that make us
0:44:51 less mammalia,
0:44:52 that make us less,
0:44:54 that quite frankly,
0:44:55 attack our manhood,
0:44:57 attack our risk
0:44:58 aggressiveness,
0:44:59 attack what it means
0:45:00 to experience
0:45:01 real victory
0:45:02 because the thing
0:45:03 I hate about
0:45:04 these fucking
0:45:05 synthetic relationships
0:45:06 is they give
0:45:07 people the sense
0:45:09 that relationships
0:45:09 are supposed to be
0:45:10 easy.
0:45:10 They’re not.
0:45:11 The most wonderful
0:45:11 things,
0:45:12 you’ll see this,
0:45:13 the most wonderful
0:45:14 thing in your life
0:45:16 will be forming
0:45:18 a romantic partnership
0:45:18 with someone,
0:45:19 figuring out how
0:45:20 to develop economic
0:45:21 security such that
0:45:22 you can have kids
0:45:23 and then raising
0:45:24 those children.
0:45:24 And the only thing
0:45:25 I can guarantee you
0:45:26 about all that shit
0:45:27 is it’s really
0:45:28 fucking hard.
0:45:30 And if you don’t
0:45:31 develop the skills
0:45:32 to navigate a
0:45:32 partnership,
0:45:33 navigate romantic
0:45:34 interests,
0:45:35 navigate raising
0:45:35 kids,
0:45:37 navigate the
0:45:37 corporate environment
0:45:39 because you become
0:45:40 used to synthetic
0:45:41 relationships that are
0:45:42 just so fucking
0:45:43 vanilla and easy
0:45:43 and always tell you
0:45:44 you’re great and
0:45:45 laugh at every joke
0:45:45 you make,
0:45:46 you’re never going
0:45:47 to develop those
0:45:48 skills and you’re
0:45:48 going to wake up.
0:45:49 I think we’re going
0:45:50 to just raise a
0:45:51 generation of young
0:45:51 people who wake up
0:45:52 and are like,
0:45:53 I have no ability
0:45:54 to deal with other
0:45:54 people.
0:45:55 I don’t know what
0:45:56 real victory feels
0:45:58 like and I’m anxious,
0:45:59 obese and depressed.
0:46:00 I don’t think many
0:46:01 people would disagree
0:46:02 with anything that
0:46:03 you just said.
0:46:04 And the trouble is
0:46:05 there is so much
0:46:08 money in this.
0:46:09 There is so much
0:46:10 money in children
0:46:12 and advertising to
0:46:12 children and
0:46:13 entertaining children
0:46:14 and keeping them
0:46:14 glued to their
0:46:15 devices.
0:46:16 And there’s also so
0:46:17 much money in
0:46:18 porn.
0:46:19 Porn makes up
0:46:22 25% of global
0:46:23 internet traffic.
0:46:24 People spend
0:46:27 $3,000 on porn
0:46:28 every second.
0:46:29 So this is an
0:46:31 extremely profitable
0:46:33 business, the
0:46:34 business of porn.
0:46:36 And it is
0:46:37 therefore kind of a
0:46:37 question.
0:46:38 I mean, the
0:46:39 markets are going to
0:46:40 roll on and they
0:46:41 are going to go full
0:46:43 steam ahead on
0:46:45 porn and more
0:46:46 specifically AI porn
0:46:47 because AI porn is a
0:46:48 more cost-effective
0:46:49 way of delivering porn
0:46:50 to people.
0:46:51 So it’s a choice
0:46:53 for these CEOs and
0:46:55 for these AI leaders.
0:46:57 And it’s a choice for
0:46:57 Sam Albin.
0:46:59 And I want to point
0:47:02 you to a quote that
0:47:03 he said on a
0:47:05 podcast, I believe this
0:47:06 was a few months ago,
0:47:07 maybe a year ago,
0:47:09 where he addressed
0:47:10 this choice.
0:47:11 There’s a lot of
0:47:11 short-term stuff we
0:47:12 could do that would
0:47:15 like really like juice
0:47:15 growth or revenue or
0:47:17 whatever and be very
0:47:17 misaligned with that
0:47:18 long-term goal.
0:47:20 And I’m proud of the
0:47:21 company and how little
0:47:22 we get distracted by
0:47:23 that, but sometimes we
0:47:23 do get tempted.
0:47:25 Are there specific
0:47:26 examples that come to
0:47:26 mind?
0:47:27 Any like decisions
0:47:27 that you’ve made?
0:47:32 Well, we haven’t put a
0:47:33 sex bot avatar in
0:47:34 Chessie Beauty yet.
0:47:36 That does seem like it
0:47:38 would get time spent.
0:47:39 Apparently it does.
0:47:40 And now they do.
0:47:41 Yeah, I mean, I hear
0:47:42 that and I kind of
0:47:43 have just one general
0:47:43 feeling.
0:47:45 Fuck you.
0:47:48 You and your hush
0:47:49 tones and your faux
0:47:49 concern.
0:47:51 If we’re waiting on
0:47:52 the better angels of
0:47:53 Sam Altman or Satya
0:47:55 Nadella or Tim Cook
0:47:57 or Mark Zuckerberg or
0:47:58 Elon Musk to show
0:47:59 up, don’t hold your
0:48:00 breath.
0:48:01 Stop being such
0:48:01 fucking idiots.
0:48:03 There have to be
0:48:04 laws.
0:48:05 We live in a
0:48:07 capitalist society where
0:48:08 your power, your
0:48:09 selection set of mates,
0:48:10 your influence, how
0:48:11 much people laugh at
0:48:12 your jokes, your
0:48:13 ability to take care of
0:48:14 your children is all
0:48:16 correlated to wealth.
0:48:17 So what we know with
0:48:20 100% certainty or 99.9%
0:48:22 certainty is the CEOs of
0:48:23 these companies will make
0:48:24 decisions.
0:48:25 It will rationalize
0:48:26 decisions incrementally
0:48:27 regardless of how many
0:48:28 teens start cutting
0:48:30 themselves or how it
0:48:31 separates them from their
0:48:32 parents and key
0:48:33 relationships.
0:48:34 So to a certain extent it’s
0:48:36 not Sam Altman’s fault.
0:48:36 It’s ours.
0:48:38 It’s pretty simple.
0:48:42 Age-gate social media, no
0:48:43 phones in schools, no
0:48:45 synthetic relationships under
0:48:46 the age of 18, no
0:48:47 pornographic material for
0:48:49 kids under the age of 18,
0:48:50 removal of Section 230
0:48:52 protections for
0:48:53 algorithmically elevated
0:48:54 content, and start
0:48:55 fucking finding these
0:48:56 companies a percentage of
0:48:57 their revenues, not a
0:48:59 parking ticket, and here’s
0:48:59 an idea.
0:49:01 someone does a perp
0:49:03 walk, but this notion
0:49:06 that we keep hoping that
0:49:07 these guys are going to
0:49:08 get it and their better
0:49:09 angels are going to show
0:49:10 up, come on.
0:49:13 Sam’s doing what he’s
0:49:14 supposed to be doing.
0:49:16 He is adding a crazy
0:49:16 amount of shareholder
0:49:18 value to justify the
0:49:19 $500 billion valuation he
0:49:20 just raised money at.
0:49:21 He’s got to figure out a
0:49:23 way to create the GDP of
0:49:24 Finland in the next two
0:49:27 years, and if it means
0:49:28 increasing attention by
0:49:30 10, 20, 30 percent, I’m
0:49:30 young people under the
0:49:32 age of 25, with
0:49:34 lifelike porn, he’s
0:49:36 going to rationalize why
0:49:37 they should do it, and
0:49:38 he’ll put in place some
0:49:39 faux controls that kids
0:49:40 can get around.
0:49:42 Folks, this is up to
0:49:44 us, and the notion of
0:49:45 the illusion of
0:49:46 complexity that gets
0:49:47 weaponized here in these
0:49:48 false or hollow
0:49:49 arguments are just
0:49:49 that.
0:49:50 They’re false
0:49:51 misdirects.
0:49:52 Do you realize how
0:49:53 much trouble a bar gets
0:49:54 in if they let in kids
0:49:55 under the age of 21?
0:49:56 They get in real
0:49:57 trouble.
0:49:58 You can lose your
0:49:59 liquor license.
0:50:02 So why would it be any
0:50:02 different here?
0:50:04 Because, and the reason
0:50:05 it’s more different is
0:50:06 because we have become so
0:50:08 co-opted by money that if
0:50:09 a company can add tens,
0:50:10 hundreds of billions of
0:50:11 dollars in shareholder
0:50:13 value, well, child safety
0:50:15 kind of takes a backseat.
0:50:17 We’re going to nod our
0:50:18 head, and Sam’s going to
0:50:20 talk in hushed tones
0:50:21 about, you know, we don’t
0:50:22 do these things because
0:50:23 we’re more concerned with
0:50:24 the commonwealth than
0:50:24 shareholders.
0:50:25 No.
0:50:26 Sam is going to continue
0:50:28 to do whatever will
0:50:29 increase shareholder value,
0:50:32 by a dollar every day.
0:50:33 That is his first, his
0:50:34 second, and his third
0:50:34 priority.
0:50:35 And quite frankly, that’s
0:50:36 his job.
0:50:38 It’s our job to elect
0:50:39 people who say, okay,
0:50:41 our kids should not be
0:50:41 engaging in these
0:50:42 relationships.
0:50:43 Our kids should not be
0:50:44 consuming content that
0:50:46 results in a 60% increase
0:50:47 in self-harm.
0:50:49 And we have to have elected
0:50:50 leaders that aren’t total
0:50:52 whores and sort of not.
0:50:53 And by the way, this is on
0:50:54 both sides of the aisle,
0:50:56 hands down, and have
0:50:57 thoughtful questions about
0:50:59 privacy in these issues and
0:51:00 want to hear more about it
0:51:02 because they just got money
0:51:05 from, you know, from
0:51:06 Meta or from Alphabet or
0:51:08 for whatever PAC is the, is
0:51:09 the false front for these
0:51:11 things or the veneer, and
0:51:12 that we don’t have a
0:51:13 president who basically
0:51:15 wants to hang out with
0:51:15 these guys.
0:51:17 So this is, you know,
0:51:20 this is really, I have, I
0:51:21 mean, I’m always freaked
0:51:22 out and I catastrophize
0:51:23 because I’m angry and
0:51:24 depressed, but I think
0:51:24 there is a legitimate
0:51:26 reason to worry about the
0:51:28 collision of adult
0:51:29 content and synthetic
0:51:30 relationships and
0:51:31 struggling young men.
0:51:32 I think that is a
0:51:33 fucking disaster waiting
0:51:34 to happen.
0:51:34 I think you’re going to
0:51:36 find, we talk about
0:51:37 these, I think they’re
0:51:38 called needs, neither
0:51:39 employed.
0:51:39 Yeah, employed or
0:51:40 enrolled or in training.
0:51:42 Yeah, nothing, right?
0:51:43 I think that could
0:51:44 quadruple in the next
0:51:45 five or eight years
0:51:47 because it’s like, well,
0:51:48 if my parents will let
0:51:48 me live in the
0:51:50 basement and I can find
0:51:51 enough chief calories to
0:51:53 survive, why wouldn’t I
0:51:54 stay at home and have
0:51:56 these interesting, exotic,
0:51:58 erotic, pornographic,
0:51:59 intellectually somewhat
0:52:00 stimulating relationships
0:52:02 with friends, mentors, and
0:52:03 girlfriends on an
0:52:04 algorithm and a
0:52:05 screen, and by the
0:52:06 time I emerge from my
0:52:07 fucking cave, I’m
0:52:08 going to have no
0:52:09 skills at all, none
0:52:11 whatsoever, right?
0:52:12 It’s almost like these,
0:52:14 they call them
0:52:15 sexpats, these guys
0:52:16 who’ve just given up on
0:52:17 their local society and
0:52:18 they moved to Thailand or
0:52:20 some other place where
0:52:21 they can basically have a
0:52:22 relationship for a much
0:52:23 lower cost, much lower
0:52:24 effort, much more
0:52:24 frictionless.
0:52:26 Take that times a
0:52:28 hundred, times a
0:52:29 hundred, and it’s going
0:52:31 to be not involve any
0:52:31 humans at all.
0:52:34 But this is, and the
0:52:35 thing that pisses me off so
0:52:36 much about this is that we
0:52:37 don’t want to admit and
0:52:38 acknowledge the
0:52:39 solutions are a lot
0:52:41 simpler than these guys
0:52:42 would have you believe.
0:52:43 Just to go over some of
0:52:44 the solutions that other
0:52:45 countries have had,
0:52:46 Norway, they just
0:52:48 implemented a complete
0:52:49 ban on social media for
0:52:50 people under 13, the
0:52:51 plan is to raise it to
0:52:52 15, and Australia just
0:52:52 passed a law which is
0:52:54 going to ban social media
0:52:56 for children under 16, and
0:52:57 that’s going to be going
0:52:58 into effect later this
0:52:58 year.
0:53:00 But I think the point
0:53:01 being, like, as you
0:53:04 always say, money wins,
0:53:05 and money will always
0:53:05 win.
0:53:07 And we can’t keep
0:53:09 expecting and hoping that
0:53:10 people and tech leaders
0:53:11 and business leaders are
0:53:11 going to regulate
0:53:12 themselves.
0:53:13 They might talk about it
0:53:15 for a time, but as we
0:53:17 see here, no, they’re
0:53:17 never going to regulate
0:53:18 themselves.
0:53:19 It’s the government’s job
0:53:21 to do the regulation.
0:53:22 It’s the government’s job
0:53:24 to figure out what the
0:53:25 rules of the road are.
0:53:26 They’re supposed to be
0:53:26 the referee.
0:53:28 They figure out what the
0:53:28 boundaries are.
0:53:29 And so you need the
0:53:30 government to set the
0:53:32 rules here and say what
0:53:33 is okay and what isn’t
0:53:34 okay, make it punishable
0:53:35 by law, and then let
0:53:36 capitalism do its
0:53:37 thing.
0:53:38 I mean, we talk about
0:53:38 this a lot.
0:53:39 we like the
0:53:40 competition, but this
0:53:41 expectation that these
0:53:43 tech leaders are going to
0:53:45 have sort of the moral
0:53:46 bandwidth to regulate
0:53:47 themselves and to do it
0:53:49 in a thoughtful way, it’s
0:53:50 just never going to
0:53:50 happen.
0:53:51 As you say, it’s not
0:53:52 their job.
0:53:56 We’ll be right back.
0:53:58 For even more markets
0:53:59 content, sign up for our
0:54:00 newsletter at
0:54:02 ProfGMarkets.com slash
0:54:03 subscribe.
0:54:14 Scott, we’re hitting the
0:54:16 road, bringing Pivot Live to
0:54:17 the people.
0:54:19 Seven cities, Toronto, Boston,
0:54:21 New York, D.C., Chicago, San
0:54:23 Francisco, and L.A., of course.
0:54:25 You went to Oasis, you
0:54:27 went to Beyoncé, you saw the
0:54:28 remake of Wizard of Oz and
0:54:29 the Spear.
0:54:32 All those suck compared to
0:54:32 the Pivot Tour.
0:54:35 This is the biggest tour.
0:54:37 Same people that are
0:54:38 organizing our tour that
0:54:40 organized Taylor Swift’s
0:54:40 tour.
0:54:41 They are much more excited
0:54:43 about our tour.
0:54:44 All right, that’s enough,
0:54:44 Grandpa.
0:54:46 It’s going to be so good, and
0:54:48 we’re bringing our brand of
0:54:50 whatever we do to the people,
0:54:51 and we’re excited to meet our
0:54:51 fans.
0:54:52 We love our fans.
0:54:53 For tickets, head to
0:54:55 PivotTour.com.
0:54:56 See you there.
0:55:00 Megan Rapinoe here.
0:55:02 This week on A Touch More,
0:55:04 we’ve got WNBA champion
0:55:07 Jackie Young, a.k.a.
0:55:09 IA Jack, on the show.
0:55:10 We’re so excited.
0:55:12 We’ll find out how she’s been
0:55:13 celebrating her third
0:55:15 championship, how the Aces
0:55:16 turned their season around,
0:55:18 and whether they’re the
0:55:19 greatest dynasty ever.
0:55:21 Plus, we’re handing out the
0:55:23 most prestigious awards of the
0:55:25 WNBA season, the Meggies.
0:55:27 From best dressed to gayest
0:55:28 moments of the season, you do
0:55:29 not want to miss it.
0:55:31 Check out the latest episode of
0:55:32 A Touch More wherever you get
0:55:33 your podcasts and on YouTube.
0:55:40 Ever gone on vacation outside the country
0:55:42 and felt like the food was just
0:55:43 better?
0:55:45 When I was in Japan recently,
0:55:47 the produce and meat were amazing,
0:55:49 and even a minor skin issue that I
0:55:51 usually have cleared out by the end
0:55:51 of the two weeks.
0:55:53 What’s up with what we eat here?
0:55:56 Tomatoes are renowned for looking
0:55:58 delicious, but not tasting delicious.
0:56:00 Is some food actually better in other
0:56:02 countries, and what are they doing
0:56:03 differently?
0:56:05 A law passed in 1993
0:56:08 that said, if you’re buying bread
0:56:11 in a boulangerie, it must be made
0:56:13 with four ingredients, essentially.
0:56:15 Find out more this week on
0:56:16 Explain It To Me.
0:56:18 New episodes every Sunday,
0:56:19 wherever you get your podcasts.
0:56:34 We’re back with Prof G Markets.
0:56:37 Back in July, we said SpaceX was the
0:56:38 most important monopoly that no one
0:56:39 is talking about.
0:56:41 We argued that it owns the space
0:56:42 economy and that that’s what makes it
0:56:43 such a powerful investment.
0:56:47 But that monopoly might be under
0:56:48 threat because there are two companies
0:56:51 that are looking to shake up this
0:56:51 space race.
0:56:55 AST Space Mobile is building a space-based
0:56:57 cellular broadband network, and
0:56:59 Rocket Lab is working to offer
0:57:01 reliable launches for hundreds of
0:57:03 small satellites, and those stocks
0:57:07 are up 230% and 500%
0:57:10 respectively in the past year.
0:57:12 That is really why we’re paying
0:57:14 attention to these companies.
0:57:15 The stocks are absolutely
0:57:17 ripping right now, and the idea
0:57:20 could be that they are going to
0:57:22 compete with SpaceX, perhaps one day
0:57:24 dethrone SpaceX.
0:57:26 They’re certainly a way off from that
0:57:26 at the moment.
0:57:28 But Scott, your reactions to the
0:57:30 absolute explosions in the stocks
0:57:33 of these two fledgling space
0:57:33 companies.
0:57:35 I think it’s really exciting because I
0:57:37 think it’ll get everyone’s greed
0:57:40 glands going and people will go into
0:57:40 the space.
0:57:42 And we said earlier in the year that the
0:57:44 most powerful and possibly dangerous
0:57:48 monopoly wasn’t even, you know, YouTube
0:57:59 Amazon at 50% of e-commerce, Meta at 75% or
0:58:02 two-thirds of all social media, and
0:58:04 Alphabet with 90-plus percent of search.
0:58:08 The fact that Elon Musk or SpaceX is
0:58:10 responsible, I think, for about two-thirds of
0:58:13 all launches, I mean, the way to think of
0:58:15 it is we’re on an, we’re one of nine
0:58:17 unremarkable rocks with a little bit of
0:58:19 moisture and gas that, and one of them
0:58:22 appears to sustain life as far as we know.
0:58:24 It’s one of 100,000 galaxies in a million
0:58:25 different universes.
0:58:28 I mean, the potential of space is pretty
0:58:29 striking, right?
0:58:32 And one company appears to be, or was
0:58:34 developing a monopoly on, well, there’s
0:58:36 Earth and there’s everything else.
0:58:38 And it’s like, well, okay, if you have a
0:58:41 monopoly on everything else, there might
0:58:43 be incredible upside there, but also do
0:58:45 you really want, you want a lot of
0:58:45 competition here.
0:58:49 So, I think it’s super exciting.
0:58:52 Right now, of the 10,200 active
0:58:54 satellites, 8,500 are Starlink.
0:58:56 By the end of 2025, 6 million subscribers
0:58:59 and 62% of global satellite broadband
0:59:02 revenue is going to go to Starlink.
0:59:05 And there are three pure-play public space
0:59:06 companies with market caps larger than
0:59:07 10 billion.
0:59:10 Rocket Lab was up, almost tripled last
0:59:14 year, up 178%, and it’s up six-fold, 500%
0:59:15 over the past year.
0:59:17 Echo Star, a similar satellite communications
0:59:19 company, up 231% this year.
0:59:23 AST Space Mobile, a company building a
0:59:25 Starlink-like satellite broadband network, is
0:59:30 up 356% a year today, up over 2,500% in
0:59:30 the last two years.
0:59:33 Stock’s up 26-fold in the last two years.
0:59:36 They’ve only had six satellites in orbit, but
0:59:37 they’re aiming to get to 45 to 60.
0:59:41 There’s Blue Origin and Kuiper.
0:59:43 What would be just super interesting, and
0:59:45 Mia did this research, is just as we were
0:59:48 excited about GLP-1, you always feel like
0:59:48 we’re late.
0:59:49 No.
0:59:50 You don’t feel like we’re late.
0:59:51 So, the question is, how do we go further
0:59:55 downstream and find the suppliers?
0:59:57 These guys are going to raise so much
0:59:58 capital.
0:59:58 They’re not dumb.
1:00:01 They realize that these stocks have gone
1:00:02 crazy, so they’re going to issue a ton of
1:00:03 stock, and they’re going to start buying.
1:00:07 What I want to know is, what are the O-rings
1:00:11 or the type of metal or the type of plastics,
1:00:15 milled products, specialty components that go
1:00:16 into these things?
1:00:19 And are any of them publicly traded, or could
1:00:20 we buy any of them?
1:00:23 Because the amount of CapEx that’s about to
1:00:27 go into space is only going to be, it’s not
1:00:30 going to rival what’s going into AI, but it
1:00:32 strikes me that this is going to be, just as
1:00:34 we’ve been tracking AI and talking about the
1:00:35 extraordinary amounts of money, I wonder if
1:00:38 we’ll be talking about space and launch
1:00:42 capability with the same type of valuations and
1:00:44 growth over the next couple of years that we’ve
1:00:46 been talking about AI in the last 24 months.
1:00:50 Let’s go over some of the numbers, like AST Space
1:00:53 Mobile, as you say, up 2,500% in the past two years.
1:00:54 It’s just unbelievable.
1:00:58 You look at what they’ve done, they’ve got six
1:00:59 satellites in orbit.
1:01:03 They’re planning to launch another 45 satellites into
1:01:05 orbit by the end of next year.
1:01:10 So they’re getting there, building out the
1:01:11 satellite network.
1:01:15 But remember, you’ve got to keep in mind, 8,500
1:01:17 satellites out there are Starlink.
1:01:22 So, you know, they’re beginning to compete, but not
1:01:22 really.
1:01:26 Then there’s Rocket Lab, which is, Morgan Stanley
1:01:27 calls them the alternative to SpaceX.
1:01:30 They’re not really building a network, but they’re
1:01:33 launching stuff into space.
1:01:37 And, you know, they are becoming sort of the
1:01:40 second alternative to SpaceX if you want to launch
1:01:41 payloads into space.
1:01:43 You might go with Rocket Lab.
1:01:45 They have this Rocket Lab Neutron rocket.
1:01:48 That’s their flagship rocket, which is sort of trying
1:01:50 to compete with the Falcon Heavy.
1:01:53 So that’s another option.
1:01:57 But something I have been thinking about, you know,
1:02:00 this stock performance is absolutely crazy.
1:02:04 I mean, 2,500% in the past two years, Rocket Lab up
1:02:05 600%.
1:02:08 Like, this is crazy town.
1:02:11 And you look at, like, the valuations.
1:02:15 I mean, you’ve got Space Mobile that is trading at 500
1:02:18 times expected sales for 2025.
1:02:22 So these aren’t fully rational valuations.
1:02:26 And it does remind me of this dynamic that we’ve discussed
1:02:28 about private versus public.
1:02:32 And that is, there’s only one space company that people are
1:02:34 really excited about, and it’s SpaceX.
1:02:37 And you can’t invest in it because it’s a private company,
1:02:39 unless you are an accredited investor, unless you know someone
1:02:40 who can get you some shares.
1:02:43 But when you look at the stock performance of these companies,
1:02:47 it does feel as though those are just the names that are
1:02:48 associated with space.
1:02:52 And people who are investing in the public markets just want to
1:02:54 get some sort of exposure to space.
1:02:57 So I do think that these are the kinds of stocks that you want
1:02:58 to be really wary of.
1:03:02 Like, this kind of explosion is not really tied to the fundamentals.
1:03:05 To me, it’s more of a momentum thing where there’s so much
1:03:06 demand for space.
1:03:08 And we all agree, space is a big deal.
1:03:09 It’s going to be important.
1:03:14 But the one company that you’d want to invest in, you can’t invest in.
1:03:19 So you go to these competitors, which, you know, are making some progress.
1:03:22 But let’s be real, they don’t hold a candle to SpaceX.
1:03:24 Mia pulled together some research.
1:03:26 There’s some of the component suppliers for satellites.
1:03:30 Honeywell, they make antennas, high-speed radios, data transmitters, receivers
1:03:33 that allow satellites to send and receive information.
1:03:37 Their clients include Boeing, Airbus, Lockheed, Martin, and SpaceX.
1:03:44 It also holds a majority stake in quantum computing firm, Quantinum, that recently raised about
1:03:48 $600 million at a $10 billion valuation from investors, including NVIDIA.
1:03:51 And their backlog grew 10% year-on-year.
1:03:53 It’s down 10% year-to-date.
1:03:56 Trades at three-time sales, roughly in line with its five-year historical average.
1:04:01 So there’s a company that appears to be in the space, but hasn’t had the same sort of updraft.
1:04:05 Universal Microwave Technology, a Taiwanese firm that manufactures super high-frequency
1:04:11 electronics that handle the radio waves satellites use, and their components are used in satellites
1:04:13 themselves and also in the ground receivers.
1:04:14 It’s up 39%.
1:04:20 But to your point, space is obviously a really risky business.
1:04:24 Fewer than one in four venture-backed companies even make it to orbit with a vehicle.
1:04:29 I had dinner with a friend of mine who works at a large PE firm, and he said the cheapest
1:04:33 way to invest in SpaceX right now is through EchoStar.
1:04:41 That I think EchoStar either owns, has a stake in SpaceX, but he described it.
1:04:48 He basically said that EchoStar is, it’s not cheap though, it’s tripled in the last year,
1:04:52 is the cheapest way to own SpaceX.
1:04:54 That sort of describes the dynamic, right?
1:04:59 It’s like everyone wants to get to SpaceX, so they take these interesting diversions to get there.
1:04:59 I love this.
1:05:01 I just love seeing competition.
1:05:07 And the fact that these stocks, what happens is when these stocks go up 25-fold, you’re just going
1:05:13 to see a ton of venture activity in the space, a ton of human capital go in, and people trying to
1:05:17 figure out how to get shit into space and attract capital.
1:05:26 If any of these guys becomes somewhat formidable, gets them striking as SpaceX, that’s what probably
1:05:30 would motivate SpaceX to go public so they can run away with it on a capital basis.
1:05:32 But I think it’s super exciting.
1:05:36 And I think, and we were talking about this in the editorial call, I think we should start
1:05:39 following space kind of the same way we have been following AI.
1:05:41 Okay, let’s take a look at the week ahead.
1:05:44 We will see the consumer price index for September.
1:05:48 Despite the government shutdown, we’ll also see earnings from Netflix.
1:05:53 We’ll see earnings from Tesla, from P&G, Coca-Cola, and Intel.
1:05:55 Any predictions, Scott?
1:05:58 Really interested in this Netflix deal.
1:05:59 And I know you talked about this.
1:06:00 We talked about it last night.
1:06:05 But basically, Netflix is partnering with Spotify and has essentially said, okay, let’s be honest.
1:06:09 Disney Plus and Hulu, that’s not our competition.
1:06:11 Our competition is YouTube.
1:06:16 And they’ve done a deal with Spotify for some of their original content.
1:06:20 I think it’s The Ringer, Bill Simmons, and some crime drama stuff.
1:06:23 And we’re going to run it on Netflix.
1:06:28 And I think that what you’re going to see in the next 12 months, I think you’re going to
1:06:35 see a lot of podcasts or what started as podcasts running, not only in streaming media, I think
1:06:40 the real home for them or the more opportune home is on cable networks.
1:06:40 Why?
1:06:46 If you think about cable networks, they’re actually still, if you didn’t know what amazing businesses
1:06:51 they were 10 years ago, and you just looked at them today, they still look like good businesses.
1:06:55 They’re declining, but they still spin off a lot of cash flow.
1:07:01 The problem with these businesses is not only on the revenue side, it’s that the expenses haven’t moved.
1:07:03 The expenses haven’t come down.
1:07:08 They dramatically need to decrease the costs of these shows.
1:07:10 And podcasts do that.
1:07:16 I mean, essentially, 25% of our listens are on a TV, streamed through YouTube.
1:07:21 But I can guarantee you, we cost a lot less than your traditional whatever.
1:07:25 If someone watches an hour of this on their TV right now off of YouTube, and then they flip
1:07:33 over and watch an hour of Wednesday or Breaking Bad on AMC or name your hour-long program, you
1:07:35 can bet this costs a lot less.
1:07:42 It won’t get nearly the viewership, but it’ll get more people in the core demo, and it’ll
1:07:46 be on an operating margin or profitability basis much more profitable.
1:07:52 So anyway, long-winded way of saying a dozen to two dozen of the top 50, maybe top 100 podcasts
1:07:57 are going to be running on cable channels and across streaming media.
1:08:03 This episode was produced by Claire Miller and engineered by Benjamin Spencer.
1:08:04 Our associate producer is Alison Weiss.
1:08:06 Mia Silveri is our research lead.
1:08:10 Our research associates are Isabella Kinsel, Dan Chalon, and Kristen O’Donoghue.
1:08:14 Drew Burrows is our technical director, and Catherine Dillon is our executive producer.
1:08:16 Thank you for listening to Prof G Markets from Prof G Media.
1:08:20 Tune in tomorrow for a fresh take on markets.
1:08:50 Prof G Markets from Prof G Markets from Prof G Markets from Prof G Markets.
1:08:55 Prof G Markets from Prof G Markets from Prof G Markets from Prof G Markets from Prof G Markets from Prof G Markets.
Scott and Ed dig into how America’s data center boom is driving up electricity bills — and what should be done about it. Then, they look at how Meta and OpenAI are responding to rising mental health concerns and consider who’s really responsible for fixing the problem. Finally, they look at two fast-rising challengers to SpaceX and the new investment opportunities emerging in the space race.
Subscribe to the Prof G Markets newsletter
Order “The Algebra of Wealth,” out now
Subscribe to No Mercy / No Malice
Follow the podcast across socials @profgmarkets
Follow Scott on Instagram
Learn more about your ad choices. Visit podcastchoices.com/adchoices

Leave a Reply
You must be logged in to post a comment.