The Biggest Ai Announcements From CES 2025

AI transcript
0:00:04 One of the things at CES was last night in Jensen gave his big keynote,
0:00:06 AKA Tech Taylor Swift, right?
0:00:10 Like, you know, the second they announced him and like he was walking out on this
0:00:14 stage, people were like standing and cheering and screaming.
0:00:17 In my opinion, the bigger news and the things with the bigger
0:00:22 implications of the AI world are project digits or just digits.
0:00:26 You’re going to have a model that’s good enough to do anything you want.
0:00:28 So you’re saying that basically NVIDIA is going to be the big player and not open
0:00:35 AI? Hey, welcome to the next wave podcast on Matt Wolf.
0:00:42 I’m here with Nathan Lanz, and I am reporting into this episode from CES in Las Vegas.
0:00:47 You know, in this episode, we’re going to deep dive into the future of AI as we’re
0:00:51 sort of seeing it unfold here at CES this year.
0:00:55 You know, we hear a lot about these companies like Open AI and Anthropic and
0:00:58 Google and all of the big, massive companies building AI.
0:01:03 But is the big winner, the company that everybody’s really going to remember when
0:01:04 it comes to AI in the future?
0:01:06 Is it really going to be NVIDIA?
0:01:10 Well, that’s what we’re going to discuss in this episode because I’m spending a
0:01:13 ton of time with NVIDIA out here learning what they’re up to.
0:01:16 They just put on a massive keynote where they drop some huge announcements.
0:01:20 We’re going to get into all of that as well as some of the coolest tech I’ve
0:01:24 come across at CES and my overall thoughts on CES.
0:01:26 And Nathan and I, we’re just going to nerd out on this one with you.
0:01:28 So let’s just go ahead and dive right in.
0:01:34 I think maybe where we should start is probably the whole Sam Altman thing,
0:01:38 because that kind of came out like right before I got to CES, right?
0:01:42 He put out a blog post and essentially said, we know how to get to AGI.
0:01:43 We’ve got a path to it.
0:01:45 We know we’re going to hit AGI.
0:01:47 It’s going to be 2025 that we hit AGI.
0:01:49 Our new mission is ASI.
0:01:53 It basically makes it sound like AGI is no longer a big deal.
0:01:56 Like we definitely know how to reach AGI.
0:01:56 Yeah.
0:02:00 Yeah, just the tone of the letter was like super confidence.
0:02:01 Yeah, right?
0:02:05 Like the next big deal is ASI, you know, artificial super intelligence.
0:02:09 Yeah, with Sam Altman though, I feel like the last like year,
0:02:13 he’s sort of been on this campaign to like damper expectations, right?
0:02:16 Like he’s been kind of going around saying AGI is going to come.
0:02:19 It’s going to pass and it’s not going to change anybody’s life.
0:02:22 As big as like they think it’s going to change their lives, right?
0:02:25 So I feel like he’s kind of doing that sort of thing of like, we’re going to hit AGI,
0:02:27 but it’s not as big of a deal as you guys think it is.
0:02:30 That’s kind of the feeling I get from Sam lately.
0:02:32 Maybe, yeah, he did say some stuff too.
0:02:35 That I thought when I had said previously where we were talking about like, you know,
0:02:37 is open AI in a bad spot or not?
0:02:39 I mean, you had one episode where we were kind of arguing back and forth on that.
0:02:42 And I was kind of saying like from a Silicon Valley perspective,
0:02:46 like when you do a startup, there’s always issues between co-founders.
0:02:49 And typically the more successful a company is and the faster things move,
0:02:53 actually, the more issues because just there’s such a divergence of what people want.
0:02:57 And also just egos get bigger and bigger and then the egos clash.
0:03:01 His letter says that it’s been like the best year of my life
0:03:06 and the worst year of my entire life, which is interesting.
0:03:08 Because last year was the year he got booted from open AI.
0:03:09 That wasn’t this year.
0:03:11 And he did talk about that in startups.
0:03:14 You know, he was a white comedy for a long time.
0:03:15 Startups that grow fast.
0:03:16 They always have team issues.
0:03:20 And typically the faster a startup grows, the more intense those are.
0:03:23 And he said, this is possibly the best example of that possible
0:03:26 is that they’ve been growing faster than almost any company in history.
0:03:29 Right. And so, of course, there’s huge mistakes they’ve made.
0:03:31 And, you know, he didn’t say egos, but, you know, obviously egos got big.
0:03:33 And he mentioned he made mistakes as well.
0:03:37 But he did kind of like throw the people under the bus who had kicked him out, though.
0:03:41 He did make it clear that he thinks that they totally screwed him over
0:03:43 and screwed up by doing what they did.
0:03:47 Well, then now you have this year where he’s been battling with Elon, right?
0:03:51 So that’s like the new thing that he’s dealing with is Elon.
0:03:55 And now they’ve got, you know, Jeffrey Hinton, who’s on board trying to
0:03:58 also stop the for-profit, right?
0:04:03 That’s another piece of news that came out recently is that they are going to try
0:04:06 to switch to a for-profit from a nonprofit.
0:04:09 They’re trying to register it as like a Delaware corporation.
0:04:11 The non-profit still going to exist.
0:04:15 It’s just going to hold a ton of equity in the for-profit, right?
0:04:16 But you’ve got Elon Musk.
0:04:17 You’ve got Metta.
0:04:22 You’ve got Jeffrey Hinton, who, you know, is sort of like the godfather of AI.
0:04:25 All of these people pushing back, saying we should not let that company
0:04:27 switch to a for-profit.
0:04:30 He mentioned that whole situation as well, without saying names and talking about.
0:04:34 It’s natural when you’re growing this fast and such an important moment in history
0:04:38 that people, especially people who might be competitive with you,
0:04:42 might be trying to throw up roadblocks and cause issues.
0:04:44 So he kind of alludes to that without saying the names.
0:04:47 Yeah, he also talks about he believes that in 2025,
0:04:52 we are going to have agents that do meaningful work in organizations.
0:04:52 Right.
0:04:56 And he thinks that’s going to be a huge turning point with AGI and for AI in general.
0:04:57 And we’ve been hearing rumors of that,
0:05:01 but it sounds like they must have some kind of examples of that actually working
0:05:04 over like agents going off and doing meaningful work for people.
0:05:06 And so that’s going to be exciting to see that.
0:05:10 And he talked about ASI, you know, like, hey, now that we know AGI is like a given.
0:05:12 We definitely know how to do that.
0:05:13 We need to be aiming for ASI.
0:05:16 He didn’t say that they know completely how to get there,
0:05:20 but his confidence of getting there seemed way higher than when he’s talked in the past.
0:05:21 Right, right.
0:05:23 Which was people on Twitter were like, holy crap.
0:05:26 So like, obviously this test time compute stuff with like what?
0:05:29 01 going to 03 and how fast that’s improved.
0:05:35 They must be seeing signals internally that there’s no end in sight to those improvements.
0:05:37 That’s probably what’s going on.
0:05:40 Well, it sounds like they know how to get to AGI right now.
0:05:44 It’s probably mostly like an energy cost issue, right?
0:05:50 Like we saw with 03, you know, it got these high benchmarks on all of these tests.
0:05:53 But to do it, they had to spend thousands of dollars per task.
0:05:58 So I think like the real barrier right now is like, OK, maybe they’ve got AGI internally,
0:06:05 but it costs $10,000 per task to do it, which is not feasible for like any individual
0:06:08 and maybe barely any businesses, right?
0:06:12 So it’s like, I think they do actually know how to get to AGI.
0:06:15 I think AGI is probably more on the inference side, right?
0:06:19 Where maybe they have some more trainings in them with like a GPT five or something like that.
0:06:25 But it’s really that ability to like think and double check and triple check and quadruple
0:06:29 check its work and then give you its answer on the inference phase,
0:06:31 which is going to get us to AGI.
0:06:33 That’s sort of the sense that I’m getting from it all.
0:06:37 Yeah. And with ASI, he’s talking about like, you know, solving cancer and like
0:06:41 environmental issues and just any kind of problem humanity has,
0:06:45 having almost like Godlike intelligence that can help solve those problems.
0:06:50 So I thought it was exciting that now he’s talking about in terms of like business
0:06:55 use cases and AGI being useful for companies like that’s a given for them now.
0:06:56 AGI is going to be useful.
0:06:58 It’s going to be doing meaningful work for people.
0:07:03 And their main focus is moving on to ASI of how can you now
0:07:05 understand the world around us better?
0:07:07 How can you understand physics better?
0:07:11 How can you invent new things, learn or discover new science?
0:07:14 And so that’s just exciting that like as a company, it’s just like, we know AGI.
0:07:15 It’s like an energy thing.
0:07:17 It’s a little bit more improvements to the algorithm.
0:07:18 How do we do ASI?
0:07:20 And we think we might know that’s just exciting.
0:07:23 Like the confidence level is so much higher than the previous letters.
0:07:26 To me, it seems like if they can figure out AGI,
0:07:30 ASI doesn’t feel too far off after that, right?
0:07:36 Because like now you have the AGI who’s helping you figure out how to get to ASI.
0:07:38 Right? It’s like at that point, once you have an AGI,
0:07:42 you don’t necessarily need a whole bunch of humans figuring out
0:07:45 how we get to the next phase beyond AGI.
0:07:49 You have an AGI that helps you figure out how do we get to the next phase beyond AGI.
0:07:52 Right. I mean, I would be surprised if internally,
0:07:57 they’re not already using O3 and things like that to help them make progress faster.
0:08:00 I mean, you know, since the last video we did where I showed you my game demo,
0:08:05 I’ve been using O1 Pro more and more and I’m just blown away by like what it can do.
0:08:09 Yeah. I wonder what’s going to happen with a lot of the other models, right?
0:08:14 Because it seems like Anthropic and Gemini and all these other tools that are out there,
0:08:18 you know, OpenAI has kind of showed their cards now, right?
0:08:21 So don’t you think that like maybe the next phase of Claude
0:08:26 is going to have more of that like logic and double checking itself?
0:08:28 I think that’s the next phase for all of them.
0:08:32 Well, I think the big question is, did OpenAI come up with some kind of secret sauce
0:08:36 that people are not aware of Google and other people are kind of saying
0:08:37 that they understand how to do it.
0:08:40 The open source community is saying they understand how to do what OpenAI is doing.
0:08:45 OpenAI has repeatedly kind of hint that, yeah, maybe you do, maybe you don’t.
0:08:49 We have come up with some interesting techniques that have made this possible.
0:08:51 It’s not just super straightforward.
0:08:54 Is that just messaging or do they really have some kind of secret sauce
0:08:56 where, yeah, the other people think they know how to do it,
0:08:59 but there is something better about the way OpenAI is doing it
0:09:01 that will take the other ones time to discover.
0:09:03 Yeah, I’d say it’s probably a blend.
0:09:05 It’s probably like sort of in between the two.
0:09:06 Right. Yeah.
0:09:08 So like I mentioned, I’m at CES right now.
0:09:11 So if you’re watching this on the video that explains why I’m in a hotel room.
0:09:15 But one of the things at CES was last night, Jensen gave his big keynote,
0:09:18 AKA Tech Taylor Swift, right?
0:09:22 Like everybody like fan girls out whenever Jensen Huang’s in the room.
0:09:26 And, you know, the second they announced him and like he was walking out on this stage,
0:09:31 people were like standing and cheering and screaming and, you know, flashing them.
0:09:33 And, you know, no, not that flash.
0:09:35 Remember, it happened that right?
0:09:38 There are those pictures of Jensen signing a woman’s chest.
0:09:40 So he’s a rock star of our generation.
0:09:41 Like we don’t have many of them.
0:09:43 So, you know, he’s one of them.
0:09:44 Yeah, yeah, yeah, exactly.
0:09:49 But he came out, you know, the big thing that everybody at CES was
0:09:52 I think the most excited about because it’s the consumer electronic show
0:09:57 was probably the fact that we got these NVIDIA RTX 50 series, right?
0:10:05 He said that the NVIDIA 5070 is just as powerful as the 4090s that you get now.
0:10:08 The 4090s cost 1,600 bucks right now.
0:10:15 These new 5070s cost 550 bucks and are just as powerful as the 4090s.
0:10:19 Supposedly, there’s some controversy around that as well.
0:10:22 People are saying it’s like close to being as powerful.
0:10:26 But then there’s some like upscaling that happens on the GPU
0:10:27 that the old ones don’t have.
0:10:31 So it’s not quite as good and there’s some additional latency issues.
0:10:34 I don’t know all of like the technical issues or anything yet.
0:10:38 But a lot of people are going, he’s claiming it’s the same as the 4090s.
0:10:40 But if you look at the specs, it’s not technically the same.
0:10:45 But anyway, that’s what most of the people I think we’re here to hear about.
0:10:47 That got like huge cheers.
0:10:48 Everybody was excited when they talked about that.
0:10:52 In my opinion, the bigger news and the things with the bigger
0:10:58 implications of the AI world are more about the project Cosmos, the digits.
0:11:00 So digits, that’s the personal computer.
0:11:01 Is that correct?
0:11:03 That’s the personal like super computer.
0:11:07 OK, and I feel like that is kind of a big deal, right?
0:11:12 Like that is something that over time, I wonder if that’s going to affect
0:11:16 open AIs and Anthropics and Google’s business models, right?
0:11:20 Because pretty soon, anybody can have a computer that’s like just as
0:11:23 powerful as these cloud computers that a lot of these other companies are
0:11:25 running with sitting on their desktop.
0:11:28 And, you know, if you’ve got enough money, you can stack them together.
0:11:31 And right now they cost three grand each, you know, whenever they’re released.
0:11:32 I don’t think they’re out yet.
0:11:34 But when they’re released, they’re going to be about three grand each.
0:11:39 And Nvidia has their own version of like llama that they fine tuned and
0:11:42 optimized to work with their own computers.
0:11:46 But it makes me wonder, like, is the future of all of this stuff going
0:11:47 to be like the edge computing?
0:11:48 Is it going to skip the cloud?
0:11:51 Are people going to even want to use things like open AI when they can
0:11:53 just have their own box?
0:11:55 And right now, yes, they’re three grand.
0:11:58 But, you know, what are they going to be two years from now?
0:11:59 Look at what they did with the RTX series.
0:12:02 The 40 90 is sixteen hundred dollars to this day.
0:12:07 The 50 70 is now five hundred and fifty dollars and supposedly just as powerful.
0:12:11 So if these things are three grand right now, maybe next year, they’re fifteen
0:12:14 hundred bucks, maybe the year after that, you have a personal super computer
0:12:18 in your house that’s seven hundred bucks that runs models as powerful as
0:12:19 ChatGPT. I don’t know.
0:12:24 So I wonder, like, how does that impact a lot of these big AI companies?
0:12:25 Yeah, I don’t know.
0:12:28 You know, you talked about this before about in the future, are people going
0:12:31 to be using something like ChatGPT or they’re going to have the models
0:12:32 just running in their own house?
0:12:33 Yeah, it’s a good question.
0:12:35 Like, I haven’t really thought enough about that.
0:12:38 It does feel like that open AI will always have the best models.
0:12:42 But yeah, you might have for like daily usage that the models get so good
0:12:45 and that the compute gets so cheap that, yeah, you could have your own private
0:12:48 one in your house for daily stuff because you really don’t want, you know,
0:12:52 the other companies just knowing like, oh, I’m talking to the AI all day.
0:12:54 We’re like, yeah, they can hear every single thing I’m saying.
0:12:58 You know, it’s like to use these little like super computers.
0:13:00 You don’t even need to be connected to the internet, right?
0:13:03 You could be like completely offline and just wired into it.
0:13:05 And it will run all the inference on there.
0:13:08 You know, maybe someday you’ll even be able to like train your own
0:13:10 AI’s and fine tune on these things as well.
0:13:13 So like, yeah, I don’t know.
0:13:15 I feel like that could be more of the future.
0:13:19 I mean, I do think open AI and, you know, maybe anthropic and Google,
0:13:22 they will always be the more state of the art, the more cutting edge,
0:13:26 the furthest along further than what you could run at your house.
0:13:31 But at some point you get to this level where the models could
0:13:33 enough for everything you want it to do.
0:13:37 So do you really need the most cutting edge state of the art model, right?
0:13:41 Unless you’re like using it for like enterprise, unless you have like
0:13:44 really, really massive needs where you really want open AI or Google
0:13:47 or anthropic or one of these companies sort of managing it
0:13:49 and making sure you’re always still the art.
0:13:52 I feel like we’re going to get to a point where you’re going to have
0:13:55 a model that’s good enough to do anything you want.
0:13:58 Run your own agents, pretty much anything you need to do.
0:14:01 You can do on this computer at your house
0:14:03 without actually connecting to one of these cloud companies.
0:14:06 So you’re saying that basically NVIDIA is going to be the big player
0:14:09 and not open AI. That is interesting.
0:14:13 I could imagine a future where AI voice like talking to the AI
0:14:16 and being able to interact with your calendar and your email system
0:14:17 and all this kind of stuff.
0:14:19 That’s going to be very simple, very soon.
0:14:23 And so I can imagine you would want all that to be on like a local machine
0:14:25 that you’re running and then you probably could even have the AI help
0:14:29 kind of manage your privacy in terms of like, OK, I want to keep this
0:14:33 information private, but I’m OK with you interacting with chat to BT.
0:14:37 And I want you to help kind of manage what I’m willing to share with it not.
0:14:40 So if you’re going like private, personal, you know, AI interacting
0:14:43 with the more powerful companies like chat to BT, I can imagine that.
0:14:45 Like that makes a lot of sense to me.
0:14:48 Yeah, I mean, I do think we’ll get to a point where like the AIs sort of
0:14:51 interact with other AIs. I think that’s kind of inevitable, right?
0:14:56 So it’s like maybe your own personal computer, like in the same way,
0:15:01 if you use the new version of Siri that’s on the iPhone 16 plus or pro or whatever.
0:15:07 Right. Like if you use Siri, it’s got its own like edge AI on the phone
0:15:12 where it can answer questions and do some really basic responding right on the phone.
0:15:17 But if it’s a little bit more complex of a question, it’ll say, can we ask open AI?
0:15:20 Right. So I think it’ll probably be that type of thing.
0:15:22 Yeah. But yeah, I think it’s just interesting to think about, like,
0:15:28 do a lot of these big AI companies stay really like consumer tools
0:15:33 far into the future, or does the consumer stuff move more to these like the hardware
0:15:35 that you own at home? I don’t know.
0:15:40 I feel like Nvidia presented a future where maybe it is more like edge compute
0:15:42 like at home, you have your own devices.
0:15:44 It’s like the question will people ever care about that?
0:15:46 It’s like, yeah, it’s kind of like the thing with like Bitcoin.
0:15:47 Like there’s so many things that Bitcoin is better at,
0:15:49 but there’s also a lot of user experience issues.
0:15:52 And like if people are just like used to going on chat to BT and they don’t have
0:15:56 to buy all the hardware and it’s just, you know, if they can solve that.
0:15:59 Yeah, yeah. I mean, everybody’s going to prioritize different things, right?
0:16:02 Like right now, big companies, if they wanted to,
0:16:07 they can build their own server farms and run everything locally if they wanted to,
0:16:10 or they can cut costs and run it on the cloud.
0:16:13 And if you’re running it on the cloud, there are some inherent risks, right?
0:16:17 If you lose internet connectivity, do all of your systems go down for the day
0:16:19 because you can’t access the cloud?
0:16:25 Do you know exactly what information your cloud provider is able to see on your stuff?
0:16:27 Right? There are like issues with the cloud.
0:16:31 So it really depends on like what the company or the people are going to prioritize.
0:16:36 You know, us being like techie AI type people will probably always continue
0:16:40 to sort of be on the bleeding edge and try it all and, you know,
0:16:43 use both versions of it to see what can do what.
0:16:47 Yeah. I mean, on my use case right now, on how I’ve been using AI recently,
0:16:49 like it kind of fits with the whole idea of having your own
0:16:53 private for like just like conversational stuff and then for like work,
0:16:56 have the strongest models because like right now, like what I’ve been recently,
0:16:59 when I have been like, you know, getting really hands on with AI
0:17:02 by like developing this game demo, what I’ll do is I’ll be having
0:17:06 oh, one pro generate code like for like new features and things like that.
0:17:08 And sometimes it takes it like five to seven minutes.
0:17:11 And so while it’s doing that, I’m often going to Claude
0:17:13 and then having conversations with it.
0:17:14 You know, I use voice.
0:17:17 Like I have a button that like transcribes what I say in voice to text.
0:17:20 I’ll be sitting there talking and then just like going back and forth
0:17:22 on different ideas and updating that specs document while I’m waiting
0:17:24 for the code to finish.
0:17:26 Right. And so I can definitely imagine like that in the future,
0:17:29 where like for like daily use, you could have your own private computer.
0:17:31 You’re chatting with it, but then when you want to do some hard work,
0:17:35 maybe you send it off to the super ASI, which may take a minute,
0:17:38 may take an hour, you know, if it’s something super complicated.
0:17:40 And so I can see that kind of world.
0:17:42 Yeah. And I think if you’re building things like spec sheets
0:17:45 and you say like, this is how it should perform.
0:17:47 And this is what this screen should have on it.
0:17:49 And this is what this screen should have.
0:17:51 And when you press this button, this is how it should react.
0:17:53 And you have all of that kind of stuff.
0:17:58 I imagine these AIs are going to be able to test to make sure everything works.
0:18:01 And it’s like own virtual environment, which, you know, sort of brings me to
0:18:05 like the other big thing that NVIDIA talked about, which was their project Cosmos.
0:18:09 Wasn’t, you know, really the exact same thing we’re talking about here.
0:18:12 But they were talking about this project Cosmos as like the, you know,
0:18:16 the digital twin concept, the sort of virtual world where you can put
0:18:21 self-driving cars or you can put robots or whatever inside of these
0:18:25 virtual worlds, test them in the virtual worlds, train them in the virtual worlds.
0:18:28 And then once it’s working perfectly in these virtual worlds,
0:18:31 then you deploy them into the real world, right?
0:18:33 And they’ve already sort of announced that.
0:18:37 I was at NVIDIA GTC last year and they were already talking about that
0:18:39 as like part of their project, Groot and stuff.
0:18:42 But I think the new thing about Cosmos, this was one thing I was a little
0:18:46 unclear about. I was talking to somebody after the keynote and we were kind of
0:18:48 confused about like this whole digital twin thing.
0:18:52 Jensen’s been talking about that for the last like, you know, year and a half,
0:18:57 two years now, like how is this Cosmos thing that they announced this week
0:19:02 different? And the major difference that I noticed was that he was actually
0:19:06 entering prompts, just like you’d enter a chat GPT prompt or just like you to
0:19:11 enter a prompt in like stable diffusion or mid journey to give it scenarios
0:19:16 to try, right? So he was, let’s say, for example, putting a self-driving car
0:19:21 inside of this virtual environment and then he was prompting things like, oh,
0:19:27 there’s a snowstorm and a ball rolls into the street and what should the car do?
0:19:30 Run every possible scenario that the car can do.
0:19:33 And let’s figure out the optimal scenario, right?
0:19:37 They were actually typing prompts into the prompt box to generate different
0:19:40 scenarios. And to me, that’s really, really interesting.
0:19:43 It’s like, there is a black mirror episode that kind of does this with like
0:19:48 the whole like dating scenario where it plays out every possible scenario and
0:19:51 then finds like a mismatch for somebody, right?
0:19:56 That’s essentially what this Cosmos can do in like a self-driving car environment
0:19:59 or robots or, you know, whatever you want to test it with, that’s what it can do.
0:20:05 You build this environment, it plays out every possible scenario it can imagine
0:20:10 and then gives you the optimal solution for, you know, whatever scenario you
0:20:14 enter, which is pretty mind blowing to think about, but that’s kind of what
0:20:18 Project Cosmos is doing and they’re open sourcing it.
0:20:21 They literally put it up on GitHub so any companies can go and use it.
0:20:24 So that’s pretty wild as well.
0:20:25 That is wild.
0:20:29 Like I imagine you’re using Claude or Chatsp or whatever to help
0:20:32 generate the prompts, even like, what are all the things we should test?
0:20:34 It can help come up with all those prompts.
0:20:36 And so you could have AI actually coming up with all the scenarios to come up
0:20:38 with, you know, a million scenarios to test.
0:20:42 Yeah. Yeah. And I mean, coming back to like the game example that you’re doing,
0:20:47 it’s not necessarily like the same sort of virtual environment, but I would
0:20:52 imagine it could go and play through your game, run all possible scenarios
0:20:56 and find any potential issues with the game that might need to be fixed.
0:21:00 Right. Think about like a long, like, let’s say like a platform or game
0:21:03 or something that would typically take somebody 10 hours to beat, you know,
0:21:07 what happens if there’s a bug in the game that’s like nine hours in that maybe
0:21:12 you missed because you didn’t hit a certain part of the level and you never
0:21:16 spotted that bug. Well, is this going to be able to like preempt any potential
0:21:18 bugs that might come out in a game?
0:21:21 Because it can run through every potential scenario of that.
0:21:23 I think you’ll get there.
0:21:25 I was actually talking to my wife about this the other day.
0:21:28 I was like, you know, play testing for games is going to get really easy soon.
0:21:31 For any software, it’s going to get really easy soon because you’ll be
0:21:34 able to have like probably through computer use first, if I had to guess,
0:21:38 because you’ll be able to have like the computer actually interact with the
0:21:41 thing, whether it’s a game or an application and test different scenarios
0:21:44 and then see where bugs are, log them, possibly even then look at the code
0:21:46 base and like identify the bug and solve it.
0:21:49 But I assume at some point you could just simulate the entire thing.
0:21:50 Yeah, you can do that for anything.
0:21:52 You can do that for architecture.
0:21:55 I think you still have real human game testers though, right?
0:21:58 Because like you still need people to tell you whether the thing’s fun or not.
0:22:00 Oh, yeah, of course.
0:22:01 Yeah, it’s like, this is super fun.
0:22:03 That’s just because you like hard challenges.
0:22:04 You are like hard problems.
0:22:06 Yeah, yeah, yeah.
0:22:10 So right now the day that we’re recording this is actually basically
0:22:15 the first day of CES, but I did go and sort of speed run the convention center
0:22:18 and try to see as much as I can see while I was out there.
0:22:20 Some other interesting stuff.
0:22:25 I did sit in on a Samsung keynote and Samsung is all about like smart devices.
0:22:31 Some of this stuff that they’re talking about feels almost like novelty.
0:22:34 Like that’s cool, but like are people really going to use it once it’s
0:22:36 installed in their house, right?
0:22:40 Like they had stuff like AI in your refrigerator where it’ll take pictures
0:22:45 inside of your refrigerator and then recommend things for you to buy based on
0:22:49 like AI, it’ll be like, oh, it looks like they’re running low on eggs or whatever.
0:22:51 And it’ll let you know when you’re running low on things.
0:22:54 They have like digital displays on the refrigerator as the example they were
0:22:56 giving as you wake up in the morning.
0:22:59 And while you’re making breakfast, you can see right on your refrigerator what
0:23:00 the weather is going to be like.
0:23:03 And you can quickly look at your calendar and they’ve been doing that
0:23:06 stuff for like 15 years now, but now it’s using AI.
0:23:07 Yeah, yeah, yeah.
0:23:10 I mean, they, you know, I remember I’d go to CES like over 10 years ago.
0:23:13 They had that kind of stuff then you see the fridge and you see, you know,
0:23:15 it’s got all this stuff on it and then no one uses it.
0:23:18 I think that stuff eventually will happen, but it has to get so seamless
0:23:20 and so tied into all your systems.
0:23:22 You’re already using that you don’t even think about it.
0:23:26 Like, you know, some of the examples they were giving though was like more
0:23:30 of like the agentic style of like it could look in your fridge, see that
0:23:34 you’re low on eggs and then send that to Instacart and then, you know, eggs
0:23:37 will just show up at your door without you having to think about it.
0:23:37 Right.
0:23:40 So like that’s sort of where some of the AI stuff would play in.
0:23:44 I don’t know if you saw that like demo that I did on my live stream
0:23:47 where I told Amazon to buy me some toilet paper and then it racked up
0:23:50 $600 worth of toilet paper on my Amazon card.
0:23:52 There need to be some kind of control system.
0:23:54 Yeah, that’s what I worry about with that kind of stuff.
0:23:58 It needs to seamlessly tie into the existing AI systems people are using.
0:24:00 I’m not using AI voice as much as I expected I would, but I think
0:24:03 eventually I will eventually if they could tie into some kind of system
0:24:07 like that and then chat to BT might say, Hey, you have all these notifications
0:24:10 from you have a notification from your fridge or a notification from email
0:24:13 of things I think are important or you need to make a decision upon.
0:24:16 And then you can kind of through natural voice decide upon how you want to
0:24:17 deal with things like that.
0:24:18 Yeah. Oh, yeah.
0:24:19 If I need eggs, just freaking get the eggs for me.
0:24:20 Don’t ask me.
0:24:24 Somebody’s asked me to, you know, to send an appointment.
0:24:27 No, I don’t like appointments or whatever it is.
0:24:29 Just like deal with all those kind of things and like kind of set
0:24:30 the rules set using the AI.
0:24:33 I think I think we’re like probably a year or two away from like way more
0:24:36 personalization like that and that kind of stuff can make a lot of sense.
0:24:40 I would say like CES this year from what I’ve seen so far, I haven’t even been
0:24:44 over to the Venetian Expo yet, which is like a whole other thing, right?
0:24:47 The Venetian Expo, they have this thing called Eureka Park, which is only
0:24:49 companies that are a year older or newer, right?
0:24:54 So that’s all the like cutting edge, brand new, like early startups over there.
0:24:55 And I haven’t seen what’s there yet.
0:24:56 So I haven’t explored that yet.
0:25:03 But what I’ve seen so far has been a lot of like self-driving car tech, tons
0:25:05 and tons of self-driving car tech here.
0:25:09 And then like internet of things blended with AI, right?
0:25:15 Taking like the IOT stuff and then adding AI like the example of, oh,
0:25:16 you’re, you’re out of milk.
0:25:19 Send that to Instacart, let Instacart order it for you.
0:25:24 They had some other examples of like a Samsung 2-in-1 washer and dryer and it
0:25:30 could like automatically sense how dry your clothes are and then like make
0:25:33 it wash for longer if they were still wet.
0:25:34 Mine in Japan already does that.
0:25:35 Don’t they already do that in America?
0:25:36 I don’t like mine.
0:25:40 Yeah, there’s like some other AI elements to it where you can like communicate
0:25:46 through like Siri or whatever or like your Amazon Alexa or things like that
0:25:49 to tell it to do stuff and it’s all connected.
0:25:53 I did watch the Samsung keynote, but you can tell how impactful it was because
0:25:56 I’m already forgetting some of the stuff that they were telling us during it.
0:25:58 But it was a lot of that kind of stuff.
0:26:03 It was a lot of like smart washer and dryers, smart refrigerators, you know,
0:26:06 smart lighting systems.
0:26:10 They were talking about systems that control like you’re heating in air.
0:26:15 Maybe at night you’re wearing like your Samsung Galaxy ring and it notices
0:26:16 that you’re like really, really hot.
0:26:19 So it’ll like actually turn up the AC for you.
0:26:24 And the idea is less and less like prompting and asking AI to do things.
0:26:28 And AI just sort of figuring out what it needs to do and do it for you.
0:26:29 Right.
0:26:31 Like I think that’s where they’re trying to go with a lot of the internet
0:26:36 of things stuff combined with AI is instead of me telling these things what
0:26:42 they should do next, AI will recognize, OK, in the past, he did it this way.
0:26:44 So let’s make sure we do it that way again.
0:26:47 Or he’s looking really hot while he’s sleeping.
0:26:49 Let’s turn on the AC for him.
0:26:54 You have smart blinds and you set an alarm for 830 and it raises
0:26:56 your blinds for you at 830.
0:26:57 Pretty sure you can already do that.
0:26:59 Different lighting modes would be kind of cool.
0:27:02 Like, you know, you like dim lights at certain times or, you know,
0:27:04 you could turn the lights red at certain times.
0:27:05 Yeah, yeah, yeah.
0:27:09 It’s all like internet of things combined with AI.
0:27:13 But the idea is they want it to be more preemptive.
0:27:17 They want it to be more, you don’t have to go and tell these various
0:27:18 things what to do.
0:27:23 They just know what to do next and give you the results you’re looking
0:27:25 for without you having to go and ask it to do that.
0:27:26 Right.
0:27:28 So there’s been a lot of that kind of stuff.
0:27:31 I think you need the automatic stuff with the ability through natural
0:27:34 language to modify how those things behave is what I think.
0:27:36 I think it’s all automatic.
0:27:38 People are just going to like some people are going to hate that.
0:27:41 It’s like I want some amount of control over a bit more how I can tweak it.
0:27:42 I wonder how that’s going to work.
0:27:45 If you’ll be able to do that with like chat to PT and stuff at some point.
0:27:48 Yeah, at the end of the day, I think the way I felt about it,
0:27:51 sitting through some of these keynotes is I came out of the keynotes going,
0:27:53 that’s really, really cool tech.
0:27:55 I think it’s really, really impressive that we’re at a point
0:27:58 where we can do that stuff now other than that.
0:27:59 Do I really care that much?
0:28:01 Do I really feel like it’s going to impact my life that much?
0:28:05 Do I really like want to go and buy the latest state of the art
0:28:11 washer and dryer combo or a fridge with a, you know, a giant 30 inch screen on it?
0:28:12 Not really.
0:28:16 Like I’m not sold myself yet, but I am impressed.
0:28:17 Right.
0:28:21 And that’s how I feel about a lot of what I’m seeing at CES this year.
0:28:23 Is I’m like, wow, that’s really, really impressive.
0:28:23 Do I want one?
0:28:26 Not yet, but it’s impressive.
0:28:31 I’m OK with all that stuff as long as we don’t have any kind of oppressive government.
0:28:35 It’s like, hey, you have a point system and you don’t have enough points.
0:28:38 Sorry, we don’t like what you said on social media.
0:28:39 So you can’t really.
0:28:43 It’s like, yeah, yeah, yeah, yeah, we’ve locked your fridge and you can’t eat
0:28:45 until you get your credit score.
0:28:48 You must say something good about the current president.
0:28:51 Otherwise you’re going to starve.
0:28:51 I’m sorry.
0:28:52 That’s just how it is.
0:28:54 Yeah, yeah, but I don’t know.
0:28:55 Like I’m going to do some more exploring.
0:28:59 I actually encourage people to check out like the Nvidia keynote.
0:29:00 It’s really long.
0:29:03 I mean, it was like, I think his keynote went on for almost two hours last night.
0:29:06 I actually ended up leaving early because I had to be at another event.
0:29:09 So I left like five minutes before the keynote was over.
0:29:12 But the Samsung keynote was really, really interesting.
0:29:15 They did show like a lot of examples of how this stuff will like all tie together
0:29:21 and how, you know, it’s going to try to preempt what your needs are before you even prompt it.
0:29:24 So I think that’s where a lot of the things are going this year.
0:29:27 Last year, there was some really, really cool like TV tech.
0:29:29 I love seeing that kind of stuff.
0:29:31 Like I haven’t really seen much yet.
0:29:33 All I’ve seen is like bigger TVs, right?
0:29:38 I saw like a hundred and twenty seven inch like OLED eight KTV or something
0:29:41 that looks really impressive when it’s right in front of you.
0:29:44 But, you know, it’s probably like a sixty thousand dollar TV or something.
0:29:45 Right.
0:29:48 Last year, they had like these transparent TVs that were really cool to see.
0:29:52 Not really a lot of use cases for like in the home,
0:29:54 but really interesting for like businesses.
0:29:57 Like some of the examples they showed last year was you can have
0:30:00 like a display case with like all of your pastries in it.
0:30:02 And then like on the display in front of it,
0:30:05 it’s actually got like digital labels of what each thing is.
0:30:09 And you can play like animations in front, like on the display case.
0:30:12 So it’s almost like using these clear TVs as your display case.
0:30:15 And I thought that was really cool for like businesses,
0:30:17 but who really wants a clear TV in their house?
0:30:20 Like to me, it seems like there’s more downside than upside to that.
0:30:22 Right. But none of that kind of stuff this year.
0:30:24 There’s a couple like flying car techs.
0:30:28 There was a flying motorcycle, like a two wheel motorcycle
0:30:32 that has like the four like drone propellers, like a quad copter
0:30:34 motorcycle that looked really interesting.
0:30:37 I mean, it kind of looks like a death trap because like the blades
0:30:39 are like really like right at face level.
0:30:43 If somebody wasn’t paying attention, it looked like a walk right into the blades.
0:30:48 You know, they had that they had that the X-Peng car that looks like a giant van.
0:30:52 But out of the back of the van is like a little personal drone that you can fly around.
0:30:55 They have that here, which is really, really cool to see.
0:30:58 Really, really cool to see that kind of stuff in person.
0:31:00 But all of it feels so much like concept still, right?
0:31:02 It feels like, oh, that’s really, really cool.
0:31:06 You can do that, but it’s so far off from actually being anything.
0:31:08 Anybody’s actually going to use in real life.
0:31:10 It’s crazy that the tangible stuff is AI.
0:31:16 It’s like you told me 10 years ago that the tangible stuff in a CES was the AI stuff.
0:31:18 It’s like, what are you talking about?
0:31:21 It was always the AI stuff that was just like, oh, yeah, that’s cute.
0:31:23 It doesn’t actually work. None of it works.
0:31:27 What I want to know is when you were coming to CES, let’s say 15 years ago,
0:31:30 did they have like flying car concepts and stuff back then?
0:31:32 Because I feel like they probably did.
0:31:34 It was just a little more of what we’re seeing now.
0:31:36 No, there was no flying car stuff.
0:31:41 You know, actually, I helped put a video game system in the back of a Toyota Prius
0:31:44 concept car in partnership with Alcatel Lucent.
0:31:46 I was doing this startup called GameStreamer at the time.
0:31:48 And so a lot of it was that kind of stuff.
0:31:49 It was like, you know, and it was kind of gimmicky.
0:31:50 It’s like, you got a car.
0:31:52 OK, now there’s a game system in the back of the car.
0:31:54 Yeah, yeah.
0:31:55 A lot of it was that kind of stuff.
0:31:57 And there was a lot of the smart fridges.
0:31:59 They were doing interesting things with lighting.
0:32:02 You know, I did see some stuff there that ended up being used in people’s houses,
0:32:06 like the stuff like controlling your lights with voice and stuff like that.
0:32:09 I did see that kind of stuff first at CES before I ever saw it in anybody’s house.
0:32:11 Yeah, so you did see some kind of things
0:32:13 that ended up being real, you know, that people would use.
0:32:14 Yeah, yeah.
0:32:18 Most of the cars that you see here feel like concepts that we may never see.
0:32:22 Some of them, I think, are actually on the roads like Wemo is here.
0:32:24 You can take a look at what the Wemo cars are.
0:32:25 But if you go to San Francisco,
0:32:28 you could literally order one up and take a ride in one.
0:32:29 So they have that here.
0:32:34 Zooks is here, which is like Amazon owned self-driving car company.
0:32:38 Honda showed off some like cool EV self-driving cars.
0:32:41 One of them almost looks like a Lambo or like maybe like a Lotus,
0:32:43 like a cross between like a Lotus and a Lambo or something,
0:32:45 which is really, really interesting.
0:32:47 Honda is kind of going down that path a little bit.
0:32:50 Before we wrap up here, I wanted to like go back to Nvidia for a second,
0:32:53 because there was one thing that I heard Nvidia talk about that I didn’t hear you mention.
0:32:54 It’s kind of hard to understand.
0:32:57 So I’m still trying to process like what exactly it means.
0:33:00 But Jensen talked about with this new graphics card,
0:33:04 the way that they’re able to achieve so much better performance,
0:33:07 partially it’s because they’re using AI for a lot of the processing.
0:33:07 Right.
0:33:09 And my understanding was he was saying almost like 90 percent
0:33:13 of the processing of like the graphics was being given to AI.
0:33:14 Yeah. Yeah.
0:33:17 And that’s why you can have a smaller chip that uses less power was because of that.
0:33:21 Yeah, he was talking about how it’s really interesting
0:33:25 because the G force graphic cards led to being able to create AI.
0:33:29 And now he’s using AI to be able to create better and better graphics cards.
0:33:33 Did you hear about the switch to the the rumors that have dropped around
0:33:35 the Nintendo switch to know.
0:33:38 So basically what they’re saying about the Nintendo switch to
0:33:41 is that they’re expecting it to be announced in 2025.
0:33:45 And they’re expecting it to be able to play games in 4K.
0:33:48 You’re not going to be actually creating games in 4K.
0:33:52 The game developers are going to be basically creating games in 720p
0:33:56 or 1080p or whatever to keep the file sizes small enough to fit on a cartridge.
0:34:00 And then the switch itself is going to upscale it to 4K
0:34:05 and sort of like essentially guess the pixels in between to scale it up to 4K.
0:34:09 So my understanding of like what NVIDIA is doing is kind of
0:34:13 that same kind of concept is that it can take less data
0:34:17 and then imply the rest of the information using AI.
0:34:20 Yeah, again, this is a really oversimplification
0:34:21 and I don’t totally understand it.
0:34:25 But that’s sort of my understanding is that it can actually work with less
0:34:29 information and then imply and use AI to figure out the gaps.
0:34:31 And that’s sort of how it’s going to work.
0:34:36 But again, I’m not an engineer and so I’m not the person to explain that.
0:34:38 That makes sense to me.
0:34:41 But like the way he said it made it sound like it’s almost hard
0:34:43 coated into the hardware, which was kind of surprising to me.
0:34:46 Like he made it sound like this chip actually does that.
0:34:49 Like you don’t actually have to do anything different with the game
0:34:53 code to make that work that the chip itself is doing it in real time.
0:34:57 I was like, if that’s true, that is that is mind boggling.
0:35:01 Because like you could potentially like really scale up graphics in games.
0:35:05 You know, if you’re getting like a order of magnitude improvement from the AI,
0:35:08 we’re going to see games that look like we’re walking around the real world,
0:35:09 like real, real world.
0:35:11 Yeah, I don’t think we’re far off from that already.
0:35:14 Yeah, yeah. But now with this, I mean, like we’ll probably be there
0:35:15 like in a year or two. It’s like it’s crazy.
0:35:18 You know, it might require like the 5090 or something like that
0:35:19 to get that level of realism.
0:35:22 But I bet you the 5090 will get you there, you know?
0:35:26 Yeah, you just need people to develop stuff that’s at that level as well.
0:35:27 Well, yeah, that was the most exciting thing to me.
0:35:30 You have to make it look pretty good, but then the AI fills in the gaps
0:35:31 to make it look better.
0:35:33 So in theory, that makes game development easier.
0:35:34 Yeah, that makes sense.
0:35:38 We’ve already got things like, let’s say like runways, video to video, right?
0:35:42 Where you can give it one video and then it can go and sort of re-skin
0:35:45 that video and make it look like a completely different video.
0:35:46 But still looks good.
0:35:49 Like you get something out of Unreal Engine 5 right now.
0:35:51 It looks pretty damn good, right?
0:35:55 Already. Now imagine it sort of re-skinning it, but with like a
0:35:58 upscaled 8K realism filter on it.
0:36:02 Right. And now that Unreal Engine game looks indistinguishable
0:36:04 from something that was just shot on a video camera.
0:36:07 Right. Imagine you’re going to be like go back and remaster like old games
0:36:10 or old movies or anything to just like make them look super bad.
0:36:13 Yeah, yeah, yeah. I mean, I don’t doubt it, right?
0:36:16 Like they’ll probably be systems where you could take your old VHS tapes,
0:36:19 run them through and then upscaled them 4K.
0:36:21 Yeah, make it look, you know,
0:36:24 add some like cinematic stuff to it, to lighting and other stuff.
0:36:25 And just like, yeah, why not?
0:36:28 Yeah, I mean, I feel like with all of this AI stuff,
0:36:31 once it gets cheap enough that it’s like anybody can use it,
0:36:32 anybody can do whatever they want with it.
0:36:33 That’s going to get so weird.
0:36:34 Any of that is on the table.
0:36:38 You know, one thing that they do in Asia, at least I know it’s normal
0:36:42 in Japan, that when you have a wedding, you have somebody filming the wedding.
0:36:46 And at the same time, people are editing the videos during that time period.
0:36:48 So by the time you’re done with the wedding,
0:36:53 they present to you a really nicely edited video of the day in movie format.
0:36:57 So they try to present it like it’s a movie, like your wedding day was a movie.
0:36:59 Yeah, yeah, yeah, I’ve heard of that kind of thing.
0:37:01 But yeah, I don’t think it’s very common.
0:37:02 And there’s always like tears.
0:37:05 They were all like emotional about it because they make it so cinematic.
0:37:08 Like, oh, my God, look, you know, this moment that we were all part of today.
0:37:11 You got that for like your life. That’s going to be so weird.
0:37:14 Like take all the like the videos of your life and have AI automatically edited
0:37:16 into this kind of like master film of your life.
0:37:19 Yeah, well, that job that you just mentioned is going to get way easier
0:37:22 for the right because they’re just going to feed it all of the video.
0:37:27 And then AI is going to figure out how to edit it into like a cinematic masterpiece.
0:37:29 Yeah, it’ll make it better, right?
0:37:33 At cinematic lighting, lighting the right spots, editing in the right spots,
0:37:35 tied all together with the right narrative.
0:37:37 Yeah, it’s going to be crazy.
0:37:39 Yeah, we’re entering a wild world.
0:37:41 The last few years I’ve come to CES
0:37:45 and like it definitely like expands my mind into what’s possible, right?
0:37:48 Makes you go, oh, crap, I didn’t know like we’re able to do that yet.
0:37:50 I didn’t know we were capable of that kind of thing yet.
0:37:52 I was still surprised by what Jensen said
0:37:55 about the predicting the different parts of the game.
0:37:57 You know, if that’s what they’re really doing in hardware,
0:37:59 because he had been saying that like, hey,
0:38:02 we’re eventually going to use AI to generate all the pixels.
0:38:04 Yeah. And I was like, well, that sounds kind of farfetched.
0:38:07 Maybe that’s like a five year kind of thing that sounds really hard.
0:38:09 Yeah, there’s already been demos of that.
0:38:12 I feel like we’ve already seen glimpses of it, right?
0:38:13 I really want to learn more about that
0:38:15 and like see if that’s really what’s going on.
0:38:16 But that’s happening.
0:38:18 That’s going to change, you know, so many things.
0:38:20 And also those improvements for the graphics cards,
0:38:22 those go back into the improvements in the AI.
0:38:23 So even for non entertainment,
0:38:26 all of these things end up improving AI in general,
0:38:28 which then goes into all businesses.
0:38:30 It’s interesting that somehow gaming has become
0:38:32 this part of the core loop of all business.
0:38:34 Yeah, I think maybe who listened to the podcast are like,
0:38:35 why is Nathan talking about gaming?
0:38:38 It’s like, well, all that started from gaming,
0:38:39 like without gaming, we don’t have AI.
0:38:41 Oh, 100%. Right.
0:38:43 I’m just looking for the day to where like as a YouTuber, right?
0:38:47 Like I can record YouTube videos at 720p, right?
0:38:48 Edit the whole video at 720p
0:38:53 because it’s a hell of a lot faster to edit and render at 720p than it is at 4K.
0:38:57 Create my whole video at 720p, render it out, toss it on YouTube.
0:39:00 And then YouTube just upscales it to 4K for me.
0:39:01 You know, we’re like that close to that.
0:39:04 I think YouTube’s already testing AI upscaling.
0:39:06 And, you know, I think we’re going to get to a point
0:39:09 where we can record and edit our videos at 720p
0:39:12 and then post process them up to 4K.
0:39:15 And it’ll look like the whole thing was filmed in 4K.
0:39:16 And I’m excited about that.
0:39:19 I think that’ll make video creation a lot easier.
0:39:22 Yeah, I mean, eventually it’ll get good at editing even.
0:39:24 I think I think we’ll give it like samples
0:39:26 of like different YouTubers you like
0:39:28 or if you’re making a movie, different movies you like or whatever.
0:39:30 And it’ll be able to help you with editing.
0:39:32 Yeah, whoever’s editing this video, don’t worry.
0:39:34 Like, you know, I think it’s going to take a few years before they
0:39:36 I like it and I’m also scared of it.
0:39:38 That’s everything with AI is like, it’s all super exciting
0:39:39 and scared at the same time.
0:39:42 I’m more excited than scared, but I understand if anybody
0:39:45 could make YouTube videos that look like Mr. Beast videos
0:39:49 or whatever, like, does that kind of devalue what Mr. Beast does?
0:39:51 You know, yeah, definitely. Yeah, so I don’t know.
0:39:54 But I think we’re still a little ways off from that, fortunately.
0:39:57 But like you said, AI sort of excites me
0:40:00 and scares me at the same time, but more excites me than scares me.
0:40:02 So I’m going to keep talking about it.
0:40:05 Yeah, cool, man. Well, this has been really fun.
0:40:08 We actually came into this one, not knowing if we had enough to talk about.
0:40:12 And I think this is actually a pretty solid episode all said and done.
0:40:14 That’s kind of been my experience with CES this year.
0:40:20 If I was to sort of recap it, it’s like slightly underwhelming this year.
0:40:24 Between last year and this year, there was major leaps.
0:40:28 And what I have seen a lot of is like AI blended with Internet of Things,
0:40:30 a lot of self-driving tech.
0:40:34 And other than that, it’s more of the same as last year, like AI and everything.
0:40:36 We’ve got barbecues that have AI.
0:40:38 We’ve got bird feeders that have AI.
0:40:43 I was talking to like a Synology who makes like network attached storage solutions.
0:40:45 Right. They’re putting AI in those now.
0:40:50 Right. So it’s like existing companies adding AI in Internet of Things blended
0:40:55 with AI and a lot of self-driving tech, which relies on AI.
0:41:01 So I mean, CES is really, really AI focused again, but still a whole year
0:41:05 has gone by since the last CES, but it doesn’t feel like the advancements
0:41:08 were as big as I thought they would be in that timeframe.
0:41:10 And that’s probably how I would summarize CES this year.
0:41:12 A lot of it sounds gimmicky.
0:41:13 It’s in Japan.
0:41:16 You have a lot of that right now, too, because in Japan, there’s a huge AI fever.
0:41:19 Like when I bought my washer and dryer, it had a big sign AI.
0:41:22 I talked to my wife and it’s like, wait, so just the fact that they’re
0:41:24 like, you know, determining that it’s still wet.
0:41:27 That’s they’re calling that the AI that it’s, you know, and I was like,
0:41:30 and I was like, the models do that a few years ago in Japan.
0:41:31 Yeah, they already did that.
0:41:35 It’s like, so they’ve just kind of attached AI on to it to say like,
0:41:36 yeah, this is the new AI feature.
0:41:38 Yeah. The Nvidia stuff sounds amazing, though,
0:41:41 like especially if they are doing the thing I said, the hardware of predicting
0:41:43 pixels and things like that.
0:41:45 So tomorrow I’m spending all day with Nvidia.
0:41:47 I’m actually meeting up with the Nvidia crew.
0:41:50 I will know more about what Nvidia is doing tomorrow.
0:41:53 Like I’m getting hands on demos with a lot of their new tech tomorrow.
0:41:59 So I will know more about it and feel free to send me any questions you have
0:42:02 about what Nvidia is up to and I’ll ask them when I’m with them tomorrow
0:42:06 because I should know a lot more about what’s going on with Nvidia.
0:42:09 And I’ll ask the questions that you brought up.
0:42:14 I want to understand how the RTX, the new 50 series is leveraging AI.
0:42:16 Like I want to understand that process on a deeper level.
0:42:18 So I will ask them about that.
0:42:22 And maybe there’s a follow up episode to this one where we go deeper
0:42:24 into what Nvidia is up to.
0:42:28 Well, on that note, I’m going to get back down to the conference floor
0:42:30 and see what else I can find.
0:42:32 And again, I’m going to be hanging out with Nvidia tomorrow.
0:42:34 So I’ll know more about what they’re up to.
0:42:39 And if you want to stay looped in, you want to learn more about what I find out
0:42:43 from Nvidia and more about what I come across at CES and learn about some
0:42:47 of these other tools that Nathan’s been sort of teasing us about,
0:42:49 but not going into very much detail yet.
0:42:54 Make sure that you subscribe to this show wherever you subscribe to podcasts.
0:42:56 YouTube is the visual platform.
0:42:59 We’re showing a lot of stuff on our screens, but we’re also available
0:43:01 wherever you listen to your podcast.
0:43:04 So give us a subscribe and thank you so much for tuning in.
0:43:06 Hopefully we’ll see you in the next one.
0:43:06 Thank you.
0:43:07 Yeah.
0:43:08 Yeah.
0:43:09 .
0:43:11 Yeah.
0:43:13 (upbeat music)

Episode 41: How groundbreaking are the AI-driven advancements showcased at CES 2025? Matt Wolfe (https://x.com/mreflow) and Nathan Lands (https://x.com/NathanLands) take you through the year’s biggest AI announcements and innovations from the Consumer Electronics Show.

This episode dives into the latest in AI’s integration with the Internet of Things, Nvidia’s exciting new projects, advancements in self-driving car technology, and the potential of fully automated game testing. The hosts share their CES experiences, highlight favorite tech demonstrations, and discuss the future of AI, from personal computing to enterprise applications.

Check out The Next Wave YouTube Channel if you want to see Matt and Nathan on screen: https://lnk.to/thenextwavepd

Show Notes:

  • (00:00) Discussing Nvidia’s AI advancements at CES.
  • (04:52) Hopes for AGI, aiming next for ASI.
  • (06:33) AI to revolutionize solving major human challenges.
  • (10:50) Edge computing could disrupt cloud-based business models.
  • (14:04) Nvidia dominates AI; local privacy management evolves.
  • (17:40) Nvidia’s Project Cosmos: Virtual testing for technologies.
  • (20:39) Automated testing identifies potential game bugs efficiently.
  • (24:36) CES highlights: self-driving tech, AI-enhanced IoT.
  • (29:12) Advancing tech predicts needs, massive impressive TVs.
  • (29:54) Flying motorcycle showcased; clear TV criticized.
  • (33:35) Nintendo Switch 2: 2025 release, AI upscales to 4K.
  • (38:41) Excited for easier video creation with AI.
  • (42:17) Stay informed on Nvidia updates; subscribe now.

Mentions:

Get the guide to build your own Custom GPT: https://clickhubspot.com/tnw

Check Out Matt’s Stuff:

• Future Tools – https://futuretools.beehiiv.com/

• Blog – https://www.mattwolfe.com/

• YouTube- https://www.youtube.com/@mreflow

Check Out Nathan’s Stuff:

The Next Wave is a HubSpot Original Podcast // Brought to you by The HubSpot Podcast Network // Production by Darren Clarke // Editing by Ezra Bakker Trupiano

Leave a Comment