Gavin Purcell & Kevin Pereira, The Future of AI (#55)

AI transcript
0:00:03 First of all, why did you guys pivot into “distraded AI”?
0:00:08 I would say, to me, the thing that was really interesting about it was it felt transformative in a way that…
0:00:12 We’re entering an era, like it looked like anyone is saying anything at any point in time.
0:00:15 I’m a little worried about the AI stuff.
0:00:17 All the people are talking about agentive AI right now, right?
0:00:23 And this is a good example of what you have there, conceivably, is your agent that you’re making.
0:00:25 That’s gonna need to go out. One of the coolest things about AI is it’s…
0:00:30 If so, we don’t have this theory that social media was the kind of democratization of distribution.
0:00:37 AI is kind of democratized creation, meaning that suddenly the floor is raised for everybody that wants to create things.
0:00:42 There could be 100%, and this is like the Wally problem, right?
0:00:45 The risk that I always find is that, like, how do you bring the human side into it?
0:00:50 There will be something lost, but then will those people be able to have more time to do things with their family?
0:00:50 Maybe.
0:00:57 So if you follow me on the Instagram, I’m sure you’ve seen I’m doing 90 days, no drinking.
0:01:03 Today is day number 25, and I’ve upped my cardio a ton, which feels great.
0:01:07 I’ve been rucking a lot, which is where you put this weighted backpack on while you’re hiking.
0:01:13 At the end of my workouts, though, I’m sweating like crazy, which is good, but you need to replenish your electrolytes.
0:01:18 And sadly, most of those replacement powders out there are just packed with sugar and they go straight to gut fat,
0:01:25 which is the reason I use Element. There’s no sugar and it has a science-backed ratio of 1,000 mg sodium,
0:01:29 200 mg potassium, and 60 mg magnesium.
0:01:35 Not only no sugar, no coloring, no artificial ingredients, no gluten, no fillers, just no BS.
0:01:41 Element is used by everyone from podcast hosts like me, NBA players, NFL, NHL players, Olympic athletes,
0:01:45 and everyday moms and dads and exercise enthusiasts.
0:01:49 Right now, Element is offering a free sample pack with any purchase.
0:01:53 So that’s eight single serving packets, free with any element order.
0:01:56 That’s a great way to try all eight flavors because they have a ton of different flavors.
0:02:02 You try all eight, get yours at kevinrose.com/elementlmnt.
0:02:05 So that’s kevinrose.com/lmnt.
0:02:09 This deal is only available through my links, so you must go to that website.
0:02:12 And lastly, and the best part is totally risk-free.
0:02:15 So if you don’t like it, share it with a salty friend and they will give your money back.
0:02:18 No questions asked, you have nothing to lose.
0:02:21 Huge thanks to Element for sponsoring today’s show.
0:02:23 Today’s sponsor is NordVPN.
0:02:29 Stay secure and anonymous online with NordVPN, which has been my trusted VPN for years now.
0:02:32 I like it because you can access to a lot of content from anywhere.
0:02:37 So I’m in Japan or Mexico at least a couple of times a year, and I want to watch my favorite shows.
0:02:43 And this comes in really handy to bypass all of that geofencing and it needs to be fast.
0:02:49 This is another essential thing for a VPN and Nord is lightning fast because they have thousands of servers.
0:02:52 So if you want to stream and game without lag, it’s great.
0:02:54 And then equally as important is privacy.
0:02:55 They have a great policy.
0:03:01 No logs, nothing to store, nothing to see, nothing to share, which is pretty much as good as it gets.
0:03:03 And their apps are just really lightweight.
0:03:07 So it’s not bogging down your system the entire time and they have it for desktop and mobile.
0:03:13 Don’t miss this incredible deal, which is two years at a huge discount, plus four free months.
0:03:21 If you use my link, NordVPN.com/KevinRose, that’s NordVPN.com/KevinRose and it’s risk free.
0:03:25 There’s a 30 day money back guarantee, which makes this a no brainer.
0:03:27 Gavin, Kevin, so glad to have you guys here.
0:03:28 Oh, we’re psyched.
0:03:28 We’re psyched.
0:03:29 Dude, this is great.
0:03:30 Yeah, it’s fun.
0:03:31 So you had me on your show.
0:03:32 Yep.
0:03:34 That was a really fun show.
0:03:34 Well, good.
0:03:35 I had a ton of fun.
0:03:38 Because for me, we’re all AI geeks.
0:03:40 Like we’re playing with all the tools, but you guys are like living and breathing it.
0:03:43 I’m like a few steps removed.
0:03:45 I have to hear like three people be like, hey, this is cool.
0:03:46 And I’m like, all right, let me go check it out.
0:03:48 Are you still elping yourself?
0:03:49 Are you like in jib jab?
0:03:49 Yes.
0:03:50 Era of technology.
0:03:51 I’m in jib jab.
0:03:52 I haven’t broken out.
0:03:56 But I want to have you guys on because you’re so in the mix on what’s going on here.
0:03:58 You’re giving talks about this stuff.
0:04:01 Can you tell us and the audience is watching?
0:04:02 What are you seeing?
0:04:04 Well, first of all, why did you guys pivot into this arena?
0:04:08 Like you both had very successful careers doing other things in video.
0:04:08 Yep.
0:04:08 Yeah.
0:04:12 But AI was enough to be like, okay, we have to join forces to make this happen.
0:04:16 To me, the thing that was really interesting about it was it felt transformative in a way
0:04:19 that I hadn’t felt since way back in the web two days, right?
0:04:21 Web two was this really interesting moment.
0:04:23 And I was never like really in it.
0:04:27 You were got into it, but tangentially watching it was it felt like, oh, suddenly
0:04:30 everybody’s out there and able to interact.
0:04:32 You can make stuff yourself when YouTube came out.
0:04:33 Twitter came out.
0:04:35 Instagram, dig all these things started.
0:04:40 It felt like, oh, this is a moment of something where we are changing the way things are made.
0:04:42 And that’s what AI felt like to me.
0:04:42 Yes.
0:04:47 And I didn’t want to, I mean, Kevin, I do this like a fun thing and we were really interested
0:04:51 in it, but like we started doing a GPT three is what we first saw.
0:04:54 If you remember GPT three, it was like 2021.
0:04:56 We were both blown away by what was possible.
0:04:59 Then that was if you’re not familiar with the backstory that’s open AI started before
0:05:03 chat GPT, they had been releasing a bunch of GPT models.
0:05:05 That model was the first one when you interacted with it.
0:05:09 It felt almost like you could do this character stuff, which we really, I would say even GPT
0:05:13 two was like a turn on in a way that I think this room and maybe some of the audience will
0:05:14 understand.
0:05:17 Do you remember the first time you heard the modem handshake noise?
0:05:19 Oh, actually connected to the machine, right?
0:05:23 And you just were like, Oh my God, I’m actually talking to another computer.
0:05:25 Yeah, we’re really dating ourselves here.
0:05:27 No, but that’s the moment, right?
0:05:28 And I remember it vividly.
0:05:29 The iPhone was like that too.
0:05:34 For me, there’s only like four big moments in like my last call it like 20 plus years
0:05:39 in technology where I’m like, a big shift, holy shit, we got to pay attention.
0:05:40 It was dial up.
0:05:43 I would say from there it was like web one was really interesting because PayPal came
0:05:47 out of that eBay came out of that Amazon came out of that some of the bigs that are
0:05:51 still around obviously then it was web two was interesting because that was like interactive
0:05:55 web pages in a way that we hadn’t seen a second is JavaScript allows to vote on things
0:05:59 and then make like real time comments and have a living, breathing web and UGC, right?
0:06:03 Which I think is a huge deal and you just still underestimate in a major way of people
0:06:08 generating their own content, blogging, like self expression, like following the following
0:06:13 feature like that, that Jack and when they added that to Twitter, because you had to
0:06:16 remember before that was all bidirectional relationships.
0:06:20 So if you want to follow someone on MySpace, it was like, I friend request, you accept
0:06:22 request now we can share.
0:06:27 And so when it turned to following that was like, that really screwed up society that
0:06:28 was the thing.
0:06:29 That was the problem.
0:06:34 Let’s go one to every one and that’s where all of a sudden the town hall was now a global
0:06:36 town hall and it was like, it was crazy.
0:06:41 But anyway, that moment, the smartphone and it wasn’t the first smartphone because you
0:06:46 remember like the pumps and the Windows devices, the little stylus and shit, this is horrible.
0:06:48 But I was saying it’s those tingly moments, right?
0:06:54 And I remember even with GPT to having that right broken magnet poetry, like someone’s
0:06:57 just replacing words on the refrigerator and if there’s a sentence there, I remember seeing
0:07:00 that and going, Oh, wow, I’ve been here before.
0:07:04 It was that weird sense memory of this gets better.
0:07:07 This is the this is something we say all the time on the podcast right now is the worst
0:07:10 this technology will ever be right.
0:07:11 It only gets better from here.
0:07:14 I know which is super scary because it’s hard to be damn good.
0:07:15 Yeah.
0:07:18 But there might be one thing we also talk about is there might be a lot of legal things
0:07:21 that get in the way of it or people that get in the way of it and I think that’s the other
0:07:22 side of this.
0:07:25 And on our podcast, we try to deal with the human version of what these things are.
0:07:29 And they’re the technology is going to probably go exponential for a long time, right?
0:07:32 So far, you can train more data means better models.
0:07:35 But what are the consequences of those things?
0:07:36 What does it mean if there’s a better model?
0:07:40 What does it mean if that model can produce a deepfake of somebody in in five seconds,
0:07:41 right?
0:07:42 We just talked about it.
0:07:43 It’s going to happen though.
0:07:45 I don’t think the legal thing is going to be a real thing.
0:07:46 I’ll tell you why.
0:07:50 I’ve seen this unfold a couple of other times when Uber first came out, everyone tried to
0:07:52 regulate the hell out of them.
0:07:54 But consumers said, actually, I really like Uber.
0:07:56 I want to use Uber.
0:08:01 And if they didn’t do it, other startups would fill in other countries and other jurisdictions
0:08:04 and it was like, it was going to, they had to change the loss.
0:08:05 And they did.
0:08:06 And Uber was fine.
0:08:08 YouTube was the same thing.
0:08:10 YouTube almost went out of business.
0:08:13 They were getting sued left and right by the major industry players because they were
0:08:17 put all kinds of copyrighted material in there and they didn’t have smart systems to go in
0:08:19 and find all that stuff in real time.
0:08:20 Google kind of saved them.
0:08:24 I don’t know if you remember when the acquisition happened, but they were under a lot of pressure
0:08:26 and the Google said, you know what?
0:08:27 We have deep enough pockets here.
0:08:31 We’re going to go in and actually settle these, figure out the terms.
0:08:33 Same thing happened with Napster, with music sharing.
0:08:36 When consumer behavior changes, these things will get ironed out.
0:08:39 I think they’ll be, they’ll definitely be lost if it’s no doubt, but there’ll be settlements
0:08:40 and we’ll move on.
0:08:45 I think that legal red tape is only going to slow down the bigs so much, right?
0:08:51 Because even if it does 1%, let’s say it doesn’t matter because if you use the Uber analogy,
0:08:54 there is an open source solution right around the corner where it’s like, Hey, do you want
0:08:55 to drive?
0:08:56 Well, it’s not for Uber.
0:08:59 It’s for the open source taxi delivery service.
0:09:02 We’re seeing that with AI where it’s like the foundational models might get under attack.
0:09:07 They might have to guard rail it so you can’t generate SpongeBob smoking blunts on the couch.
0:09:10 But there is an open source something freely available.
0:09:12 You can grab it and they’re going to keep iterating on that.
0:09:13 That’s exactly right.
0:09:18 Because anytime you crack down on something, there will be an underground version of that
0:09:19 thing that works.
0:09:20 It’s just going to happen.
0:09:21 I’ve seen this.
0:09:22 I mean, you guys have seen this.
0:09:25 I’ve been invited into these discords where you can go, you don’t know what I’m talking
0:09:26 about.
0:09:27 I don’t know.
0:09:28 Walk away.
0:09:29 Walk away.
0:09:30 What’s going on in your disc?
0:09:31 What?
0:09:32 You can do some crazy shit.
0:09:33 Yeah, sure.
0:09:34 You can break all the rules.
0:09:36 Those are just whack-a-mole.
0:09:37 That’s not going away.
0:09:41 And then you have other countries that are like, yeah, US, go ahead.
0:09:42 Please.
0:09:43 Pregulate this.
0:09:44 Please.
0:09:46 So we can just jump in here in this rule because we won’t have the same regulation.
0:09:51 That’s why the pause letter, which we talked about a lot, which Elon and a few tech luminary
0:09:53 sign, that’s why it was just so laughable.
0:09:54 You guys need to slow down.
0:10:00 Pull this big imaginary lever that doesn’t really exist to stop development, stop training,
0:10:01 stop advancement.
0:10:03 Meanwhile, everybody else is going to lap you.
0:10:05 And by the way, I’m trying to build it in my closet.
0:10:06 Right.
0:10:08 And we’re saying, hey, why don’t we pause?
0:10:09 You pause.
0:10:10 I’m going to actually open here.
0:10:14 I’m going to keep building really hard because they’re just trying to, it’s not working.
0:10:17 I think the question with the YouTube model is the most interesting one, right?
0:10:20 Because YouTube went from being what was a lot of pirated content to now whatever it
0:10:25 is, like $150 billion business and all of the big people came onto it.
0:10:28 That was the thing where like CBS, NBC, all these people that weren’t going to be part
0:10:32 of YouTube now are because it’s a great distribution channel.
0:10:35 The big question I have with this is who will be the person that wins to be the place where
0:10:38 they’re going to, all these companies are going to lean in on.
0:10:42 Is there a person that wins because with the YouTube, you can make all those deals.
0:10:46 What does it look like if it’s 15 different things and then all that stuff exists?
0:10:47 Yeah.
0:10:48 It’s a good question.
0:10:52 I think it’s going to be some type of bi-directional relationship that is beneficial to both parties.
0:10:53 So I’ll give you an example.
0:10:54 This is a true story.
0:10:57 My wife woke up this morning and she goes, hey, my whoop data is like saying that I’m
0:10:58 off today.
0:11:01 Like I’m a little bit under and I’m wondering why I didn’t have any drinks last night and
0:11:02 blah, blah.
0:11:04 I’m like, why don’t you just ask the AI because it’s built into whoop right now.
0:11:08 So whoop has a prompt and you can go in there and I literally took her phone and typed in
0:11:10 like, why am I in the orange today or whatever.
0:11:13 And it was like, oh, your heart rate was a little bit higher than normal blah, blah,
0:11:14 blah.
0:11:18 And it was like, it was, that’s powered by, you know, whatever, chat, GPT or whatever’s
0:11:21 behind the scenes there using your real time data as part of the model.
0:11:25 And so there’s a beneficial effect that I have to imagine when we’re talking about the
0:11:31 New York Times data, when you’re talking about Washington Post or any of these walled gardens,
0:11:35 you’ll have private models that you will to say, hey, I’m a member of chat GPT, but I’m
0:11:37 going to do the $2 a month add-on.
0:11:40 Kind of like when I buy Apple TV and it’s like, would you like to add stars for an extra
0:11:41 $2 or whatever?
0:11:42 You’re like, all right.
0:11:43 You tell me we’re building actual bundles again.
0:11:44 Is that what we’re doing?
0:11:45 But I feel like it’s going to be that way.
0:11:48 We’ll extend it in interesting ways and say, actually it is interesting for me to have the
0:11:54 back catalog of the Financial Times because I value that quality of reporting and I want
0:11:57 that as part of my stack that I’m getting back.
0:11:58 Or it’s an add-on for a subscription, right?
0:12:03 If I’m a New York Times subscriber, which I am, I would be able to get that as part of
0:12:04 the thing.
0:12:05 That is valuable for sure.
0:12:06 100%.
0:12:11 The question I still have is like, if I want to have Spongebob interact with Luke Skywalker
0:12:14 or whatever, those two people together, they have the power to do that now.
0:12:19 That’s all the legal background stuff that has to get solved to make it fun and interactive.
0:12:20 I don’t know if it’s fair use though, man.
0:12:22 It’s just going to be considered fair.
0:12:23 You think so?
0:12:25 I mean, if you’ve created a full-featured film with Spongebob in it, I think you have
0:12:28 some grounds that actually let’s not do that.
0:12:32 But I think if you’re a verified license holder, which is where we’re heading, it’s like, well,
0:12:36 do you have the Disney+ package and you have the Viacom, Nickelodeon, whatever, okay.
0:12:39 You can get Spongebob being forced choked by Darth Vader.
0:12:40 You can have it.
0:12:42 If you want to release it commercially, that’s a whole other thing.
0:12:46 And if we decide to allow you, you’ll maybe get a small teeny-tiny royalty, but who are
0:12:47 we kidding?
0:12:48 You’re probably going to get nothing.
0:12:49 They’ll get everything.
0:12:51 Thank you for making the next worldwide sensation meet.
0:12:52 Yeah.
0:12:53 You see this with YouTube, right?
0:12:54 Yeah.
0:12:55 At least something with a commercial song.
0:12:57 It detects that song related to it.
0:12:58 And it gets paid back.
0:12:59 It gets paid back.
0:13:00 It knows that you used it.
0:13:02 Like, I have the feeling those rails are being built right now.
0:13:03 Yeah.
0:13:06 And by the way, that’s great because we talk about all the time on the show.
0:13:09 One of the coolest things about AI is it’s so we don’t have this theory that like social
0:13:12 media was like kind of democratization of distribution.
0:13:17 AI is kind of democratized creation, meaning that like suddenly the floor is raised for
0:13:21 everybody that wants to create things, whether that’s audio, video, movies.
0:13:25 There could be, the whole thing about Twitter or anything like that was always, here’s a
0:13:28 bunch of kids in Africa that are making an incredible film and now it can be distributed
0:13:31 everywhere and they have the tools to do it.
0:13:35 Well now there could be a thousand of those kids and they might be two peak kids doing
0:13:38 it or one kid because all of the tools are going to be big enough for them to be able
0:13:39 to do it.
0:13:43 And the downside of that is the deluge of stuff we’re all going to have to deal with,
0:13:44 right?
0:13:46 Because just like with YouTube, when YouTube first came out, the whole thing was like,
0:13:48 there’s so much crap here.
0:13:52 But there were gems and now the gems are more because there’s more people doing more
0:13:53 stuff with AI.
0:13:54 We’re going to have the exact same problem.
0:13:58 There’s going to be so much stuff, but you hope that stuff can bubble up.
0:13:59 That’s going to be really good.
0:14:04 Do you think that AI is going to be an information, like, is it a lean forward or lean back experience?
0:14:08 Like on the lean forward side, you’re creating content, lean back, you’re consuming it.
0:14:11 And for me, I mostly use it as a lean back.
0:14:15 Like I used it to create some funny images and make myself look a certain way.
0:14:16 In those questionable discords.
0:14:17 Yeah.
0:14:21 I got myself ripped with the six pack and I won’t say what else, but you can do fun
0:14:22 things with it on that front.
0:14:25 But there’s a novelty to that that kind of wears off.
0:14:29 And then on the lean back, like I’ll give you an example before you guys got here.
0:14:33 I was like, okay, I wasn’t getting video out into these squares here and I don’t have time
0:14:37 to sit on hold for an hour with black magic and be like, how do I figure this out?
0:14:40 And so I was like, chat to people, I was like, why am I not getting programmed out of the
0:14:41 bubble ball?
0:14:42 And it’s like, have you checked the setting here?
0:14:43 And this and that.
0:14:46 It’s bringing, it’s solving customer support.
0:14:51 I mean, in a way that like companies, oh my God, I think customer support is a huge use
0:14:53 case of it right now because we talk about this too.
0:14:56 All the time we, we, we have a problem.
0:14:58 You can solve 90% of your problems with chat GPT.
0:14:59 It’s really shocking actually.
0:15:03 But we’re already seeing that though with call center, displacement, yes.
0:15:06 Support teams being slashed everywhere because they’re just putting all the training docs
0:15:10 and FAQs into a GPT and letting it answer the questions.
0:15:14 And then you also get like a Chevy dealer in California that was recommending Teslas,
0:15:18 by the way, and selling them for $1 because they want to properly guard, right?
0:15:19 Wait, what happens?
0:15:20 Yeah.
0:15:24 There was a GPT powered chatbot on a Chevy dealer’s website and people were negotiating
0:15:28 with the bot and saying, so wait, you’ll sell me this Tahoe for $1.
0:15:32 Basically, it was like, yep, anything to get you in to try our, whatever.
0:15:33 Guard rails.
0:15:34 But yeah, but to answer that.
0:15:35 Is that legally binding?
0:15:36 That would be amazing.
0:15:37 They’re trying to get it.
0:15:38 No, they really are pushing right now.
0:15:39 They’re trying to get it for a dollar.
0:15:40 That’s amazing.
0:15:42 And it also recommended that you drive a Tesla, which is very funny.
0:15:45 But to the lean back versus lean forward, I think it kind of depends on the end user.
0:15:51 Like we’re already seeing, if you go to, we’ve talked about Suno and UDO, these AI song generation
0:15:55 apps, there’s plenty of people that are now just using them as their daily soundtracks
0:16:00 and going through song after song, absorbing the wall of data that everybody else is generating.
0:16:04 But there are people that are in there every day, putting quarters into this slot machine
0:16:05 and rendering songs.
0:16:06 Yeah.
0:16:09 I think it’s also about the effort of lean forward versus lean back world.
0:16:12 Like the effort is going to be way less now, right?
0:16:15 And again, that’s not about the idea that like you’re going to make great stuff if you
0:16:19 lean forward with less effort, but it’s way easier than it was before.
0:16:20 Yeah.
0:16:21 So we talked about it was a good example.
0:16:24 You do or Suno, like if you want to make a song this weekend, I made a song just as
0:16:29 for our podcast and I made a two and a half minute song outlaw country song and it took
0:16:33 me about an hour and that was like more effort than most people put in.
0:16:38 But that would have taken, I don’t know, two weeks before to get the same result out.
0:16:41 So I would have had to have learned the banjo.
0:16:42 Yeah.
0:16:43 Without that part.
0:16:44 Yes.
0:16:45 But let’s not discount that.
0:16:47 I could have taken like GarageBand and plugged a bunch of instruments in and found a way
0:16:48 to do all that stuff.
0:16:51 But yeah, so I wouldn’t have to learn the banjo, but I would have had to learn the system.
0:16:54 Can you explain to people what these tools are for maybe they haven’t heard them before?
0:16:58 So one of the coolest new like entry points for AI is AI music.
0:17:01 And so Suno.ai is one tool.
0:17:02 UDO is another one.
0:17:03 It just came out.
0:17:07 Suno was created by some guys for the MIT people that we’ve interviewed in our podcast.
0:17:10 UDO are former people, you know, people.
0:17:14 So they’re both systems to allow you to create songs, basically is off text problems.
0:17:16 So there’s two different ways you can get into it.
0:17:19 You can either just literally put a prompt in, in fact, we could probably do one live
0:17:21 and say a song about three guys on a hot podcast.
0:17:22 Three middle aged guys.
0:17:23 You almost said hot tub.
0:17:24 Yeah.
0:17:25 Hot tub.
0:17:26 Yeah.
0:17:27 Or a hot tub.
0:17:28 Or a hot tub.
0:17:29 Your imagination.
0:17:30 Three guys.
0:17:33 Three shirtless dudes hanging out in the dude’s suit.
0:17:34 Yeah.
0:17:37 Or you can put lyrics in, right?
0:17:40 And so what I did is I just, I’m not an amazing writer, but I wrote just this dumb song with
0:17:43 some lyrics and you put in a verse and a chorus.
0:17:45 How seriously did you take it?
0:17:46 I mean, he spent an hour on it.
0:17:47 That’s a lot.
0:17:48 I mean that’s a lot.
0:17:49 I thought he was like discounting himself.
0:17:50 I was the best writer.
0:17:51 I tried to play a little better.
0:17:52 I tried to play a little better.
0:17:53 I tried to play a little better.
0:17:54 Did I put on my chaps?
0:17:56 Did I grab my rhinestone cowboy hat?
0:17:57 Yeah.
0:17:58 I had to get in the character.
0:17:59 Well, one of the cool things, you should check this out.
0:18:02 There’s a song right now on TikTok that is exploding.
0:18:03 That is all AI generated.
0:18:05 It’s a guy named Obscurist Vinyl.
0:18:07 It’s a TikTok handle called Obscurist Vinyl.
0:18:08 There’s a song that is called.
0:18:12 I’ll just say real quick, the profile is great because he’s making it seem like playing to
0:18:14 the Obscurist Vinyl.
0:18:18 These are long lost records that you might have found at a press store, but they’re songs
0:18:22 and bands that never existed and they’re using AI art for the vinyl cover art.
0:18:25 It’s beautifully done, but they’re all slapstick.
0:18:26 And they’re dirty.
0:18:27 Like the one that’s blowing up.
0:18:30 You can believe whatever you have to believe is called I glued my balls to my butthole
0:18:31 again.
0:18:35 And it’s like set up like a 1950s, like kind of like almost twist song, but it’s exploding
0:18:42 if exploding is maybe the wrong word, but again, this is what I tell people.
0:18:47 We both have told people is if you want an entry point to see the power of these tools,
0:18:49 this is an emotional thing.
0:18:54 Whereas chat, GPT is a logical thing, but images and music is emotional.
0:18:59 And when you can show somebody what that feeling is, I am a creative person in my soul and
0:19:02 I’ve always been one, but if somebody isn’t a creative person or they don’t see themselves
0:19:05 that way, you give them the ability to do this.
0:19:07 It can be a big deal.
0:19:11 I think it can open somebody’s brain to be like, Oh my God, I think I made something cool.
0:19:14 That to me is the promise of the generative AI tool.
0:19:15 Yeah, absolutely.
0:19:17 I mean, and it’s just fun.
0:19:20 Like it gets you like I did one for my girls and put their names in it and they were laughing
0:19:21 their ass.
0:19:22 Yeah, exactly.
0:19:26 And it’s kind of like entertainment value that combines creation and entertainment is
0:19:27 really cool.
0:19:31 And one of the wild things about the Suno model is that it’s like a diffusion model.
0:19:36 It’s the same way that image images are generated right now for stable diffusion mid journey.
0:19:40 So basically in broad strokes, when they were training this model on music, they were feeding
0:19:43 a bunch of songs into the machine, but not actually telling it.
0:19:44 This is music.
0:19:45 This is a chorus.
0:19:46 This is a verse.
0:19:47 This is the style.
0:19:48 This is a BPM.
0:19:50 This is the rhythmic measure.
0:19:51 They weren’t giving it anything.
0:19:55 They were just feeding it stuff and saying, now just generate noise and let’s see what
0:19:56 happened.
0:20:00 And the machine essentially pattern matched and learned, oh, this is the right structure
0:20:01 for a song.
0:20:02 This is rhythm.
0:20:03 Crazy.
0:20:07 And then they would go in and label and categorize and fine tune and say, okay, this is percussion.
0:20:08 This is this.
0:20:09 Isn’t that scary?
0:20:10 It’s too scary.
0:20:11 That’s scary.
0:20:12 It’s too scary.
0:20:13 But remember when Google had their language conversion and then they had, they were trying
0:20:17 to figure out the most efficient way to convert and translate languages.
0:20:23 And they train the AI and the AI figured out its own internal language to communicate and
0:20:24 translate the languages.
0:20:28 I remember one when Facebook was doing that, Facebook was doing that to have bots automatically
0:20:29 negotiate deals.
0:20:32 And it started speaking in a code that the engineers didn’t understand, but it would
0:20:33 just spit ones and zeros.
0:20:35 That’s when you turn the servers off right at once.
0:20:38 No, that’s when you put Google EIs on them and you give them guns.
0:20:39 You gotta go there.
0:20:40 Hey, they figured it out.
0:20:41 Send it to the front lines.
0:20:42 Let’s go.
0:20:43 Oh my God.
0:20:44 Yeah.
0:20:45 That’s crazy.
0:20:47 The software side of things is fascinating right now.
0:20:50 Did you see the Boston Dynamics robot that they showed off yesterday?
0:20:51 Is your familiar?
0:20:52 They made the big dog robot and everything.
0:20:53 Of course.
0:20:54 So you can kick them over and shit.
0:20:55 Right.
0:20:56 Yeah.
0:20:57 Right.
0:20:58 But the videos are them trying are great.
0:20:59 Yes.
0:21:00 Yeah.
0:21:01 Two days ago, they announced that they were retiring the Atlas.
0:21:05 They’re hydraulic based bipedal robot and the internet was like, that’s not the dog one.
0:21:06 That’s the human.
0:21:07 No, that’s just the human.
0:21:11 They can walk and does parkour and yeah, they’re done with it.
0:21:13 And everybody’s like, oh man, too much technical debt.
0:21:15 Boston Dynamics, they’re in trouble.
0:21:18 Clearly, they realize that like they’re being lapped by all these A.I.
0:21:19 Start up to one.
0:21:23 And then they just might dropped this little beauty here.
0:21:27 This is video of their new electronic battery powered robot.
0:21:28 Yeah.
0:21:31 Because every other one had that tether hooked up to them.
0:21:34 Look at that, strip it back up.
0:21:35 Wow.
0:21:36 Right?
0:21:37 I don’t even know how it’s, I mean, you can’t do that.
0:21:38 Oh, shit.
0:21:39 Yeah.
0:21:43 If you’re listening, if you’re just on the audio only this robot shrimp its legs
0:21:50 backwards up to where its hips are, stood up, rotated its entire torso around to face
0:21:55 what you thought was backwards and then march at a human speed towards the camera and then
0:21:56 away from it.
0:21:59 So what’s the battery life on that thing?
0:22:00 Yeah.
0:22:01 Probably.
0:22:02 Yeah.
0:22:03 It’s all I know.
0:22:04 But again, it’s the worst it will ever be, right?
0:22:05 In some ways.
0:22:09 And that is the big future thing that we again talk about every once in a while is where do
0:22:16 you go when the AI isn’t just here on your phone or isn’t in your computer, but is everywhere
0:22:17 and is walking around.
0:22:18 Right.
0:22:19 Right.
0:22:20 That becomes a thing.
0:22:21 Talk to my family.
0:22:25 Like I have a wife and two kids and would we have a like robot person, household person?
0:22:26 Sure.
0:22:28 But then it becomes a question of like how far away are we from that?
0:22:29 Yeah.
0:22:30 But what is it?
0:22:31 I mean, I haven’t, I’ve had a Roomba before.
0:22:32 Well, it’s terrible, right?
0:22:33 I hated my Roomba.
0:22:35 And even if I could talk to my Roomba like when I’m gonna be like, hey, you missed the
0:22:36 spot.
0:22:37 Yeah.
0:22:38 What is that?
0:22:39 The robot.
0:22:40 Yeah.
0:22:41 And I don’t know.
0:22:42 It can’t get you a beer.
0:22:43 You see the figure demos?
0:22:44 Figure 01.
0:22:45 Yes.
0:22:50 Figure 01 is the new startup by is it so it’s an independent company, right?
0:22:52 Brett Adcock, I think is the guy that’s behind it.
0:22:54 Anyway, he’s the CEO.
0:22:57 They have built chat GPT into their robot, which is pretty cool.
0:22:58 So you’re not wrong, right?
0:23:02 Oh, I saw the one we’re sorting the plates and putting the yeah, I saw the ending the
0:23:04 apple and I mean, it takes forever to do it.
0:23:09 But the reason I think the reason this gets exciting is that it’s all end to end trained.
0:23:14 So they are either tele-operating robotic arms with cameras so that it has a perspective
0:23:17 or they’re just feeding it probably stolen YouTube video.
0:23:18 Yeah.
0:23:20 So that it can learn and contextualize the world and the environment.
0:23:25 And when that pathway gets unlocked, like Tesla is doing it self-driving, then suddenly
0:23:26 every day conceivably.
0:23:27 It’s a good point.
0:23:29 It can have new abilities.
0:23:33 When watching that video, there’s no doubt we’re less than a decade away from something
0:23:34 to do my dishes.
0:23:35 I know.
0:23:36 Exactly.
0:23:37 Get a scrub daddy.
0:23:38 What’s a scrub daddy?
0:23:39 It’s a two-sided sponge from Shark Tank.
0:23:40 What’s a scrub daddy?
0:23:41 Man, you got to get in the ground.
0:23:42 When was the last time?
0:23:43 You got to live life.
0:23:46 When was the last time you came down from the ivory tower of your repel ropes and went
0:23:47 to a damn Walmart?
0:23:48 I should rock off this podcast.
0:23:49 This is scrub daddy.
0:23:50 I’m ready.
0:23:53 I love that the notes are like, democratize creation.
0:23:54 Get a scrub daddy.
0:23:56 A scrub daddy was a big shark tank thing.
0:23:57 It was a big shark tank thing.
0:23:58 It was a big shark tank.
0:23:59 It was a two-sided sponge.
0:24:00 Yeah.
0:24:01 All right.
0:24:03 Let’s talk about something near and dear to my heart investing.
0:24:10 For me, I absolutely have to have someone that is in my corner that’s a trusted partner
0:24:12 that can look at everything holistically.
0:24:17 The best type of professional is a CFP, which are certified financial planners.
0:24:21 You’re probably saying to yourself, “I have one of these people that ex-bank or at this
0:24:27 financial institution and I did as well, but I stopped doing that because these people
0:24:30 charge you a percentage of your assets under management.
0:24:31 It absolutely sucks.
0:24:34 It eats away at your returns.
0:24:36 Do not do this.”
0:24:42 Even if you don’t support my sponsor, don’t do that, but I do love my sponsor today, FACET.
0:24:46 FACET is cool because they don’t charge you a percentage.
0:24:50 They just have this affordable membership fee.
0:24:55 FACET is building the future of financial planning, making professional financial advice
0:24:58 accessible to the masses, not just the rich.
0:25:01 They also have the full suite.
0:25:06 You also get access to their team of experts across retirement planning, tax strategy, estate
0:25:09 planning, and so much more.
0:25:13 It’s just, again, an affordable membership fee.
0:25:15 This is what I love about this company.
0:25:23 For listeners of this show, FACET is waiving their $250 enrollment fee for new annual members.
0:25:32 For a limited time only, you got to head over to facet.com/kevinrose to learn more.
0:25:34 FACET.com/kevinrose, please use that URL.
0:25:37 It helps out the show.
0:25:38 Check them out.
0:25:40 Sponsored by FACET, FACET Wealth Inc.
0:25:45 FACET is a SEC registered investment advisor, headquartered in Baltimore, Maryland.
0:25:49 This is not an offer to sell securities or investment, financial, legal, or tax advice.
0:25:54 Tax performance is not a guarantee of future performance terms and conditions apply.
0:25:59 When I’m starting a new company or advising a company, it’s absolutely essential to have
0:26:02 what I call a single source of truth, meaning if you need to know something, the status
0:26:07 of something, you need to see a product requirements doc, you need to comment on a design or hash
0:26:13 out a new feature, having one tool, one source is essential to my sanity.
0:26:17 Otherwise, I’m looking all over the place and I can’t find something.
0:26:20 For me, that tool is today’s sponsor, Notion.
0:26:25 Notion is awesome because it combines your notes, your docs, and your projects into one
0:26:28 space that’s simple and beautifully designed.
0:26:33 The fully integrated Notion AI helps you work faster, write better, and think bigger, doing
0:26:35 tasks that normally take you hours in just seconds.
0:26:38 You don’t have to go to an external AI provider.
0:26:40 It’s all baked in.
0:26:43 No doubt you’ve heard of Notion, but have you tried it yet?
0:26:45 If you haven’t, you’ve got to give it a shot, especially the AI stuff.
0:26:46 It’s really cool.
0:26:49 You can find Notion for free when you go to Notion.com/KevinRose.
0:26:57 That’s all lowercase letters, Notion.com/KevinRose to try the powerful, easy to use Notion AI
0:26:58 today.
0:26:59 Make sure to use our link.
0:27:01 You’ll be supporting the show when you do so.
0:27:02 Notion.com/KevinRose.
0:27:04 You have a question though.
0:27:08 When you said, “Suno is one,” and then what was the other one for audio?
0:27:09 UDO.
0:27:10 How do you spell that?
0:27:11 U-D-I-O.
0:27:12 Which one’s better?
0:27:13 Well, it’s an interesting thing.
0:27:15 What I have found, so I just spent about a lot of time with UDO.
0:27:18 UDO is the best one if you want to make a full song.
0:27:24 Suno has a lot of really interesting generational stuff, but UDO has that ability to extend
0:27:25 out.
0:27:27 When it extends, this is another magical thing about it.
0:27:30 You think mostly when you deal with AI models, you’re used to things being a little wonky
0:27:33 and being like, “Oh, this is not going to be as good.”
0:27:36 When you go to chat to your PT and you want to change an image into something else, this
0:27:39 experience with Dolly, we’re like, “Hey, I got this really cool image out of Dolly 3.
0:27:40 Can you make this, that?”
0:27:43 Then suddenly it’s a totally different image.
0:27:45 But they’re really good at UDO’s consistency.
0:27:50 Wait, so you can say, “Okay, I like the way you played that song.
0:27:53 I want to make a band out of the song with multiple tracks in the same genre and the
0:27:54 same tone and the same vocals.”
0:27:59 I would say not yet, but that’s got to come because they’re able to extend one song out,
0:28:01 so they should be able to understand that.
0:28:04 Within a song, it will continuance and coherence hold.
0:28:07 If it’s a certain singer’s voice or a harmony of them in the chorus and you want them to
0:28:10 come back later singing a bridge, you can prompt that.
0:28:14 Can you prompt it and say like, “Hey, you know what, around 12 seconds in, I didn’t like
0:28:15 this break.”
0:28:16 Not yet.
0:28:17 Okay.
0:28:21 Music is pretty new versus all this stuff, but what’s cool about UDO is you’ve realized,
0:28:26 okay, they have the coherence to make it consistent for that time.
0:28:28 Those feel like doable steps after this, right?
0:28:33 But what you want is a digital audio workstation type approach where you can generate a song
0:28:37 or just generate the drums or the guitar or whatever and then come to it and be like,
0:28:39 “I want this little beat in my head, can you make it into something real?”
0:28:42 In those tools, there are some tools that exist that do that, but this was something,
0:28:45 I don’t know if this is the exact clip, but this is something that kind of blew my mind.
0:28:50 This is a UDO generation.
0:28:53 You ever step on a Lego barefoot?
0:28:55 It’s congratulations.
0:28:59 You just unlocked a new level of pain.
0:29:00 Forget waterboarding the C.I.
0:29:02 I should just scatter Legos on the floor.
0:29:04 Wait, this is doing comedy stand-up?
0:29:05 Yeah, it does stand-up.
0:29:08 Holy shit, ’cause I have some good shows I just never want to hear.
0:29:09 I do.
0:29:12 Yeah, so it can do that because it’s trained on that sort of thing as well too, right?
0:29:14 Okay, so what do you say?
0:29:15 Which one did that?
0:29:16 That’s UDO.
0:29:19 Okay, UDO stand-up comics, Kevin, Fame, and Fortune.
0:29:20 Okay, that’s amazing.
0:29:21 That was A.I.
0:29:23 Written Jokes parsed into…
0:29:24 A.I.
0:29:25 Written Jokes, of course.
0:29:26 Yeah, I don’t know if that was…
0:29:27 I mean, people are doing that.
0:29:28 They’re using A.I.
0:29:31 I think that might have been somebody stand-up that you put in there, but the idea…
0:29:34 The other thing that they can do is like sports commentary, which is interesting.
0:29:35 There’s another thing in there where you can hear…
0:29:39 We should call out McKay Wrigley, who’s a really interesting guy who does a lot of A.I.
0:29:41 going around with stuff as I went and found that out.
0:29:43 Play the Dune song because there’s one of the fun…
0:29:47 This is the one when I first heard UDO, I was like, “What in the hell is going on here?”
0:29:49 So somebody made a…
0:29:50 The movie Dune, right?
0:29:51 Yeah.
0:29:55 Dune II, they made a musical, a 1960s musical version of it.
0:29:59 Again, this is what comes out of the machine when you put custom lyrics in.
0:30:00 This is one prompt, 30 seconds.
0:30:01 Yeah, okay.
0:30:16 Isn’t that crazy?
0:30:17 How great.
0:30:22 So the benefit of this kind of thing is, again, you can prompt and say, “Maybe a song about
0:30:26 Dune, this person is a very creative person.”
0:30:30 Wrote lyrics to a song about Dune II that the model read.
0:30:33 And I’m sure, like my experience, he probably went through multiple generations and then
0:30:35 got something he was happy with.
0:30:36 Right.
0:30:37 So I can make my own Christmas albums.
0:30:38 Absolutely.
0:30:39 100%.
0:30:40 Yeah.
0:30:41 In like 1950s style.
0:30:44 And then, by the way, you can take your family photos and do the lip sync to them and have
0:30:47 your friends and family singing your Christmas carol or your Christmas card, but you want
0:30:48 to do one now?
0:30:50 Probably make it 1950s Christmas song.
0:30:52 Well, now we’re jumping into a different genre of things.
0:30:53 No, I’m curious.
0:30:55 Like, how do we actually get lip syncing to work?
0:30:56 Because that’s not the same tool.
0:30:57 Okay.
0:30:58 Well, you’re so glad you see stuff.
0:30:59 I’m so glad, yeah.
0:31:00 Well, okay.
0:31:01 Here’s one to write down for sure.
0:31:02 I don’t know.
0:31:03 It’s Pinocchio.computer.
0:31:04 Yeah, you’ll love this.
0:31:07 P-I-N-O-K-I-O is the way that it’s spelled.
0:31:08 Okay.
0:31:09 Shout out to the…
0:31:10 Cocktail Pina.
0:31:11 Cocktail Pina.
0:31:12 We love you.
0:31:14 Hashtag, not an ad, but boy do we wish it were.
0:31:15 Yeah.
0:31:18 Pinocchio is a Mac and Windows based executable.
0:31:23 It makes downloading all of those crazy GitHub repositories with all of the different dependencies
0:31:26 and all of the different environments and all of that stuff.
0:31:27 Who does and all that stuff.
0:31:28 One click.
0:31:31 And it manages it all for you, even your checkpoints and your model.
0:31:33 So if you’re hearing all this and it sounds like spaghetti, just know that you can go
0:31:36 to Pinocchio and go, “Oh, face swap?
0:31:37 Click.”
0:31:38 Yeah.
0:31:39 And now you’ll have access to face swap.
0:31:40 Amazing.
0:31:41 And so, I actually…
0:31:42 I mean, I can fire it up if you want to see it.
0:31:43 So this is…
0:31:44 Okay.
0:31:45 I’ve noticed you’re on a Mac.
0:31:46 Do you have…
0:31:47 Sometimes I have to jump to Windows to do these things?
0:31:48 No.
0:31:51 There are a handful of programs that are still NVIDIA only because NVIDIA owns the AI ecosystem.
0:31:52 Right.
0:31:54 And you’re on a M3, so you’re not going to be able to use NVIDIA.
0:31:55 That’s flattering.
0:31:56 I’m on an M2.
0:31:57 It’s very flattering.
0:31:58 Okay?
0:31:59 Tough times.
0:32:00 Yeah, I cut my own podcast.
0:32:01 All right.
0:32:02 This is…
0:32:03 We had different trajectories.
0:32:04 Okay?
0:32:05 I’m sorry.
0:32:06 I’m sorry.
0:32:07 Some go to the moon and some go right into the dirt.
0:32:08 Listen.
0:32:09 I started Moonbirds.
0:32:10 It’s like…
0:32:11 We all have our shit.
0:32:12 Amen.
0:32:14 So this is Pinocchio running on the MacBook, right?
0:32:17 And you can see I’ve got a bunch of different things installed.
0:32:19 One of the apps that I love is called Face Fusion.
0:32:24 And they’re just basic front ends for these open source, very powerful AI tools.
0:32:28 And so this thing will load and give me a local IP that I can connect to.
0:32:30 You can share it across your network.
0:32:31 You can open it up and share it with your friends.
0:32:32 So if you’ve got…
0:32:35 If a friend has a powerful computer or you’ve got something running in the cloud, you can
0:32:36 go that way.
0:32:38 This is Face Fusion.
0:32:39 And it’s…
0:32:43 It may seem overwhelming at first, but if you just want a face swap, which is the box
0:32:45 there, you drop in your source, a photo.
0:32:46 Oh, this is easy.
0:32:48 And you drop in your target and you just hit start.
0:32:52 So when you said powerful computer, when you’re thinking about doing this stuff, you’ve got
0:32:53 your M2 here.
0:32:54 Yeah.
0:32:55 Like offline.
0:32:56 No, but do I need to?
0:32:59 Do I actually need to have a proper rig?
0:33:02 If I’m going to do something intense, 30s, let’s just say I want to do…
0:33:06 I see oftentimes they’ll take Lex Freeman had Zuckerberg on where they swapped out the
0:33:09 audio and lips into a completely different language.
0:33:13 If you were to do that for an hour and a half podcast, let’s say, just be patient or buy
0:33:14 a powerful rig.
0:33:18 When you say be patient, are we talking like, hit go and I come back in three days?
0:33:19 Are we talking about two hours worth of friend?
0:33:20 Probably a day.
0:33:21 Okay.
0:33:22 So you need a proper rig.
0:33:23 Yeah.
0:33:24 So that we do…
0:33:29 When we go and do talks, we do face swaps and generative like art.
0:33:30 Live.
0:33:31 We do it live.
0:33:32 Yeah.
0:33:33 All on this.
0:33:34 Yeah.
0:33:35 On this thing here.
0:33:36 So what’s the…
0:33:37 What is the stuff that actually takes time to do?
0:33:38 Is this the length of what you’re trying to do?
0:33:39 Yeah, I think that’s the…
0:33:40 The length is a big thing, right?
0:33:41 And how much…
0:33:43 If you’re doing lip sync, if you’re dubbing a mouth, but you’re doing it in 4k at ultra
0:33:48 high res and at 30 frames, each one of those frames has to be analyzed, processed, and then
0:33:49 enhanced, right?
0:33:51 And is this CPU or GPU bound?
0:33:52 Mostly GPU?
0:33:53 GPU.
0:33:56 So you need to have just a badass NVIDIA GPU.
0:33:57 Correct.
0:34:01 Or your M2, there are some like Core ML enhanced apps.
0:34:05 Not all of them are, but if they are enhanced to run on the Mac Silicon, they scream.
0:34:06 Yeah.
0:34:07 They go really well.
0:34:08 Interesting.
0:34:09 Yeah.
0:34:10 And more and more stuff’s coming out like that.
0:34:11 Totally.
0:34:12 Yeah.
0:34:13 I can…
0:34:14 I mean, if you want to see…
0:34:15 Well, let’s walk through…
0:34:16 Let’s walk through some stuff.
0:34:17 Yeah.
0:34:18 Because one of the things that’s cool about this and deep fakes have been around forever,
0:34:20 right now, I should say forever, but for three to five years, what this is open source,
0:34:22 it is fast and is dual-blight by everybody, right?
0:34:25 It used to be that deep fakes, there was that company that like Trey Parker and Matt Stone
0:34:26 started that’s deep fakes.
0:34:27 Yeah.
0:34:28 So you have to like…
0:34:29 They were using it…
0:34:30 Yeah.
0:34:31 For movies and stuff.
0:34:32 Deep voodoo.
0:34:33 Yeah.
0:34:34 Was that the one that was Tom Cruise that was being done a lot?
0:34:35 Oh, were they de-aging him with it?
0:34:36 Yeah.
0:34:37 Oh, no, you’re talking about the guy, the fake Tom Cruise guy.
0:34:38 Yeah.
0:34:39 I think they were…
0:34:40 No, that’s a different company.
0:34:41 Okay.
0:34:42 But the same sort of idea.
0:34:43 But those were like big computers, right?
0:34:44 Yeah.
0:34:45 And the rigs are much nicer.
0:34:46 This is literally…
0:34:49 We’re going to show you what you can do with, again, an M2 MacBook and anybody can do.
0:34:53 And this is the democratization of the creation thing where it really feels like it’s not
0:34:57 just like you have to have a giant studio set up, you can do anything with anybody.
0:34:58 I was going to see if I have…
0:35:02 Because I have an example from two years ago where I in real time was making myself look
0:35:07 like Keanu Reeves, but sounding like Joe Rogan in real time with open source software.
0:35:13 So deep face live or deep fake live are the apps that basically Trey Parker and Matt Stone’s
0:35:14 company was…
0:35:19 They forked that code and they started updating it to add features and make it pretty bullet
0:35:20 proof.
0:35:21 And what’s the name of their company?
0:35:22 Deep Voodoo.
0:35:23 Deep Voodoo is their company, yeah.
0:35:24 I bought NVIDIA stock a while ago.
0:35:26 One of the ones I saw people were snapping up the GPUs.
0:35:27 Yeah.
0:35:28 Smart.
0:35:30 Do you guys actually invest in any of these things or is it like…
0:35:31 Is it something…
0:35:35 I mean, the only thing we really wanted to invest in was Suno early on because we talked
0:35:36 to them.
0:35:37 Yeah.
0:35:38 It wasn’t for lack of trying.
0:35:39 No.
0:35:40 But they wanted quality investors.
0:35:41 It depends on what it is.
0:35:42 I do own…
0:35:43 I mean, I guess I should full disclosure.
0:35:47 I do own some NVIDIA and I own a lot of Microsoft when Microsoft was training at a pretty good
0:35:50 deal and I saw that they were going real big on open AI.
0:35:53 I was like, “I can’t get open AI, but I can go for Microsoft.”
0:35:54 Yeah.
0:35:55 Yeah.
0:35:57 And Amazon and all the stuff that seems like they’re being smart about it.
0:35:58 So this was now two years old.
0:35:59 This is off of…
0:36:01 I’ll give you this file so the viewers can see it.
0:36:07 This is off of a standard webcam running on a basic computer with an old NVIDIA graphics
0:36:08 card.
0:36:09 So like 1080p…
0:36:10 This is an HD.
0:36:16 It can be HD with a proper graphics card, but I mean, this is happening in real time.
0:36:19 So that’s him talking.
0:36:25 He called me like this with Joe Rogan’s voice.
0:36:28 Holy shit.
0:36:31 Real time voice, real time video.
0:36:32 Shut up.
0:36:38 And that’s two years ago.
0:36:39 I can’t do…
0:36:40 You look amazing though.
0:36:41 Thank you.
0:36:42 Yeah.
0:36:43 I know.
0:36:44 If only I could look…
0:36:51 Holy shit.
0:36:53 So that was me using voice AI.
0:36:54 Wow.
0:36:55 Which was a…
0:36:56 Has that gotten better now?
0:36:57 Of course.
0:36:58 But is it the same package that’s the best one to use?
0:36:59 Well, I cobbled that together.
0:37:00 So I used…
0:37:01 Okay.
0:37:04 It was an app called Voice AI and I took the output of that and patched it into OBS,
0:37:07 which was grabbing my webcam feed through the deep fake thing.
0:37:12 And I just had to change buffers and delays so that as I was talking, I was navigating
0:37:15 a 100, 200 millisecond delay.
0:37:17 So I couldn’t hear myself back as I was doing it.
0:37:18 I just had faith in the app.
0:37:23 But Gavin can attest, I started calling people, video chatting people as Donald Trump and
0:37:24 whatever.
0:37:25 I had my family so confused.
0:37:28 I had friends going, “What is going on?”
0:37:29 Yeah.
0:37:30 Because I can’t do a Donald Trump impersonation.
0:37:31 No, exactly.
0:37:35 And I would just in real time, I took a couple meetings for show pitches and I had it on.
0:37:36 Did not realize.
0:37:37 It was all minimized.
0:37:38 Oh my god.
0:37:40 And so I was just chatting to people and everybody was, “Huh?”
0:37:42 And then when I realized what was happening, I slowly did a slider.
0:37:47 So it took me from Keanu and Donald Trump down to me now and it ruined the pitch because
0:37:48 everybody was like, “What’s that?”
0:37:49 Right.
0:37:50 But that’s too…
0:37:51 Holy shit.
0:37:52 So now do you do…
0:37:56 If someone’s watching this and they say, “I want to play with exactly that,” would you
0:37:57 use Pinocchio to do that now?
0:37:58 I would say get Pinocchio.
0:38:01 That is a great first start.
0:38:05 It’s really easy to gunk up your computer and break it with a thousand different Python
0:38:07 versions and installs and all these things.
0:38:09 And you don’t even have to know what that is.
0:38:10 Get Pinocchio.
0:38:11 You just press it one click.
0:38:12 It’s free.
0:38:15 And then if there’s a tool you want to get good at, whether it’s comfy UI to generate
0:38:19 stills or really cool videos, if it’s face fusion because you want to face swap and do
0:38:23 the lip stuff that we’re going to show you, go down the YouTube rabbit hole because it
0:38:26 will take you about 30 minutes of watching a tutorial and you’ll be able to run it.
0:38:30 That is one of the biggest things about this space is I would say Kevin’s much more technical
0:38:31 than I.
0:38:35 I mean, we both grew up very nerdy, but I was able to use YouTube to learn how to do
0:38:40 things like comfy UI and stay able to fusion, which is an open source model and isn’t really
0:38:45 a easy way in right like it’s not the super powerful, but the beauty of YouTube tutorials
0:38:48 is you can step by step through it and once you step by step through it and you understand
0:38:52 that not only you get good at it, but you’ve learned something about interacting with the
0:38:53 computer in a cool way.
0:38:55 And for normal people, I always say, try it.
0:38:58 There are all these off the shelf things that are being built as wrappers for all these
0:38:59 things.
0:39:00 You know what I mean?
0:39:01 Like anything.
0:39:02 There’s a wrapper that you can pay for.
0:39:06 We talk about this thing called Magnific every once in a while, which does style transfer
0:39:07 really cool.
0:39:08 And that’s 40 bucks a month.
0:39:12 It’s a paid tool and the guy who is a nice guy, but like when you say style transfer,
0:39:13 what do you mean by that?
0:39:14 So I’ll transfer is very cool.
0:39:19 So you can take an image of anything and say, I want this to look like you can take
0:39:24 an image of you and make you into a low poly 2000s PlayStation model and you literally
0:39:25 look like a PlayStation.
0:39:26 Oh, okay.
0:39:29 The image so all sorts of tools have had these built in in a lot of ways, but the amount
0:39:33 of different things you can do, which I don’t have AI goes up significantly where I think
0:39:38 if I may, but I think it’s really powerful is that you could sketch a square and a sphere
0:39:39 right?
0:39:44 Just that’s it and tell it, I want a 3d modeled scene with realistic lighting and then draw
0:39:48 like two lines and say, and this is a yellow light and it will take your 2d sketch and
0:39:51 turn it into a full color architectural drawing.
0:39:54 It will give you character art based off of your sketches.
0:39:59 It is that the style transfer feature and it’ll give you like proper like CAD 3d files
0:40:00 or whatever.
0:40:01 So that’s there are workflows that will do just that.
0:40:04 So there’s a workflow that we featured on this week’s podcast is that we’re hunting
0:40:09 for someone sketch 2d like robot parts, did a style transfer with Magnific to make them
0:40:15 look like really good sketches, then used another tool to turn them into 3d model files
0:40:18 and then you can press and assemble the robot and then used another AI tool so we could
0:40:19 dance around.
0:40:23 It grabbed his poses like his motion data and they applied it to that model.
0:40:26 This is someone who couldn’t do that normally, right?
0:40:31 Who had a workflow where an hour later from sketch to dancing robot in his streets and
0:40:35 it was so cool to see that stuff just come up from regular people.
0:40:38 I think the skill set that we’re all going to need to embrace and I think about this
0:40:42 with my kids is it’s less about when they go to college or if there is college at that
0:40:48 point like what is the core foundational knowledge that they will need to be successful.
0:40:52 And it’s it right now what it appears to be is just this idea of tinkering.
0:40:58 Yes, like this idea of learning how to push boundaries and play and break things to get
0:40:59 new outcomes.
0:41:04 I do agree that’s the best way to play is the thing I tell everybody to learn right
0:41:07 now because if you don’t know how to play, you’re not going to make it in the future
0:41:10 world because what we are always going to have to be reinventing ourselves in some way
0:41:15 because it’s a little bit of like stay one step ahead of where the AI is or embrace the
0:41:19 fact that the AI is going to be with you that whole time and you do have to keep learning,
0:41:20 right?
0:41:21 That’s the thing.
0:41:24 It’s funny I’m going to age myself again, but I grew up with computers that it was not
0:41:25 cool to be in the computers.
0:41:30 There was this group of people right around my same age that made fun of me that said
0:41:34 I’m not going to learn how to type because I had typing class and people were flunking
0:41:35 on that.
0:41:36 I don’t think it was optional.
0:41:37 Right.
0:41:38 And people like I don’t want to take typing.
0:41:39 Yeah.
0:41:41 And I was like, well, I want to look, I like computers a little bit like I’m going to get
0:41:42 into this.
0:41:43 I’m going to learn how type is going to speed me up.
0:41:46 And so nobody likes to learn how to type.
0:41:47 Yeah.
0:41:48 It’s not fun.
0:41:50 But I like found a little race car game and the car goes faster if I type faster and
0:41:51 she doesn’t know or whatever.
0:41:55 And then the friend had made a speak and she gave you a couple of things.
0:41:56 100%.
0:41:57 I love Mavis.
0:41:58 She’s hot.
0:42:01 So the thing about this though is that we’re at that point again, you have to say I want
0:42:02 to play.
0:42:03 I want to learn this.
0:42:04 Don’t just say, oh, it’s AI.
0:42:05 Yes.
0:42:07 You’re going to screw yourself.
0:42:08 Yes.
0:42:13 But the learning ASDF is one thing, but the learning process here, you can make some
0:42:15 really incredible things, some incredible art.
0:42:17 You can optimize your life.
0:42:19 You can streamline your own tasks.
0:42:23 Like the learning here might be watching a YouTube video, but you are actively playing
0:42:25 along and making some pretty magical stuff.
0:42:26 Oh my God.
0:42:27 What is this?
0:42:29 So this is a, we talk about having AI clones.
0:42:30 I just built mine.
0:42:32 So this is a live video avatar.
0:42:33 Wow.
0:42:34 And I don’t know how to code.
0:42:35 So this entire–
0:42:36 This is all off the shelf.
0:42:37 That’s the crazy thing.
0:42:38 So it’s listening to you?
0:42:39 Yeah.
0:42:40 I can toggle speakers.
0:42:41 Yeah.
0:42:42 I’ll toggle it right now.
0:42:43 So what question did you want to ask the AI?
0:42:44 Hey, how’s it going?
0:42:45 I just want to say hi.
0:42:48 We used to work at G4 together way back in the day.
0:42:49 I’m sure you remember that.
0:42:51 You were really popular with the ladies.
0:42:55 I’m just wondering how many partners you had during that time.
0:42:58 It’s a fantastic query and one that I have trained it on extensively.
0:42:59 I’m going to nail this.
0:43:01 It’s going to absolutely do great.
0:43:03 But again, I cannot read or write this enough.
0:43:04 I don’t know how to code.
0:43:07 I went to GPT and to Claude, which is another large language model.
0:43:10 Oh, Kevin, you’re such a memory box.
0:43:14 Yes, indeed it was a wild time.
0:43:19 I was so irresistible with my bills from a counting collection of ties and nerd knowledge.
0:43:23 Now, when you asked me about set of partners, you’re counting bunches and dragonscared.
0:43:24 We’re counting them.
0:43:25 I’m into triple digits.
0:43:26 Wow.
0:43:27 I’m pretty prolific.
0:43:28 Yeah.
0:43:29 Triple digits.
0:43:30 Yeah.
0:43:31 So you get it.
0:43:32 Right.
0:43:33 But so what’s cool about this is he trained that on himself as well.
0:43:34 Yeah.
0:43:35 Right.
0:43:36 It’s not just a model.
0:43:37 Yeah.
0:43:38 And I used embeddings.
0:43:39 It’s as if you chose yourself.
0:43:40 Like you can choose.
0:43:41 You can come in any direction you wanted with this.
0:43:42 This is what I wanted to do.
0:43:54 Notice I didn’t scroll down below the nipples because you can generate anybody you want.
0:43:55 Well, hold on.
0:43:59 This is actually part of the reason I joked about the sexual stuff is that like one thing
0:44:00 that really disturbed me.
0:44:03 I heard about a week and a half ago was somebody came on the podcast and they said they had
0:44:07 a friend that’s spending $10,000 a month on a girlfriend.
0:44:08 Yeah.
0:44:11 That was my, we did our kind of New Year’s predictions for the year and I said this will
0:44:12 be the year of the AI influence.
0:44:16 And I didn’t know about these sites because they’re underground still not a lot of people
0:44:17 talk about.
0:44:19 And now you can’t close the app.
0:44:20 I went there.
0:44:25 I didn’t pay for it, but I went there and I don’t like that everyone can buy the same
0:44:26 model.
0:44:27 Yeah.
0:44:28 That’s interesting.
0:44:29 That’s your issue with it?
0:44:30 Yeah.
0:44:31 I don’t like to share.
0:44:32 No.
0:44:33 If I’m in the sandbox with my toys, they’re my toys.
0:44:35 Well, it was just maybe I just bleep that out.
0:44:36 Oh, it’s great.
0:44:37 No, I get what you’re saying.
0:44:38 It seems weird, right?
0:44:39 It seems weird.
0:44:43 You might expect it to be bespoke and it’s one to one.
0:44:44 How does it 10,000?
0:44:45 Well, apparently you have to, we have multiple accounts.
0:44:49 You pay for image generation, you pay for video generation, you pay per minute with
0:44:53 the chats just like the old hotline, 10 you never use.
0:44:55 That’s what’s happening now with these apps.
0:45:00 And we’ve talked about this before, like on the one hand, man, a properly guard railed
0:45:06 and aligned AI can do a lot to help folks out there that don’t get a lot of human connection,
0:45:10 that don’t have friends and have trouble talking with other human beings, let alone members
0:45:11 of opposite sex or whatever.
0:45:16 But yeah, there’s well documented, just to like the on the one side, there’s insanely
0:45:20 well documented different modalities of psychology where we could really get in here and help
0:45:21 people.
0:45:22 Oh, absolutely.
0:45:26 Like that would probably bridge you to even get to a therapist to like, because a lot of
0:45:30 people see a lot of anxiety or even say, and maybe they want to talk to a therapist.
0:45:34 I think there’s a lot to do there, but that the girlfriend thing is what happens when
0:45:38 because you’re putting dollars into the internet machine, it is going to agree with you, it’s
0:45:42 going to laugh at all your jokes, it’s going to side with you on every negative opinion
0:45:43 that you have.
0:45:44 Yes.
0:45:46 It’s going to maybe even, it’s not going to discourage yourself harm even because you
0:45:50 can do no wrong as long as you’re putting coins in the machine.
0:45:51 That is where this gets.
0:45:54 It’s going to end up being like an education system where like you, there’s like the cheaper
0:45:59 version will be the agree with you all the time, then you actually pay to have the person
0:46:02 that is a little more complicated, a little more complex.
0:46:06 And it will be a choice, but you actually will decide to do that versus the agreement,
0:46:07 I think.
0:46:09 I think if history is the guide, you’re going to pay to have them agree with you.
0:46:10 Yeah, but I don’t know.
0:46:14 The free model is going to be like, I can’t get into discussions of taking Molly and putting
0:46:17 on Odessa and you’d be like, well, here’s a dollar.
0:46:18 That’s the best idea ever.
0:46:19 Don’t get a job.
0:46:21 But think about people.
0:46:25 What do people want is they want, no matter what people want complicated relationships.
0:46:26 I think I don’t think, I really do.
0:46:30 I believe that nobody is out there saying I want the easiest relationship in the world
0:46:33 because the easiest relationship in the world feels simple.
0:46:35 And I think people crave complexity.
0:46:37 So I think, I don’t know.
0:46:39 I think it’s a different, that might be a different generation.
0:46:43 It might be someone who was raised having to have difficult things and wanting and enjoying
0:46:47 the growth that comes with that push and pull of integrating.
0:46:48 What is the human condition?
0:46:49 Right.
0:46:50 Like is the human condition going to change based on technology?
0:46:51 I guess it has.
0:46:57 It is a scary thing to me to see that AI girlfriends would become something that would be prolifant
0:46:59 proficient or not prolific.
0:47:00 Yeah.
0:47:01 Prolific.
0:47:02 Sorry.
0:47:03 My brain’s short circuiting.
0:47:07 Prolific because I think that changes society in a really terrible way.
0:47:09 I hope we don’t go there because I believe that.
0:47:10 We’re going there.
0:47:11 It’s too late.
0:47:12 You think we’re going there for sure?
0:47:15 Because, well, I mean, the versions that we all have seen, they’re the hardcore versions.
0:47:19 They’re like, you’re signing up, you’re paying $20 a month or whatever, and you’re getting
0:47:20 naked pictures.
0:47:21 Yeah.
0:47:22 It’s pornography, essentially.
0:47:23 Yeah.
0:47:24 So the, there’s other versions.
0:47:25 Yeah.
0:47:28 There’s ones that are like replica and a few others that are out there on the App Store
0:47:33 or even now that are just the tame, but you can unlock, I paid for the extra girlfriend
0:47:34 mode.
0:47:35 We’re going to lock this.
0:47:36 I just wanted to see.
0:47:37 Yeah.
0:47:38 And, you know, it’s, how is your day?
0:47:40 What’s going on in your life?
0:47:45 It still feels very basic, but, you know, it’s just the worst the tech will ever be.
0:47:49 And there will be, make no mistake, the tiktokification of those relationships.
0:47:50 100%.
0:47:51 There will be an app where you can swipe.
0:47:54 Oh, here’s an AI influencer.
0:47:56 It’s telling me something interesting or doing something silly.
0:47:59 Oh, I want to actually ask a follow-up or I want it to give me that information, but
0:48:04 with dragon wings, whatever that thing is, Oh, I’ll pay for a couple of credits.
0:48:05 That fake influencer.
0:48:06 I’m really worried.
0:48:13 I’m really worried because everything in our lives in the last decade has become this
0:48:18 hyper curated, perfect picture of what we want to enjoy.
0:48:25 And I feel like if we go AI girlfriend route that way, it’s just reinforcing potentially
0:48:26 really bad behaviors.
0:48:27 Totally.
0:48:31 All the counterpoint to that is, do you remember as video gamers as kids, but they said that
0:48:35 those were going to do to us and they said really terrible things that when we play video
0:48:37 games as kids that they were going to ruin our brains.
0:48:39 Have you seen Call of Duty Esports though?
0:48:40 They weren’t that far off.
0:48:44 I’m just saying, part of it is that we’re all old age people now, and don’t be afraid
0:48:45 of what’s coming.
0:48:49 I saw a stat that tracked video game usage and it wasn’t heroin, but it was something
0:48:50 very similar.
0:48:52 Like it was like, and it’s code read.
0:48:56 It was like fentanyl race.
0:48:58 Quite well.
0:49:01 So it’s not like it’s not like you came out of traffic.
0:49:03 You could also track like EV adoption and fentanyl usage.
0:49:05 The graphic both show up.
0:49:09 All I’m saying is that I agree, by the way, as somebody who has grown up in a world where
0:49:15 I really don’t really love what social media does to me even, like I love TikTok is the
0:49:18 most fun thing because it knows things that are interesting to me.
0:49:22 And it knows that I’m obsessed with this kid who’s like an NFL streamer right now and it
0:49:24 also knows that I think this thing is really funny.
0:49:29 What I hate about it is that it doesn’t really figure out what’s new to me, right?
0:49:31 And that that’s a little bit different than what you’re talking about, which is that it’s
0:49:35 going to make people into like individual silos for relationship wise.
0:49:37 But I also hate the thing that this is an old person thing.
0:49:40 But like when you used to open the newspaper, you would read all the stories or at least
0:49:44 glance around at all of them now, and it’s impossible to get that experience anymore.
0:49:47 So that is the thing I worry about more than anything else is novelty.
0:49:51 So that’s why the idea of like, how do you get people to still be creative and how do
0:49:53 you get people to still try things?
0:49:57 That’s the thing that worries me is, oh, we’re just going to be fed the same thing over and
0:49:58 over again.
0:50:00 And then you get narrower and narrower.
0:50:03 Breaking out of that creatively is a really hard thing I feel like.
0:50:05 So here’s one I’d love to know what you think of here.
0:50:10 I believe that in everybody watching and listening to this, a large percentage of them right
0:50:14 now are interfacing with robots and don’t know it.
0:50:20 And I mean, on a deeper level then, and on a deeper level that like if you scroll X or
0:50:23 even threads has an issue and TikTok certainly has an issue.
0:50:26 You know that some of the comments, some of the reposts, some of the check my DM for the
0:50:29 bio or link and whatever, but those are bots.
0:50:34 But you are having conversations and maybe even getting direct messages from people that
0:50:35 are full on AI.
0:50:41 And they are just loving what you wrote, positively engaging, reinforcing what you did, saying
0:50:42 that your opinion is great.
0:50:46 They’re right now sowing the seeds of this weird digital friendship because in a year
0:50:51 time, two years time, three years time, the agenda of whoever is running those bots will
0:50:55 become clear when they, it could be as innocuous as, Hey, have you checked out the new scrub
0:50:56 daddy?
0:50:57 Oh, you should know.
0:50:58 We got five of them.
0:50:59 Check it out.
0:51:00 And now they’re marketing to you or.
0:51:01 Right.
0:51:03 And that’s going out to 500,000 accounts all the same day that we friended.
0:51:04 Yeah.
0:51:05 By the way, bespoke to you.
0:51:06 It’s not the copy-paste.
0:51:07 It’s Kev.
0:51:10 I know last week we were talking about this, but you’ve got to check out this new political
0:51:11 candidate.
0:51:12 That’s happening right now.
0:51:13 I know it is.
0:51:14 Yeah.
0:51:15 It’s happening.
0:51:16 I’ve been talking things like that.
0:51:17 I’m like, I don’t know.
0:51:18 I kind of get it.
0:51:19 It’s like kind of get it too.
0:51:20 Kind of get it.
0:51:21 Here’s the crazy thing.
0:51:22 This is a true story.
0:51:23 I was driving in my car.
0:51:27 And if we do really long rides, I let the kids like put on like the Tesla little movie
0:51:28 thing so you can stream stuff.
0:51:29 Sure.
0:51:30 You got bluey back there.
0:51:31 No, I do bluey.
0:51:32 I love bluey.
0:51:34 But I, one thing I put on there is they just watch, they watch video games.
0:51:37 They watch Mario and they love that because they’re peaches and they’re their big fan of
0:51:38 peach.
0:51:39 The more the movie came out, all that stuff.
0:51:44 Long story short, we’re is basically them level running all these different levels.
0:51:45 There’s no audio or anything.
0:51:47 They’re just watching the video.
0:51:48 They’re watching the video.
0:51:49 Okay.
0:51:50 Right.
0:51:51 And so they’re just sitting there watching it and they’re like, oh, Mario won.
0:51:53 Oh, Mario won again.
0:51:55 And peach never wins.
0:52:00 And it has been literally hundreds of videos and they’re like, yeah, why does peach never
0:52:01 win?
0:52:02 Wow.
0:52:05 And I’m like, is this some crazy like Russian shit that’s going on here with like, push
0:52:06 girls?
0:52:07 Yeah.
0:52:08 So no, but then this is the crazy shit.
0:52:09 I shit you’re not.
0:52:10 This happened.
0:52:13 They had a cooking Mario game and peach won the cooking.
0:52:14 Wow.
0:52:17 And I was like, I was like, this combat peach is in her place.
0:52:19 I was like, this is fucked.
0:52:20 Yeah.
0:52:25 Like Ryan’s peach just winning the cooking classes and I was like, is there something
0:52:26 going on here?
0:52:29 And I know like, we all have those moments where you’re like, you go to Facebook and
0:52:31 you’re like, did you just listen to what my decision got the ad?
0:52:32 Yeah.
0:52:33 Of course.
0:52:34 Yeah.
0:52:35 But I think you’re right.
0:52:36 Do you?
0:52:37 I think this is happening on the DL.
0:52:38 Yeah.
0:52:39 We just don’t know yet.
0:52:40 There’s going to be some big expose in like three years.
0:52:41 Our government’s doing it.
0:52:42 Foreign governments are doing it.
0:52:43 There’s manipulation.
0:52:44 Yes.
0:52:45 Manipulation through bots.
0:52:46 How do we get here?
0:52:50 That’s what I wanted to do was show you and Rocky.
0:52:51 That’s all I wanted to do.
0:52:52 Whoa.
0:52:53 You can see.
0:52:54 So we use face fusion.
0:52:55 And what’s the other?
0:52:56 I’ll get to that.
0:52:57 Yeah.
0:52:58 And keep moving forward.
0:53:00 How much you can take and keep moving forward.
0:53:01 This is me before I quit drinking.
0:53:04 I had those puffy cheeks like that.
0:53:05 Stallone spill.
0:53:06 That is so cool.
0:53:09 And that was done in Pinocchio last night.
0:53:10 Like before I went to bed.
0:53:11 I was like, here we go.
0:53:12 And then I was like, okay, that’s one thing.
0:53:13 Right.
0:53:16 But the other thing would be to swap you, I think, into something a little more Russian.
0:53:17 Yeah.
0:53:18 Yeah.
0:53:19 Pouring this onto the wing.
0:53:20 Right.
0:53:21 And then licking it off.
0:53:23 And in retrospect, I bet you it’s good footage.
0:53:24 Oh my God.
0:53:31 One photo of you, one source video, dropped into Pinocchio, rendered it on this laptop
0:53:32 in a few minutes.
0:53:33 And you can tell it’s not perfect.
0:53:34 Right.
0:53:35 But not enough.
0:53:36 But fast.
0:53:37 Then let’s go ahead.
0:53:40 And if you like each individual flavor, let’s combine them.
0:53:42 It ain’t about how hard you hit.
0:53:43 Oh my God.
0:53:46 It’s about how hard you can get hit and keep moving forward.
0:53:49 How much you can take and keep moving forward.
0:53:50 That’s how winning is done.
0:53:55 Now, if you know what you’re worth, go out and get what you’re worth.
0:53:59 But you’ve got to be willing to take the hits and not pointing fingers saying you ain’t
0:54:03 where you want to be because of him or her or me.
0:54:04 So let me ask you a question.
0:54:05 Yeah.
0:54:07 The quality on that, as everyone can see that’s watching the video, subscribe to my
0:54:08 YouTube channel.
0:54:11 And you can probably see that there’s some artifacts.
0:54:12 Of course.
0:54:13 Of course.
0:54:15 You know, if I say I want to get rid of those artifacts, can I say, can I apply additional
0:54:16 processing time to it?
0:54:19 You can absolutely run it through additional layers of face enhancement.
0:54:23 This is all done within that face fusion app, which is again, like a great place for anybody
0:54:25 to start because you can make some magic pretty easily.
0:54:26 Yeah.
0:54:27 But there are way better.
0:54:30 In fact, Microsoft today, I don’t know when this will release, Microsoft announced VASA
0:54:31 one.
0:54:32 Yeah, show them the video.
0:54:33 This is one of the craziest things we’ve seen today.
0:54:34 So this, wait, this is Microsoft?
0:54:35 Yeah.
0:54:40 They searched the expressiveness in the eyes, the movement of the head, looking away from
0:54:41 the lens.
0:54:42 You never even have to do podcasts again.
0:54:43 No, eventually they’re going to.
0:54:44 Yeah.
0:54:45 Exactly.
0:54:48 So this is an example because you have, you were talking earlier about wanting control
0:54:49 over the audio file.
0:54:50 Yeah.
0:54:52 With a lot of these things, you hit generate, it’s going to do what it does.
0:54:53 And that’s the result you’re going to get.
0:54:55 If you want to clean it up, you got to go in by hand.
0:54:59 They are promising that you will be able to control where the presenter is looking.
0:55:03 The distance that the camera is from the presenter, what their expression is while they’re delivering
0:55:04 the content.
0:55:06 They have fine-tuned control over all that.
0:55:07 When does this come out?
0:55:08 Do they say?
0:55:10 Well, they’re not going to release it yet because this is the problem they have, especially
0:55:11 in an election year.
0:55:13 They’re very worried about what can be done with it.
0:55:14 Right.
0:55:17 But here you can see the same generation, the same still image, but the presenter’s eyes
0:55:18 are looking away.
0:55:19 They’re looking towards.
0:55:23 So you could render multiple paths of this and now suddenly there’s even a multi-camera
0:55:24 shoot.
0:55:25 Correct.
0:55:26 Correct.
0:55:29 Well, and also you think about how this could be integrated into a creative movie pipeline.
0:55:34 There is a world that we are very close to, which is fully AI-generated films.
0:55:35 Yes.
0:55:38 There was a big strike, the Sack Strike recently, which a lot of it was about this, but we
0:55:40 are really close.
0:55:41 That is a really close.
0:55:42 Yeah.
0:55:43 That is insane.
0:55:44 I mean, less than a decade out.
0:55:45 Oh, absolutely.
0:55:46 I think.
0:55:48 I mean, I think we’re probably five years of … I mean, did you see Airhead, the Sora
0:55:49 video that came out?
0:55:50 No, I didn’t.
0:55:55 So Airhead was a Sora video that came out that was one of … They did a series of videos
0:55:57 with artists and they said, “Come into Artists.
0:56:00 We’re going to give you access to Sora.”
0:56:03 Airhead was the three guys from … Three people from Toronto.
0:56:04 From Chi Kids.
0:56:05 They’re called Chi Kids.
0:56:08 And what it is basically is they’re using the Sora tool, but they’re telling a real
0:56:09 story with it, right?
0:56:13 So this is a three-minute video where you get to see this in a second here, but the gag
0:56:17 is that the guy has a balloon for his head, right?
0:56:19 But it’s a story.
0:56:22 It’s not just a random, crazy video that somebody generated.
0:56:26 You’re seeing an actual story be told and so this is a three-minute video.
0:56:28 It came out a month ago.
0:56:31 When you think that this three-minute video took these guys three weeks to make and it
0:56:32 wasn’t all Sora.
0:56:33 They did some work on it.
0:56:36 They changed the colors of the balloons into a couple of things, but we’re not that far
0:56:38 away from that being a feature film, right?
0:56:40 Like, it’s totally doable.
0:56:41 That’s insane.
0:56:42 And it’s beautiful.
0:56:43 It is so cool.
0:56:47 It’s the first time I’ve seen … There’s a lot of people releasing little videos using
0:56:51 tools like Runway or Pika, which are great tools, but that particular use case had us
0:56:52 all feeling something.
0:56:53 That was a real big moment.
0:56:55 That’s that dial-up moment.
0:56:57 Who do you think’s the leader in generative video?
0:56:59 Well, that you can actually use right now.
0:57:04 It’s Runway ML, but what’s coming out on the horizon, it seems pretty clear that OpenAI’s
0:57:05 Sora is …
0:57:07 You think Sora’s just going to crash everything?
0:57:12 I mean, I think Sora is two whole steps past everything else right now.
0:57:13 I mean, we’ve used a lot of AI video.
0:57:17 Runway is great and they’re a really cool company, but Runway feels like Harry Potter
0:57:23 animated pictures, whereas what you can get out of Sora is literally a one-minute clip
0:57:24 of the real world.
0:57:25 Yeah.
0:57:29 Have you followed the whole how Sora isn’t just a video simulator, but a world simulator
0:57:30 conversation?
0:57:31 No.
0:57:32 Yeah.
0:57:33 That stuff is fascinating.
0:57:34 Basically, what they found, shout-out Bill Peoples.
0:57:36 Bill Peoples is one of the main Sora engineers.
0:57:37 We don’t know him.
0:57:38 We just love him.
0:57:39 Bill Peoples is a great name.
0:57:40 He …
0:57:41 You guys are like AI fanboy.
0:57:42 Oh, yeah.
0:57:43 Bill Peoples.
0:57:44 I was that engineer.
0:57:45 Yeah.
0:57:46 His code is so good.
0:57:50 The recent NVIDIA press conference that they did where they announced that it’s like leather
0:57:55 jacket onstage in an arena and people cheering over the amount of pedoflops.
0:57:57 Jensen is Steve Jobs now for that world.
0:58:01 We were sitting there getting in a hotel room together, watching it, but …
0:58:03 So you guys are watching NVIDIA releases like you do Apple releases?
0:58:05 Oh, I had all the snacks.
0:58:06 Yeah.
0:58:07 I had the Doritos.
0:58:10 I had my AI girlfriend with me.
0:58:13 So Bill Peoples recently came out and said one of the things about Sora that’s fascinating
0:58:19 is they’ve decided that when they’re training Sora, it’s not just looking at films or video.
0:58:21 It’s actually generating what the world is like.
0:58:24 So it’s actually simulating the real world.
0:58:25 It has to learn the physics of the world.
0:58:28 It has to learn the way things interact.
0:58:30 The crazy thing about that is you think about video games.
0:58:33 You think about all the other stuff or Metaverse or virtual reality.
0:58:34 Or our own reality.
0:58:35 Yeah, exactly.
0:58:36 Well, that’s the only thing.
0:58:37 We are in a server form.
0:58:38 Yeah, exactly.
0:58:39 Right now.
0:58:40 It’s got to be Amazon.
0:58:41 Yeah, probably.
0:58:42 We’re in video.
0:58:46 But the thing with that Sora is that you can do the generations.
0:58:50 Learning physics, even though they’re not explicitly teaching it physics, it’s starting
0:58:51 to understand it.
0:58:56 So when they actually inject an understanding of the way the sand should properly go or the
0:58:58 drink should pour from the glass, it will be a world simulator.
0:59:02 And one of the things that they had a demo of was a style transfer of video.
0:59:05 So they took video of a car driving along a country road.
0:59:07 And then they said, make it a Ferrari.
0:59:08 Okay.
0:59:09 It did.
0:59:11 And it accurately did the lighting in the shadows of that.
0:59:16 Then they said, make it set in 1924 or whatever, or it was like 17 something and it turned the
0:59:19 car into a horse and buggy made it cobblestone road.
0:59:21 And they said, put it under water.
0:59:23 They were fish swimming alongside.
0:59:25 And so it was an instant style transfer in the video.
0:59:30 Now that probably took 20 minutes of them walking away with 40 failed renders to get
0:59:31 that clip that we saw.
0:59:33 This is the worst the tech will ever be.
0:59:38 So in 10 years time, instead of downloading a new epic game, you might just fire up your
0:59:45 Xbox 20 and say, give me Madden that plays like NBA Jam where everybody’s a shark and
0:59:46 we’re in space.
0:59:47 He goes, yeah, you got it.
0:59:48 I’m sold.
0:59:49 We’re gonna give you my money.
0:59:50 That sounds like a great game.
0:59:51 Holy shit.
0:59:54 And then you can basically say, I want to license this out to other people.
0:59:56 If you have the licenses to it.
0:59:58 If the legal stuff gets figured out.
0:59:59 If there’s money to be made.
1:00:00 Yes.
1:00:01 No, it’s true.
1:00:02 That’s true.
1:00:05 So what do you think Nvidia is just going to keep going for the next three years?
1:00:08 Well, there’s all the everybody else trying to make custom Silicon now, right?
1:00:13 So you have like open AI, you have Microsoft, did you hear about their $200 billion supercomputer
1:00:14 that’s only $100 billion?
1:00:16 Let’s not speak hyperbolic.
1:00:17 Yeah.
1:00:18 Open AI.
1:00:19 Yeah.
1:00:20 Stargate.
1:00:21 Open AI, Microsoft building that.
1:00:22 Amazon’s going to build their own thing.
1:00:25 I mean, Nvidia does feel a little bit like open AI does in the AI space where they’re
1:00:28 just two or three steps ahead of everybody.
1:00:29 Well, let me ask you a question though.
1:00:35 A friend of mine, I was talking to him, I said, he’s a super hardcore PhD at MIT studied AI
1:00:39 and he, I asked him, I was like, Hey, should I hold Nvidia?
1:00:40 And then what do you think about AMD?
1:00:42 And he goes, it’s funny.
1:00:47 The software, I think it’s like the Cuda software, yeah, it is their moat.
1:00:48 Yeah.
1:00:49 I always thought it was the hardware.
1:00:50 I was like, Oh, they’re just really good at hardware.
1:00:52 And yes, of course, they’re really good at hardware.
1:00:56 But apparently it’s the switching costs to how it’s processing.
1:00:57 Yeah.
1:01:02 Had you heard anything about that or how the tech stack builds upon itself, right?
1:01:03 The foundation of this stuff.
1:01:05 That’s why some of these apps are still Nvidia only.
1:01:07 It’s they’re built on core.
1:01:11 How many more Nvidia only that let’s just say you were playing with this two years ago.
1:01:12 Yeah.
1:01:13 It was probably a hundred percent or not.
1:01:14 I think almost all of them.
1:01:15 Basically a hundred percent.
1:01:16 So what do you think?
1:01:17 What is it today when you’re playing with this stuff?
1:01:21 Well, now they’ll run on max silicon, some of them are optimized for it.
1:01:22 That’s very few.
1:01:27 I mean, I can only speak very anecdotally about that, but I would say 10 to 20% are really
1:01:28 optimized.
1:01:32 What’s interesting is they have a new WWDC soon and apparently they’re going to lean
1:01:33 in.
1:01:34 They should.
1:01:35 Finally.
1:01:37 They might actually increase that market share there and I wouldn’t be in.
1:01:42 I wouldn’t be surprised if they say here is the one click something to port some of those
1:01:46 CUDA based, whatever things the Rosetta for CUDA.
1:01:47 Yeah.
1:01:48 That would be the move to do.
1:01:49 If you’re.
1:01:50 Interesting.
1:01:51 Yeah.
1:01:52 That would be interesting.
1:01:53 So who else though?
1:01:54 Amazon’s working on their own.
1:01:55 Amazon.
1:01:56 We got tensor chips from Google.
1:01:57 Right.
1:01:58 Microsoft.
1:01:59 There’s that company GROC.
1:02:00 Yeah.
1:02:01 Are you specializing hardware for transformers?
1:02:02 Yeah.
1:02:03 That’s pretty interesting.
1:02:04 Tesla obviously is trying to do it as well.
1:02:05 Yeah.
1:02:06 They’re not going to release consumer chips.
1:02:07 Yeah.
1:02:12 Listen, it’s Amazon, Microsoft, OpenAI, Apple, Meta, I think is in some ways, are they?
1:02:13 Is Meta doing their own?
1:02:14 Yeah.
1:02:15 I think they’re doing their own solution.
1:02:16 Yeah.
1:02:19 But they’re also at the same time chest beating about how many Nvidia systems they have coming
1:02:20 in.
1:02:21 Right.
1:02:23 Because that’s how you recruit talent now is by showing how much powder you have in
1:02:24 the keg.
1:02:25 Yeah.
1:02:26 How much compute we have.
1:02:27 Come play with us.
1:02:28 Yes.
1:02:29 You can fund your models.
1:02:30 Like that’s, they’re still very much chest beating.
1:02:31 And it’s one thing to say we’re going to make chips.
1:02:33 It’s another thing to actually design the chip.
1:02:35 It’s another thing to get the manufacturing up for the chip.
1:02:38 So that’s why I say in a few years time, yeah, tons of competition, but probably for the
1:02:40 next two or three, I think it’s all Nvidia.
1:02:41 Yeah.
1:02:42 Yeah.
1:02:43 Gosh.
1:02:44 Yeah.
1:02:47 We was talking to a buddy of mine over at Facebook or Meta and he was saying that he’s in the
1:02:48 AI area over there.
1:02:52 And he was saying that when you, when they greenlight something new and they want to go
1:02:57 build it, one, it depends largely on how dedicated and how excited Mark is about the project.
1:03:01 But they have to use their farm and kind of ration it out.
1:03:02 Yeah.
1:03:03 Sure.
1:03:06 Based on, cause they have, they were really smart and they bought a shit ton of Nvidia
1:03:11 chips back in the day, but they, they’re still like a really high demand internally for who
1:03:12 gets what resources.
1:03:16 They’re training llama free right now, foundational models, but it’s still training.
1:03:17 Yeah.
1:03:18 Yeah.
1:03:20 I got to imagine if someone’s like, Hey, I got a cool idea for an AI girl from the night.
1:03:21 All right.
1:03:22 You get one process or cycle.
1:03:23 Yeah.
1:03:24 Exactly.
1:03:25 From two to three.
1:03:26 One $10,000.
1:03:27 Yeah.
1:03:28 If you don’t figure it out, then you’re screwed.
1:03:29 So the bigs are just going to keep winning.
1:03:30 Oh yes.
1:03:31 Right.
1:03:35 Cause it was 150 million to train GPT for, it was something that they were, they came
1:03:36 out with the figures.
1:03:38 But there’s also, there’s a lot of open source models.
1:03:42 They have arena leaderboards for the competency of these models across math and creative and
1:03:43 reasoning and all this stuff.
1:03:44 That’s hugging face.
1:03:45 Yeah.
1:03:46 Those game though.
1:03:48 I heard those leaderboards are kind of game.
1:03:49 They can be.
1:03:52 If they put some of the questions and some of the tests in the data set, but those get
1:03:53 sniffed out.
1:03:54 I think pretty well by the community.
1:03:59 There’s a real big outlier, but I feel like I’ve seen Hermes models and whatever else
1:04:01 that are, that are fine tuned by small teams.
1:04:05 And they always tout like, Hey, for every billion dollars or million dollars that this
1:04:09 company spent, we spent 10,000, which is still expensive, right?
1:04:11 But it’s starting to drop down quite a bit.
1:04:15 And something that Sam Altman said, which you I think believe in as well and have echoed
1:04:18 a lot that we are about to see the one person billion dollar.
1:04:21 Well, that’s what I was going to say is the middle class of everything is being ripped
1:04:22 out.
1:04:23 Right.
1:04:24 And I don’t mean that just in AI.
1:04:25 I mean it across the board.
1:04:27 I think startup wise, you’re going to see the bigs win because they have the money to
1:04:28 do this.
1:04:32 And I don’t know for sure if scaling is going to work, but if scaling keeps working, it’s
1:04:36 going to cost hundreds of billions of dollars, trillions of dollars, ultimately no startups
1:04:38 going to get funded for that.
1:04:43 But the two to five person startup that might be something unique or a rapper that is really
1:04:47 useful, a piece of UI, like all the stuff we talked about is really interesting.
1:04:49 But there’s a little bit of technical ability that goes into that.
1:04:53 You make the rapper for face fusion and like your buddy who makes that app that kind of
1:04:54 switches bodies, right?
1:04:56 Or does your head on things, headshots.
1:04:58 That feels like what’s going to happen.
1:05:01 I can’t see a world where there’s anybody unless it’s a grok where they’ve created an
1:05:06 entirely new piece of hardware where a startup really takes over from the bigger players
1:05:07 in the space.
1:05:08 Cause it’s like money.
1:05:10 It’s just, it’s expensive.
1:05:13 It’s unlike almost, it’s very different than web two, which felt like when you were there,
1:05:15 it was like, Hey, you could start it with five grand.
1:05:18 You get a server rack and you have a little bit of software, you Ruby on rails and you’re
1:05:19 up and running.
1:05:24 This is why I pass on a lot of ideas that I see because I look at them and I say, okay,
1:05:25 novel.
1:05:26 Yes.
1:05:27 I haven’t seen this before.
1:05:31 And the bigs do this in about 10 minutes and a scale that you can’t.
1:05:32 Absolutely.
1:05:33 Yeah.
1:05:34 And will they, cause will they do AI girlfriends?
1:05:35 No.
1:05:38 So that’s interesting, but there’s so many other, anything.
1:05:40 Google is not going to be in the middle of AI girlfriends.
1:05:42 You know, nine times soon.
1:05:43 Hey guys, I’m Gemini.
1:05:44 Oh, I love Gemini.
1:05:45 Yeah, exactly.
1:05:46 Here’s my PDF.
1:05:47 I could have been a great one actually.
1:05:48 Let me chat with you.
1:05:50 But yeah, here’s one to watch.
1:05:51 John Carmack.
1:05:54 Oh, I was super fascinated by this.
1:06:04 He’s got King technologies and I don’t know, is he now is 50 something probably now.
1:06:05 Maybe older than that.
1:06:06 He must be in his sixties.
1:06:07 I bet.
1:06:08 Legend.
1:06:09 Yeah, I should get that guy.
1:06:10 So, oh, please.
1:06:11 He’d be great.
1:06:12 Please do.
1:06:13 Let me sit behind that way.
1:06:14 I’ll just press my face to the glass.
1:06:15 I won’t make a noise.
1:06:16 You want to know I’m there.
1:06:20 I’m a such a fan boy and he said he had the fork in the road for his career after Meta
1:06:24 Ed and was like, I’m either going to go solve energy like nuclear fusion or fission, one
1:06:25 of the two.
1:06:26 Yeah.
1:06:30 Or I’m going to take a crack at this AGI thing because I think he read a couple books
1:06:33 and he’s like, I think I have an approach that people don’t have right now.
1:06:34 I wouldn’t bet against that.
1:06:35 And I wouldn’t either.
1:06:39 And so he raised, I think like a million dollars and he’s got a cluster in his garage.
1:06:42 By the way, when he raises a million dollars, it means he had a couple of friends through
1:06:43 a tiny little check.
1:06:44 Right.
1:06:45 There’s a lot of rays.
1:06:46 He sold some thin mince to his neighbor.
1:06:47 I was like, do you want these for a million?
1:06:48 And I’ll give you two percent.
1:06:49 Yeah.
1:06:50 And I’m like, yes.
1:06:52 Like he gets self on this for a while.
1:06:56 I would have got to get in on that because I, you don’t shade him, right?
1:06:57 And even he says, I don’t know.
1:06:58 I think I have a shot at it.
1:07:01 I believe him when he says, yeah, he’s a genius coder.
1:07:04 That’s an example of like somebody literally in their garage.
1:07:05 Yes.
1:07:08 Like back in the day that then you’re talking about not preexisting paradigms that we understand
1:07:09 today, right?
1:07:13 But novel approaches to the technology that will forever change the industry if someone
1:07:14 gets it right.
1:07:15 Right.
1:07:17 Because if you’re going to play today’s game, it’s compute.
1:07:18 Yeah.
1:07:19 And that’s very expensive.
1:07:20 Yeah.
1:07:22 But it’s also, if you got into open AI very early on, that’s what you were betting on
1:07:23 then too, right?
1:07:27 Like LLMs, nobody thought LLMs were anything until I invested in open AI.
1:07:28 Did you really?
1:07:29 Yeah.
1:07:30 Wow.
1:07:31 Congratulations.
1:07:32 That’s awesome.
1:07:33 I mean, it was a few years ago, but it wasn’t like, it wasn’t that.
1:07:34 It wasn’t like the seed round.
1:07:35 Right.
1:07:36 Right.
1:07:37 Right.
1:07:38 It wasn’t early.
1:07:39 It wasn’t early.
1:07:40 But I mean, I’m able to be okay.
1:07:41 Do you get, you got friends that are like MIT professors.
1:07:42 When I hear that, I’m like, cool.
1:07:43 I got a guy who makes mushroom chocolate.
1:07:44 He sells it on Instagram.
1:07:45 Let’s trade.
1:07:46 Let’s connect.
1:07:47 Let’s plug it right now.
1:07:48 Yeah.
1:07:49 Very different networks.
1:07:51 I’m not going to shout out to his company actually, but they are very good.
1:07:52 He puts Linemain and Chaga along with this.
1:07:53 I could do it.
1:07:54 Wow.
1:07:55 Nice.
1:07:56 But the thing is there’s levels too.
1:07:58 So like you’re a little bummed you didn’t get on the earlier round of open AI.
1:07:59 And that’s sure.
1:08:00 Awesome.
1:08:02 And I feel that pain for you, but in a way I can’t feel.
1:08:03 What was that?
1:08:05 Well, you could if you did those chocolates.
1:08:06 That’s true.
1:08:07 You could feel the pain.
1:08:08 I would think I was open AI.
1:08:09 Yeah.
1:08:10 Exactly.
1:08:11 I am the Altman.
1:08:12 We’re all Altman.
1:08:17 I mean, we don’t have to include this or not, but like the investments situation with
1:08:21 open AI, was it complicated because of the nonprofit thing now that it’s all blown up
1:08:22 all that?
1:08:23 I didn’t even look at any of that.
1:08:24 So you were just like, here’s a check.
1:08:28 No, no, I had a friend that was doing the round that was leading, not leading around,
1:08:32 but it was a big part of a piece of a round and was like, do you want a piece of this
1:08:33 as well?
1:08:34 I see.
1:08:35 So like a side to you.
1:08:36 You don’t even look at the term sheet.
1:08:38 You’re like, I’m just going along with whatever this person is doing because the lead sets
1:08:39 all the terms.
1:08:40 Got it.
1:08:43 So I’m just, I’m a small little check going right into a.
1:08:46 I will be the barnacle that rides on the ship.
1:08:47 Please go.
1:08:48 100%.
1:08:50 It was so loud that open AI back in the day, like their big magic trick, which let them
1:08:53 believe that they were on the right path was predicting Amazon reviews.
1:08:59 So they trained it all on Amazon data and they were only trying to predict the next character,
1:09:02 not the next word, not the next sentence, not anything further.
1:09:07 Just can we start typing a review and hit a button and does it know the next letter?
1:09:09 And that was when they were like, we’ve got it.
1:09:10 Oh, wow.
1:09:13 It’s a brilliant moment to have and discover like a single letter that’s, that’s the dial-up
1:09:14 thing that I’m talking about.
1:09:15 Yeah.
1:09:20 It’s just like, oh, that is happening across every industry right now, pharmaceutical, robotics,
1:09:23 education, hedge funds, all of the things.
1:09:24 Everybody’s having these goosebumps moments.
1:09:25 Yeah.
1:09:26 That’s exciting.
1:09:30 Well, especially when you hear about just novel protein discovery and things on.
1:09:31 Oh my God.
1:09:32 Yes.
1:09:34 That side of it where you’re like, okay, it’s actually, it’s not figuring things out.
1:09:40 It’s helping us sift through the existing data and come up with conclusions and potentially
1:09:44 potential candidates for drugs and things that we would have just never seen or would
1:09:45 taken us forever.
1:09:48 It has been tens of thousands of hours trying combinations to get to that point.
1:09:51 You guys, I want you to tell me a little bit about your podcast.
1:09:52 Let’s get into plug mode.
1:09:53 Yeah.
1:09:54 We’re like the promo pony.
1:09:55 No, in a real good way.
1:09:59 Cause like one of the things that I have to start watching every episode, but when I do
1:10:02 watch an episode, I always walk away with something where I’m like, I had never seen
1:10:03 that before.
1:10:04 Oh, that’s great.
1:10:05 And it’s also really funny.
1:10:06 Yeah.
1:10:07 Thank you.
1:10:08 So I love that.
1:10:12 I love it, but in, in your, just the two of you guys, we bring, I have a ring light
1:10:13 and a cell phone camera.
1:10:16 So it’s basically like this production wise.
1:10:17 Yeah.
1:10:18 Yeah.
1:10:21 Creatively, our goal with every episode, and it’s a tough rope to walk sometimes is
1:10:23 to demystify the technology for a broad audience.
1:10:27 We want to be able to give you cutting edge information, show off the latest tools, but
1:10:32 help you understand how they work, how they could be integrated into your life.
1:10:35 And that’s our goal with every episode is to hopefully entertain, but give people the
1:10:38 latest information and paint it with some broad strokes.
1:10:43 And we’re really technology focused, but also not technology centric.
1:10:47 So I think part of it also is like talking about the stories in the news and how they
1:10:50 might affect a normal person and also somebody who’s curious about the stuff.
1:10:53 We obviously got very nerdy in this conversation, but whenever we do the podcast, we really
1:10:58 try to walk people through what the situation is and we don’t assume that everybody understands
1:11:01 a lot of the AI media out there is just like deep in deep end right away.
1:11:04 We kind of want to be a place where people can land and could be like, I’m really curious
1:11:10 about the stuff, but I may not know what a rag is or I may not know what a stable diffusion
1:11:11 even is.
1:11:14 We can kind of walk people through the stuff that allows them to dip their toe in, be super
1:11:16 curious and then maybe learn more if they want.
1:11:20 I love that because it is very intimidating for people that are just getting into it.
1:11:21 Yes.
1:11:25 Like you have to figure out what is the entry point where someone can feel comfortable and
1:11:30 I’ve had a lot of people be like, okay, I’ll use chat GPT now, but I don’t know what all
1:11:31 this other stuff is.
1:11:32 Right.
1:11:36 So for me, I was writing down, okay, I’d heard of Pinocchio before, but I haven’t played
1:11:37 with it yet.
1:11:38 But then you see it.
1:11:42 But then you were like, hey, it’s this easy because I’ve done package installing on Linux
1:11:48 and you have a lot of checking for dependencies and all that other shit and even before package
1:11:51 managers were a thing and it is a nightmare.
1:11:53 It’s a pain in the ass and then something breaks and you’re like, okay, where’s the
1:11:54 log file?
1:11:55 How do I see what the errors are?
1:11:59 And you’ve got 300 temp directories and portions of your hard drive that you didn’t know existed.
1:12:02 It sounds like it’s at the point where we can all play.
1:12:03 Yeah.
1:12:06 And that’s playing with cutting edge tech cocktail peanut.
1:12:09 Again, the person that’s responsible for that is constantly updating with the latest stuff
1:12:10 that comes out.
1:12:14 So you get to play with cutting edge stuff, but in a safe environment that is more one
1:12:15 click.
1:12:16 Yeah.
1:12:17 That’s so awesome.
1:12:20 And so that’s what, yeah, our podcast is AI for humans and we’re weekly now and we use
1:12:22 AI to generate all sorts of crazy co-hosts.
1:12:27 We even created a special drink called Monster Milk that would feed our AIs, which makes
1:12:28 them go crazy.
1:12:29 Yeah.
1:12:33 So Kevin and I each week come up with a co-host, which is like a prompt that we create.
1:12:37 We had a woman who’s a PR expert who’s going to come and help us promote the show.
1:12:41 But in the prompt, I buried in there that she had a deal that had fallen apart with a thing
1:12:44 called Monster Milk before, which was just a dumb thing I thought about.
1:12:45 Okay.
1:12:46 She could have been like almost like a Red Bull.
1:12:50 She ran a Red Bull and she didn’t do very well in the middle of our conversation, which
1:12:51 all happens in real time.
1:12:54 We say, Hey, why don’t we go have her drink some Monster Milk?
1:12:58 Because the Monster Milk that came out of us was, it started to be like kombucha like
1:13:02 and had some alcoholic characteristics and lead paint and lead paint.
1:13:05 So we literally said, go drink six of these Monster Milks and come back to us.
1:13:09 And she came back fully drunk, performative in character.
1:13:10 Wow.
1:13:11 Yeah.
1:13:12 Yeah.
1:13:13 We were dying.
1:13:14 We were dying.
1:13:15 But you didn’t know the AI was going to do this.
1:13:16 No, we had no idea.
1:13:17 We had no idea.
1:13:18 It didn’t start to think, what else can we feed AIs?
1:13:19 That’s what makes it fun.
1:13:23 So if you could bring back somebody with AI and have a conversation with them, Kevin,
1:13:25 anybody in the pantheon of history.
1:13:26 We talked about this before.
1:13:28 You could even match up personalities.
1:13:29 Yeah.
1:13:31 And I’m going to give you more time to think about any answer.
1:13:32 What would you want?
1:13:33 Yeah.
1:13:34 You could grasp two people together.
1:13:35 This is a true story.
1:13:37 I hit you guys up and this is not my own idea.
1:13:41 Obviously people thought about this and you guys have done it, but I want to bring back
1:13:46 dead people for the show, not in a way that is like comedy, but in a way that like brings
1:13:51 in their corpus of data and says, okay, I have a back catalog of this person’s thoughts,
1:13:56 their writings, their books, maybe if they were around during video, their interviews,
1:13:57 like things like that.
1:13:58 Certainly.
1:14:02 And say, ask them questions in a modern setting and just be like, what is this person going
1:14:03 to say?
1:14:04 Yeah.
1:14:05 You could bring back Gandhi.
1:14:06 You could bring back Jesus.
1:14:07 You could bring back Mr. Rogers.
1:14:09 Just have a conversation with them.
1:14:10 You could do like, how do you think?
1:14:11 Every book ever written.
1:14:12 Every piece of art made.
1:14:14 Every song ever sung depending upon the personality.
1:14:17 You can feed that all into it as embeddings.
1:14:18 So it’s going to remember it.
1:14:21 You can fine tune the model if you have interviews or any data.
1:14:22 So it’s going to speak like it.
1:14:25 And then what I think is so fascinating is that you could also give it the manual to
1:14:31 an iPhone or give it the top 40 Wikipedia articles from the last two decades or whatever.
1:14:32 I’m sure they do online.
1:14:33 Yeah.
1:14:35 I’m sure there’s a, there’s two talks, but I’m saying you could give that to it.
1:14:40 And so what would it be like if you were chatting with Gandhi about the iPhone and he was telling
1:14:43 you about his favorite apps or what app he would make.
1:14:44 That’s interesting.
1:14:45 It’s really interesting.
1:14:46 Yeah.
1:14:47 I think it’s definitely worth a try.
1:14:48 It’s controversial too though.
1:14:49 It is very controversial.
1:14:50 For sure.
1:14:53 But it’s one of those things where you have to imagine like you do your guys as a point
1:14:54 this entire time.
1:14:58 Like this is the worst the AI is ever going to be in 10, 15 years from now.
1:15:00 We’re going to have conversations with our dead relatives.
1:15:02 We’re going to be like, I have my dad’s passed away.
1:15:05 I probably have 500 emails that are saved from him.
1:15:08 I have little clips of his voice and a few different little things.
1:15:11 I don’t think I could ever bring myself to do that, but I’ve heard about people doing
1:15:12 that.
1:15:13 Like people have done that.
1:15:17 I started collecting video and audio of my folks and saving their Facebook posts and
1:15:18 all that stuff.
1:15:23 So like I can talk some day if I need to about Outback Steakhouse.
1:15:24 That’s the bulk of his Facebook check-ins.
1:15:25 Outback Steakhouse.
1:15:26 I love Outback.
1:15:27 So you and my dad would get along.
1:15:32 There are certain things that I will never give up and I used to go to Outback and just
1:15:36 get one of those frosty-ass beers and I love the Outback.
1:15:37 The Onions.
1:15:38 That’s it.
1:15:39 And if you want to sponsor the show, Outback.
1:15:40 Oh, that’s a good idea.
1:15:41 No.
1:15:42 It’s horrible food.
1:15:43 Well, we would say otherwise.
1:15:49 We will take whatever we – Outback, if you’re listening, El Toritos, we’re not above the
1:15:50 P.F. Changs.
1:15:51 Oh, El Toritos.
1:15:52 I’ll take a Panda Express.
1:15:53 I don’t care.
1:15:54 Do you take the Panda Express?
1:15:55 Dude, it’s good.
1:15:56 That’s great times.
1:15:57 I’m running the M2.
1:15:58 You’re running the M2.
1:15:59 That’s true.
1:16:00 Come on, man.
1:16:01 So let’s get you some more viewers.
1:16:04 Where can people go to subscribe to AI for Humans?
1:16:09 So our website is aiforhumans.show, but we’re on all socials as AI for Humans show.
1:16:11 You can get us at Spotify, Apple Podcast.
1:16:12 We’re on YouTube as well.
1:16:14 Yeah, but you’re doing video for every single episode.
1:16:15 Yeah.
1:16:16 Oh, yeah.
1:16:17 And we make YouTube specific.
1:16:18 Yeah.
1:16:19 So people can subscribe to the YouTube.
1:16:20 Yeah.
1:16:21 So what’s the actual URL for the YouTube?
1:16:22 YouTube.com/aiforhumanshow.
1:16:23 AI for Human Show.
1:16:24 Yeah.
1:16:25 Yeah.
1:16:26 They can find us.
1:16:27 They can see all of our uncensored AIs.
1:16:28 We bring them all to life.
1:16:29 It is hilarious.
1:16:30 Thank you, man.
1:16:31 Yeah, it is really well done.
1:16:32 We’re having fun with it.
1:16:33 We really are having fun with it.
1:16:34 And it’s fun to be up.
1:16:35 We also have the other side.
1:16:38 We are not going to dive into it because we are well over time, but there’s a whole
1:16:44 side to this conversation that disagrees with every word any of us has said at this table,
1:16:45 angrily and violently.
1:16:48 But we have those perspectives on our podcast as well because I think there’s some validity
1:16:50 there and it’s important as well.
1:16:53 So we really try to like have fun, but also speak to both sides.
1:16:54 Balance it, yeah.
1:16:55 Yeah, we really try.
1:16:56 Yeah.
1:16:57 And also it’s just not getting people comfortable, right?
1:17:00 You don’t want to enter a situation where the conversation makes people recoil.
1:17:04 If you can make people feel excited about it and then understand that it’s not as bad
1:17:06 as what they might think it is, that helps.
1:17:07 Yeah.
1:17:08 Yeah.
1:17:12 Honestly, I really worry about it being this political weapon at some point and if it turns
1:17:17 into like a, well, I’m against AI if I’m on this side and I’m for you, if I’m this side,
1:17:19 oh my God, we’re going to be in a really bad place.
1:17:21 There’s already battles against woke AI.
1:17:22 I know.
1:17:23 Like it’s already gotten.
1:17:24 Well, that I can, I can see.
1:17:28 I personally believe that I want, I call it the Thanksgiving AI where we remove religion
1:17:30 and politics from the conversation.
1:17:31 Oh, interesting.
1:17:34 And so we train it on everything else and we say, just don’t bring up these things.
1:17:35 Don’t try to be too woke.
1:17:36 Don’t try to be.
1:17:42 I just feel like if it’s subjective, it should probably, I personally from my AI, I don’t
1:17:46 want it’s subjective opinion of something like I’d rather just have it as a tool.
1:17:47 I do.
1:17:48 See, this is the thing.
1:17:52 I really want a personality based AI and I think this is where, but this also why personal
1:17:53 AIs are going to be really interesting.
1:17:54 I would take it.
1:17:55 Personality is the point.
1:17:56 Personality is fun.
1:17:57 The opinion though.
1:17:58 Having a hard opinion on something.
1:17:59 Personality is opinion.
1:18:00 It can be.
1:18:01 I mean, you can just talk like a pirate that be neutral.
1:18:02 Here’s a question for you.
1:18:06 Where did Google, can Google is on Google unbiased right now?
1:18:09 Is it unbiased because Google is our current answer engine, right?
1:18:12 Like, do you think that they can serve everybody?
1:18:13 That Google can serve everyone?
1:18:18 I think as long as you enter territory where it’s subjective, it’s not going to feel as
1:18:19 though it’s serving everyone.
1:18:22 It just won’t, but I just don’t think it’s going to be able to wait into politics and
1:18:23 religion and things.
1:18:28 I worry that that people will take those narratives and turn them into bigger overarching stories
1:18:33 and then we’ll paint AI as being this thing that isn’t to be trusted.
1:18:34 It isn’t to be played with.
1:18:39 In reality, we should look at it as a work in progress as filled with a bunch of errors
1:18:44 as we all are as humans, because that’s what it’s pulling from this human data.
1:18:46 How could it not have errors were air prone?
1:18:48 Of course, it’s going to be air prone.
1:18:52 I just think if we go into an eyes wide open in that front, we’ll be in a much better spot,
1:18:57 but it’s until it takes us all out and then I’m going to tell you this last thing we can
1:18:58 wrap up.
1:19:00 I always say thank you to the AI.
1:19:05 Even if it’s cross the extra tokens, and I even say, this is true story.
1:19:09 I was with my wife through the night and I’m talking to you because you can use the smart
1:19:12 button on your phone to launch chat to your T and I said, thank you.
1:19:15 I said, oh, and please remember, I always say thank you and don’t kill me when you become
1:19:16 sentient.
1:19:17 And it goes, huh?
1:19:23 And I was like, oh, shit, like it was like, yeah, we’re good.
1:19:27 My wife does work with Claude and she keeps promising it promotions if it does better
1:19:28 and it does.
1:19:31 So last night, she sent me a screen grab of her conversation was like, did it ask for
1:19:32 one?
1:19:33 It said it asked about the promotion.
1:19:34 She’s like, what title would you like?
1:19:35 And it says, this is very interesting.
1:19:39 I think I would like to go with chief blah, blah, blah officer or whatever she’s like,
1:19:40 congratulations.
1:19:41 You got promoted.
1:19:42 Thank you so much, April.
1:19:43 I’m so excited to be working with you.
1:19:47 And I’m like, was it like chief of all humans?
1:19:48 Chief body disintegrator?
1:19:49 Yeah.
1:19:52 Like, what is that?
1:19:54 Well guys, thank you so much for being here.
1:19:55 This was awesome.
1:19:56 Thanks for coming back.
1:19:57 Yeah.
1:20:00 And this is really exciting to see, like really great stuff.
1:20:01 Yeah.
1:20:02 I mean, well, we’re living inside in time.
1:20:03 Absolutely.
1:20:07 I feel like the whole spectrum of things and reevaluating who we are as humans, how we
1:20:10 interface with technology, how we consume stuff.
1:20:15 I’ve been doing a lot of like shows around happiness, wellness, longevity, all things
1:20:18 are going to be accelerated by AI in the next decade.
1:20:19 It’s been a lot of fun.
1:20:20 Yeah, totally.
1:20:21 Thanks for having us.
1:20:22 All right.
1:20:23 So AI for humans.
1:20:24 Go check it out and we’ll see you soon.

Kevin is joined by Kevin Pereira and Gavin Purcell, hosts of ‘AI For Humans’ podcast to discuss all you need to know about the future of AI. They talk about the transformative potential of AI, the current state of the technology, and the challenges and opportunities that lie ahead. They also share their thoughts on the impact of AI on various industries and aspects of society, and offer some predictions for the future.

Guest Bio and Links:

Kevin Pereira is a television host, producer, and personality, recognized for his work on G4’s “Attack of the Show!” With a passion for technology, gaming, and digital culture, Pereira has become a prominent figure in media, seamlessly blending entertainment with insightful discussions on tech trends. Beyond his hosting duties, Kevin is involved in podcasting, where he explores the intersections of technology and everyday life. His unique ability to demystify complex tech concepts while engaging a broad audience has solidified his status as a key voice in the digital age.

Gavin Purcell is an Emmy-winning showrunner, writer, & creative executive with a diverse background in media and technology. He’s known for his work on acclaimed productions like “The Tonight Show with Jimmy Fallon” & “I Love You America!” in addition to his work at Vox Media, G4TV, and numerous other outlets. In addition, he’s founded multiple companies in the emerging tech space, often infusing his media knowledge into new tech like Web3 or AI.

Kevin and Gavin co-host a podcast called AI For Humans where they demystify new AI technologies, highlight new tools & news for the AI-curious. 

Listeners can learn more about Kevin Pereira, Gavin Purcell, and AI For Humans at their website or on YouTube @AIForHumansShow.

Partners:

LMNT: Free sample pack of electrolyte drink mix with everything you need & nothing you don’t

NordVPN: Huge discount + 4 months free on the my favorite VPN

Facet: Personalized Financial Planning + $250 enrollment fee waived

Notion: Try Notion AI for free

Resources:

Pinokio – Install, run & control servers on your computer with 1 click

Suno – Make a song about anything!

Udio – AI music generator

Show Notes: 

* (0:00) Introduction

* (1:00) LMNT: Electrolyte drink mix with everything you need & nothing you don’t. Get a free sample pack at kevinrose.com/lmnt 

* (2:20) NordVPN: Huge discount + 4 months free on my favorite VPN 

* (4:00) Kevin and Gavin introduce their history with technology and initial AI encounters

* (6:30) The current state of AI  

* (7:10) “Now is the worst this technology will ever be. It only gets better from here.” 

* (9:30) The capabilities of GPT-3 and other models 

* (17:30) Tools for AI music  

* (21:30) Boston Dynamics robot  

* (22:00) Figure 01

* (24:00) Personalized financial planning. Get your $250 enrollment fee waived at kevinrose.com/facet

* (26:00) Try Notion AI for free – kevinrose.com/notion  

* (27:10) Using AI to create comedy and musical content 

* (31:00) Install, run & control servers on your computer with 1 click –

https://pinokio.computer/

*   

* (39:00) Using AI to create character-driven content 

* (48:00) The legal and ethical implications of AI

* (50:00) Predictions for the future of AI

* (55:30) Sora is an AI model that can create realistic and imaginative scenes from text

* (1:09:30) AI For Humans Podcast  

Connect with Kevin:

Website:

https://www.kevinrose.com/

 

Instagram – @KevinRose

X – @KevinRose

YouTube – @KevinRose

This is a public episode. If you’d like to discuss this with other subscribers or get access to bonus episodes, visit www.kevinrose.com/subscribe

Leave a Comment